Start learning with our library of video tutorials taught by experts. Get started
Viewed by members. in countries. members currently watching.
Reading an entire file into memory can be troublesome, especially if the file is several gigabytes in size. A better way of handling large files is to stream the data instead. In this video we'll stream the data from a file and read individual chunks. Let's go to the exercise files. Open up chapter nine and video two, and then copy those two files to the Desktop. Now let's go to Terminal and change directory to the Desktop. It's capital D, and then press Return.
Now let's take a look at these files. I'm going to take a look at stream.js and pull it into sublime text. On line one, I'm requiring the FS Module. The FS module is a core module that deals with the file system. Then on line three, I'm calling the create read stream method of FS. I'm creating a stream out of data.json. Instead of reading all the contents of the file at once and storing it in the stream, it's creating a stream so that we can read the data one chuck at a time.
Then on line five and on line 12, I'm defining two different listeners to the data event. These two listeners are functions. These functions accept a chunk as the single argument. In this listener I'm marking the beginning and the end of each chunk, and then in the middle, I'm converting the chunk to a string. In the second listener, I'm simply outputting the length of the chunk. There's one more listener in this file. It's listening for the end event. This event is going to get fired when the stream reaches its end.
And then all I'm doing in this listener is outputting that we've reach the end of the file. So let's go back to Terminal and run this file. I'm going to type node and stream.js. Now I'm going to scroll all the way back up to where this started. Now in this case, our file is pretty small, and the string finished in one chunk. However, both data listeners fired. The chunk was output to the console, and we also displayed the length of the data in the chunk.
If this were a much larger file, this would be a more efficient way of reading the data than trying to wait for the entire file to load in memory. Streaming a file makes it possible to read data before the entire file has loaded. One or more listeners can be attached to a file stream to handle chunks, and end listeners can perform any finishing cleanup. In the next video, we'll have a look at how streams can be paused.
There are currently no FAQs about Node.js Essential Training.
Access exercise files from a button right under the course name.
Search within course videos and transcripts, and jump right to the results.
Remove icons showing you already watched videos if you want to start over.
Make the video wide, narrow, full-screen, or pop the player out of the page into its own window.
Click on text in the transcript to jump to that spot in the video. As the video plays, the relevant spot in the transcript will be highlighted.