Introduction of NodeJs and Its Core Modules
Introduction of NodeJs and Its Core Modules
js
Node.js is an open-source, cross-platform runtime environment built on Chrome's
V8 JavaScript engine, which allows developers to execute JavaScript code on the
server-side.
Introduced by Ryan Dahl in 2009.
Node.js comes with several features that make it a popular choice for building
scalable and efficient web applications.
Here are some of the main features of Node.js:
2. JavaScript Everywhere:
- Node.js allows developers to use JavaScript on both the client-side and server-
side, providing a unified language for full-stack development. This facilitates code
reuse and reduces context switching between different languages.
3. Fast Execution:
- Node.js is built on the V8 JavaScript engine, which compiles JavaScript code into
machine code before execution. This leads to fast performance, making Node.js
suitable for building high-performance applications.
7. Cross-Platform Compatibility:
- Node.js is cross-platform and runs on various operating systems such as
Windows, macOS, and Linux. This allows developers to write code once and deploy it
across different environments without modification.
8. Community Support:
- Node.js has a vibrant and active community of developers, contributing to its
growth and evolution. The community provides support, documentation, tutorials,
and a vast ecosystem of modules to help developers build robust applications.
9. Real-Time Applications:
- Node.js is well-suited for building real-time applications such as chat
applications, online gaming, collaboration tools, and streaming platforms, where
instant communication and data updates are essential.
These features collectively make Node.js a powerful and versatile platform for
building a wide range of web applications, from simple APIs to complex, real-time
systems.
Node.js process model vs traditional Web Server
Node.js runs in a single process and the application code runs in a single
thread and thereby needs less resources than other platforms.
All the user requests to your web application will be handled by a single
thread and all the I/O work or long running job is performed
asynchronously for a particular request.
Thread need not wait for the request to get completed; it can take up next
request.
When asynchronous I/O work completes then it processes the request
further and sends the response.
Node.js uses Event loop architecture, which maintains event queue to keep
watch on async.jobs and assign each of these job to internal c++ thread
pool which has worker thread to implement the job.
Node.js is not fit for an application which performs CPU-intensive
operations like image processing or other heavy computation.
1. http:
- The `http` module provides functionality to create HTTP servers and make HTTP
requests. It allows you to build web servers and handle incoming HTTP requests.
2. fs (File System):
- The `fs` module provides an API for interacting with the file system. It allows you
to perform file operations such as reading, writing, updating, and deleting files and
directories.
3. path:
- The `path` module provides utilities for working with file and directory paths. It
offers methods for resolving and normalizing paths, joining paths, and extracting
path components.
4. events:
- The `events` module provides an event emitter pattern implementation. It allows
you to create custom event emitters and subscribe to events, enabling event-driven
programming in Node.js.
5. util:
- The `util` module provides various utility functions and classes that are
commonly used in Node.js applications. It includes functions for debugging,
formatting, and inspecting objects.
6. os (Operating System):
- The `os` module provides information about the operating system on which
Node.js is running. It allows you to access system-related information such as CPU
architecture, memory usage, and network interfaces.
7. http(s):
- In addition to the `http` module, Node.js also provides the `https` module for
creating HTTPS servers and making secure HTTPS requests. It extends the
functionality of the `http` module to support SSL/TLS encryption.
8. net (Networking):
- The `net` module provides networking functionality for creating TCP servers and
clients. It allows you to build networked applications that communicate over the TCP
protocol.
9. crypto (Cryptography):
- The `crypto` module provides cryptographic functionality for generating hashes,
encrypting and decrypting data, and creating digital signatures. It includes various
algorithms for secure data manipulation.
10. stream:
- The `stream` module provides a mechanism for streaming data between
different sources and destinations. It allows you to work with streams of data in a
memory-efficient and asynchronous manner.
Ex2:
Example application using ftp module:
var fs = require('fs');
console.log(data.toString());
});
File System ('fs') Module
The built-in File System module allows us to manipulate files, such as:
.open(filename, [encoding], callback(err, file)): open a file or create a new file.
.readFile(filename, callback(err, data))
.writeFile(filename, data, callback(err)): writes data into filename, erasing existing
contents.
.appendFile(filename, data, callback(err)): appends given data to filename.
.unlink(filename, callback(err)): delete file
.rename(oldFilename, newFilename, callback(err))
Stream Module
A Stream is a sequence of data that is being moved from one point to another over
time and processed in sequential manner.
Ex: A stream of data over internet transferred from one computer to another
Ex: A stream of data moving from one file to another within same computer.
Process streams of data in chunks as they arrive instead of waiting for the entire data
to be available before processing.
Ex: watching a video in youtube : the data arrives in chunks and you watch in chunks
while the rest of the data arrives/downloaded over time.
Benefits: preventing unnecessary downloads and memory usage.
Node.js includes a built-in module called stream which lets us work with streaming
data.
Handling HTTP requests
Reading /Writing to Files
Making Socket Communication
pipelines allow us to build applications involving multiple streams
gluing the streams can be done using pipe() or pipeline() of stream module
Streams basically provide two major advantages compared to other data handling
methods:
Memory efficiency: you don’t need to load large amounts of data in memory before
you are able to process it
Time efficiency: it takes significantly less time to start processing data as soon as you
have it, rather than having to wait with processing until the entire payload has been
transmitted
Readable Streams
A readable stream can read data from a particular data source, most commonly, from
a file system. Other common uses of readable streams in Node.js applications are:
process.stdin - To read user input via stdin in a terminal application.
http.IncomingMessage - To read an incoming request's content in an HTTP server or
to read the server HTTP response in an HTTP client.
Writable Streams
Writable streams allow us to write data from an application to a specific destination,
for example, a file.
process.stdout can be used to write data to standard output and is used internally
by console.log.
duplex and transform streams can be considered as 'hybrid' stream types built on
readable and writable streams.
Duplex Streams
A duplex stream is a combination of both readable and writable streams.
It provides the capability to write data to a particular destination and read data from
a source.
The most common example of a duplex stream is net.Socket, used to read and write
data to and from a socket.
Transform Streams
A transform stream is slightly similar to a duplex stream, but the readable side is
connected to the writable side in a transform stream.
A good example would be the crypto.Cipher class which implements an encryption
stream.
Using a crypto.Cipher stream, an application can write plain text data into the
writable side of a stream and read encrypted ciphertext out of the readable side of the
stream.
The transformative nature of this type of stream is why they are called 'transform
streams'.
Transform streams are also used for compression, encryption and data validation.
Buffers
The Buffer class was introduced as part of the Node.js API to make it possible to
manipulate or interact with streams of binary data.
Streaming applications
ReadableStream and WriteableStream
Ex1:
const fs = require("fs");
const readable = fs.createReadStream("./sayHello.js", { highWaterMark: 20 });
Ex2:
const {Readable} = require('stream');
inStream.push('Web');
inStream.push('Engineering');
inStream.push(null);
inStream.pipe(process.stdout);
ex3:
const fs = require("fs");
const readable = fs.createReadStream("./sayHello.js", { highWaterMark: 20 });
let bytesRead = 0;
console.log(
`before attaching 'data' handler. is flowing: ${readable.readableFlowing}`
);
readable.on("data", (chunk) => {
console.log(`Read ${chunk.length} bytes`);
bytesRead += chunk.length;
Ex4:
var fs = require('fs');
var readableStream = fs.createReadStream('sometext.txt');
var writableStream = fs.createWriteStream('file2.txt');
readableStream.setEncoding('utf8');
readableStream.on('data', function(chunk) {
writableStream.write(chunk);
});
Ex5:
var fs = require('fs');
const { writeHeapSnapshot } = require('v8');
var readableStream = fs.createReadStream('./output.txt',{highWaterMark:20});
var writableStream = fs.createWriteStream('./output2.txt');
readableStream.setEncoding('utf8');
readableStream.on('open', ()=>{
console.log("Opened readstream for reading the file");
});
readableStream.on('end',()=>{
console.log("end of reading from a readable stream.");
});
let i=0;
readableStream.on('data', function(chunk) {
console.log("writing on the file using writeable streams");
console.log(`${++i}`);
writableStream.write(chunk);
});
writableStream.on('drain', ()=>{
console.log('now it is appropriate for the stream for resuming the writing to the
file.')
});
Ex6:
const fs=require('fs');
//creating a writeablestream
const writestream=fs.createWriteStream('./output3.txt');
writestream.on('finish',()=>{
console.log("Finished writing on to the file");
})
writestream.write('this cannot be written');
const msg=`trying to write on the stream after closing`;
writestream.on('error',()=>{
console.log(`${msg}: error has occured reported`);
})
Events module:
Ex1:
ob1.on('event1', () => {
console.log('Event1 is triggered inst1!');
});
ob2.on('event1', () => {
console.log('Event1 is triggered on inst2!');
});
ob1.on('event3',()=>{
console.log("Event3 is triggered on inst1!");
})
ob1.emit('event1');
ob2.emit('event1');
ob1.emit('event3');
console.log(ob1.eventNames());
ex2:
const EventEmitter = require('events');
stock.price = 700;
ex3:
const EventEmitter = require('events');
count=0;
// emit the saved event
emitter.emit('saved', {
id: 501,
name: 'event1',
hit:++count
});
// remove the event listener
emitter.off('saved', log);
// no effect
emitter.emit('saved', {
id: 502,
name: 'event2',
hit:++count
});
OS Module:
Ex1: const os=require('os');
console.log(os.arch());
console.log(os.homedir());
console.log(os.hostname());
const cpus=os.cpus();
console.log(cpus);
console.log((os.totalmem()/1073741824)+ "GB" );
console.log(`freemem: ${os.freemem()}`);
console.log(os.networkInterfaces());
console.log(os.platform());
console.log(os.type());
console.log(os.userInfo())