0% found this document useful (0 votes)
15 views19 pages

Introduction of NodeJs and Its Core Modules

Uploaded by

prithikareddyd
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
Download as docx, pdf, or txt
0% found this document useful (0 votes)
15 views19 pages

Introduction of NodeJs and Its Core Modules

Uploaded by

prithikareddyd
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1/ 19

Node.

js
Node.js is an open-source, cross-platform runtime environment built on Chrome's
V8 JavaScript engine, which allows developers to execute JavaScript code on the
server-side.
Introduced by Ryan Dahl in 2009.

Node.js comes with several features that make it a popular choice for building
scalable and efficient web applications.
Here are some of the main features of Node.js:

1. Asynchronous and Event-Driven:


- Node.js uses an event-driven, non-blocking I/O model, which makes it lightweight
and efficient. It can handle a large number of concurrent connections without getting
blocked, making it suitable for building real-time applications.

2. JavaScript Everywhere:
- Node.js allows developers to use JavaScript on both the client-side and server-
side, providing a unified language for full-stack development. This facilitates code
reuse and reduces context switching between different languages.

3. Fast Execution:
- Node.js is built on the V8 JavaScript engine, which compiles JavaScript code into
machine code before execution. This leads to fast performance, making Node.js
suitable for building high-performance applications.

4. Single-Threaded, Event Loop:


- Node.js uses a single-threaded event loop architecture, where all I/O operations
are performed asynchronously. This allows Node.js to handle many concurrent
connections efficiently while avoiding the overhead of managing threads.

5. NPM (Node Package Manager):


- Node.js comes with a rich ecosystem of libraries and packages available through
npm, the largest package registry in the world. Developers can easily find and install
reusable modules to extend the functionality of their applications.
6. Scalability:
- Node.js is designed to be scalable, both vertically and horizontally. It can handle a
large number of concurrent connections and scale across multiple CPU cores using
clustering or load balancing techniques.

7. Cross-Platform Compatibility:
- Node.js is cross-platform and runs on various operating systems such as
Windows, macOS, and Linux. This allows developers to write code once and deploy it
across different environments without modification.

8. Community Support:
- Node.js has a vibrant and active community of developers, contributing to its
growth and evolution. The community provides support, documentation, tutorials,
and a vast ecosystem of modules to help developers build robust applications.

9. Real-Time Applications:
- Node.js is well-suited for building real-time applications such as chat
applications, online gaming, collaboration tools, and streaming platforms, where
instant communication and data updates are essential.

10. Microservices Architecture:


- Node.js is often used in microservices architecture, where applications are built
as a collection of small, loosely coupled services. Its lightweight and modular nature
make it well-suited for building and deploying microservices.

These features collectively make Node.js a powerful and versatile platform for
building a wide range of web applications, from simple APIs to complex, real-time
systems.
Node.js process model vs traditional Web Server

Traditional Web Server Model

- Each request is handled by dedicated thread from the thread pool


- If no thread is available in the pool, request needs to wait until a next thread is
available.
- Dedicated thread executes a particular request and does not return to
thread pool until it completes the execution and returns a response.
-

 Node.js runs in a single process and the application code runs in a single
thread and thereby needs less resources than other platforms.
 All the user requests to your web application will be handled by a single
thread and all the I/O work or long running job is performed
asynchronously for a particular request.
 Thread need not wait for the request to get completed; it can take up next
request.
 When asynchronous I/O work completes then it processes the request
further and sends the response.
 Node.js uses Event loop architecture, which maintains event queue to keep
watch on async.jobs and assign each of these job to internal c++ thread
pool which has worker thread to implement the job.
 Node.js is not fit for an application which performs CPU-intensive
operations like image processing or other heavy computation.

Node.js Core Modules:


Node.js provides a set of core modules that are included with every installation.
These modules offer fundamental functionality for various tasks such as file system
operations, networking, and utilities. Some of the major core modules of Node.js
include:

1. http:
- The `http` module provides functionality to create HTTP servers and make HTTP
requests. It allows you to build web servers and handle incoming HTTP requests.

2. fs (File System):
- The `fs` module provides an API for interacting with the file system. It allows you
to perform file operations such as reading, writing, updating, and deleting files and
directories.

3. path:
- The `path` module provides utilities for working with file and directory paths. It
offers methods for resolving and normalizing paths, joining paths, and extracting
path components.

4. events:
- The `events` module provides an event emitter pattern implementation. It allows
you to create custom event emitters and subscribe to events, enabling event-driven
programming in Node.js.
5. util:
- The `util` module provides various utility functions and classes that are
commonly used in Node.js applications. It includes functions for debugging,
formatting, and inspecting objects.

6. os (Operating System):
- The `os` module provides information about the operating system on which
Node.js is running. It allows you to access system-related information such as CPU
architecture, memory usage, and network interfaces.

7. http(s):
- In addition to the `http` module, Node.js also provides the `https` module for
creating HTTPS servers and making secure HTTPS requests. It extends the
functionality of the `http` module to support SSL/TLS encryption.

8. net (Networking):
- The `net` module provides networking functionality for creating TCP servers and
clients. It allows you to build networked applications that communicate over the TCP
protocol.

9. crypto (Cryptography):
- The `crypto` module provides cryptographic functionality for generating hashes,
encrypting and decrypting data, and creating digital signatures. It includes various
algorithms for secure data manipulation.

10. stream:
- The `stream` module provides a mechanism for streaming data between
different sources and destinations. It allows you to work with streams of data in a
memory-efficient and asynchronous manner.

Example application using http module:

const http = require('http');


const hostname = 'localhost';
const port = 3000;
const sum=(a,b)=> {return a+b;}
const server = http.createServer((req, res) => {
res.statusCode = 200;
res.setHeader('Content-Type', 'text/plain');
res.end('Hello World ');
});

server.listen(port, hostname, () => {


console.log(`Server running at http://${hostname}:${port}/`);
console.log( `Hello good morning to all the student ${sum(20,30)}`)
});

Ex2:
Example application using ftp module:
var fs = require('fs');

fs.readFile('sometext.txt', function (err, data) {


if (err)
throw err;

console.log(data.toString());
});
File System ('fs') Module
The built-in File System module allows us to manipulate files, such as:
.open(filename, [encoding], callback(err, file)): open a file or create a new file.
.readFile(filename, callback(err, data))
.writeFile(filename, data, callback(err)): writes data into filename, erasing existing
contents.
.appendFile(filename, data, callback(err)): appends given data to filename.
.unlink(filename, callback(err)): delete file
.rename(oldFilename, newFilename, callback(err))

Uses of url module


Here are some common tasks and functionalities provided by the url module:
Parsing URLs: You can use the url.parse() method to parse a URL string and break it
down into its individual components, such as protocol, host, pathname, query
parameters, and more.
Formatting URLs: The url.format() method allows you to construct a URL string
from an object that contains its components. This is useful when you want to create
URLs programmatically.
Resolving URLs: The url.resolve() method helps resolve a relative URL against a base
URL, producing an absolute URL.
URL Component Encoding/Decoding: The url.encodeURIComponent() and
url.decodeURIComponent() methods can be used for encoding and decoding URL
components to ensure that special characters are correctly represented in URLs.
Operations in url module
Parsing urlYou can use url.parse() method to parse a URL string and extract its
components such as protocol, hostname, query parameters, and more. This is used
for analyzing and manipulating URLs.
const url = require('url');
const urlString = 'https://www.example.com:8080/path?
param1=value1&param2=value2';
const parsedUrl = url.parse(urlString, true);
Formatting urlThe url.format() method allows you to construct a URL string from an
object that contains individual components.
const formattedUrl = url.format({
protocol: 'https:',
hostname: 'www.example.com',
port: 8080,
pathname: '/path',
query: { param1: 'value1', param2: 'value2' },
});
Accessing URL ComponentsOnce you have the URL component, you can access its
components directly as its properties.
console.log(myUrl.protocol); // 'https:'
console.log(myUrl.hostname); // 'www.example.com'
console.log(myUrl.pathname); // '/path'
console.log(myUrl.searchParams.get('param1')); // 'value1'
Resolving URLThe url.resolve() method will help to resolve a relative URL into a
base and obtain an absolute URL.
const baseUrl = 'https://www.example.com';
const relativeUrl = '/subpath/page.html';
const resolvedUrl = url.resolve(baseUrl, relativeUrl);

Stream Module
A Stream is a sequence of data that is being moved from one point to another over
time and processed in sequential manner.
Ex: A stream of data over internet transferred from one computer to another
Ex: A stream of data moving from one file to another within same computer.
Process streams of data in chunks as they arrive instead of waiting for the entire data
to be available before processing.
Ex: watching a video in youtube : the data arrives in chunks and you watch in chunks
while the rest of the data arrives/downloaded over time.
Benefits: preventing unnecessary downloads and memory usage.
Node.js includes a built-in module called stream which lets us work with streaming
data.
Handling HTTP requests
Reading /Writing to Files
Making Socket Communication
pipelines allow us to build applications involving multiple streams
gluing the streams can be done using pipe() or pipeline() of stream module
Streams basically provide two major advantages compared to other data handling
methods:
Memory efficiency: you don’t need to load large amounts of data in memory before
you are able to process it
Time efficiency: it takes significantly less time to start processing data as soon as you
have it, rather than having to wait with processing until the entire payload has been
transmitted

Types of Node.js Streams


Node.js streams provides four types of streams:
Readable Streams
Writable Streams
Duplex Streams
Transform Streams

Readable Streams
A readable stream can read data from a particular data source, most commonly, from
a file system. Other common uses of readable streams in Node.js applications are:
process.stdin - To read user input via stdin in a terminal application.
http.IncomingMessage - To read an incoming request's content in an HTTP server or
to read the server HTTP response in an HTTP client.

Writable Streams
Writable streams allow us to write data from an application to a specific destination,
for example, a file.
process.stdout can be used to write data to standard output and is used internally
by console.log.
duplex and transform streams can be considered as 'hybrid' stream types built on
readable and writable streams.

Duplex Streams
A duplex stream is a combination of both readable and writable streams.
It provides the capability to write data to a particular destination and read data from
a source.
The most common example of a duplex stream is net.Socket, used to read and write
data to and from a socket.

Transform Streams
A transform stream is slightly similar to a duplex stream, but the readable side is
connected to the writable side in a transform stream.
A good example would be the crypto.Cipher class which implements an encryption
stream.
Using a crypto.Cipher stream, an application can write plain text data into the
writable side of a stream and read encrypted ciphertext out of the readable side of the
stream.
The transformative nature of this type of stream is why they are called 'transform
streams'.
Transform streams are also used for compression, encryption and data validation.

Buffers
The Buffer class was introduced as part of the Node.js API to make it possible to
manipulate or interact with streams of binary data.

// demonstrating the functionality render by Node js Buffer object


const fs=require('fs');
const buffer=new Buffer.from('RAMESH');
console.log(buffer.toJSON());
console.log(buffer);
console.log(buffer.toString());
console.log(buffer.length);
console.log(buffer.slice(0,3));
buffer.write("Digital Transformation");
console.log(buffer.toString());
const buffer1=new Buffer.from([72,73,74,75]);
console.log(buffer1);
const buffer2=Buffer.alloc(20);
console.log(buffer2);
buffer2.fill('a');
console.log(buffer2);
buffer2.write("Learning from WE Teacher!!!");
console.log(buffer2);
console.log(buffer2.length);
console.log(buffer2.toString());
console.log(buffer2.toString("utf-8",0,10));
const buffer4 = Buffer.from(buffer2);
console.log(buffer4.toString());
var buffer5=Buffer.concat([buffer1,buffer2]);
console.log(buffer5.length);
console.log(buffer5.toString());

ex2: file i/o operation using buffering


var fs = require("fs");
var buf = new Buffer.alloc(1024);
console.log("opening an existing file");
fs.open('index.txt', 'r+', function(err, fd) {
if (err) {
return console.error(err);
}
console.log("File opened successfully!");
console.log("reading the file");

fs.read(fd, buf, 0, buf.length, 0, function(err, bytes){


if (err){
console.log(err);
}
console.log(bytes + " bytes read");

// Print only read bytes to avoid junk.


if(bytes > 0){
console.log(buf.slice(0, bytes).toString());
}
});
});

Streaming applications
ReadableStream and WriteableStream
Ex1:

const fs = require("fs");
const readable = fs.createReadStream("./sayHello.js", { highWaterMark: 20 });

readable.on("data", (chunk) => {


console.log(`Read ${chunk.length} bytes\n"${chunk.toString()}"\n`);
});

Ex2:
const {Readable} = require('stream');

const inStream = new Readable({


read() {}
});

inStream.push('Web');
inStream.push('Engineering');
inStream.push(null);
inStream.pipe(process.stdout);

ex3:
const fs = require("fs");
const readable = fs.createReadStream("./sayHello.js", { highWaterMark: 20 });

let bytesRead = 0;

console.log(
`before attaching 'data' handler. is flowing: ${readable.readableFlowing}`
);
readable.on("data", (chunk) => {
console.log(`Read ${chunk.length} bytes`);
bytesRead += chunk.length;

// Pause the readable stream after reading 60 bytes from it.


if (bytesRead === 60) {
readable.pause();
console.log(`after pause() call. is flowing: ${readable.readableFlowing}`);

// resume the stream after waiting for 1s.


setTimeout(() => {
readable.resume();
console.log(
`after resume() call. is flowing: ${readable.readableFlowing}`
);
}, 1000);
}
});
console.log(
`after attaching 'data' handler. is flowing: ${readable.readableFlowing}`
);

Ex4:
var fs = require('fs');
var readableStream = fs.createReadStream('sometext.txt');
var writableStream = fs.createWriteStream('file2.txt');

readableStream.setEncoding('utf8');

readableStream.on('data', function(chunk) {
writableStream.write(chunk);
});

Ex5:
var fs = require('fs');
const { writeHeapSnapshot } = require('v8');
var readableStream = fs.createReadStream('./output.txt',{highWaterMark:20});
var writableStream = fs.createWriteStream('./output2.txt');

readableStream.setEncoding('utf8');
readableStream.on('open', ()=>{
console.log("Opened readstream for reading the file");
});

readableStream.on('end',()=>{
console.log("end of reading from a readable stream.");
});
let i=0;
readableStream.on('data', function(chunk) {
console.log("writing on the file using writeable streams");
console.log(`${++i}`);
writableStream.write(chunk);
});

writableStream.on('drain', ()=>{
console.log('now it is appropriate for the stream for resuming the writing to the
file.')
});

Ex6:
const fs=require('fs');
//creating a writeablestream
const writestream=fs.createWriteStream('./output3.txt');

writestream.write("Hello guys this is the first line\n");


writestream.write("This is the second message\n");
writestream.end();

writestream.on('finish',()=>{
console.log("Finished writing on to the file");
})
writestream.write('this cannot be written');
const msg=`trying to write on the stream after closing`;
writestream.on('error',()=>{
console.log(`${msg}: error has occured reported`);
})

Events module:
Ex1:

class Obj1 extends event.EventEmitter { }


class Obj2 extends event.EventEmitter { }

const ob1 = new Obj1();


const ob2 = new Obj2();

ob1.on('event1', () => {
console.log('Event1 is triggered inst1!');
});

ob2.on('event1', () => {
console.log('Event1 is triggered on inst2!');
});
ob1.on('event3',()=>{
console.log("Event3 is triggered on inst1!");
})
ob1.emit('event1');
ob2.emit('event1');
ob1.emit('event3');
console.log(ob1.eventNames());

ex2:
const EventEmitter = require('events');

class Stock extends EventEmitter {


constructor(symbol, price) {
super();
this._symbol = symbol;
this._price = price;
}
set price(newPrice) {
if (newPrice !== this._price) {
this.emit('PriceChanged', {
symbol: this._symbol,
oldPrice: this._price,
newPrice: newPrice,
adjustment: ((newPrice - this._price) * 100 / this._price).toFixed(2)
});
}
}
get price() {
return this._price;
}
get symbol() {
return this._symbol;
}
}
const stock = new Stock('AAPL', 700);

stock.on('PriceChanged', (arg) => {


console.log(`The price of the stock ${arg.symbol} has changed ${arg.adjustment}
%`);
})

stock.price = 700;

ex3:
const EventEmitter = require('events');

const emitter = new EventEmitter();

// declare the event handler


function log(arg) {
console.log(`A saved event occurred, name: ${arg.name}, id: ${arg.id}, hit:$
{arg.hit}`);
}

// attach the event listener to the saved event


emitter.on('saved', log);

count=0;
// emit the saved event
emitter.emit('saved', {
id: 501,
name: 'event1',
hit:++count
});
// remove the event listener
emitter.off('saved', log);

// no effect
emitter.emit('saved', {
id: 502,
name: 'event2',
hit:++count
});

OS Module:
Ex1: const os=require('os');
console.log(os.arch());
console.log(os.homedir());
console.log(os.hostname());

const cpus=os.cpus();
console.log(cpus);
console.log((os.totalmem()/1073741824)+ "GB" );
console.log(`freemem: ${os.freemem()}`);
console.log(os.networkInterfaces());
console.log(os.platform());
console.log(os.type());
console.log(os.userInfo())

You might also like