Node - Js Interview Questions
Node - Js Interview Questions
Node.js
Source: tutorialspoint.com
● Aynchronous and Event Driven - All APIs of Node.js library are aynchronous that is
non-blocking. It essentially means a Node.js based server never waits for a API to
return data. Server moves to next API after calling it and a notification mechanism
of Events of Node.js helps server to get response from the previous API call.
● Very Fast - Being built on Google Chrome's V8 JavaScript Engine, Node.js library is
very fast in code execution.
● Single Threaded but highly Scalable - Node.js uses a single threaded model with
event looping. Event mechanism helps server to respond in a non-bloking ways
and makes server highly scalable as opposed to traditional servers which create
limited threads to handle requests. Node.js uses a single threaded program and
same program can services much larger number of requests than traditional server
like Apache HTTP Server.
● No Buffering - Node.js applications never buffer any data. These applications
simply output the data in chunks.
Source: tutorialspoint.com
Source: tutorialspoint.com
Source: tutorialspoint.com
Source: tutorialspoint.com
Source: blog.risingstack.com
Source: techbeamers.com
Source: techbeamers.com
Source: techbeamers.com
Source: lazyquestion.com
Node.js has built in event's and built in event listeners. Node.js also provides functionality
to create Custom events and Custom Event listeners.
Source: lazyquestion.com
When Node gets I/O request it creates or uses a thread to perform that I/O operation and
once the operation is done, it pushes the result to the event queue. On each such event,
event loop runs and checks the queue and if the execution stack of Node is empty then it
adds the queue result to execution stack.
Source: codeforgeek.com
Q13: What is Callback Hell? ☆☆
Answer: The asynchronous function requires callbacks as a return parameter. When
multiple asynchronous functions are chained together then callback hell situation comes
up.
Source: codeforgeek.com
Child process module has following three major ways to create child processes –
Source: codeforgeek.com
Source: a4academics.com
● EventEmitter
● Stream
● FS
● Net
● Global Objects
Source: github.com/jimuyouyou
Source: nodejs.org
Source: nodejs.org
return callback();
//some more lines of code; - won't be executed
callback();
//some more lines of code; - will be executed
Of course returning will help the context calling async function get the value returned by
callback.
function do2(callback) {
log.trace('Execute function: do2');
return callback('do2 callback param');
}
log.trace(`print ${do2Result}`);
Output:
Source: stackoverflow.com
● Read - Reads user's input, parse the input into JavaScript data-structure and stores
in memory.
● Eval - Takes and evaluates the data structure
● Print - Prints the result
● Loop - Loops the above command until user press ctrl-c twice.
Source: tutorialspoint.com
This makes Node.js highly scalable, as it can process high number of request without
waiting for any function to return result.
Source: tutorialspoint.com
Source: tutorialspoint.com
Source: tutorialspoint.com
Source: tutorialspoint.com
Source: tutorialspoint.com
Source: tutorialspoint.com
Source: tutorialspoint.com
Source: tutorialspoint.com
Every I/O requires a callback - once they are done they are pushed onto the event loop
for execution. Since most modern kernels are multi-threaded, they can handle multiple
operations executing in the background. When one of these operations completes, the
kernel tells Node.js so that the appropriate callback may be added to the poll queue to
eventually be executed.
Source: blog.risingstack.com
Q34: How to avoid callback hell in Node.js? ☆☆☆
Answer: Node.js internally uses a single-threaded event loop to process queued events.
But this approach may lead to blocking the entire process if there is a task running longer
than expected.
sometimes, it could lead to complex and unreadable code. More the no. of callbacks,
longer the chain of returning callbacks would be.
There are four solutions which can address the callback hell problem.
It proposes to split the logic into smaller modules. And then join them together from the
main module to achieve the desired result.
The async module has <async.waterfall> API which passes data from one operation to
other using the next callback.
Another async API <async.map> allows iterating over a list of items in parallel and calls
back with another list of results.
With the async approach, the caller’s callback gets called only once. The caller here is the
main method using the async module.
Promises give an alternate way to write async code. They either return the result of
execution or the error/exception. Implementing promises requires the use of <.then()>
function which waits for the promise object to return. It takes two optional arguments,
both functions. Depending on the state of the promise only one of them will get called.
The first function call proceeds if the promise gets fulfilled. However, if the promise gets
rejected, then the second function will get called.
Use generators
Generators are lightweight routines, they make a function wait and resume via the yield
keyword. Generator functions uses a special syntax <function* ()>. They can also suspend
and resume asynchronous operations using constructs such as promises or and turn a
synchronous code into asynchronous.
Source: techbeamers.com
Node.js works asynchronously by using the event loop and callback functions, to handle
multiple requests coming in parallel. An Event Loop is a functionality which handles and
processes all your external events and just converts them to a callback function. It invokes
all the event handlers at a proper time. Thus, lots of work is done on the back-end, while
processing a single request, so that the new incoming request doesn’t have to wait if the
processing is not complete.
While processing a request, Node.js attaches a callback function to it and moves it to the
back-end. Now, whenever its response is ready, an event is called which triggers the
associated callback function to send this response.
Source: techbeamers.com
Source: techbeamers.com
If threading support is desired in a Node.js application, there are tools available to enable
it, such as the ChildProcess module.
Source: lazyquestion.com
Domains provide a way to handle multiple different I/O operations as a single group. So,
by having your application, or part of it, running in a separate domain, you can safely
handle exceptions at the domain level, before they reach the Process level.
Source: lazyquestion.com
Q19: What is stream and what are types of streams available in Node.js?
☆☆☆
Answer: Streams are a collection of data that might not be available all at once and don’t
have to fit in memory. Streams provide chunks of data in a continuous manner. It is useful
to read a large set of data and process it.
● Readable.
● Writeable.
● Duplex.
● Transform.
Readable streams as the name suggest used in reading a large chunk of data from a
source. Writable streams are used in writing a large chunk of data to the destination.
Duplex streams are both readable and writable ( Eg socket). Transform stream is the
duplex stream which is used in modifying the data (eg zip creation).
Source: codeforgeek.com
Source: github.com/jimuyouyou
Source: tutorialspoint.com
Source: tutorialspoint.com
Source: blog.risingstack.com
Q5: What tools can be used to assure consistent code style? ☆☆☆☆
Answer: You have plenty of options to do so:
These tools are really helpful when developing code in teams, to enforce a given style
guide and to catch common errors using static analysis.
Source: blog.risingstack.com
A use-case can be a file read, when you do not want to read an actual file:
var fs = require('fs');
expect(readFileStub).to.be.called;
readFileStub.restore();
Source: blog.risingstack.com
However, Node.js can facilitate deployment on multi-core systems where it does use the
additional hardware. It packages with a Cluster module which is capable of starting
multiple Node.js worker processes that will share the same port.
Source: techbeamers.com
Q8: Is Node.js entirely based on a single-thread? ☆☆☆☆
Answer: Yes, it’s true that Node.js processes all requests on a single thread. But it’s just a
part of the theory behind Node.js design. In fact, more than the single thread mechanism,
it makes use of events and callbacks to handle a large no. of requests asynchronously.
Moreover, Node.js has an optimized design which utilizes both JavaScript and C++ to
guarantee maximum performance. JavaScript executes at the server-side by Google
Chrome v8 engine. And the C++ lib UV library takes care of the non-sequential I/O via
background workers.
To explain it practically, let’s assume there are 100s of requests lined up in Node.js queue.
As per design, the main thread of Node.js event loop will receive all of them and forwards
to background workers for execution. Once the workers finish processing requests, the
registered callbacks get notified on event loop thread to pass the result back to the user.
Source: techbeamers.com
Moreover, Node.js has an optimized design which utilizes both JavaScript and C++ to
guarantee maximum performance. JavaScript executes at the server-side by Google
Chrome v8 engine. And the C++ lib UV library takes care of the non-sequential I/O via
background workers.
To explain it practically, let’s assume there are 100s of requests lined up in Node.js queue.
As per design, the main thread of Node.js event loop will receive all of them and forwards
to background workers for execution. Once the workers finish processing requests, the
registered callbacks get notified on event loop thread to pass the result back to the user.
Source: techbeamers.com
Source: techbeamers.com
Source: codingdefined.com
Source: github.com/jimuyouyou
Source: stackoverflow.com/
● Joyent's Guide
● Debugger
● Node Inspector
● Visual Studio Code
● Cloud9
● Brackets
Source: stackoverflow.com
function asyncTask() {
return functionA()
.then((valueA) => functionB(valueA))
.then((valueB) => functionC(valueB))
.then((valueC) => functionD(valueC))
.catch((err) => logger.error(err))
}
Answer:
Source: stackoverflow.com
Source: stackoverflow.com
Source: medium.com
// domain has now exited. Any errors in code past this point will not be caught.
Source: nodejs.org
Q19: Are you familiar with differences between Node.js nodules and ES6
nodules? ☆☆☆
Answer: The modules used in Node.js follow a module specification known as
the CommonJS specification. The recent updates to the JavaScript programming language,
in the form of ES6, specify changes to the language, adding things like new class syntax
and a module system. This module system is different from Node.js modules. To import
ES6 module, we'd use the ES6 import functionality.
Now ES6 modules are incompatible with Node.js modules. This has to do with the way
modules are loaded differently between the two formats. If you use a compiler like Babel,
you can mix and match module formats.
Source: stackoverflow.com
Q20: What are the use cases for the Node.js "vm" core module? ☆☆☆
Answer: It can be used to safely execute a piece of code contained in a string or file. The
execution is performed in a separate environment that by default has no access to the
environment of the program that created it. Moreover, you can specify execution timeout
and context-specific error handling.
Source: quora.com
Q1: What is Piping in Node? ☆☆☆☆
Answer: Piping is a mechanism to connect output of one stream to another stream. It is
normally used to get data from one stream and to pass output of that stream to another
stream. There is no limit on piping operations.
Source: tutorialspoint.com
Source: tutorialspoint.com
Source: blog.risingstack.com
Q5: What tools can be used to assure consistent code style? ☆☆☆☆
Answer: You have plenty of options to do so:
These tools are really helpful when developing code in teams, to enforce a given style
guide and to catch common errors using static analysis.
Source: blog.risingstack.com
A use-case can be a file read, when you do not want to read an actual file:
var fs = require('fs');
expect(readFileStub).to.be.called;
readFileStub.restore();
Source: blog.risingstack.com
However, Node.js can facilitate deployment on multi-core systems where it does use the
additional hardware. It packages with a Cluster module which is capable of starting
multiple Node.js worker processes that will share the same port.
Source: techbeamers.com
Moreover, Node.js has an optimized design which utilizes both JavaScript and C++ to
guarantee maximum performance. JavaScript executes at the server-side by Google
Chrome v8 engine. And the C++ lib UV library takes care of the non-sequential I/O via
background workers.
To explain it practically, let’s assume there are 100s of requests lined up in Node.js queue.
As per design, the main thread of Node.js event loop will receive all of them and forwards
to background workers for execution. Once the workers finish processing requests, the
registered callbacks get notified on event loop thread to pass the result back to the user.
Source: techbeamers.com
Moreover, Node.js has an optimized design which utilizes both JavaScript and C++ to
guarantee maximum performance. JavaScript executes at the server-side by Google
Chrome v8 engine. And the C++ lib UV library takes care of the non-sequential I/O via
background workers.
To explain it practically, let’s assume there are 100s of requests lined up in Node.js queue.
As per design, the main thread of Node.js event loop will receive all of them and forwards
to background workers for execution. Once the workers finish processing requests, the
registered callbacks get notified on event loop thread to pass the result back to the user.
Source: techbeamers.com
Source: techbeamers.com
Source: codingdefined.com
Q12: How to gracefully Shutdown Node.js Server? ☆☆☆☆
Answer: We can gracefully shutdown Node.js server by using the generic signal called
SIGTERM or SIGINT which is used for program termination. We need to call SIGTERM or
SIGINT which will terminate the program and clean up the resources utilized by the
program.
Source: codingdefined.com
function cb(){
console.log('Processed in next iteration');
}
process.nextTick(cb);
console.log('Processed in the first iteration');
Output:
Source: github.com/jimuyouyou
Source: github.com/i0natan/nodebestpractices
Q15: What is LTS releases of Node.js why should you care? ☆☆☆☆
Answer: An LTS(Long Term Support) version of Node.js receives all the critical bug fixes,
security updates and performance improvements.
LTS versions of Node.js are supported for at least 18 months and are indicated by even
version numbers (e.g. 4, 6, 8). They're best for production since the LTS release line is
focussed on stability and security, whereas the Current release line has a shorter lifespan
and more frequent updates to the code. Changes to LTS versions are limited to bug fixes
for stability, security updates, possible npm updates, documentation updates and certain
performance improvements that can be demonstrated to not break existing applications.
Source: github.com/i0natan/nodebestpractices
Q16: Provide some example of config file separation for dev and prod
environments ☆☆☆☆
Answer: A perfect and flawless configuration setup should ensure:
var config = {
production: {
mongo : {
billing: '****'
}
},
default: {
mongo : {
billing: '****'
}
}
}
Source: github.com/i0natan/nodebestpractices
Q17: How would you handle errors for async code in Node.js? ☆☆☆☆
Answer: Handling async errors in callback style (error-first approach) is probably the
fastest way to hell (a.k.a the pyramid of doom). It's better to use a reputable promise
library or async-await instead which enables a much more compact and familiar code
syntax like try-catch.
doWork()
.then(doWork)
.then(doOtherWork)
.then((result) => doWork)
.catch((error) => {throw error;})
.then(verify);
or using async/await:
Source: github.com/i0natan/nodebestpractices
● dependencies - Dependencies that your project needs to run, like a library that
provides functions that you call from your code. They are installed transitively (if A
depends on B depends on C, npm install on A will install B and C).
Source: stackoverflow.com
Answer:
// Promise can be used with together async\await in ES7 to make the program flow wait
for a fullfiled result
async function foo() {
var result = await divisionAPI(1, 2); // awaits for a fulfilled result!
console.log(result);
}
readfile('/some/file')
.then((data) => {
/** ... **/
})
.catch((err) => {
/** ... **/
});
Source: stackoverflow.com
Async/Await is:
Async functions can make use of the await expression. This will pause the async function
and wait for the Promise to resolve prior to moving on.
{
console.time("loop");
for (var i = 0; i < 1000000; i += 1) {
// Do nothing
}
console.timeEnd("loop");
}
The time required to run this code in Google Chrome is considerably more than the time
required to run it in Node.js Explain why this is so, even though both use the v8 JavaScript
Engine.
Source: codeforgeek.com
● cloning using Cluster module.
● decomposing the application into smaller services – i.e micro services.
Source: codeforgeek.com
Source: codingdefined.com
At the same time, there is an Event Loop which iterates over the items in the Event Queue.
Every event has a callback function associated with it, and that callback function is invoked
when the Event Loop iterates.
Source: hackernoon.com
● Error throwing - well-establish pattern, in which a function does its thing and if an
error situation arises, it simply bails out throwing an error. Can leave you in an
unstable state. It requires extra work to catch them. Also wrapping the async calls
in try/catch won't help because the errors happen asynchronously. To fix this, we
need domains. Domains provide an asynchronous try...catch.
try {
validateObject('123');
}
catch (err) {
console.log('Thrown: ' + err.message);
}
● Error callback - returning an error via a callback is the most common error handling
pattern in Node.js. Handling error callbacks can become a mess (callback hell or
the pyramid of doom).
● Error emitting - when emitting errors, the errors are broadcast to any interested
subscribers and handled within the same process tick, in the order subscribed.
validateObject('123');
doWork()
.then(doWork)
.then(doError)
.then(doWork)
.catch(errorHandler)
.then(verify);
try {
let response = await fetch('http://no-such-url');
} catch(err) {
alert(err); // TypeError: failed to fetch
}
}
f();
callback(null,quote);
Source: gist.github.com
Q8: Why should you separate Express 'app' and 'server'? ☆☆☆☆☆
Answer: Keeping the API declaration separated from the network related configuration
(port, protocol, etc) allows testing the API in-process, without performing network calls,
with all the benefits that it brings to the table: fast testing execution and getting coverage
metrics of the code. It also allows deploying the same API under flexible and different
network conditions. Bonus: better separation of concerns and cleaner code.
/**
* Get port from environment and store in Express.
*/
/**
* Create HTTP server.
*/
var server = http.createServer(app);
Source: github.com/i0natan/nodebestpractices
Answer:
Source: medium.com
Source: stackoverflow.com
There is an effort by Microsoft to allow the Chakra Javascript engine (that's the engine in
Edge) to be used with node.js. Node.js can actually function to some extent without V8,
through use of the node-chakracore project. There is ongoing work to reduce the tight
coupling between V8 and Node, so that different JavaScript engines can be used in-place.
Source: stackoverflow.com
V8 was first designed to increase the performance of the JavaScript execution inside web
browsers. In order to obtain speed, V8 translates JavaScript code into more efficient
machine code instead of using an interpreter. It compiles JavaScript code into machine
code at execution by implementing a JIT (Just-In-Time) compiler like a lot of modern
JavaScript engines such as SpiderMonkey or Rhino (Mozilla) are doing. The main difference
with V8 is that it doesn’t produce bytecode or any intermediate code.
Source: nodejs.org
Accessing types and properties effectively makes a first big challenge for V8. Instead of
using a dictionary-like data structure for storing object properties and doing a dynamic
lookup to resolve the property location (like most JavaScript engines do), V8 creates**
hidden classes**, at runtime, in order to have an internal representation of the type system
and to improve the property access time.
Source: thibaultlaurens.github.io
● A “Full” Compiler that can generate good code for any JavaScript: good but not
great JIT code. The goal of this compiler is to generate code quickly. To achieve its
goal, it doesn’t do any type analysis and doesn’t know anything about types.
Instead, it uses an Inline Caches or “IC” strategy to refine knowledge about types
while the program runs. IC is very efficient and brings about 20 times speed
improvment.
Source: thibaultlaurens.github.io
Libuv by default creates a thread pool with four threads to offload asynchronous work to.
Today’s operating systems already provide asynchronous interfaces for many I/O tasks
(e.g. AIO on Linux). Whenever possible, libuv will use those asynchronous interfaces,
avoiding usage of the thread pool.
The event loop as a process is a set of phases with specific tasks that are processed in a
round-robin manner. Each phase has a FIFO queue of callbacks to execute. While each
phase is special in its own way, generally, when the event loop enters a given phase, it will
perform any operations specific to that phase, then execute callbacks in that phase's
queue until the queue has been exhausted or the maximum number of callbacks has
executed. When the queue has been exhausted or the callback limit is reached, the event
loop will move to the next phase, and so on.
Source: nodejs.org
Q16: How does the cluster module work? What’s the difference between
it and a load balancer? ☆☆☆☆
Answer: The cluster module performs fork from your server (at that moment it is already
an OS process), thus creating several slave processes. The cluster module supports two
methods of distributing incoming connections.
● The first one (and the default one on all platforms except Windows), is the round-
robin approach, where the master process listens on a port, accepts new
connections and distributes them across the workers in a round-robin fashion, with
some built-in smarts to avoid overloading a worker process.
● The second approach is where the master process creates the listen socket and
sends it to interested workers. The workers then accept incoming connections
directly.
The difference between a cluster module and a load balancer is that instead of distributing
load between processes, the balancer distributes requests.
Source: imasters.com
● Function Template is the blueprint for a single function. You create a JavaScript
instance of template by calling the template’s GetFunction method from within the
context in which you wish to instantiate the JavaScript function. You can also
associate a C++ callback with a function template which is called when the
JavaScript function instance is invoked.
Source: blog.ghaiklor.com
1. You may want to access some native apis that is difficult using JS alone.
2. You may want to integrate a third party library written in C/C++ and use it directly
in Node.js.
3. You may want to rewrite some of the modules in C++ for performance reasons.
N-API (pronounced N as in the letter, followed by API) is an API for building native
Addons.
Source: nodejs.org
class Animal {
constructor(name) {
this.name = name;
}
print() {
console.log('Name is :' + this.name);
}
}
};
Once imported into another module, then you can treat it as if it were defined in that file:
Source: stackoverflow.com
Q20: Why Node.js devs tend to lean towards the Module Requiring vs
Dependency Injection? ☆☆☆☆☆
Answer: Dependency injection is somewhat the opposite of normal module design. In
normal module design, a module uses require() to load in all the other modules that it
needs with the goal of making it simple for the caller to use your module. The caller can
just require() in your module and your module will load all the other things it needs.
With dependency injection, rather than the module loading the things it needs, the caller is
required to pass in things (usually objects) that the module needs. This can make certain
types of testing easier and it can make mocking certain things for testing purposes easier.
Modules and dependency injection are orthogonal: if you need dependency injection for
testability or extensibility then use it. If not, importing modules is fine. The great thing
about JS is that you can modify just about anything to achieve what you want. This comes
in handy when it comes to testing.
Source: reddit.com
crazy.on('event1', function () {
console.log('event1 fired!');
crazy.emit('event2');
});
crazy.on('event2', function () {
console.log('event2 fired!');
crazy.emit('event3');
});
crazy.on('event3', function () {
console.log('event3 fired!');
crazy.emit('event1');
});
crazy.emit('event1');
Answer: You’ll get an exception that basically says the call stack has exploded. Why?
Every emit will invoke synchronous code. Because all callbacks are executed in a
synchronous manner it’ll just recursive call itself to infinity and beyond.
Output:
console.js:165
if (isStackOverflowError(e))
^
Source: codementor.io
crazy.on('event1', function () {
console.log('event1 fired!');
setImmediate(function () {
crazy.emit('event2');
});
});
crazy.on('event2', function () {
console.log('event2 fired!');
setImmediate(function () {
crazy.emit('event3');
});
});
crazy.on('event3', function () {
console.log('event3 fired!');
setImmediate(function () {
crazy.emit('event1');
});
});
crazy.emit('event1');
Answer: Shortly - the app will be run infinitely. Any function passed as the setImmediate()
argument is a callback that's executed in the next iteration of the event loop.
Without setImmidiate all callbacks are executed in a synchronous manner.
With setImmidiate each call back executed as a part of next event loop iteration so no
recursion/stuck occurs.
Source: codementor.io
Q3: What will happen when that code will be executed? ☆☆☆☆☆
Details:
What will happen when that code will be executed?
crazy.on('event1', function () {
console.log('event1 fired!');
process.nextTick(function () {
crazy.emit('event2');
});
});
crazy.on('event2', function () {
console.log('event2 fired!');
process.nextTick(function () {
crazy.emit('event3');
});
});
crazy.on('event3', function () {
console.log('event3 fired!');
process.nextTick(function () {
crazy.emit('event1');
});
});
crazy.emit('event1');
Answer: It’ll get stuck! And if you wait long enough, about 30 seconds, it’ll eventually give
you a “process out of memory” exception. Now, the problem is not stack overflow, it’s GC
not being able to reclaim memory. Every handler has its own closure to access the crazy
on the outer layer. This cost comes out of the heap. Though you might not be 100% why
GC can't successfully get the memory back, you can probably guess that the program got
stuck in some even loop phase because there’s always another process.nextTick callback
to be processed. So essentially, the event loop is blocked completely.
Source: _codementor.io_Consider the code:
function doubleAfter2Seconds(x) {
return new Promise(resolve => {
setTimeout(() => {
resolve(x * 2);
}, 2000);
});
}
What if we want to run a few different values through our function and add the result?
function addPromise(x) {
return new Promise(resolve => {
doubleAfter2Seconds(10).then((a) => {
doubleAfter2Seconds(20).then((b) => {
doubleAfter2Seconds(30).then((c) => {
resolve(x + a + b + c);
})
})
})
});
}
addPromise(10).then((sum) => {
console.log(sum);
});
addAsync(10).then((sum) => {
console.log(sum);
});
Source: medium.com
Follow me on:
● LinkedIn: https://www.linkedin.com/in/tranphuong0211/
● Facebook: https://www.facebook.com/phuongtake.itrecruiter