Get Node js Design Patterns Master best practices to build modular and scalable server side web applications 2nd Edition Casciaro free all chapters
Get Node js Design Patterns Master best practices to build modular and scalable server side web applications 2nd Edition Casciaro free all chapters
com
https://textbookfull.com/product/node-js-design-patterns-
master-best-practices-to-build-modular-and-scalable-server-
side-web-applications-2nd-edition-casciaro/
OR CLICK BUTTON
DOWNLOAD NOW
https://textbookfull.com/product/node-js-web-development-server-side-
development-with-node-10-made-easy-fourth-edition-edition-david-
herron/
textboxfull.com
https://textbookfull.com/product/pro-express-js-master-express-js-the-
node-js-framework-for-your-web-development-mardan-azat/
textboxfull.com
https://textbookfull.com/product/django-design-patterns-and-best-
practices-2nd-edition-arun-ravindran/
textboxfull.com
https://textbookfull.com/product/node-js-8-the-right-way-practical-
server-side-javascript-that-scales-1st-edition-jim-wilson/
textboxfull.com
https://textbookfull.com/product/node-js-for-embedded-systems-using-
web-technologies-to-build-connected-devices-1st-edition-patrick-
mulder/
textboxfull.com
[ ii ]
Sequential crawling of links 69
The pattern 70
Parallel execution 71
Web spider version 3 74
The pattern 75
Fixing race conditions with concurrent tasks 76
Limited parallel execution 77
Limiting the concurrency 78
Globally limiting the concurrency 79
Queues to the rescue 80
Web spider version 4 81
The async library 82
Sequential execution 82
Sequential execution of a known set of tasks 83
Sequential iteration 85
Parallel execution 86
Limited parallel execution 86
Summary 88
Index 89
[ iii ]
Welcome to the Node.js
1
platform
Some principles and design patterns literally define the Node.js platform and its ecosystem;
the most peculiar ones are probably its asynchronous nature and its programming style that
makes heavy use of callbacks. It's important that we first dive into these fundamental
principles and patterns, not only for writing correct code, but also to be able to take effective
design decisions when it comes to solving bigger and more complex problems.
Another aspect that characterizes Node.js is its philosophy. Approaching Node.js is in fact
way more than simply learning a new technology; it's also embracing a culture and a
community. We will see how this greatly influences the way we design our applications
and components, and the way they interact with those created by the community.
In addition to these aspects it's worth knowing that the latest versions of Node.js introduced
support for many of the features described by ES2015, which makes the language even
more expressive and enjoyable to use. It is important to embrace these new syntactical and
functional additions to the language to be able to produce more concise and readable code
and come up with alternative ways to implement the design patterns that we are going to
see throughout this book.
None of these rules are imposed and they should always be applied with common sense;
however, they can prove to be tremendously useful when we are looking for a source of
inspiration while designing our programs.
Small core
The Node.js core itself has its foundations built on a few principles; one of these is, having
the smallest set of functionality, leaving the rest to the so-called userland (or userspace),the
ecosystem of modules living outside the core. This principle has an enormous impact on the
Node.js culture, as it gives freedom to the community to experiment and iterate fast on a
broader set of solutions within the scope of the userland modules, instead of being imposed
with one slowly evolving solution that is built into the more tightly controlled and stable
core. Keeping the core set of functionality to the bare minimum then, not only becomes
convenient in terms of maintainability, but also in terms of the positive cultural impact that
it brings on the evolution of the entire ecosystem.
Small modules
Node.js uses the concept of module as a fundamental mean to structure the code of a
program. It is the brick for creating applications and reusable libraries called packages (a
package is also frequently referred to as just module; since, usually it has one single module
as an entry point). In Node.js, one of the most evangelized principles is to design small
modules, not only in terms of code size, but most importantly in terms of scope.
[2]
Welcome to the Node.js platform
This principle has its roots in the Unix philosophy, particularly in two of its precepts, which
are as follows:
“Small is beautiful.”
“Make each program do one thing well.”
Node.js brought these concepts to a whole new level. Along with the help of npm, the
official package manager, Node.js helps solving the dependency hell problem by making sure
that each installed package will have its own separate set of dependencies, thus enabling a
program to depend on a lot of packages without incurring in conflicts. The Node way, in
fact, involves extreme levels of reusability, whereby applications are composed of a
highnumber of small, well-focused dependencies. While this can be considered unpractical
or even totally unfeasible in other platforms, in Node.js this practice is encouraged. As a
consequence, it is not rare to find npm packages containing less than 100 lines of code or
exposing only one single function.
Besides the clear advantage in terms of reusability, a small module is also considered to be
the following:
Having smaller and more focused modules empowers everyone to share or reuse even the
smallest piece of code; it's the Don't Repeat Yourself (DRY) principle applied at a whole
new level.
In Node.js, a very common pattern for defining modules is to expose only one piece of
functionality, such as a function or a constructor, while letting more advanced aspects or
secondary features become properties of the exported function or constructor. This helps
the user to identify what is important and what is secondary. It is not rare to find modules
that expose only one function and nothing else, for the simple fact that it provides a single,
[3]
Welcome to the Node.js platform
Another characteristic of many Node.js modules is the fact that they are created to be used
rather than extended. Locking down the internals of a module by forbidding any possibility
of an extension might sound inflexible, but it actually has the advantage of reducing the use
cases, simplifying its implementation, facilitating its maintenance, and increasing its
usability.
– Leonardo da Vinci
Richard P. Gabriel, a prominent computer scientist coined the term worse is better to describe
the model, whereby less and simpler functionality is a good design choice for software. In
his essay,The rise of worse is better, he says:
“The design must be simple, both in implementation and interface. It is more important for
the implementation to be simple than the interface. Simplicity is the most important
consideration in a design.”
In Node.js, this principle is also enabled by JavaScript, which is a very pragmatic language.
It's not rare, in fact, to see simple functions, closures, and object literals replacing complex
class hierarchies. Pure object-oriented designs often try to replicate the real world using the
mathematical terms of a computer system without considering the imperfection and the
complexity of the real world itself. The truth is that our software is always an
approximation of the reality and we would probably have more success in trying to get
something working sooner and with reasonable complexity, instead of trying to create a
near-perfect software with a huge effort and tons of code to maintain.
Throughout this book, we will see this principle in action many times. For example, a
considerable number of traditional design patterns, such as Singleton or Decorator can have
a trivial, even if sometimes not foolproof implementation and we will see how an
uncomplicated, practical approach most of the time is preferred to a pure, flawless design.
[4]
Welcome to the Node.js platform
Throughout this book, we will widely adopt some of these new features in the code
examples. These concepts are still fresh within the Node.js community so it's worth having
a quick look at the most important ES2015 specific features currently supported in Node.js.
Our version of reference is Node 5, more specifically version 5.1.
Many of these features will work correctly only when the strict mode is enabled. Strict
mode can be easily enabled by adding ause strict; statement at the very beginning of
your script. For the sake of brevity, we will not write this line in our code examples but you
should remember to add it to be able to run them correctly.
The following list is not meant to be exhaustive but just an introduction to the ES2015
features supported in Node, so that you can easily understand all the code examples in the
rest of this book.
Arrow Function
One of the most appreciated features introduced by ES2015 is the support for arrow
functions. Arrow function is a more concise syntax for defining functions, especially useful
when defining a callback. To better understand the advantages of this syntax let's see first
an example of a classic filtering on an array:
var numbers = [2, 6, 7, 8, 1];
var even = numbers.filter(function(x) {
return x%2 === 0;
});
This code above can be rewritten as follows using the arrow function syntax:
var numbers = [2, 6, 7, 8, 1];
var even = numbers.filter((x) => x%2 === 0);
The filter function can be defined inline, the keyword function is removed, leaving only
the list of parameters, which is followed by => (the arrow), which in turn is followed by the
body of the function. When the body of the function is just one line there's no need to write
the return keyword as it is applied implicitly. If we need to add more lines of code to the
body of the function, we can wrap them in curly brackets, but beware that in this case the
[5]
Welcome to the Node.js platform
But there is another important feature to know about arrow functions: arrow functions are
bound to their lexical scope. This means that inside an arrow function the value of this is
the same as in the parent block. Let's clarify this concept with an example:
function DelayedGreeter(name) {
this.name = name;
}
DelayedGreeter.prototype.greet = function() {
setTimeout( function cb() {
console.log('Hello ' + this.name);
}, 500);
};
In this code we are defining a simple “greeter” prototype which accepts a name as
argument. Then we are adding the greet method to the prototype. This function is
supposed to print “Hello” and the name defined in the current instance after 500
milliseconds it has been called. But this function is broken because inside the timeout
callback function (cb), the scope of the function is different from the scope of the greet
method and the value of this is undefined.
Before Node.js introduced support for arrow functions, to fix this we needed to change the
greet function using bind as follows:
DelayedGreeter.prototype.greet = function() {
setTimeout( (function cb() {
console.log('Hello ' + this.name);
}).bind(this), 500);
};
But since we have now arrow functions and since they are bound to their lexical scope, we
can just use an arrow function as callback to solve the issue:
[6]
Welcome to the Node.js platform
DelayedGreeter.prototype.greet = function() {
setTimeout( () => console.log('Hello ' + this.name), 500);
};
This is a very handy feature, most of the time it makes our code more concise and
straightforward.
This code will not fail as we might expect and it will just print undefined in the console.
This behavior has been cause of many bugs and frustration and that is the reason why
ES2015 introduces the let keyword to declare variables that respect the block scope. Let's
replace var with let in our previous example:
if (false) {
let x = "hello";
}
console.log(x);
This code will raise a ReferenceError: x is not defined because we are trying to
print a variable that has been defined inside another block scope.
To give a more meaningful example we can use the let keyword to define a temporary
variable to be used as an index for a loop:
for (let i=0; i < 10; i++) {
// do something here
}
console.log(i);
As in the previous example, this code will raise a ReferenceError: i is not defined
error.
This protective behavior introduced with let allows us to write safer code, because if we
accidentally access variables that belong to another scope we will get an error that will
[7]
Welcome to the Node.js platform
allow us to easily spot the bug and avoid potentially dangerous side effects.
ES2015 introduces also the const keyword. This keyword allows to declare read-only
variables. Let's see a quick example:
const x = 'This will never change';
x = '...';
This code will raise a “TypeError: Assignment to constant variable” error because
we are trying to change the value of a constant.
Constants are extremely useful when you want to protect a value from being accidentally
changed in your code.
Class syntax
ES2015 introduces a new syntax to leverage prototypical inheritance in a way that should
sound more familiar to all the developers that come from classic object oriented languages
such as Java or C#. It's important to underline that this new syntax does not change the way
objects are managed internally by the JavaScript runtime, they still inherits properties and
functions through prototypes and not through classes. While this new alternative syntax
can be very handy and readable, as a developer is important to understand that it is just a
syntactic sugar.
Let's see how it works with a trivial example. First of all, let's describe a Person using the
classic prototype based approach:
function Person(name, surname, age) {
this.name = name;
this.surname = surname;
this.age = age;
}
Person.prototype.getFullName = function() {
return this.name + ' ' + this.surname;
}
As you can see a person has a name, a surname and an age. We are providing to our
prototype a helper function that allows us to easily get the full name of a person object and
a generic helper function accessible directly from the Person prototype that returns the
[8]
Welcome to the Node.js platform
Let's see now how we can implement the same example using the new handy ES2015 class
syntax:
class Person {
constructor (name, surname, age) {
this.name = name;
this.surname = surname;
this.age = age;
}
getFullName () {
return this.name + ' ' + this.surname;
}
This syntax results more readable and straightforward to understand. We are explicitly
stating what is the constructor for the class and declaring the function older as a static
method.
Both implementations are completely interchangeable, but the real killer feature of the new
syntax is the possibility of extending the Person prototype using the extend and the
super keywords. Let's assume we want to create the PersonWithMiddlename class:
getFullName () {
return this.name + ' ' + this.middlename + ' ' + this.surname;
}
}
What is worth noticing in this second example is that the syntax really resembles what is
common in other object oriented languages. We are declaring the class from which we want
to extend, we define a new constructor that can call the parent one using the keyword
super and we override the getFullName method to add support for our middle name.
[9]
Welcome to the Node.js platform
obj will be an object containing the keys x and y, respectively with the values 22 and 17.
In this case we are writing a module that exports the functions square and cube mapped to
properties with the same name. Notice that we don't need to specify the keyword
function.
Let's see in another example how we can use computed property names:
let namespace = '-webkit-';
let style = {
[namespace + 'box-sizing'] : 'border-box',
[namespace + 'box-shadow'] : '10px 10px 5px #888888'
};
In this case the resulting object will contain the properties -webkit-box-sizing and -
webkit-box-shadow.
Let's see now how we can use the new setter and getter syntax by jumping directly to an
example:
let person = {
name : 'George',
surname : 'Boole',
[ 10 ]
Welcome to the Node.js platform
get fullname () {
return this.name + ' ' + this.surname;
},
In this example we are defining three properties, two normal name and surname and a
computed property fullname through the set and get syntax. As you can see from the
result of the console.log calls, we can access the computed property as if it was a regular
property inside the object for both reading and writing the value. It's worth noticing that the
second call to console.log prints “Alan Turing”. This happens because by default every
set function returns the value that is returned by the get function for the same property, in
this case get fullname.
profiles.size; // 3
profiles.has('twitter'); // true
profiles.get('twitter'); // "@adalovelace"
profiles.has('youtube'); // false
profiles.delete('facebook');
profiles.has('facebook'); // false
profiles.get('facebook'); // undefined
for (let entry of profiles) {
console.log(entry);
}
[ 11 ]
Welcome to the Node.js platform
As you can see the Map prototype offers several handy methods like set, get, has and
delete and the size attribute. We can also iterate through all the entries using the for
... of syntax. Every entry in the loop will be an array containing the key as first element
and the value as second element. This interface is very intuitive and self-explanatory.
But what makes maps really interesting is the possibility of using functions and objects as
keys of the map and this was something that is not entirely possible using plain objects,
because with objects all the keys are automatically casted to strings. This opens new
opportunities, for example we can build a micro testing framework leveraging this feature:
let tests = new Map();
tests.set(() => 2+2, 4);
tests.set(() => 2*2, 4);
tests.set(() => 2/2, 1);
As you can see in this last example, we are storing functions as keys and expected results as
values. Then we can iterate through our hash map and execute all the functions. It's also
worth noticing that when we iterate through the map all the entries respect the order in
which they have been inserted, this is also something that was not always guaranteed with
plain objects.
Along with Map, ES2015 also introduces the Set prototype. This prototype allows us to
easily construct sets, which means lists with unique values.
let s = new Set([0, 1, 2, 3]);
As you can see in this example the interface is quite similar to the one we have just seen for
[ 12 ]
Welcome to the Node.js platform
Map. We have the methods add (instead of set), has and delete and the property size.
We can also iterate through the set and in this case every entry is a value, in our example it
will be one of the numbers in the set. Finally, sets can also contain objects and functions as
values.
WeakMap is quite similar to Map in terms of interface, there are only two main differences:
there is no way to iterate all over the entries and it allows to have only objects as keys.
While this might seem as a limitation there is a good reason behind it, in fact the distinctive
feature of WeakMap is that it allows objects used as keys to be garbage collected when the
only reference left is inside a WeakMap. This is extremely useful when we are storing some
metadata associated to an object that might get deleted during the regular lifetime of the
application. Let's see an example:
let obj = {};
let map = new WeakMap();
map.set(obj, {metadata: "some_metadata"});
console.log(map.get(obj)); // {metadata: "some_metadata"}
obj = undefined; // now obj and metadata will be cleaned up in the next gc
cycle
In this code we are creating a plain object called obj. Then we store some metadata for this
object in a new WeakMap called map. We can access this metadata with the map.get method.
Later, when we cleanup the object by assigning its variable to undefined, the object will be
correctly garbage collected and its metadata removed from the map.
Similarly, to WeakMap, WeakSet is the weak version of Set: it exposes the same interface of
Set but it allows to store only objects and cannot be iterated. Again the difference with Set
is that WeakSet allows objects to be garbage collected when their only reference left is in the
weak set.
It's important to understand that WeakMap and WeakSet are not better or worse than Map
and Set, they are simply more suitable for different use cases.
Template Literals
ES2015 offers a new alternative and more powerful syntax to define strings: the template
[ 13 ]
Welcome to the Node.js platform
literals. This syntax uses back ticks (`) as delimiters and offers several benefits compared to
regular quoted (') or double-quoted (") delimited strings. The main ones are that template
literals syntax can interpolate variables or expressions using ${expression} inside the string
(this is the reason why this syntax is called “template”) and that strings can finally be
multiline. Let's see a quick example:
let name = "Leonardo";
let interests = ["arts", "architecture", "science", "music",
"mathematics"];
let birth = { year : 1452, place : 'Florence' };
let text = `${name} was an Italian polymath interested in many topics such
as ${interests.join(', ')}.
He was born in ${birth.year} in ${birth.place}.`;
console.log(text);
A more extended and up to date list of all the supported ES2015 features is
available in the official Node.js documentation:
https://nodejs.org/en/docs/es6/
I/O is slow
I/O is definitely the slowest among the fundamental operations of a computer. Accessing
the RAM is in the order of nanoseconds (10e-9 seconds), while accessing data on the disk or
the network is in the order of milliseconds (10e-3 seconds). For the bandwidth, it is the same
story; RAM has a transfer rate consistently in the order of GB/s, while disk and network
varies from MB/s to, optimistically, GB/s. I/O is usually not expensive in terms of CPU,
but it adds a delay between the moment the request is sent and the moment the operation
completes. On top of that, we also have to consider thehuman factor; often, the input of an
[ 14 ]
Welcome to the Node.js platform
application comes from a real person, for example, the click of a button or a message sent in
a real-time chat application, so the speed and frequency of I/O don't depend only on
technical aspects, and they can be many orders of magnitude slower than the disk or
network.
Blocking I/O
In traditional blocking I/O programming, the function call corresponding to an I/O request
will block the execution of the thread until the operation completes. This can go from a few
milliseconds, in case of a disk access, to minutes or even more, in case the data is generated
from user actions, such as pressing a key. The following pseudocode shows a typical
blocking read performed against a socket:
//blocks the thread until the data is available
data = socket.read();
//data is available
print(data);
It is trivial to notice that a web server that is implemented using blocking I/O will not be
able to handle multiple connections in the same thread; each I/O operation on a socket will
block the processing of any other connection. For this reason, the traditional approach to
handle concurrency in web servers is to kick off a thread or a process (or to reuse one taken
from a pool) for each concurrent connection that needs to be handled. This way, when a
thread gets blocked for an I/O operation it will not impact the availability of the other
requests, because they are handled in separate threads.
[ 15 ]
Welcome to the Node.js platform
The preceding image lays emphasis on the amount of time each thread is idle, waiting for
new data to be received from the associated connection. Now, if we also consider that any
type of I/O can possibly block a request, for example, while interacting with databases or
with the filesystem, we soon realize how many times a thread has to block in order to wait
for the result of an I/O operation. Unfortunately, a thread is not cheap in terms of system
resources, it consumes memory and causes context switches, so having a long running
thread for each connection and not using it for most of the time, is not the best compromise
in terms of efficiency.
Non-blocking I/O
In addition to blocking I/O, most modern operating systems support another mechanism to
access resources, called non-blocking I/O. In this operating mode, the system call always
returns immediately without waiting for the data to be read or written. If no results are
available at the moment of the call, the function will simply return a predefined constant,
indicating that there is no data available to return at that moment.
For example, in Unix operating systems, the fcntl() function is used to manipulate an
existing file descriptor to change its operating mode to non-blocking (with the O_NONBLOCK
flag). Once the resource is in non-blocking mode, any read operation will fail with a return
code, EAGAIN, in case the resource doesn't have any data ready to be read.
The most basic pattern for accessing this kind of non-blocking I/O is to actively poll the
resource within a loop until some actual data is returned; this is called busy-waiting. The
following pseudocode shows you how it's possible to read from multiple resources using
[ 16 ]
Welcome to the Node.js platform
You can see that, with this simple technique, it is already possible to handle different
resources in the same thread, but it's still not efficient. In fact, in the preceding example, the
loop will consume precious CPU only for iterating over resources that are unavailable most
of the time. Polling algorithms usually result in a huge amount of wasted CPU time.
Event demultiplexing
Busy-waiting is definitely not an ideal technique for processing non-blocking resources, but
luckily, most modern operating systems provide a native mechanism to handle concurrent,
non-blocking resources in an efficient way; this mechanism is called synchronous event
demultiplexer or event notification interface. This component collects and queues I/O
events that come from a set of watched resources, and block until new events are available
to process. The following is the pseudocode of an algorithm that uses a generic synchronous
event demultiplexer to read from two different resources:
socketA, pipeB;
watchedList.add(socketA, FOR_READ); //[1]
watchedList.add(pipeB, FOR_READ);
while(events = demultiplexer.watch(watchedList)) { //[2]
//event loop
foreach(event in events) { //[3]
//This read will never block and will always return data
data = event.resource.read();
if(data === RESOURCE_CLOSED)
//the resource was closed, remove it from the watched list
[ 17 ]
Welcome to the Node.js platform
demultiplexer.unwatch(event.resource);
else
//some actual data was received, process it
consumeData(data);
}
}
1. The resources are added to a data structure, associating each one of them with a
specific operation, in our example a read.
2. The event notifier is set up with the group of resources to be watched. This call is
synchronous and blocks until any of the watched resources is ready for a read.
When this occurs, the event demultiplexer returns from the call and a new set of
events is available to be processed.
3. Each event returned by the event demultiplexer is processed. At this point, the
resource associated with each event is guaranteed to be ready to read and to not
block during the operation. When all the events are processed, the flow will block
again on the event demultiplexer until new events are again available to be
processed. This is called the event loop.
It's interesting to see that with this pattern, we can now handle several I/O operations
inside a single thread, without using a busy-waiting technique. The following image shows
us how a web server would be able to handle multiple connections using a synchronous
event demultiplexer and a single thread:
[ 18 ]
Welcome to the Node.js platform
that using only one thread does not impair our ability to run multiple I/O bound tasks
concurrently. The tasks are spread over time, instead of being spread across multiple
threads. This has the clear advantage of minimizing the total idle time of the thread, as
clearly shown in the image. This is not the only reason for choosing this model. To have
only a single thread, in fact, also has a beneficial impact on the way programmers approach
concurrency in general. Throughout the book, we will see how the absence of in-process
race conditions and multiple threads to synchronize, allows us to use much simpler
concurrency strategies.
In the next chapter, we will have the opportunity to talk more about the concurrency model
of Node.js.
[ 19 ]
Welcome to the Node.js platform
[ 20 ]
Welcome to the Node.js platform
new operations to be inserted in the Event Demultiplexer (1), before the control is
given back to the Event Loop.
6. When all the items in the Event Queue are processed, the loop will block again on
the Event Demultiplexer which will then trigger another cycle.
The asynchronous behavior is now clear: the application expresses the interest to access a
resource at one point in time (without blocking) and provides a handler, which will then be
invoked at another point in time when the operation completes.
Pattern (reactor): handles I/O by blocking until new events are available from a set of
observed resources, and then reacting by dispatching each event to an associated handler.
Besides abstracting the underlying system calls, libuv also implements the reactor pattern,
thus providing an API for creating event loops, managing the event queue, running
asynchronous I/O operations, and queuing other types of tasks.
A great resource to learn more about libuv is the free online book created
by Nikhil Marathe, which is available at
http://nikhilm.github.io/uvbook/
[ 21 ]
Welcome to the Node.js platform
A set of bindings responsible for wrapping and exposing libuv and other low-
level functionality to JavaScript.
V8, the JavaScript engine originally developed by Google for the Chrome
browser. This is one of the reasons why Node.js is so fast and efficient. V8 is
acclaimed for its revolutionary design, its speed, and for its efficient memory
management.
A core JavaScript library (called node-core) thatimplements the high-level
Node.js API.
Finally, this is the recipe of Node.js, and the following image represents its final
architecture:
[ 22 ]
Welcome to the Node.js platform
Summary
In this chapter, we have seen how the Node.js platform is based on a few important
principles that provide the foundation to build efficient and reusable code. The philosophy
and the design choices behind the platform have, in fact, a strong influence on the structure
and behavior of every application and module we create. Often, for a developer moving
from another technology, these principles might seem unfamiliar and the usual instinctive
[ 23 ]
Welcome to the Node.js platform
reaction is to fight the change by trying to find more familiar patterns inside a world which
in reality requires a real shift in the mindset.
On one hand, the asynchronous nature of the reactor pattern requires a different
programming style made of callbacks and things that happen at a later time, without
worrying too much about threads and race conditions. On the other hand, the module
pattern and its principles of simplicity and minimalism creates interesting new scenarios in
terms of reusability, maintenance, and usability.
Finally, besides the obvious technical advantages of being fast, efficient, and based on
JavaScript, Node.js is attracting so much interest because of the principles we have just
discovered. For many, grasping the essence of this world feels like returning to the origins,
to a more humane way of programming for both size and complexity and that's why
developers end up falling in love with Node.js. The introduction of ES2015 makes things
even more interesting and open new scenarios for being able to embrace all these
advantages with an even more expressive syntax.
In the next chapter, we will get into deep of the two basic asynchronous patterns used in
Node.js: the callback pattern and the event emitter. We will also understand the difference
between synchronous and asynchronous code and how to avoid to write unpredictable
functions.
[ 24 ]
Node.js Essential Patterns
2
Embracing the asynchronous nature of Node.js is not trivial at all, especially if coming from
a language such as PHP where it is not usual to deal with asynchronous code.
Node.js offers a series of tools and design patterns to deal optimally with asynchronous
code and it's important to learn how to use them to gain confidence with asynchronous
coding and write applications that are both performant and easy to understand and debug.
In this chapter, we will see two of the most important asynchronous patterns: callback and
event emitter.
Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.
textbookfull.com