Nest - Js A Progressive Node - Js Framework
Nest - Js A Progressive Node - Js Framework
js Framework
What is Nest.js?
There are so many available web frameworks, and with the advent of
Node.js, even more have been released. JavaScript frameworks go in
and out of style very quickly as web technologies change and grow.
Nest.js is a good starting point for many developers that are looking to
use a modern web framework because it uses a language that is very
similar to that of the most used language on the web to this day,
JavaScript. Many developers were taught programming using
languages such as Java or C/C++, which are both strict languages, so
using JavaScript can be a little awkward and easy to make mistakes
given the lack of type safety. Nest.js uses TypeScript, which is a happy
medium. It is a language that provides the simplicity and power of
JavaScript with the type safety of other languages you may be used to.
The type safety in Nest.js is only available at compile time, because the
Nest.js server is compiled to a Node.js Express server that runs
JavaScript. This is still a major advantage, however, since it allows you
to better design programs error free prior to runtime.
This will create a local copy of the project on your computer, which you
can run locally by building the project with Docker:
docker-compose up
docker ps
This will run the database migrations so that your Nest.js app can read
and write to the database with the correct schema.
If you don’t want to use Docker, or cannot use Docker, you can build
the project with your choice of package managers such as npm or yarn:
npm install
or
yarn
npm start:dev
yarn start:dev
These will run nodemon, which will cause your Nest.js application to
restart if any changes are made, saving you from having to stop,
rebuild, and start your application again.
1. Dependency Injection
2. Authentication
3. ORM
4. REST API
5. Websockets
6. Microservices
7. Routing
8. Explanation of Nest specific tools
9. OpenApi (Swagger) Documentation
10. Command Query Responsibility Segregation (CQRS)
11. Testing
12. Server-side rendering with Universal and Angular.
Topics discussed
Each of the topics below will be discussed in more detail in the
following chapters.
Nest CLI
New in version 5 of Nest there is a CLI that allows for command line
generation of projects and files. The CLI can be installed globally with:
This process will create the project from a typescript-starter and will
ask for the name, description, version (defaults to 0.0.0), and author (this
would be your name). After this process is finished you will have a fully
setup Nest project with the dependencies installed in
your node_modules folder. The newcommand will also ask what package
manager you would like to use, in the same way that
either yarn or npm can be used. Nest gives you this choice during
creation.
The most used command from the CLI will be the generate (g)
command, this will allow you to create
new controllers, modules, servies or any other components that Nest
supports. The list of available components is:
1. class (cl)
2. controller (co)
3. decorator (d)
4. exception (e)
5. filter (f)
6. gateway (ga)
7. guard (gu)
8. interceptor (i)
9. middleware (mi)
10. module (mo)
11. pipe (pi)
12. provider (pr)
13. service (s)
Note that the string in the brackets is the alias for that specific
command. This means that instead of typing:
nest g s [service-name]
Lastly, the Nest CLI provides the info (i) command to display
information about your project. This command will output information
that looks something like:
[System Information]
OS Version : macOS High Sierra
NodeJS Version : v8.9.0
YARN Version : 1.5.1
[Nest Information]
microservices version : 5.0.0
websockets version : 5.0.0
testing version : 5.0.0
common version : 5.0.0
core version : 5.0.0
Dependency Injection
Dependency Injection is the technique of supplying a dependent object,
such as a module or component, with a dependency like a service,
thereby injecting it into the component’s constructor. An example of
this taken from the sequelize chapter is below. Here we are injecting
the UserRespositoryservice into the constructor of the UserService,
thereby providing access to the User Database repository from inside
the UserService component.
@Injectable()
export class UserService implements IUserService {
constructor(@Inject('UserRepository') private readonly
UserRepository: typeof User) { }
...
}
Authentication
Authentication is one of the most important aspects of developing. As
developers, we always want to make sure that users can only access
the resources they have permission to access. Authentication can take
many forms, from showing your drivers license or passport to
providing a username and password for a login portal. In recent years
these authentication methods have expanded out to become more
complicated, but we still need the same server-side logic to make sure
that these authenticated users are always who they say they are and
persist this authentication so they do not need to reauthenticate for
every single call to a REST API or Websocket because that would
provide a pretty terrible user experience. The chosen library for this is
ironically named Passport as well, and is very well known and used in
the Node.js ecosystem. When integrated into Nest it uses a JWT (JSON
Web Token) strategy. Passport is a Middleware that the HTTP call is
passed through before hitting the endpoint at the controller. This is
the AuthenticationMiddleware written for the example project that
extends NestMiddleware, authenticating each user based on the email in
the request payload.
@Injectable()
export class AuthenticationMiddleware implements NestMiddleware {
constructor(private userService: UserService) { }
ORM
An ORM is an Object-relational mapping and is one of the most
important concepts when dealing with communication between a
server and a database. An ORM provides a mapping between objects in
memory (Defined classes such a User or Comment) and Relational tables
in a database. This allows you to create a Data Transfer Object that
knows how to write objects stored in memory to a database, and read
the results from an SQL or another query language, back into memory.
In this book, we will talk about three different ORMs: two relational
and one for a NoSQL database. TypeORM is one of the most mature and
popular ORMs for Node.js and thus has a very wide and flushed out
feature set. It is also one of the packages that Nest provides its own
packages for: @nestjs/typeorm. It is incredibly powerful and has support
for many databases like MySQL, PostgreSQL, MariaDB, SQLite, MS SQL
Server, Oracle, and WebSQL. Along with TypeORM, Sequelize is also
another ORM for relational data.
REST API
REST is one of the main design paradigms for creating APIs. It stands
for Representative State Transfer, and uses JSON as a transfer format,
which is in line with how Nest stores objects, thus it is a natural fit for
consuming and returning HTTP calls. A REST API is a combination of
many techniques that are talked about in this book. They are put
together in a certain way; a client makes an HTTP call to a server. That
server will Route the call to the correct Controller based on the URL
and HTTP verb, optionally passing it through one or more Middlewares
prior to reaching the Controller. The Controller will then hand it off to
a Service for processing, which could include communication with a
Database through an ORM. If all goes well, the server will return an OK
response to the client with an optional body if the client requested
resources (GET request), or just a 200/201 HTTP OK if it was a
POST/PUT/DELETE and there is no response body.
WebSockets
WebSockets are another way to connect to and send/receive data from
a server. With WebSockets, a client will connect to the server and then
subscribe to certain channels. The clients can then push data to a
subscribed channel. The server will receive this data and then
broadcast it to every client that is subscribed to that specific channel.
This allows multiple clients to all receive real-time updates without
having to make API calls manually, potentially flooding the server with
GET requests. Most chat apps use WebSockets to allow for real-time
communication, and everyone in a group message will receive the
message as soon as one of the other members sends one. Websockets
allow for more of a streaming approach to data transfer than
traditional Request-Response API’s, because Websockets broadcast
data as it’s received.
Microservices
Microservices allow for a Nest application to be structured as a
collection of loosely coupled services. In Nest, microservices are
slightly different, because they are an application that uses a different
transport layer other than HTTP. This layer can be TCP or Redis
pub/sub, among others. Nest supports TCP and Redis, although if you
are married to another transport layer it can be implemented by using
the CustomTransportStrategy interface. Microservices are great because
they allow a team to work on their own service within the global
project and make changes to the service without affecting the rest of
the project since it is loosely coupled. This allows for continuous
delivery and continuous integration independent of other teams
microservices.
GraphQL
As we saw above, REST is one paradigm when designing APIs, but
there is a new way to think about creating and consuming APIs:
GraphQL. With GraphQL, instead of each resource having its own URL
pointing to it, a URL will accept a query parameter with a JSON object
in it. This JSON object defines the type and format of the data to return.
Nest provides functionality for this through
the @nestjs/graphql package. This will include the GraphQLModule in the
project, which is a wrapper around the Apollo server. GraphQL is a
topic that could have an entire book written about it, so we don’t go
into it any further in this book.
Routing
Routing is one of the core principles when discussing web frameworks.
Somehow the clients need to know how to access the endpoints for the
server. Each of these endpoints describes how to
retrieve/create/manipulate data that is stored on the server.
Each Component that describes an API endpoint must have
a @Controller(‘prefix’) decorator that describes the API prefix for this
component’s set of endpoints.
@Controller('hello')
export class HelloWorldController {
@Get(‘world’)
printHelloWorld() {
return ‘Hello World’;
}
}
The above Controller is the API endpoint for GET /hello/world and will
return an HTTP 200 OK with Hello World in the body. This will be
discussed more in the Routing chapter where you will learn about
using URL params, Query params, and the Request object.
OpenAPI (Swagger)
Documentation is very important when writing a Nest server, and is
especially so when creating an API that will be consumed by others,
otherwise the developer writing the clients that will eventually be
consuming the API do not know what to send or what they get back.
One of the most popular documentation engines out there is Swagger.
Like with others, Nest provides a dedicated module for the OpenAPI
(Swagger) spec, @nestjs/swagger. This module provides decorators to
help describe the inputs/outputs and endpoints of your API. This
documentation is then accessible through an endpoint on the server.
Testing
Testing your Nest server will be imperative so that once it is deployed
their are no unforseen issure and it all runs smoothly. There are two
different kinds of tests you will learn about in this book: Unit Tests and
E2E Tests (End-to-end Tests). Unit Testing is the art of testing small
snippets or blocks of code, and this could be as granular as testing
individual functions or writing a test for a Controller, Interceptor, or
any other Injectable. There are many popular unit testing frameworks
out there, and Jasmine and Jest are two popular ones. Nest provides
special packages, @nestjs/testing specifically, for writing unit tests
in *.spec.ts and *.test.ts classes.
E2E Testing is the other form of testing that is commonly used and is
different from unit testing only in that it tests entire functionality
rather than individual functions or components, which is where the
name end-to-end testing came from. Eventually applications will
become so large that it is hard to test absolutely every piece of code
and endpoint. In this case you can use E2E tests to test the application
from beginning to the end to make sure everything works along the
way. For E2E testing a Nest application can use the Jest library again to
mock up components. Along with Jest you can use the supertest library
to simulate HTTP requests.
Testing is a very important part of writing applications and should not
be ignored. This is a chapter that will be relevant no matter what
language or framework you end up working with. Most large scale
development companies have entire teams dedicated to writing tests
for the code that is pushed to production applications, and these are
called QA developers.
Summary
Throughout this book, you will go through each of the above topics in
more detail, continuously building on top of prior concepts. Nest
provides a clean well-organized framework that implements each of
these concepts in a simple yet efficient way that is consistent across all
modules because of the modular design of the framework.
Chapter 2. Overview
In this chapter we’ll take an overview of Nest.js and look at the core
concepts that you’ll need to build a Nest.js application.
Controllers
Controllers in Nest are responsible for handling incoming requests and
returning responses to the client. Nest will route incoming requests to
handler functions in controller classes. We use
the @Controller() decorator to create a controller class.
@Controller('entries')
export class EntryController {
@Get()
index(): Entry[] {
const entries: Entry[] = this.entriesService.findAll();
return entries;
}
Providers
Providers in Nest are used to create services, factories, helpers, and
more that can be injected into controllers and other providers using
Nest’s built-in dependency injection. The @Injectable() decorator is
used to create a provider class.
@Injectable()
export class AuthenticationService {
constructor(private readonly userService: UserService) {}
async validateUser(payload: {
email: string;
password: string;
}): Promise<boolean> {
const user = await this.userService.findOne({
where: { email: payload.email }
});
return !!user;
}
}
Modules
A Nest.js application is organized into modules. If you’re familiar with
modules in Angular, then the module syntax Nest uses will look very
familiar.
Property Description
imports The list of modules to import that export components that are require
Property Description
exports The list of components from this module to be made available to othe
In our example application, the root Module is named AppModule and the
application is split up into a number of sub-modules that handle the
major parts of the application such as authentication, comments,
database access, blog entries and users.
@Module({
components: [],
controllers: [],
imports: [
DatabaseModule,
AuthenticationModule.forRoot('jwt'),
UserModule,
EntryModule,
CommentModule,
UserGatewayModule,
CommentGatewayModule
],
exports: [],
})
export class AppModule implements NestModule {}
The AppModule imports the modules that are needed for the
application. The root module in our application doesn’t need to have
any exports since no other modules import it.
@Module({
components: [entryProvider, EntryService],
controllers: [EntryController],
imports: [],
exports: [EntryService],
})
export class EntryModule implements NestModule {}
Modules in Nest.js are singletons by default. This means that you can
share the same instance of an exported component, such as
the EntryService above, between modules without any effort.
Bootstrapping
Every Nest.js application needs to be bootstrapped. This is done by by
using the NestFactory to create the root module and calling
the listen() method.
In our example application, the entry point is main.ts and we use the
async / await pattern to create the AppModule and call listen():
Middleware
Nest.js middleware is either a function or a class decorated with
the @Injectable() decorator that implements
the NestMiddleware interface. Middleware is called before route
handlers. These functions have access to
the request and response object, and they can makes changes to the
request and response object.
import {
MiddlewareFunction,
HttpStatus,
Injectable,
NestMiddleware
} from '@nestjs/common';
import * as passport from 'passport';
import { UserService } from '../../modules/user/user.service';
@Injectable()
export class AuthenticationMiddleware implements NestMiddleware {
constructor(private userService: UserService) {}
@Module({
imports: [
DatabaseModule,
AuthenticationModule.forRoot('jwt'),
UserModule,
EntryModule,
CommentModule,
UserGatewayModule,
CommentGatewayModule,
KeywordModule
],
controllers: [],
providers: []
})
export class AppModule implements NestModule {
public configure(consumer: MiddlewareConsumer) {
const userControllerAuthenticatedRoutes = [
{ path: '/users', method: RequestMethod.GET },
{ path: '/users/:id', method: RequestMethod.GET },
{ path: '/users/:id', method: RequestMethod.PUT },
{ path: '/users/:id', method: RequestMethod.DELETE }
];
consumer
.apply(AuthenticationMiddleware)
.with(strategy)
.forRoutes(
...userControllerAuthenticatedRoutes,
EntryController,
CommentController
);
}
}
Guards
Guards are classes that are decorated with the @Injectable() decorator
and implement the CanActivateinterface. A guard is responsible for
determining if a request should be handled by a route handler or route.
Guards are executed after every middleware, but before pipes. Unlike
middleware, guards have access to the ExecutionContext object, so they
know exactly what is going to evaluated.
@Injectable()
export class CheckLoggedInUserGuard implements CanActivate {
canActivate(
context: ExecutionContext
): boolean | Promise<boolean> | Observable<boolean> {
const req = context.switchToHttp().getRequest();
return Number(req.params.userId) === req.user.id;
}
}
@Controller('users')
export class UserController {
constructor(private readonly userService: UserService) { }
@Get(':userId')
@UseGuards(CheckLoggedInUserGuard)
show(@Param('userId') userId: number) {
const user: User = this.userService.findById(userId);
return user;
}
Summary
In this chapter we covered Nest.js controllers, providers, modules,
bootstrapping, and middleware. In the next chapter we will go over
Nest.js authentication.
Chapter 3. Nest.js authentication
Nest.js, using version 5 the @nestjs/passport package, allows you to
implement the authentication strategy that you need. Of course you
can also do this manually using passport.
In this chapter you will see how to use passport by integrating it into
your Nest.js project. We also cover what a strategy is, and how to
configure the strategy to use with passport.
Passport
Passport is a well known library that is popular and flexible to use. In
fact, passport is flexible middleware that can be fully customized.
Passport allows different ways to authenticate a user like the
following:
In order to use passport you have to install the following package: npm
i passport. Before you see how to implement the authentication, you
must implement the userService and the userModel.
Manual implementation
In this section, we will implement the authentication manually using
passport without using the Nest.js package.
Implementation
In order to configure passport, three things need to be configured:
Before using passport, you must configure the strategy, and in this case
we will use the passport-jwtstrategy.
AUTHENTICATION MODULE
In order to have a working example, you must implement some
modules, and we will start with AuthenticationModule.
The AuthenticationModule will configure the strategy using the jwt
strategy. To configure the strategy we will extend the Strategy class
provided by the passport-jwt package.
Strategy
@Injectable()
export default class JwtStrategy extends Strategy {
constructor(private readonly authenticationService:
AuthenticationService) {
super({
jwtFromRequest:
ExtractJwt.fromAuthHeaderAsBearerToken(),
passReqToCallback: true,
secretOrKey: 'secret'
}, async (req, payload, next) => {
return await this.verify(req, payload, next);
});
passport.use(this);
}
Authentication service
@Injectable()
export class AuthenticationService {
constructor(private readonly userService: UserService) { }
The service injects the UserService in order to find the user using the
payload pass to the validateUsermethod. If the email in the payload
allows you to find the user, and if that user has a valid token, she can
continue the authentication process.
In order to provide a token for the user who try to logged in,
implement the createToken method, which takes as parameters
an email and an optional ttl. The ttl (Time to live) will configure the
token to be valid for a period. The value of the ttl is expressed in
seconds, and the default value that we have defined in 60 * 60, which
means 1 hour.
Authentication controller
@Controller()
export class AuthenticationController {
constructor(
private readonly authenticationService:
AuthenticationService,
private readonly userService: UserService) {}
@Post('login')
@HttpCode(HttpStatus.OK)
public async login(@Body() body: any, @Res() res): Promise<any> {
if (!body.email || !body.password) {
return res.status(HttpStatus.BAD_REQUEST).send('Missing
email or password.');
}
const result =
this.authenticationService.createToken(user.email);
return res.json(result);
}
}
Module
@Module({})
export class AuthenticationModule {
static forRoot(strategy?: 'jwt' | 'OAuth' | 'Facebook'):
DynamicModule {
strategy = strategy ? strategy : 'jwt';
const strategyProvider = {
provide: 'Strategy',
useFactory: async (authenticationService:
AuthenticationService) => {
const Strategy = (await import
(`./passports/${strategy}.strategy`)).default;
return new Strategy(authenticationService);
},
inject: [AuthenticationService]
};
return {
module: AuthenticationModule,
imports: [UserModule],
controllers: [AuthenticationController],
providers: [AuthenticationService, strategyProvider],
exports: [strategyProvider]
};
}
}
As you can see, the module is not defined as a normal module, so it has
no components or controller defined in the @Module() decorator. In fact,
this module is a dynamic module. In order to provide a multiple
strategy, we can implement a static method on the class in order to call
it when we import the module in another one. This
method forRoot takes as a parameter the name of the strategy that you
want to use and will create a strategyProvider in order to be added to
the components list in the returned module. This provider will
instanciate the strategy and provide the AuthenticationService as a
dependency.
USER MODULE
The UserModule provides a service, a controller, and a model (see the
sequelize chapter for the User model). We create some methods in
the UserService in order to manipulate the data concerning the user.
These methods are used in the UserController in order to provide some
features to the user of the API.
All of the features can’t be used by the user or restricted in the data
that is returned.
User service
@Injectable()
export class UserService() {
// The SequelizeInstance come from the DatabaseModule have a
look to the Sequelize chapter
constructor(@Inject('UserRepository') private readonly
UserRepository: typeof User,
@Inject('SequelizeInstance') private readonly
sequelizeInstance) { }
/* ... */
}
where: {
email: 'some@email.test',
firstName: 'someFirstName'
}
}
Using this criteria, we can find the corresponding user. This method
will return only one result.
@Injectable()
export class UserService() {
/* ... */
/* ... */
}
@Injectable()
export class UserService() {
/* ... */
/* ... */
}
Then we need a way to create a new user in the database passing the
user respecting the IUser interface. This method, as you can see, uses
a this.sequelizeInstance.transaction transaction to avoid reading the
data before everything is finished. This method passes a parameter to
the create function, which is returning in order to get the instance user
that has been created.
@Injectable()
export class UserService() {
/* ... */
/* ... */
}
Of course, if you can create a user, you also need the possibility to
update it with the following method following the IUser interface. This
method too will return the instance of the user that has been updated.
@Injectable()
export class UserService() {
/* ... */
public async update(id: number, newValue: IUser): Promise<User
| null> {
return await this.sequelizeInstance.transaction(async
transaction => {
let user = await
this.UserRepository.findById<User>(id, { transaction });
if (!user) throw new Error('The user was not found.');
/* ... */
}
@Injectable()
export class UserService() {
/* ... */
/* ... */
}
In all of the previous examples, we have define a
complete UserService that allowed us to manipulate the data. We have
the possibility to create, read, update, and delete a user.
User model
If you wish to see the implementation of the user model, you can refer
to the Sequelize chapter.
User controller
@Controller()
export class UserController {
constructor(private readonly userService: UserService) { }
/* ... */
}
Provide a GET users route that allows access to all users from the
database, and you will see how we don’t want the user accessing the
data of all of the users, just only for himself. This is why we are using a
guard that only allows a user to acces his own data.
@Controller()
export class UserController {
/* ... */
@Get('users')
@UseGuards(CheckLoggedInUserGuard)
public async index(@Res() res) {
const users = await this.userService.findAll();
return res.status(HttpStatus.OK).json(users);
}
/* ... */
}
The user has access to a route that allows you to create a new user. Of
course, if you want, the user can register into the logged in application,
which we must allow for those without a restriction.
@Controller()
export class UserController {
/* ... */
@Post('users')
public async create(@Body() body: any, @Res() res) {
if (!body || (body && Object.keys(body).length === 0)) throw
new Error('Missing some information.');
await this.userService.create(body);
return res.status(HttpStatus.CREATED).send();
}
/* ... */
}
We also provide a GET users/:id route that allows you to get a user by
his ID. Of course a logged in user should not be able to access the data
from another user even from this route. This route is also protected by
a guard in order to allow the user access to himself and not another
user.
@Controller()
export class UserController {
/* ... */
@Get('users/:id')
@UseGuards(CheckLoggedInUserGuard)
public async show(@Param() id: number, @Res() res) {
if (!id) throw new Error('Missing id.');
A user can have the idea to update some of his own information, which
is why we provide a way to update a user through the following PUT
users/:id route. This route is also protected by a guard to avoid a user
updating another user.
@Controller()
export class UserController {
/* ... */
@Put('users/:id')
@UseGuards(CheckLoggedInUserGuard)
public async update(@Param() id: number, @Body() body: any,
@Res() res) {
if (!id) throw new Error('Missing id.');
Use deletion to finish the last handler. This route has to also be
protected by a guard to avoid a user from deleting another user. The
only user that can be deleted by a user is himself.
@Delete('users/:id')
@UseGuards(CheckLoggedInUserGuard)
public async delete(@Param() id: number, @Res() res) {
if (!id) throw new Error('Missing id.');
await this.userService.delete(id);
return res.status(HttpStatus.OK).send();
}
}
@Module({
imports: [],
controllers: [UserController],
providers: [userProvider, UserService],
exports: [UserService]
})
export class UserModule {}
APP MODULE
The AppModule imports three modules for our example.
In the end, the module should looks like the following example.
@Module({
imports: [
DatabaseModule,
// Here we specify the strategy
AuthenticationModule.forRoot('jwt'),
UserModule
]
})
export class AppModule implements NestModule {
public configure(consumer: MiddlewaresConsumer) {
consumer
.apply(AuthenticationMiddleware)
.with(strategy)
.forRoutes(
{ path: '/users', method: RequestMethod.GET },
{ path: '/users/:id', method: RequestMethod.GET
},
{ path: '/users/:id', method: RequestMethod.PUT
},
{ path: '/users/:id', method:
RequestMethod.DELETE }
);
}
}
Authentication middleware
As seen in the previous section, we have applied
the AuthenticationMiddleware, and we have seen that passport is
middleware to authenticate the user. This middleware will execute
the passport.authenticatemethod using the strategy jwt, taking a
callback function that will return the results of the authentication
method. As a result we can receive the payload corresponding to the
token or an error in case the authentication doesn’t work.
@Injectable()
export class AuthenticationMiddleware implements NestMiddleware {
constructor(private userService: UserService) { }
async resolve(strategy: string): Promise<ExpressMiddleware> {
return async (req, res, next) => {
return passport.authenticate(strategy, async (...args:
any[]) => {
const [, payload, err] = args;
if (err) {
return
res.status(HttpStatus.BAD_REQUEST).send('Unable to authenticate the
user.');
}
If the authentication work we will be able to store the user into the
request req in order to be use by the controller or the guard. the
middleware implement the interface NestMiddleware in order to
implement the resolve function. It also inject the UserService in order to
find the user authenticated.
The guards are executed after every middleware and before any pipes.
The interest of doing this is to separate the restriction logic of the
middleware and reorganize this restriction.
Imagine using a guard to manage the access to a specific route and you
want this route to only be accessible to the logged in user. To do that
we have implemented a new guard, which has to return ‘true’ if the
user accessing the route is the same as the one belonging to the
resource that the user want to access. With this kind of guard, you
avoid a user to access another user.
@Injectable()
export class CheckLoggedInUserGuard implements CanActivate {
canActivate(context: ExecutionContext): boolean |
Promise<boolean> | Observable<boolean> {
const request = context.switchToHttp().getRequest();
return Number(req.params.userId) === req.user.id;
}
}
As you can see, you get the handler from the context that corresponds
to the route handler on the controller where the guard is applied. You
also get the userId from the request parameters to compare it from to
the logged in user register into the request. If the user who wants to
access the data is the same, then he can access the references in the
request parameter, otherwise he will receive a 403 Forbidden.
To apply the guard to the route handler, see the following example.
@Controller()
@UseGuards(CheckLoggedInUserGuard)
export class UserController {/*...*/}
To use the package you will have the possibility to use the exact
same AuthenticationService that you have implemented in the previous
section, but remember to follow the next code sample.
@Injectable()
export class AuthenticationService {
constructor(private readonly userService: UserService) { }
@Injectable()
export default class JwtStrategy extends PassportStrategy(Strategy)
{
constructor(private readonly authenticationService:
AuthenticationService) {
super({
jwtFromRequest:
ExtractJwt.fromAuthHeaderAsBearerToken(),
passReqToCallback: true,
secretOrKey: 'secret'
});
}
As you can see, in this new implementation of the JwtStrategy you don’t
need to implement the callback anymore. This is because you now
extend the PassportStrategy(Strategy) where Strategy is the imported
member from the passport-jwt library. Also, the PassportStrategy is a
mixin that will call the validatemethod that we’ve implemented and
named according to the abstract member of this mixin class. This
method will be called by the strategy as the validation method of the
payload.
The AuthGuard takes as parameters the name of the strategy that you
want to apply, which in our example is jwt, and can also take some
other parameters that follow the AuthGuardOptions interface. This
interface defines three options that can be used:
session as a boolean
property as a string to define the name of the property
that you want to be add into the request to attach to the
authenticated user
callback as a function that allows you to implement your
own logic
By default the session is set to false and the property is set to user. By
default, The callback will return the user or an UnauthorizedException.
And that’s it, you can now authenticate the user on any controller
method and get the user from the request.
@Module({
imports: [UserModule],
providers: [AuthService, JwtStrategy],
})
export class AuthModule {}
Summary
In this chapter you have learned what a passport is and strategies to
configure the different parts of the passport in order to authenticate
the user and store it into the request. You have also seen how to
implement the different modules, AuthenticationModule and
the UserModule, in order to be logged into the user and provide some
endpoints accessible by the user. Of course, we have restricted the
access to some data that applies the AuthenticationMiddleware and
the CheckLoggedInUserGuard for more security.
You have also seen the new @nestjs/passport package, which allows you
to implement in faster ways a few classes
as AuthenticationService and JwtStrategy, and be able to authenticate
any user on any controller method using the AuthGuard provided by the
package.
In the next chapter you will learn about the Dependency Injection
pattern.
Chapter 4. Dependency Injection system of
Nest.js
This chapter provides an overview of the Dependency Injection (DI)
pattern, which is frequently used today by the biggest frameworks. It
is a way to keep code clean and easier to use. By using this pattern
you end up with fewer coupled components and more reusable ones,
which helps accelerate the development process time.
Here we examine the method that used the injection before the
pattern existed, and how the injection changed in time to use Nest.js
injection with a modern approach using TypeScript and decorators.
You will also see snippets that show the advantage of this type of
pattern, and modules provided by the framework.
constructor() {
this.userService = new UserService();
}
As you can see, you have to manage all of the related dependencies in
the class itself to be used inside the AuthenticationService.
// Rewritted AuthenticationService
export class AuthenticationService {
/*
Declare at the same time the public
properties belongs to the class
*/
constructor(public userService: UserService) { }
}
// Now you can instanciate the AutheticationService like that
const userService = new UserService();
const authenticationService = new
AuthenticationService(userService);
You can easily share the userService instance through all of the objects,
and it is no longer the AuthenticationService, which has to create
a UserService instance.
This makes life easier because the injector system will allow you to do
all of this without needing to instantiate the dependencies. Let’s see
this using the previous class in the next section.
This metadata will help make the framework aware that those objects
can be manipulated, injecting the needed dependencies.
@Injectable()
export class UserService { /*...*/ }
@Injectable()
export class AuthenticationService {
constructor(private userService: UserService) { }
}
This decorator will be transpiled and will add some metadata to it. This
means that you have accessed design:paramtypes after using a decorator
on the class, which allows the injector to know the type of the
arguments that are dependent on the AuthenticationService.
Generally, if you would like to create your own class decorator, this
one will take as parameter the target that represents the type of your
class. In the previous example, the type of the AuthenticationService is
the AuthenticationService itself. The purpose of this custom class
decorator will be to register the target in a Map of services.
resolve<T>(target: Type<any>): T {
const tokens = Reflect.getMetadata('design:paramtypes', target)
|| [];
const injections = tokens.map(token =>
CustomInjector.resolve<any>(token));
return new target(/*...*/injections);
}
set(target: Type<any>) {
this.services.set(target.name, target);
}
};
const authenticationService =
CustomInjector.resolve<AuthenticationService>(AuthenticationService);
const isValid = authenticationService.validateUser(/* payload */);
From the main module, Nest will know all of the related modules that
you have imported, and then create the application tree to manage all
of the Dependency Injections and the scope of the modules.
After it has created the container, it will initialize the app and, during
the initialization, it will instantiate an InstanceLoader and
a DependenciesScanner -> scanner.ts, via which Nest.js will have the
possibility to scan every module and metadata related to it. It does this
to resolve all of the dependencies and generate the instance of all
modules and services with their own injections.
If you want to know the details of the engine, we recommend that you
go deep into the two classes: InstanceLoader and DependenciesScanner.
ApplicationModule
AuthenticationModule
UserModule
@Module({
imports: [UserModule, AuthenticationModule]
})
export class ApplicationModule {/*...*/}
@Module({
imports: [UserModule],
providers: [AuthenticationService]
})
export class AuthenticationModule {/*...*/}
@Injectable()
export class AuthenticationService {
constructor(private userService: UserService) {}
}
And the UserModule:
@Module({
providers: [UserService],
exports: [UserService]
})
export class UserModule {/*...*/}
@Injectable()
export class UserService {/*...*/}
Here is an example:
@Module({
imports: [DatabaseModule, UserModule]
})
export class ApplicationModule {/*...*/}
@Global()
@Module({
providers: [databaseProvider],
exports: [databaseProvider]
})
export class DatabaseModule {/*...*/}
@Module({
providers: [UserService],
exports: [UserService]
})
export class UserModule {/*...*/}
@Injectable()
export class UserService {
// SequelizeInstance is provided by the DatabaseModule store as
a global module
constructor(@Inject('SequelizeInstance') private readonly
sequelizeInstance) {}
}
With all the previous information, you should now be familiar with the
mechanism of the Nest.js dependency injection and have a better
understanding of how they work together.
The difference between Nest.js and Angular DI
Even if Nest.js is widely based on Angular, there is a major difference
between them. In Angular, each service is a singleton, which is the
same as Nest.js, but there is a possibility to ask Angular to provide a
new instance of the service. To do that in Angular, you can use
the providers property of the @Injectable() decorator to have a new
instance of a provider registered in the module and available only for
this component. That can be useful to have to avoid overwriting some
properties through different components.
Summary
So to recap, we have seen in this chapter how it was unflexible and
hard to test an object without using the Dependecy Injection. Also, we
have learned more about the evolution of the method to implement the
dependencies into the dependent, first by implementing the
dependencies into the dependent, then changing the method by
passing them manually into the constructor to arrive with the injector
system. This then resolves the dependencies, injecting them in the
constructor automatically by resolving a tree, which is how Nest.js uses
this pattern.
In the next chapter we will see how Nest.js uses TypeORM, an Object
Relational Mapping (ORM) that works with several different relational
databases.
Chapter 5. TypeORM
Almost every time you use Nest.js in the real world, you need some
kind of persistence for your data. That is, you need to save the data
that the Nest.js app receives somewhere, and you need to read data
from somewhere so that you can then pass that data as a response to
the requests that the Nest.js app receives.
TypeORM allows you to use both the data mapper pattern, as well as
the active record pattern. We will focus on the active record pattern
as it greatly reduces the amount of boilerplate code needed to use in
the context of a typical Nest.js architecture, like the one explained
throughout the book.
TypeORM can also work with MongoDB, though in this case using a
dedicated NoSQL ORM such as Mongoose is a more common
approach.
About MariaDB
MariaDB is an open source, community-driven project led by some of
the original developers of MySQL. It was forked from MySQL when
Oracle acquired the latter with the intention of keeping it free and
open under the GNU General Public License.
The original idea of the project was to act as a drop-in replacement for
MySQL. This remains largely true for version up to 5.5, while MariaDB
kept its version numbers in sync with the MySQL ones.
Getting started
TypeORM is of course distributed as an npm package. You need to
run npm install typeorm @nestjs/typeorm.
You also need a TypeORM database driver; in this case, we will install
the MySQL/MariaDB one with npm install mysql.
version: '3'
volumes:
# for persistence between restarts
mariadb_data:
services:
mariadb:
image: mariadb:latest
restart: always
ports:
- "3306:3306"
environment:
MYSQL_ROOT_PASSWORD: secret
MYSQL_DATABASE: nestbook
MYSQL_USER: nest
MYSQL_PASSWORD: nest
volumes:
- mariadb_data:/var/lib/mysql
api:
build:
context: .
dockerfile: Dockerfile
args:
- NODE_ENV=development
depends_on:
- mariadb
links:
- mariadb
environment:
PORT: 3000
ports:
- "3000:3000"
volumes:
- .:/app
- /app/node_modules
command: >
npm run start:dev
Here is an example configuration file that suits our usecase (i.e. using
Docker Compose with the configuration previously proposed).
ormconfig.json
"type": "mariadb",
"host": "mariadb",
"port": 3306,
"username": "nest",
"password": "nest",
"database": "nestbook",
"synchronize": true,
"entities": ["src/**/*.entity.ts"]
}
The
properties host, port, username, password and database need to
match the ones specified earlier in the docker-compose.yml file;
otherwise, TypeORM will not be able to connect to the
MariaDB Docker image.
The synchronize property tells TypeORM whether to
create or update the database schema whenever the
application starts, so that the schemas match the entities
declared in the code. Setting this property to true can easily
lead to loss of data, so make sure you know what you’re
doing before enabling this property in production
environments.
Initialize TypeORM
Now that the database is running and you are able to successfully
establish a connection between it and our Nest.js app, we need to
instruct Nest.js to use TypeORM as a module.
@Module({
imports: [
TypeOrmModule.forRoot(),
...
]
})
With that said, let’s create our first entity, which we will name Entry.
We will use this entity to store entries (posts) for our blog. We will
create a new file at src/entries/entry.entity.ts; that way TypeORM will
be able to find this entity file since earlier in our configuration we
specified that entity files will follow the src/**/*.entity.ts file naming
convention.
@Entity()
export class Entry {}
The @Entity() decorator from the typeorm npm package is used to mark
the Entry class as an entity. This way, TypeORM will know that it needs
to create a table in our database for these kinds of objects.
The Entry entity is still a bit too simple: we haven’t defined a single
property for it. We will probably need things like a title, a body, an
image and a date for our blog entries, right? Let’s do it!
@Entity()
export class Entry {
@Column() title: string;
Not bad! Each property we define for our entity is marked with
a @Column decorator. Again, this decorator tells TypeORM how to treat
the property: in this case, we are asking for each property to be stored
in a column of the database.
Sadly, this entity will not work with this code. This is because each
entity needs to have at least one primary column, and we didn’t mark
any column as such.
Our best bet is to create an id property for each entry and store that on
a primary column.
@Entity()
export class Entry {
@PrimaryColumn() id: number;
Ah, that’s better! Our first entity is working now. Let’s use it!
@Component()
export class EntriesService {
constructor(
// we create a repository for the Entry entity
// and then we inject it as a dependency in the service
@InjectRepository(Entry) private readonly entry:
Repository<Entry>
) {}
Coming back to the latest service code, once you have created and
injected the Entry repository, use it to .find() and .save() entries from
the database, among other things. These helpful methods are added
when we create a repository for the entity.
Now that we have taken care of both the data model and the service,
let’s write the code for the last link: the controller.
The controller
Let’s create a controller for exposing the Entry model to the outside
world through a RESTful API. The code is really simple, as you can see.
@Controller('entries')
export class EntriesController {
constructor(private readonly entriesSrv: EntriesService) {}
@Get()
findAll() {
return this.entriesSrv.findAll();
}
@Get(':entryId')
findOneById(@Param('entryId') entryId) {
return this.entriesSrv.findOneById(entryId);
}
@Post()
create(@Body() entry) {
return this.entriesSrv.create(entry);
}
}
@Module({
imports: [TypeOrmModule.forFeature([Entry])],
controllers: [EntriesController],
components: [EntriesService],
})
export class EntriesModule {}
Anyway, let’s import the new EntriesModule into the AppModule. If you
neglect this step, your main app module won’t be aware of the
existence of the EntriesModule and your app will not work as intended.
src/app.module.ts
@Module({
imports: [
TypeOrmModule.forRoot(),
EntriesModule,
...
]
})
That’s it! Now you can fire requests to /entities and the endpoint will
invoke writes and reads from the database.
It’s time to give our database a try! We will fire some requests to the
endpoints that we previously linked to the database and see if
everything works as expected.
[]
{
"id": 1,
"title": "This is our first post",
"body": "Bla bla bla bla bla",
"image": "http://lorempixel.com/400",
"created_at": "2018-04-15T17:42:13.911Z"
}
Yes! Our previous POST request triggered a write in the database and
now this last GET request is triggering a read from the database, and
returning the data previously saved!
[{
"id": 1,
"title": "This is our first post",
"body": "Bla bla bla bla bla",
"image": "http://lorempixel.com/400",
"created_at": "2018-04-15T17:42:13.911Z"
}]
Auto-generated IDs
All of the database entries need to have a unique ID. At this point, we
are simply relying on the ID sent by the client when creating the entity
(when sending the POST request,) but this is less than desirable.
@Entity()
export class Entry {
@PrimaryGeneratedColumn() id: number;
@Entity()
export class Entry {
@PrimaryGeneratedColumn('uuid') id: string;
import {
Entity,
Column,
CreateDateColumn,
PrimaryGeneratedColumn,
} from 'typeorm';
@Entity()
export class Entry {
@PrimaryGeneratedColumn('uuid') id: string;
Nice, isn’t it? How about knowing also when the entry was last
modified, as well as how many revisions have been done to it? Again,
TypeORM makes both easy to do, and requires no additional code on
our side.
import {
Entity,
Column,
PrimaryGeneratedColumn,
CreateDateColumn,
UpdateDateColumn,
VersionColumn,
} from 'typeorm';
@Entity()
export class Entry {
@PrimaryGeneratedColumn('uuid') id: string;
Our entity will now automagically handle for us the modification date,
as well as the revision number, on each subsequent save operations.
You can track changes made to each instance of the entity without
having to implement a single line of code!
Column types
When defining columns in our entities using decorators, as exposed
above, TypeORM will infer the type of database column from the used
property type. This basically means that when TypeORM finds a line
like the following
This maps the string property type to a varchar database column type.
This will work just fine a lot of the time, but in some occasions we
might find ourselves in the position of being more explicit about the
type of columns to be created in the database. Fortunately, TypeORM
allows this kind of custom behavior with very little overhead.
The exact column types that can be used depend on the type of
database you are using.
int, tinyint, smallint, mediumint, bigint, float, double, dec, decimal, numeric,
date, datetime, timestamp, time, year, char, varchar, nvarchar, text, tinytext,
mediumtext, blob, longtext, tinyblob, mediumblob, longblob, enum, json, binary,
geometry, point, linestring, polygon, multipoint, multilinestring, multipoly
gon,geometrycollection
int, int2, int4, int8, smallint, integer, bigint, decimal, numeric, real, float,
float4, float8, double precision, money, character
varying, varchar, character, char, text, citext, hstore, bytea, bit, varbit, bit
varying, timetz, timestamptz, timestamp, timestamp without time
zone, timestamp with time zone, date, time, time without time zone, time
with time
zone, interval, bool, boolean, enum, point, line, lseg, box, path, polygon, circl
e, cidr, inet, macaddr, tsvector, tsquery, uuid, xml, json, jsonb, int4range, int
8range, numrange,tsrange, tstzrange, daterange
int, int2, int8, integer, tinyint, smallint, mediumint, bigint, decimal, numeri
c, float, double, real, double precision, datetime, varying
character, character, native character, varchar, nchar, nvarchar2, unsigned
big int, boolean, blob, text, clob, date
int, bigint, bit, decimal, money, numeric, smallint, smallmoney, tinyint, float,
real, date, datetime2, datetime, datetimeoffset, smalldatetime, time, char, va
rchar, text, nchar, nvarchar, ntext, binary, image, varbinary, hierarchyid, sql
_variant, timestamp, uniqueidentifier, xml, geometry, geography
NoSQL in SQL
TypeORM has still one last trick in the hat: a simple-json column type
that can be used in every supported database. With it, you can directly
save Plain Old JavaScript Objects in one of the relational database
columns. Yes, mindblowing!
import {
Entity,
Column,
PrimaryGeneratedColumn,
CreateDateColumn,
UpdateDateColumn,
VersionColumn,
} from 'typeorm';
@Entity()
export class Entry {
@PrimaryGeneratedColumn('uuid') id: string;
The simple-json column type allows you to directly store even complex
JSON trees without needing to define a model for them first. This can
come handy in situations where you appreciate a bit more flexibility
than the traditional relational database structure allows.
import {
Entity,
Column,
PrimaryGeneratedColumn,
CreateDateColumn,
UpdateDateColumn,
VersionColumn,
} from 'typeorm';
@Entity()
export class Comment {
@PrimaryGeneratedColumn('uuid') id: string;
You have probably noticed that the Comment entity is quite similar to
the Entry entity.
import {
Entity,
Column,
PrimaryGeneratedColumn,
CreateDateColumn,
UpdateDateColumn,
VersionColumn,
OneToMany,
} from 'typeorm';
@Entity()
export class Entry {
@PrimaryGeneratedColumn('uuid') id: string;
import {
Entity,
Column,
PrimaryGeneratedColumn,
CreateDateColumn,
UpdateDateColumn,
VersionColumn,
ManyToOne,
} from 'typeorm';
@Entity()
export class Comment {
@PrimaryGeneratedColumn('uuid') id: string;
The second argument that we’re passing to both the @OneToMany() and
the @ManyToOne() decorators is used to specify the inverse relationship
that we’re also creating on the other related entity. In other words, in
the Entry we are saving the related Comment entity in a property
named comments. That’s why, in the Comment entity definition, we
pass entry => entry.comments as a second argument to the decorator, to
the point where in Entry the comments be stored.
That’s it! Now each of our entries can have several comments.
@Component()
export class CommentsService {
constructor(
@InjectRepository(Comment) private readonly comment:
Repository<Comment>
) {}
findAll() {
return this.comment.find();
}
findOneById(id: number) {
return this.comment.findOneById(id);
}
create(comment: Comment) {
return this.comment.save(comment);
}
}
The code sure looks familiar, doesn’t it? It’s very similar to
the EntriesService that we already had, since we are providing quite
the same functionality for both comments and entries.
@Get()
findAll() {
return this.entriesSrv.findAll();
}
@Get(':entryId')
findOneById(@Param('entryId') entryId) {
return this.entriesSrv.findOneById(entryId);
}
@Post()
async create(@Body() input: { entry: Entry; comments: Comment[] })
{
const { entry, comments } = input;
entry.comments: Comment[] = [];
await comments.forEach(async comment => {
await this.commentsSrv.create(comment);
entry.comments.push(comment);
});
return this.entriesSrv.create(entry);
}
}
Setting cascade to true in our entity will mean that we’ll no longer need
to separately save each related entity; rather, saving the owner of the
relationship to the database will save those related entities at the same
time. This way, our previous code can be simplified.
First of all, let’s modify our Entry entity (which is the owner of the
relationship) to enable cascade.
src/entries/entry.entity.ts
import {
Entity,
Column,
PrimaryGeneratedColumn,
CreateDateColumn,
UpdateDateColumn,
VersionColumn,
OneToMany,
} from 'typeorm';
@Entity()
export class Entry {
@PrimaryGeneratedColumn('uuid') id: string;
This was really easy: we just added a {cascade: true} object as third
argument for the @OneToMany()decorator.
@Controller('entries')
export class EntriesController {
constructor(private readonly entriesSrv: EntriesService) {}
@Get()
findAll() {
return this.entriesSrv.findAll();
}
@Get(':entryId')
findAll(@Param('entryId') entryId) {
return this.entriesSrv.findOneById(entryId);
}
@Post()
async create(@Body() input: { entry: Entry; comments: Comment[] })
{
const { entry, comments } = input;
entry.comments = comments;
return this.entriesSrv.create(entry);
}
}
In this section we found out how to save entities that are related one to
another, while saving their relationship as well. This is a crucial step
for the success of our related entities. Nice job!
The idea in this case is that, when we request a blog entry (only one)
from the database, we also get the comments that belong to it.
We will need to modify the Entries service to achieve this. Again, it’s
going to be quite easy!
src/entries/entries.service.ts
@Component()
export class EntriesService {
constructor(
@InjectRepository(Entry) private readonly entry:
Repository<Entry>
) {}
findAll() {
return this.entry.find();
}
findOneById(id: number) {
return this.entry.findOneById(id, { relations: ['comments'] });
}
create(newEntry: Entry) {
this.entry.save(newEntry);
}
}
Lazy relationships
When working with TypeORM, regular relationships (like the ones we
have written so far) are eagerrelationships. This means that when we
read entities from the database, the find*() methods will return the
related entities as well, without us needing to write joins or manually
read them.
@Controller('entries')
export class EntriesController {
constructor(
private readonly entriesSrv: EntriesService,
private readonly commentsSrv: CommentsService
) {}
@Get()
findAll() {
return this.entriesSrv.findAll();
}
@Get(':entryId')
findAll(@Param('entryId') entryId) {
return this.entriesSrv.findOneById(entryId);
}
@Post()
async create(@Body() input: { entry: Entry; comments: Comment[] })
{
const { entry, comments } = input;
const resolvedComments = [];
await comments.forEach(async comment => {
await this.commentsSrv.create(comment);
resolvedComments.push(comment);
});
entry.comments = Promise.resolve(resolvedComments);
return this.entriesSrv.create(entry);
}
}
That said, be aware that TypeORM support for lazy relationships is still
in the experimental phase, so use them with care.
Let’s start by creating a new entity called EntryMetadata. We will put the
file in the /entry folder, next to the entry.entity.ts file.
src/entries/entry_metadata.entity.ts
@Entity()
export class EntryMetadata {
@PrimaryGeneratedColumn('uuid') id: string;
import {
Entity,
Column,
PrimaryGeneratedColumn,
CreateDateColumn,
UpdateDateColumn,
VersionColumn,
OneToMany,
OneToOne,
JoinColumn,
} from 'typeorm';
@Entity()
export class Entry {
@PrimaryGeneratedColumn('uuid') id: string;
So, just for the sake of demonstration, we will include the inverse
relationship in the EntryMetadatainstance to the Entry instance, so that
you know how it works.
src/entries/entry_metadata.entity.ts
@Entity()
export class EntryMetadata {
@PrimaryGeneratedColumn('uuid') id: string;
Make sure you don’t include the @JoinColumn() decorator on this second
entry. That decorator should only be used in the owner entity; in our
case, in Entry.
@Entity()
export class Entry {
@PrimaryGeneratedColumn('uuid') id: string;
By the way, if you’re wondering how could we save and then retrieve
this two related entities, I’ve got good news for you: it works the same
way that we saw with one-to-many relationships. So, either do it by
hand as exposed earlier in this chapter, or (my personal favorite) use
“cascades” for saving them, and find*() to retrieve them!
Many-to-many
The last type of relationship that we can establish for our entities is
known as “many-to-many.” This means that multiple instances of the
owning entity can include multiple instances of the owned entity.
We will save some code here, because these relationships are declared
exactly the same way than the “one-to-one” relationships, only
changing the @OneToOne() decorator to @ManyToMany().
Advanced TypeORM
Let’s take a look at security.
Security first
If you went through the Sequelize chapter in this same book, you might
be familiar with the concept of lifecycle hooks. In that chapter, we are
using a beforeCreate hook to encrypt the users’ passwords before we
save them to our database.
In case you’re wondering if such a thing exists also in TypeORM, the
answer is yes! Though the TypeORM documentation refers to them as
“listeners” instead.
So, to demonstrate its functionality, let’s write a very simple User entity
with a username and a password, and we will make sure to encrypt the
password before we save it to the database. The specific listener we
will be using is called beforeInsert in TypeORM.
@Entity
export class User {
@PrimaryGeneratedColumn('uuid') id: string;
@BeforeInsert()
encryptPassword() {
this.password = crypto.createHmac('sha256',
this.password).digest('hex');
}
}
Other listeners
In general, a listener is a method that gets triggered upon a specific
event within TypeORM, be it write-related or read-related. We just
learned about the @BeforeInsert() listener, but we have a few other
ones we can take advantage of:
@AfterLoad()
@BeforeInsert()
@AfterInsert()
@BeforeUpdate()
@AfterUpdate()
@BeforeRemove()
@AfterRemove()
Composing and extending entities
TypeORM offers two different ways of reducing code duplication
between entities. One of them follows the composition pattern, while
the other follows the inheritance pattern.
EMBEDDED ENTITIES
The way of composing entities in TypeORM is using an artifact known
as embedded entity.
Let’s go with the example: after reviewing the code we wrote earlier
for the entities of both Entry and Comment, we can easily see that there
are (among others) three duplicated
properties: created_at, modified_at and revision.
We will first create a Versioning entity (the name is not great, I know,
but should work for you to see the idea) with those three duplicated
properties.
src/common/versioning.entity.ts
So, now we will embed this new “embeddable” entity into our two
original entities.
src/entries/entry.entity.ts
import {
Entity,
Column,
PrimaryGeneratedColumn,
OneToMany,
OneToOne,
JoinColumn,
} from 'typeorm';
@Entity()
export class Entry {
@PrimaryGeneratedColumn('uuid') id: string;
src/comments/comment.entity.ts
@Entity()
export class Comment {
@PrimaryGeneratedColumn('uuid') id: string;
Even in this really simple case, we’ve reduced the two original entities
from three different properties to only one! In both the Entry entity and
the Comment entity, the versioning column will be actually replaced by
the properties inside the Versioning embedded entity when we invoke
any of their reading or writing methods.
ENTITY INHERITANCE
The second choice that TypeORM offers for reusing code between our
entities is using entity inheritance.
For this particular example, let’s imagine that our Nest.js-based blog
has been online for some time, and that it has become quite a success.
Now we would like to introduce sponsored blog entries so that we can
make a few bucks and invest them in a few more books.
The thing is, sponsored entries are going to be a lot like regular entries,
but with a couple of new properties: sponsor name and sponsor URL.
In this case, we might decide, after quite some thought, to extend our
original Entry entity and create a SponsoredEntry out of it.
src/entries/sponsored-entry.entity.ts
@Entity()
export class SponsoredEntry extends Entry {
@Column() sponsorName: string;
Caching
TypeORM brings a caching layer out of the box. We can take advantage
of it with only a little overhead. This layer is specially useful if you are
designing an API that expects a lot of traffic and/or you need the best
performance you can get.
Both cases would benefit increasingly from the cache because we use
more complex data retrieval scenarios, such as
complex find*() options, lots of related entities, etc.
The caching needs to be explicitly activated when connecting to the
database. In our case so far, this will be the ormconfig.json file that we
created at the beginning of the chapter.
ormconfig.json
"type": "mariadb",
"host": "db",
"port": 3306,
"username": "nest",
"password": "nest",
"database": "nestbook",
"synchronize": true,
"entities": ["src/**/*.entity.ts"],
"cache": true
}
The line of code above will make the .find() method to return the
cached value if it’s present and not expired, or the value from the
corresponding database table otherwise. So, even if the method is fired
three thousand times within the expiration time window (see below),
only one database query would be actually executed.
ormconfig.json
"type": "mariadb",
...
"cache": {
"type": "redis",
"options": {
"host": "localhost",
"port": 6379
}
}
}
This way we can easily improve the performance of our API under
heavy load.
Building a query
The TypeORM’s repository methods for retrieving data from our
database greatly isolates the complexity of querying away from us.
They provide a very useful abstraction so that we don’t need to bother
with actual database queries.
...
findOneById(id: number) {
return getRepository(Entry)
.createQueryBuilder('entry')
.where('entry.id = :id', { id })
.leftJoinAndSelect('entry.comments', 'comment')
.getOne();
}
...
There’s a nice open source project we can use for that: typeorm-model-
generator. It’s packed as a command line tool and can be run with npx.
NOTE: In case you’re not familiar with it, npx is a command that comes
out of the box with npm > 5.2 and that allows us to run npm modules
from the command line without having to install them first. To use it,
you just need to prepend npx to the regular commands from the tool.
We would use npx ng new PROJECT-NAME on our command line, for
example, if we wanted to scaffold a new project with Angular CLI.
Since this is a useful tool for only some very specific use cases, we will
leave the configuration details out of this book. However, if you find
yourself using this tool, go ahead and check its GitHub repository.
Summary
TypeORM is a very useful tool and enables us to do a lot of heavy lifting
when dealing with databases, while greatly abstracting things like data
modelling, queries, and complex joins, thus simplifying our code.
It’s also very suitable for being used in Nest.js-based projects thanks to
the great support the framework provides through
the @nest/typeorm package.
All in all, we really think the more familiar you grow with Nest.js, the
more likely you start to feel comfortable writing TypeORM code, since
they both look alike in a few aspects as their extensive use of
TypeScript decorators.
Also, Sequelize comes from many hooks providing you with the
significant advantage of being able to check and manipulate your
data at any level of the transaction.
Configure Sequelize
In order to be able to use Sequelize, we have first to set up the
connection between sequelize and our database. In order to do that, we
will create the DatabaseModule, which will contain the provider of the
sequelize instance.
This instance is for you to be aware about the different model that
should be provided. In order to tell sequelize which model we need, we
use the addModels method on the instance and pass an array of model.
Of course, in the following section we will see how to implement a new
model.
This provider will return the instance of Sequelize. This instance will
be useful to use the transaction provided by Sequelize. Also, in order to
be able to inject it, we have provided in the provide parameter, the
name of the token SequelizeInstance, which will be used to inject it.
@Global()
@Module({
providers: [databaseProvider],
exports: [databaseProvider],
})
export class DatabaseModule {}
We defined the DatabaseModule as a Global in order to be added into all
the modules as a related module, letting you inject the
provider SequelizeInstance into any module as following:
Create a model
After having set up the sequelize connection, we have to implement
our model. As seen in the previous section, we tell Sequelize that we
will have the User model using this methodsequelize.addModels([User]);.
@Table
This decorator will allow you to configure our representation of the
data, and here are some parameters:
timestamps: true,
paranoid: true,
underscored: false,
freezeTableName: true,
tableName: 'my_very_custom_table_name'
}
The timestamp parameter will tell you that you want to have
an updatedAt and deletedAt columns. The paranoid parameter allows you
to soft delete data instead of removing it to lose your data. If you
pass true, Sequelize will expected a deletedAt column in oder to set the
date of the remove action.
@column
This decorator will help define our column. You can also not pass any
parameter, so in this case Sequelize will try to infer the column type.
The types that can be inferred are string, boolean, number, Date and Blob.
@Column({
type: DataType.STRING,
allowNull: false,
validate: {
isEmail: true,
isUnique: async (value: string, next: any): Promise<any>
=> {
const isExist = await User.findOne({ where: { email:
value }});
if (isExist) {
const error = new Error('The email is already
used.');
next(error);
}
next();
},
},
})
In the previous example, we passed some options, but we could also
use some decorator as @AllowNull(value: boolean), @Unique or
even @Default(value: any).
@Column(DataTypes.STRING)
public email: string;
You can also use the options shown in the @Column section in order
to validate and ensure the data of the email.
@Column({
type: DataType.STRING,
allowNull: false,
validate: {
isEmail: true,
isUnique: async (value: string, next: any): Promise<any>
=> {
const isExist = await User.findOne({
where: { email: value }
});
if (isExist) {
const error = new Error('The email is already
used.');
next(error);
}
next();
},
},
})
public email: string;
You now know how to set up a column, so let’s set up the rest of the
model with the column that we need for a simple user.
@Table(tableOptions)
export class User extends Model<User> {
@PrimaryKey
@AutoIncrement @Column(DataType.BIGINT)
public id: number;
@Column({
type: DataType.STRING,
allowNull: false,
})
public firstName: string;
@Column({
type: DataType.STRING,
allowNull: false,
})
public lastName: string;
@Column({
type: DataType.STRING,
allowNull: false,
validate: {
isEmail: true,
isUnique: async (value: string, next: any):
Promise<any> => {
const isExist = await User.findOne({
where: { email: value }
});
if (isExist) {
const error = new Error('The email is
already used.');
next(error);
}
next();
},
},
})
public email: string;
@Column({
type: DataType.TEXT,
allowNull: false,
})
public password: string;
@CreatedAt
public createdAt: Date;
@UpdatedAt
public updatedAt: Date;
@DeletedAt
public deletedAt: Date;
}
In all the added columns, you can see the password of type TEXT, but of
course, you cannot store a password as a plain text, so we have to hash
it in order to protect it. To do that, use the lifeCycle hooks provided by
Sequelize.
LifeCycle hooks
Sequelize come with many lifeCycle hooks that allow you to
manipulate and check the data along the process of creating, updating,
or deleting a data.
Here are some interesting hooks from Sequelize.
beforeBulkCreate(instances, options)
beforeBulkDestroy(options)
beforeBulkUpdate(options)
beforeValidate(instance, options)
afterValidate(instance, options)
beforeCreate(instance, options)
beforeDestroy(instance, options)
beforeUpdate(instance, options)
beforeSave(instance, options)
beforeUpsert(values, options)
afterCreate(instance, options)
afterDestroy(instance, options)
afterUpdate(instance, options)
afterSave(instance, options)
afterUpsert(created, options)
afterBulkCreate(instances, options)
afterBulkDestroy(options)
afterBulkUpdate(options)
@Table(tableOptions)
export class User extends Model<User> {
...
@BeforeCreate
public static async hashPassword(user: User, options: any) {
if (!options.transaction) throw new Error('Missing
transaction.');
user.password = crypto.createHmac('sha256',
user.password).digest('hex');
}
}
The BeforeCreate previously written allows you to override
the password property value of the user in order to override it before
the insertion of the object into the database, and to ensure a minimum
of security.
This provider will define the key to use in order to inject it and take as
a value the User model that we have implemented before.
@Injectable()
export class UserService implements IUserService {
constructor(@Inject('UserRepository') private readonly
UserRepository: typeof User) { }
...
}
After injecting the model into the service, you will be able to use it to
access and manipulate the data as you want. For example, you can
execute this.UserRepository.findAll() to register the data in your
database.
@Module({
imports: [],
providers: [userProvider, UserService],
exports: [UserService]
})
export class UserModule {}
@Injectable()
export class UserService implements IUserService {
constructor(@Inject('UserRepository') private readonly
UserRepository: typeof User,
@Inject('SequelizeInstance') private readonly
sequelizeInstance) { }
...
}
We have injected both the model and the Sequelize instance that we
talked about earlier in this chapter.
Migration
With Sequelize you have a way to sync your model and your database.
The thing is, this synchronization will remove all of your data in order
to recreate all of the tables representing the model. So, this feature is
useful in testing, but not in a production mode.
When you are using the command npm run migrate up, which
executes ts-node migrate.ts, you can pass up/down as a parameter. In
order to track all of the migration already applied, a new table will be
created with the default name SequelizeMeta, and all of the applied
migrations will be stored in this table.
Our migration file can be found in the repository as the root with the
name migrate.ts. Also, all of the migrations files will be stored in
the migrations folder of the repository example.
migrations: {
params: [
sequelize,
sequelize.constructor, // DataTypes
],
path: './migrations',
pattern: /\.ts$/
},
logging: function () {
console.log.apply(null, arguments);
}
});
Create a migration
To execute the migration script, provide the migration that you want to
apply. Imagine that you want to create the users table using migration.
You must set an up and a down method.
Summary
In this chapter you have seen how to set up the connection to the
database by instanciating a Sequelize instance, using the factory in
order to inject the instance directly in another place.
Getting started
As a first step, we need to install the Mongoose npm package, as well as
the Nest.js/Mongoose npm package.
Let’s create a Docker Compose file to build and start both the database
we will be using, as well as our Nest.js app, and link them together so
that we can access the database later from our code.
version: '3'
volumes:
mongo_data:
services:
mongo:
image: mongo:latest
ports:
- "27017:27017"
volumes:
- mongo_data:/data/db
api:
build:
context: .
dockerfile: Dockerfile
args:
- NODE_ENV=development
depends_on:
- mongo
links:
- mongo
environment:
PORT: 3000
ports:
- "3000:3000"
volumes:
- .:/app
- /app/node_modules
command: >
npm run start:dev
@Module({
imports: [
MongooseModule.forRoot(),
...
],
})
export class AppModule {}
There’s a captcha with the code above: it won’t work, because there’s
still no way for Mongoose or the MongooseModule to figure out how to
connect to our MongoDB instance.
The bad news here is that, in our case, the “default” example
connection string will not work, because we are running our database
instance inside a container linked from another container, a Node.js
one, which is the one that our code runs in.
The good news, though, is that we can use that Docker Compose link to
connect to our database, because Docker Compose establishes a virtual
network connection between both containers, the MongoDB one and
the Node.js one.
where mongo is the name of our MongoDB container (we specified this is
the Docker Compose file), 27017is the port that the MongoDB container
is exposing (27017 being the default for MongoDB), and nest is the
collection we will store our documents on (you’re free to change it to
your heart’s content.)
Now that we have adjusted our connection string, let’s modify our
original AppModule import.
@Module({
imports: [
MongooseModule.forRoot('mongodb://mongo:27017/nest'),
...
],
})
export class AppModule {}
Let’s create our first schema. We will use it as a blueprint for storing
our blog entries. We will also place the schema next to the other blog
entries related files, grouping our files by “domain” (that is, by
functionality.)
NOTE: You’re free to organize your schemas as you see fit. We (and the
official Nest.js documentation) suggest storing them near the module
where you use each one of them. In any case, you should be good with
any other structural approach as long as you correctly import your
schema files when you need them.
src/entries/entry.schema.ts
@Module({
imports: [
MongooseModule.forFeature([{ name: 'Entry', schema: EntrySchema
}]),
],
})
export class EntriesModule {}
@Module({
imports: [
MongooseModule.forRoot('mongodb://mongo:27017/nest'),
EntriesModule,
...
],
})
export class AppModule {}
The service
Let’s create a service for our blog entries that interact with
the Entry model.
src/entries/entries.service.ts
@Component()
export class EntriesService {
constructor(
@InjectModel(EntrySchema) private readonly entryModel:
Model<Entry>
) {}
// this method retrieves all entries
findAll() {
return this.entryModel.find().exec();
}
In the code above, the most important bit happens inside the
constructor: we are using the @InjectModel() decorator to instantiate
our model, by passing the desired schema (in this case, EntrySchema) as
a decorator argument.
On the other hand, it’s worth mentioning that, in the create() method,
we are adding an ID to the received entry object by using
the _id property (as we previously defined on our schema) and
generating a value using Mongoose’s built-in Types.ObjectId() method.
The controller
The last step we need to cover in the model -> service -> controller
chain is the controller. The controller will make it possible to make an
API request to the Nest.js app and either write to or read from the
database.
This is how our controller should look like:
src/entries/entries.controller.ts
@Controller('entries')
export class EntriesController {
constructor(private readonly entriesSrv: EntriesService) {}
@Get()
findAll() {
return this.entriesSrv.findAll();
}
@Get(':entryId')
findById(@Param('entryId') entryId) {
return this.entriesSrv.findById(entryId);
}
@Post()
create(@Body() entry) {
return this.entriesSrv.create(entry);
}
}
[]
Yes! Our previous POST request triggered a write in the database. Let’s
try to retrieve all entries once again.
[{
"id": 1,
"title": "This is our first post",
"body": "Bla bla bla bla bla",
"image": "http://lorempixel.com/400",
"created_at": "2018-04-15T17:42:13.911Z"
}]
Relationships
While it’s true that MongoDB is not a relational database, it’s also true
that it allows “join-like” operations for retrieving two (or more) related
documents at once.
Modelling relationships
Let’s go back to our blog example. Remember that so far we only had a
schema that defined our blog entries. We will create a second schema
that will allow us to create comments for each blog entry, and save
them to the database in a way that allows us later to retrieve both a
blog entry as well as the comments that belong to it, all in a single
database operation.
One notable new thing is the entry property. It will be used to store a
reference to the entry each comment belongs to. The ref option is what
tells Mongoose which model to use during population, which in our
case is the Entry model. All _id’s we store here need to be
document _id’s from the Entry model.
NOTE: We will ignore the Comment interface for brevity; it’s simple
enough for you to be able to complete it on your own. Don’t forget to
do it!
In other words, each entry can have multiple comments, but each
comment can only belong to one entry.
Saving relationships
Once our relationship is modelled, we need to provide a method for
saving them into our MongoDB instance.
When working with Mongoose, storing a model instance and its related
instances requires some degree of manually nesting methods.
Fortunately, async/await will make the task much easier.
Let’s modify our EntryService to save both the receive blog entry and a
comment associated with it; both will be sent to the POST endpoint as
different objects.
src/entries/entries.service.ts
@Component()
export class EntriesService {
constructor(
@InjectModel(EntrySchema) private readonly entryModel:
Model<Entry>,
@InjectModel(CommentSchema) private readonly commentModel:
Model<Comment>
) {}
// this method retrieves all entries
findAll() {
return this.entryModel.find().exec();
}
This way we make sure that, inside the comment, we are successfully
storing a reference to the entry the comment belongs to. By the way,
note that we store the reference by entry’s ID.
The next step should obviously be providing a way of reading from the
database the related items we now are able to save to it.
Reading relationships
As covered a few sections ago, the method that Mongoose provides for
retrieving related documents from a database at once is called
“population,” and it’s invoked with the built-in .populate() method.
We’ll see how to use this method by changing the EntryService once
again; at this point, we will deal with the findById() method.
src/entries/entries.service.ts
@Component()
export class EntriesService {
constructor(
@InjectModel(EntrySchema) private readonly entryModel:
Model<Entry>,
@InjectModel(CommentSchema) private readonly commentModel:
Model<Comment>
) {}
// this method retrieves all entries
findAll() {
return this.entryModel.find().exec();
}
Summary
NoSQL databases are a powerful alternative to “traditional” relational
ones. MongoDB is arguably the best known of the NoSQL databases in
use today, and it works with documents encoded in a JSON variant.
Using a document-based database such as MongoDB allows developers
to use more flexible, loosely-structured data models and can improve
iteration time in a fast-moving project.
You have the possibility to create a full web socket app, but also, add
some web socket features inside your Rest API. In this chapter, we
will see how to implement the web socket over a Rest API using the
decorators provided by Nest.js, but also how to validate an
authenticated user using specific middleware.
@Module({
imports: [UserModule, CommentModule],
providers: [CommentGateway]
})
export class CommentGatewayModule { }
WebSocketGateway
To implement your first module using the Nest.js web socket, you will
have to use the @WebSocketGatewaydecorator. This decorator can take an
argument as an object to provide a way to configure how to use the
adapter.
To use it, you have to create you first gateway class, so imagine
a UserGateway:
@WebSocketGateway({
middlewares: [AuthenticationGatewayMiddleware]
})
export class UserGateway { /*....*/ }
By default, without any parameters, the socket will use the same port
as your express server (generally 3000). As you can see, in the previous
example we used a @WebSocketGateway, which uses the default
port 3000 without namespace and with one middleware that we will see
later.
Gateways
The gateways in the class using the decorator previously seen contain
all of the handlers that you need to provide the results of an event.
Nest.js comes with a decorator that allows you to access the server
instance @WebSocketServer. You have to use it on a property of your
class.
/* ... */
}
/* ... */
}
The comment service allows you to return the appropriate result for
the next handlers.
@SubscribeMessage('indexComment')
async index(client, data): Promise<WsResponse<any>> {
if (!data.entryId) throw new WsException('Missing entry
id.');
@SubscribeMessage('showComment')
async show(client, data): Promise<WsResponse<any>> {
if (!data.entryId) throw new WsException('Missing entry
id.');
if (!data.commentId) throw new WsException('Missing comment
id.');
We now have two handlers, the indexComment and the showComment. To use
the indexComment handler we expect an entryId in order to provide the
appropriate comment, and for the showComment we expect an entryId,
and of course a commentId.
Authentication
We have set up our CommentModule, and now we want to authenticate the
user using the token (have a look to the Authentication chapter). In this
example we use a mutualised server for the REST API and the
Websocket event handlers. So, we will mutualise the authentication
token in order to see how to validate the token received after a user
has been logged into the application.
@Injectable()
export class AuthenticationGatewayMiddleware implements
GatewayMiddleware {
constructor(private readonly userService: UserService) { }
resolve() {
return (socket, next) => {
if (!socket.handshake.query.auth_token) {
throw new WsException('Missing token.');
}
return jwt.verify(socket.handshake.query.auth_token,
'secret', async (err, payload) => {
if (err) throw new WsException(err);
The middleware used for the web socket is almost the same as the
REST API. Implementing the GatewayMiddleware interface with
the resolve function is now nearly the same. The difference is that you
have to return a function, which takes socket and the next function as
its parameters. The socket contains the handshake with the query sent by
the client, and all of the parameters provided, in our case,
the auth_token.
Adapter
As mentioned in the beginning of this chapter, Nest.js comes with it
own adapter, which uses socket.io. But the framework needs to be
flexible and it can be used with any third party library. In order to
provide a way to implement another library, you have the possibility to
create your own adapter.
The adapter has to implement the WebSocketAdapter interface in order to
implement the following methods. For example, we will use ws as a
socket library in our new adapter. To use it, we will have to inject
the app into the constructor as follows:
/* ... */
}
By doing this, we can get the httpServer in order to use it with the ws.
After that, we have to implement the create method in order to create
the socket server.
create(port: number) {
return new WebSocket.Server({
server: this.app.getHttpServer(),
verifyClient: ({ origin, secure, req }, next) => {
return (new
WsAuthenticationGatewayMiddleware(this.app.select(UserModule).
get(UserService))).resolve()(req, next);
}
});
}
/* ... */
}
if (!req.token) {
throw new WsException('Missing token.');
}
/* ... */
}
To finish our new custom adapter, implement
the bindMessageHandlers in order to redirect the event and the data to
the appropriate handler of your gateway. This method will use
the bindMessageHandler in order to execute the handler and return the
result to the bindMessageHandlers method, which will return the result to
the client.
bindMessageHandler(buffer, handlers:
MessageMappingProperties[], process: (data) => Observable<any>):
Observable<any> {
const data = JSON.parse(buffer.data);
const messageHandler = handlers.find((handler) =>
handler.message === data.type);
if (!messageHandler) {
return Observable.empty();
}
const { callback } = messageHandler;
return process(callback(data));
}
/* ... */
}
Now, we have created our first custom adapter. In order to to use it,
instead of the Nest.js IoAdapter, we have to call
the useWebSocketAdapter provided by the app: INestApplication in
your main.ts file as follows:
app.useWebSocketAdapter(new WsAdapter(app));
We pass in the adapter, the app instance, to be used as we have seen in
the previous examples.
Client side
In the previous section, we covered how to set up the web socket on
the server side and how to handle the event from the client side.
Now we will see how to set up your client side, in order to use the
Nest.js IoAdapter or our custom WsAdapter. In order to use the IoAdapter,
we have to get the socket.io-client library and set up our first HTML
file.
The file will define a simple script to connect the socket to the server
with the token of the logged in user. This token we will be used to
determine if the user is well connected or not.
<script>
const socket = io('http://localhost:3000', {
query: 'auth_token=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.
eyJlbWFpbCI6InRlc3QzQHRlc3QuZnIiLCJpYXQiOjE1MjQ5NDk3NTgs
ImV4cCI6MTUyNDk1MzM1OH0.QH_jhOWKockuV-w-vIKMgT_eLJb3dp6a
ByDbMvEY5xc'
});
</script>
As you see, we pass at the socket connection a token auth_token into the
query parameter. We can pick it from the socket handshake and then
validate the socket.
socket.on('connect', function () {
socket.emit('showUser', { userId: 4 });
socket.emit('indexComment', { entryId: 2 });
socket.emit('showComment', { entryId: 2, commentId: 1 });
});
In this example, we are waiting for the connect event to be aware when
the connection is finished. Then we send three events: one to get the
user, then an entry, and the comment of the entry.
Using the following on event, we are able to get the data sent by the
server as a response to our previously emitted events.
Here we show in the console all of the data responded by the server,
and we have also implemented an event exception in order to catch all
exceptions that the server can return.
In cases where we would like to use the custom adapter, the process is
similar. We will open the connection to the server using
the WebSocket as follows:
We open the connection here on the localhost with the same port as
our HTTP server. We also pass the token as a query parameter in order
to pass the verifyClient method, which we have seen with
the WsAuthenticationGatewayMiddleware.
Then we will wait for the return of the server, to be sure that the
connection is successful and usable.
ws.onopen = function() {
console.log('open');
ws.send(JSON.stringify({ type: 'showComment', entryId: 2,
commentId: 1 }));
};
When the connection is usable, use the send method in order to send
the type of event we want to handle, which is here with showComment and
where we pass the appropriate parameters, just like we did with the
socket.io usage.
We will use the onmessage in order to get the data returned by the
server for our previous sent event. When the WebSocket receives an
event, a message event is sent to the manager that we can catch with the
following example.
ws.onmessage = function(ev) {
const _data = JSON.parse(ev.data);
console.log(_data);
};
You can now use this data as you’d like in the rest of your client app.
Summary
In this chapter you learned how to set up the server side, in order to
use the:
You also set up a gateway in order to handle the events sent by the
client side.
You’ve seen how to set up the client side to use the socket.io-
client or WebSocket client to connect the socket to the server. This was
done on the same port as the HTTP server, and you learned how to
send and catch the data returned by the server or the exception in case
of an error.
Server bootstrap
To get started, make sure @nestjs/microservices is installed within your
project. This module provides the client, server, and accompanying
utilities needed to convert a Nest.js API application into a
microservices application. Finally, we will modify our blog
application’s bootstrap to enable microservices.
await app.startAllMicroservicesAsync();
await app.listen(3001);
}
Configuration
The configuration parameters passed to connectMicroservice depends
on the transport we use. A transport is a combination of a client and
server that work in unison to transmit microservice requests and
responses between the NestApplication and NestMicroservice
contexts. Nest.js comes with a number of built-in transports and
provides the ability to create custom transports. The available
parameters depend on the transport we use. For now, we will use the
TCP transport, but will cover other transports later. The possible
options for the TCP transport are:
@Controller()
export class UserController {
@Get('users')
public async index(@Res() res) {
const users = await this.userService.findAll();
return res.status(HttpStatus.OK).json(users);
}
@MessagePattern({cmd: 'users.index'})
public async rpcIndex() {
const users = await this.userService.findAll();
return users;
}
}
The microservice method itself can perform the same business logic
that a normal controller handler can to respond in almost the same
manor. Unlike a normal controller handler, a microservice handler has
no HTTP context. In fact, decorators like @Get, @Body, and @Req make no
sense and should not be used in a microservice controller. To complete
the processing of a message, a simple value, promise, or RxJS
Observable can be returned from the handler.
Sending data
The previous microservice handler was very contrived. It is more likely
that microservice handlers will be implemented to perform some
processing on data and return some value. In a normal HTTP handler,
we would use @Req or @Body to extract the data from the HTTP request’s
body. Since microservice handlers have no HTTP context, they take
input data as a method parameter.
@Controller()
export class UserController {
@Client({transport: Transport.TCP, options: { port: 5667 }})
client: ClientProxy
@Post('users')
public async create(@Req() req, @Res() res) {
this.client.send({cmd: 'users.index'}, {}).subscribe({
next: users => {
res.status(HttpStatus.OK).json(users);
},
error: error => {
res.status(HttpStatus.INTERNAL_SERVER_ERROR).json(error);
}
});
}
@MessagePattern({cmd: 'users.create'})
public async rpcCreate(data: any) {
if (!data || (data && Object.keys(data).length === 0))
throw new Error('Missing some information.');
await this.userService.create(data);
}
}
Exception filters
Exception filters provide a means to transform exceptions thrown
from microservice handlers into meaningful objects. For example,
our rpcCreate method currently throws an error with a string but what
happens when the UserService throws an error or possibly the ORM.
This method could throw a number of different errors and the only
means that calling method knows what happened is to parse the error
string. That’s simply unacceptable, so let’s fix it.
Start with creating a new exception class. Notice that our microservice
exception extends RpcException and does not pass a HTTP status code
in the constructor. These are the only differences between
microservice exceptions and normal Nest.js API exceptions.
We can now change the rpcCreate method to throw this exception when
the data is not valid.
@MessagePattern({cmd: 'users.create'})
public async rpcCreate(data: any) {
if (!data || (data && Object.keys(data).length === 0)) throw new
RpcValidationException();
await this.userService.create(data);
}
Note the throwError method is from the RxJS version 6 package. If you
are still using RxJS version 5, use Observable.throw.
@Catch(RpcValidationException)
export class RpcValidationFilter implements RpcExceptionFilter {
public catch(exception: RpcValidationException):
ErrorObservable {
return throwError({
error_code: 'VALIDATION_FAILED',
error_message: exception.getError(),
errors: exception.validationErrors
});
}
}
All we have left is to act on our new exception when it occurs. Modify
the create method to catch any exceptions thrown from the
microservice client. In the catch, check if the error_code field has a value
of VALIDATION_FAILED. When it does, we can return a 400 HTTP status
code back to the user. This will allow the user’s client, the browser, to
treat the error differently, possibly showing the user some messaging
and allowing them to fix the data entered. This provides a much better
user experience compared to throwing all errors back to the client
as 500 HTTP status code.
@Post('users')
public async create(@Req() req, @Res() res) {
this.client.send({cmd: 'users.create'}, body).subscribe({
next: () => {
res.status(HttpStatus.CREATED).send();
},
error: error => {
if (error.error_code === 'VALIDATION_FAILED') {
res.status(HttpStatus.BAD_REQUEST).send(error);
} else {
res.status(HttpStatus.INTERNAL_SERVER_ERROR).send(error);
}
}
});
}
Pipes
The most common pipe used with and provided by Nest.js is the
ValidationPipe. This pipe, however, cannot be used with microservice
handlers because it throws exceptions extending HttpException. All
exceptions thrown in a microservice must extend RpcException. To fix
this, we can extend the ValidationPipe, catch the HttpException, and
throw an RpcException.
@Injectable()
export class RpcValidationPipe extends ValidationPipe implements
PipeTransform<any> {
public async transform(value: any, metadata: ArgumentMetadata)
{
try {
await super.transform(value, metadata);
} catch (error) {
if (error instanceof BadRequestException) {
throw new RpcValidationException();
}
throw error;
}
return value;
}
}
class CreateUserRequest {
@IsEmail()
@IsNotEmpty()
@IsDefined()
@IsString()
public email: string;
@Length(8)
@Matches(/^(?=.*[a-z])(?=.*[A-Z])(?=.*\d)\S+$/)
@IsDefined()
@IsString()
public password: string;
@IsNotEmpty()
@IsDefined()
@IsString()
public firstName: string;
@IsNotEmpty()
@IsDefined()
@IsString()
public lastName: string;
}
@MessagePattern({cmd: 'users.create'})
@UsePipes(new RpcValidationPipe())
@UseFilters(new RpcValidationFilter())
public async rpcCreate(data: CreateUserRequest) {
await this.userService.create(data);
}
Guards
Guards in microservices serve the same purpose as they do in normal
APIs. They determine if a specific microservice handler should handle a
request. Up to now, we have used guards to protect API handlers from
unauthorized access. We should do the same thing to our microservice
handlers. Although in our application, our microservice handler is only
called from our already protected API handler, we should never
assume that will always be the case.
@Injectable()
export class RpcCheckLoggedInUserGuard implements CanActivate {
canActivate(context: ExecutionContext): boolean |
Promise<boolean> | Observable<boolean> {
const data = context.switchToRpc().getData();
return Number(data.userId) === data.user.id;
}
}
The new guard looks exactly like the API CheckLoggedInUserGuard guard.
The difference is in the parameters that are passed to
the canActivate method. Since this guard is being executed in the
context of a microservice, it will be given a microservice data object
instead of the API request object.
We use the new microservice guard the same way we did our API
guard. Simply decorate out microservice handler with @UseGuards and
our guard will now protect our microservice from misuse. Let’s make a
new microservice for retrieving the current user’s information.
@Get('users/:userId')
@UseGuards(CheckLoggedInUserGuard)
public async show(@Param('userId') userId: number, @Req() req, @Res()
res) {
this.client.send({cmd: 'users.show'}, {userId, user:
req.user}).subscribe({
next: user => {
res.status(HttpStatus.OK).json(user);
},
error: error => {
res.status(HttpStatus.INTERNAL_SERVER_ERROR).send(error);
}
});
}
@MessagePattern({cmd: 'users.show'})
@UseGuards(RpcCheckLoggedInUserGuard)
public async rpcShow(data: any) {
return await this.userService.findById(data.userId);
}
The show API handler now offloads the heavy lifting of accessing the
database to the NestMicroservice context. The guard on the
microservice handler ensures, if the handler is somehow invoked
outside of the show API handler, it will still protect the user data from
being exposed to unauthorized requests. But there is still a problem.
This example returns the entire user object from the database,
including the hashed password. This is a security vulnerability best
solved by interceptors.
Interceptors
Microservice interceptors function no differently from normal API
interceptors. The only difference is that the interceptor is passed the
data object sent to the microservice handler instead of an API request
object. This means you can actually write interceptors once and use
them in both contexts. Just like API interceptors, microservice
interceptors are executed before the microservice handler and must
return an Observable. To secure our rpcShow microservice endpoint, we
will create a new interceptor that will expect a User database object
and remove the password field.
@Injectable()
export class CleanUserInterceptor implements NestInterceptor {
intercept(context: ExecutionContext, stream$: Observable<any>):
Observable<any> {
return stream$.pipe(
map(user => JSON.parse(JSON.stringify(user))),
map(user => {
return {
...user,
password: undefined
};
})
);
}
}
@MessagePattern({cmd: 'users.show'})
@UseGuards(RpcCheckLoggedInUserGuard)
@UseInterceptors(CleanUserInterceptor)
public async rpcShow(data: any) {
return await this.userService.findById(data.userId);
}
The response from the rpcShow microservice handler will now have
the password field removed. Notice in the interceptor we had to convert
the User database object to and from JSON. This may differ depending
on the ORM you make use of. With Sequelize, we need to get the raw
data from the database response. This is because the response from
the ORM is actually a class containing many different ORM methods
and properties. By converting it to JSON and back, we can use the
spread operator with password: undefined to delete the password field.
Built-in transports
The TCP transport is only one of several transports Nest.js has built-in.
Using the TCP transport, we had to bind our NestMicroservice context
to an additional port, taking up yet another port on the server, and
ensuring our NestMicroservice context was running before starting the
NestApplication context. Other built-in transports can overcome these
limitations and add additional benefits.
Redis
Redis is a simple in-memory data store that can be used as a pub-sub
message broker. The Redis transport makes use of the redis NPM
package and a Redis server to pass messages between the
NestApplication and NestMicroservice contexts. To use the Redis
transport, we need to update our bootstrap method to use the correct
NestMicroservice configuration.
await app.startAllMicroservicesAsync();
await app.listen(3001);
}
You would also have to update all locations where you have use
the @Client decorator to the same settings. Instead, let’s centralize this
configuration so we are not duplicating code and we can switch out the
transport easier.
await app.startAllMicroservicesAsync();
await app.listen(3001);
}
@Controller()
export class UserController {
@Client(microserviceConfig)
client: ClientProxy
}
Start up a Redis server, such as the redis docker image and the
application and all our microservice transaction will now be
processing through the Redis server. The below diagram shows a
simplified view of the data flow when a new user is created and we are
using the Redis transport.
Both the client and the server make a connection with the Redis server.
When client.send is called, the client alters the message pattern on the
fly to create pub and sub channels. The server consumes the message
and removes the message pattern modification to find the correct
microservice handler. Once processing is complete in the microservice
handler, the pattern is modified again to match the sub channel. The
client consumes this new message, unsubscribes from the sub channel,
and passes the response back to the caller.
MQTT
MQTT is a simple message protocol designed to be used when network
bandwidth is a premium. The MQTT transport make use of
the mqtt NPM package and a remote MQTT server to pass messages
between the NestApplication and NestMicroservice contexts. The data
flow and how the microservice client and server operate are almost
identical to the Redis transport. To make use of the MQTT transport,
let’s update the microserviceConfig configuration object.
The MQTT transport can take several options, all of which are detailed
in the Github repository for the mqtt NPM package. Most notably, the
transport defaults the url option to mqtt://localhost:1883 and there is
no connection retrying. If the connection to the MQTT server is lost,
microservice messages will no longer be passed.
NATS
NATS is an open source message broker server that touts extremely
high throughput. The NATS transport make use of the nats NPM
package and a remote NATS server to pass messages between the
NestApplication and NestMicroservice contexts.
Start up a NATS server, such as the nats docker image, and the
application and all our microservice transaction will now be
processing through the NATS server. The below diagram shows a
simplified view of the data flow when a new user is created and we are
using the NATS transport.
Both the client and the server make a connection with the NATS server.
When client.send is called, the client alters the message pattern on the
fly to create pub and sub queues. One of the most notable differences
between the Redis transport and the NATS transport is that the NATS
transport makes use of queue groups. This means we can now have
multiple NestMicroservice contexts and the NATS server will load
balance messages between them. The server consumes the message
and removes the message pattern modification to find the correct
microservice handler. Once processing is complete in the microservice
handler, the pattern is modified again to match the sub channel. The
client consumes this new message, unsubscribes from the sub channel,
and passes the response back to the caller.
gRPC
gRPC is a remote procedural call client and server designed to be used
with Googles Protocol Buffers. gRPC and protocol buffers are extensive
subjects worthy of their own books. For that reason, we will stick to
discussing the setup and use of gRPC within a Nest.js application. To
get started, we will need the grpc NPM package. Before we can write
any code for our Nest.js application, we must write a Protocol Buffer
file.
syntax = "proto3";
package example.nestBook;
message User {
string firstName = 1;
string lastName = 2;
string email = 3;
}
message ShowUserRequest {
double userId = 1;
}
message ShowUserResponse {
User user = 1;
}
service UserService {
rpc show (ShowUserRequest) returns
(ShowUserResponse);
}
@Controller()
export class UserController implements OnModuleInit {
@Client(microserviceConfig)
private client: ClientGrpc;
private protoUserService: IProtoUserService;
constructor(
private readonly userService: UserService
) {
}
public onModuleInit() {
this.protoUserService =
this.client.getService<IProtoUserService>('UserService');
}
}
Notice that we still have the client property decorated with @Client,
but we have a new type ClientGrpcand a new property protoUserService.
The client injected when using the gRPC transport no longer contains
a send method. Instead, it has a getService method that we must use to
retrieve the service we defined in our proto file. We use
the onModuleInit lifecycle hook so the gRPC service is retrieved
immediately after Nest.js has instantiated our modules and before any
clients try to use the controller APIs. The getService method is a
generic and doesn’t actually contain any method definitions. Instead,
we need to provide our own.
import { Observable } from 'rxjs';
We could be a little more explicit with our interface but this gets the
point across. Now the protoUserService property in our controller will
have a show method allowing us to call the show gRPC service method.
@Get('users/:userId')
@UseGuards(CheckLoggedInUserGuard)
public async show(@Param('userId') userId: number, @Req() req, @Res()
res) {
this.protoUserService.show({ userId: parseInt(userId.toString(),
10) }).subscribe({
next: user => {
res.status(HttpStatus.OK).json(user);
},
error: error => {
res.status(HttpStatus.INTERNAL_SERVER_ERROR).json(error);
}
});
}
@GrpcMethod('UserService', 'show')
public async rpcShow(data: any) {
const user = await this.userService.findById(data.userId);
return {
user: {
firstName: user.firstName,
lastName: user.lastName,
email: user.email
}
};
}
In the above example, we did not defined the service name and service
method when calling the @GrpcMethod decorator. Nest.js will
automatically map these values to the method and class name. In this
example, this is equivalent to @GrpcMethod('UserController', 'rpcShow').
You may have noticed that we are using 0.0.0.0:5667 as the URL of our
gRPC server. When we start up the Nest.js application, it will create a
gRPC server on the localhost and listen on port 5667. On the surface,
this may look like a more complex version of the TCP transport.
However, the power of the gRPC transport is directly derived from the
language and platform agnostic nature of protocol buffers. This means
we can create a Nest.js application that exposes microservices using
gRPC that may be used by any other language or platform as long is it
also uses protocol buffers to connect to our microservices. We can also
create Nest.js applications that connect to microservices that may be
exposed in another language like Go.
When using the gRPC transport to connect to services at two or more
different URLs, we need to create an equal number of gRPC client
connections, one for each server. The above diagram shows how
processing would look if we offloaded the crud operations for
comments in our example blog application to a Go server. We use a
gRPC client to connect to the user microservices hosted in our Nest.js
application and a separate one to connect to the comment
microservices hosted in the Go application.
The same setup can be obtained by using any of the other transports.
However, you would have to write the extra code to serialize and
deserialize the messages between the Nest.js application and the Go
application hosting the microservice. By using the gRPC transport,
protocol buffers take care of that for you.
Custom transport
A custom transport allows you to define a new microservice client and
server for communicating between the NestApplication and
NestMicroservice contexts. You may want to create a custom transport
strategy for a number of reasons: you or your company already have a
message broker service that is does not have a built-in Nest.js
transport, or you need to customize how one of the built-in transports
works. For the purpose of our example, we will work through
implementing a new RabbitMQ transport.
constructor(
private readonly url: string,
private readonly queue: string
) {
super();
}
}
public close() {
this.channel && this.channel.close();
this.server && this.server.close();
}
if (!handler) {
return this.sendMessage({
id: packet.id,
err: NO_PATTERN_MESSAGE
});
}
constructor(
private readonly url: string,
private readonly queue: string) {
super();
}
private getQueues() {
return { pub: `${this.queue}_pub`, sub: `${this.queue}_sub`
};
}
}
this.responsesSubject.asObservable().pipe(
pluck('content'),
map(content => JSON.parse(content.toString()) as WritePacket
& PacketId),
filter(message => message.id === packet.id),
take(1)
).subscribe(({err, response, isDisposed}) => {
if (isDisposed || err) {
callback({
err,
response: null,
isDisposed: true
});
}
callback({err, response});
});
this.channel.sendToQueue(sub, Buffer.from(JSON.stringify(packet)));
}
Before we can use our new transport, we will need to update the
microservice configuration object we created earlier.
await app.startAllMicroservicesAsync();
await app.listen(3001);
}
The last piece of our custom transport is in our controller. Since we are
using a custom transport, we can no longer use
the @ClientProxy decorator. Instead, we have to instantiate our custom
transport our selves. You could do this in the constructor as so:
@Controller()
export class UserController {
client: ClientProxy;
Wait! You have now created a hard binding between the controller and
the custom transport client. This makes it more difficult to migrate to a
different strategy in the future and very difficult to test. Instead, let’s
make use of Nest.js’s fabulous Dependency Injection to create our
client. Start off with creating a new module to house and expose our
custom transport client.
const ClientProxy = {
provide: 'ClientProxy',
useFactory: () => new
RabbitMQTransportClient(microserviceConfig.url, 'nestjs_book')
};
@Module({
imports: [],
controllers: [],
components: [ClientProxy],
exports: [ClientProxy]
})
export class RabbitMQTransportModule {}
@Controller()
export class UserController {
constructor(
private readonly userService: UserService,
@Inject('ClientProxy')
private readonly client: ClientProxy
) {
}
The data flow and how the microservice client and server operate in
our RabbitMQ example are almost identical to the NATS transport. Just
like with NATS, RabbitMQ provides the ability to have multiple
NestMicroservice contexts consuming messages. RabbitMQ will work
to load balance between all the consumers.
Hybrid application
When we first started our microservice implementation in this chapter,
we modified the bootstrap method to call connectMicroservice. This is a
special method that converts our Nest.js application into a hybrid
application. This simply means our application now contains multiple
context types. Simple enough but this has some implications that you
should be aware of. Specifically, using the hybrid application approach,
you will no longer be able to attach global filters, pipes, guards, and
interceptors for the NestMicroservice context. This is because the
NestMicroservice context is immediately bootstrapped, but not
connected, in a hybrid application. To get around this limitation, we
can create our two contexts independently.
await rpcApp.listenAsync();
await app.listen(process.env.PORT || 3000);
}
@MessagePattern({cmd: 'users.create'})
public async rpcCreate(data: CreateUserRequest) {
if (!data || (data && Object.keys(data).length === 0)) throw new
RpcValidationException();
await this.userService.create(data);
}
We can extend this approach of bootstrapping our application to split
even more of our application into separate contexts. This still does not
make use of multiple processes or threads, but employing some more
advanced architecture design, we can gain those benefits.
However, that is not to say you can’t get these benefits. Nest.js just
doesn’t provide the tools out of the box. In most material found on the
subject of running a NodeJS application in production, the one thing
that is typically always covered and recommended is the use of the
NodeJS cluster module. We can do the same thing with our Nest.js
application.
await rpcApp.listenAsync();
}
if (cluster.isMaster) {
const appWorkers = [];
const rpcWorkers = [];
appWorkers.push(app);
rpcWorkers.push(rpc);
}
Summary
At the beginning of this chapter, we stated “microservice” was a
misleading name for this part of Nest.js. In fact, that could still be the
case, but it really depends on a number of factors. Our initial example
using the TCP transport could hardly qualify as a microservice by all
conventional definitions. Both the NestApplication and
NestMicroservice context were executing from the same process,
meaning a catastrophic failure in one could bring both down.
In the next chapter we will learn about routing and request handling in
Nest.js.
Chapter 10. Routing and request handling
in Nest.js
Routing and request handling in Nest.js is handled by the controllers
layer. Nest.js routes requests to handler methods, which are defined
inside controller classes. Adding a routing decorator such as @Get()to
a method in a controller tells Nest.js to create an endpoint for this
route path and route every corresponding request to this handler.
Request handlers
A basic GET request handler for the /entries route registered in the
EntryController could look like this:
@Controller('entries')
export class EntryController {
@Get()
index(): Entry[] {
const entries: Entry[] = this.entriesService.findAll();
return entries;
}
@Controller()
export class EntryController {
@Get('entries')
index(): Entry[] {
const entries: Entry[] = this.entriesService.findAll();
return entries;
}
Generating responses
Nest.js provides two approaches for generating responses.
Standard approach
Using the standard and recommended approach, which has been
available since Nest.js 4, Nest.js will automatically serialize the
JavaScript object or array returned from the handler method to JSON
and send it in the response body. If a string is returned, Nest.js will just
send the string without serializing it to JSON.
The default response status code is 200, expect for POST requests,
which uses 201. The response code for can easily be changed for a
handler method by using the @HttpCode(...) decorator. For example:
@HttpCode(204)
@Post()
create() {
// This handler will return a 204 status response
}
Express approach
An alternate approach to generating responses in Nest.js is to use a
response object directly. You can ask Nest.js to inject a response object
into a handler method using the @Res() decorator. Nest.js uses express
response objects].
You can rewrite the response handler seen earlier using a response
object as shown here.
@Controller('entries')
export class EntryController {
@Get()
index(@Res() res: Response) {
const entries: Entry[] = this.entriesService.findAll();
return res.status(HttpStatus.OK).json(entries);
}
}
The typings for the Response object come from express. Add
the @types/express package to your devDependencies in package.json to
use these typings.
Route parameters
Nest.js makes it easy to accept parameters from the route path. To do
so, you simply specify route parameters in the path of the route as
shown below.
@Controller('entries')
export class EntryController {
@Get(':entryId')
show(@Param() params) {
const entry: Entry =
this.entriesService.find(params.entryId);
return entry;
}
}
@Controller('entries')
export class EntryController {
@Get(':entryId')
show(@Param('entryId') entryId) {
const entry: Entry = this.entriesService.findOne(entryId);
return entry;
}
}
Request body
To access the body of a request, use the @Body() decorator.
@Controller('entries')
export class EntryController {
@Post()
create(@Body() body: Entry) {
this.entryService.create(body);
}
}
Request object
To access the client request details, you can ask Nest.js to inject the
request object into a handler using the @Req() decorator. Nest.js
uses express request objects.
For example,
@Controller('entries')
export class EntryController {
@Get()
index(@Req() req: Request): Entry[] {
const entries: Entry[] = this.entriesService.findAll();
return entries;
}
The typings for the Request object come from express. Add
the @types/express package to your devDependencies in package.json to
use these typings.
Asynchronous handlers
All of the examples shown so far in this chapter assume that handlers
are synchronous. In a real application, many handlers will need to be
asynchronous.
Async/await
Nest.js has support for async request handler functions.
Promise
Similarly, you can also just return a promise from a handler function
directly without using async/await.
@Controller('entries')
export class EntryController {
@Get()
index(): Promise<Entry[]> {
const entriesPromise: Promise<Entry[]> =
this.entryService.findAll();
return entriesPromise;
}
Observables
Nest.js request handlers can also return RxJS Observables.
@Controller('entries')
export class EntryController {
@Get()
index(): Observable<Entry[]> {
const entriesPromise: Observable<Entry[]> =
this.entryService.findAll();
return entriesPromise;
}
Error responses
Nest.js has an exception layer, which is responsible for catching
unhandled exceptions from request handlers and returning an
appropriate response to the client.
HttpException
If an exception thrown from a request handler is a HttpException, the
global exception filter will transform it to the a JSON response.
For example, you can throw an HttpException from the create() handler
function if the body is not valid as shown.
@Controller('entries')
export class EntryController {
@Post()
create(@Body() entry: Entry) {
if (!entry) throw new HttpException('Bad request',
HttpStatus.BAD_REQUEST);
this.entryService.create(entry);
}
}
"statusCode": 400,
"message": "Bad request"
}
@Controller('entries')
export class EntryController {
@Post()
create(@Body() entry: Entry) {
if (!entry) throw new HttpException({ status:
HttpStatus.BAD_REQUEST, error: 'Entry required' });
this.entryService.create(entry);
}
}
"statusCode": 400,
"error": "Entry required"
}
Unrecognized exceptions
If the exception is not recognized, meaning it is not HttpException or a
class that inherits from HttpException, then the client will receive the
JSON response below.
"statusCode": 500,
"message": "Internal server error"
}
Summary
With the help of using the EntryController from our example blog
application, this chapter has covered aspects of routing and request
handling in Nest.js. You should now understand various approaches
that you can use to write request handlers.
Document Settings
Each swagger document can contain a basic set of properties such as
the title of the application. This information can be configured using
the various public methods found on the DocumentBuilder class. These
methods all return the document instance allowing you to chain as
many of the methods as you need. Be sure to finish your configuration
before calling the build method. Once the build method has been called,
the document settings are no longer modifiable.
These methods are used to configure the info section of the swagger
document. The swagger specification requires
the title and version fields to be provided, but Nest.js will default these
values to an empty string and "1.0.0", respectively. If your project has
terms of service and a license, you can
use setTermsOfService and setLicense to provide a URL to those
resources within your application.
The setHost should contain only the server and port to access the APIs.
If, in your application, you use setGlobalPrefix to configure a base path
for the Nest.js application, set the same value in the swagger document
using setBasePath. The swagger specification uses a schemes array to
describe the transfer protocol used by the APIs. While the swagger
specification supports the ws and wss protocols as well as multiple
values, Nest.js limits the value to either http or https. Metadata and
external documentation can also be added to provide users of the
swagger document additional details regarding how the APIs work.
Documenting authentication
The swagger specification supports three types of authentication:
basic, API key, and Oauth2. Nest.js provides two different methods that
can be used to auto-configure the swagger document authentication
information with the possibility for some settings to be overridden.
Keep in mind, this is describing how users will authenticate with your
application.
The first parameter to the addOAuth2 method is the OAuth2 flow the
APIs use for authentication. In this example, we use the password flow to
indicate the user should send a username and password to the API.
You can also use implicit, application, and accessCode flow. The second
and third parameters are the URLs where the user will authorize
access to the APIs and request a refresh token, respectively. The last
parameter is an object of all the scopes with descriptions that are
available in the application.
For the blog application, we will keep the configuration simple and
store the configuration in a new file in the shared/config directory.
Having a central location will let us write the configuration once and
implement multiple times.
Our first implementation will use the configuration and the Nest.js
swagger module to produce two new endpoints in our application: one
to serve the swagger UI application and one to serve the swagger
document as raw JSON.
Swagger UI
The swagger module is unlike most other Nest.js modules. Instead of
being imported into your application’s primary app module, the
swagger module is configured within the main bootstrap of your
application.
If you have followed along with the book and created the blog
application, you may find that the Swagger UI produced does not
contain a lot of information about the APIs in the application. Since the
swagger document is built using Typescript decorator metadata, you
may need to alter your types or make use of the additional decorators
found in the Nest.js swagger module.
@Controller('entries/:entryId')
export class CommentController {
@Put('comments/:commentId')
public async update(
@Body() body: UpdateCommentRequest,
@Param('commentId') comment: string,
@Query('testQuery') testQuery: string,
@Headers('testHeader') testHeader: string
) {
}
}
From the example, we can see the header of this API card uses a
combination of the @Controller and @Put decorators to construct the
path to the API. The parameters section is built using
the @Body, @Param, @Query, and @Headers query params. The types we
provide to the decorated parameters is used in the Swagger UI as a hint
to the user regarding what is expected in the parameter.
Clicking the Try it out button in the header of the API card changes the
card into a set of inputs. This allows the user to fill in the required and
optional parameters of the API and execute the API call. We will cover
the remaining sections of the API card later. For now, let’s review the
basic parameter decorators in more detail.
@Body
You may have noticed in our example, the parameter we decorated
with @Body had a type of UpdateCommentRequest. Your application may or
may not have this class already. If not, let’s write it now.
The request class is very basic and makes use of the first decorator we
will cover from the Nest.js swagger module, @ApiModelPropertyOptional.
This decorator informs the swagger module that the bodyproperty of
the request class is an optional property that can be included in the
request body when calling the API. This decorator is actually a shortcut
for the @ApiModelProperty decorator. We could write our request class
as:
The property that has been decorated with the @Body decorator should
always have a type that is a class. Typescript interfaces cannot be
decorated and do not provide the same metadata that a class with
decorators can. If, in your application, any of your APIs have a property
with the @Body decorator and an interface type, the Nest.js swagger
module will not be able to correctly create the swagger document. In
fact, the Swagger UI will most likely note display the body parameter at
all.
@Param
The @Param decorator in our example contained a string value indicating
which URL parameter to use for the comment parameter of our
controller method. When the Nest.js swagger module encounters this
decorator with the provided string, it is able to determine the name of
the URL parameter and includes it in the swagger document along with
the type provided for the method parameter. However, we could have
also written the controller method without passing a string to
the @Param decorator to get an object containing all of the URL
parameters. If we do this, Nest.js will only be able to determine the
names and types of the URL parameters if we use a class as the type for
the comment parameter or use the @ApiImplicitParam decorator provided
by the Nest.js swagger module on the controller method. Let’s create a
new class to describe our URL params and see how it affects the
swagger UI.
@ApiModelProperty()
public commentId: string;
}
@Put('comments/:commentId')
public async update(
@Body() body: UpdateCommentRequest,
@Param() params: UpdateCommentParams,
@Query('testQuery') testQuery: string,
@Headers('testHeader') testHeader: string
) {
}
The swagger UI has been updated to show the comment put API takes
two required URL parameters: entryId and commentId. If you will be
writing APIs that use a single parameter in your method controller to
hold all of the URL parameters, your preferred method of informing
the Nest.js swagger module is what you should expect as URL
parameters. Using a class as the type for your URL parameters not only
informs the Nest.js swagger module of the URL parameters, it also
helps in writing your application by providing type checking and code
auto-completion.
If, however, you don’t want to make a new class to use as the type for
your URL parameters, use an interface, or one or more of the URL
parameters are in a Nest.js guard, or middleware, or in a custom
decorator, but not in the controller method. You can still inform the
Nest.js swagger module about the URL parameters using
the @ApiImplicitParam decorator.
@Put('comments/:commentId')
@ApiImplicitParam({ name: 'entryId' })
public async update(
@Body() body: UpdateCommentRequest,
@Param('commentId') comment: string,
@Query('testQuery') testQuery: string,
@Headers('testHeader') testHeader: string
) {
}
@Put('comments/:commentId')
@ApiImplicitParam({ name: 'entryId' })
@ApiImplicitParam({ name: 'commentId' })
public async update(
@Body() body: UpdateCommentRequest,
@Query('testQuery') testQuery: string,
@Headers('testHeader') testHeader: string
) {
}
@Query
The @Query decorator in our example contained a string value indicating
which query parameter to use for the testQuery parameter of our
controller method. When the Nest.js swagger module encounters this
decorator with the provided string, it is able to determine the name of
the query parameter and includes it in the swagger document along
with the type provided for the method parameter. However, we could
have also wrote the controller method without passing a string to
the @Query decorator to get an object containing all the query
parameters. If we do this, Nest.js will only be able to determine the
names and types of the query parameters if we use a class as the type
for the testQuery parameter or use the @ApiImplicitQuery decorator
provided by the Nest.js swagger module on the controller method.
Let’s create a new class to describe our query params and see how it
affects the Swagger UI.
@ApiModelPropertyOptional()
public testQueryB: string;
}
In the UpdateCommentQuery class, we have created two properties and
used the @ApiModelPropertyOptionaldecorator so the Nest.js swagger
module knows to include these properties with their types in the
swagger document. We can change our comment and put the
controller method to use the new class.
@Put('comments/:commentId')
public async update(
@Body() body: UpdateCommentRequest,
@Param('commentId') comment: string,
@Query() queryParameters: UpdateCommentQuery,
@Headers('testHeader') testHeader: string
) {
}
However, if you do not wish to make a new class to use as the type for
your query parameters, you use an interface, or the query parameters
are used in a Nest.js guard or middleware in a custom decorator, but
not in the controller method. You can still inform the Nest.js swagger
module about the query parameters using
the @ApiImplicitQuery decorator.
@Put('comments/:commentId')
@ApiImplicitQuery({ name: 'testQueryA' })
@ApiImplicitQuery({ name: 'testQueryB' })
public async update(
@Param('commentId') comment: string,
@Body() body: UpdateCommentRequest,
@Query() testQuery: any,
@Headers('testHeader') testHeader: string
) {
}
@Put('comments/:commentId')
@ApiImplicitQuery({ name: 'testQueryA' })
@ApiImplicitQuery({ name: 'testQueryB' })
public async update(
@Param('commentId') comment: string,
@Body() body: UpdateCommentRequest,
@Headers('testHeader') testHeader: string
) {
}
@Headers
The @Headers decorator in our example contained a string value
indicating which request header value to use for
the testHeader parameter of our controller method. When the Nest.js
swagger module encounters this decorator with the provided string, it
is able to determine the name of the request header and includes it in
the swagger document along with the type provided for the method
parameter. However, we could have also written the controller method
without passing a string to the @Headers decorator to get an object
containing all the request headers. If we do this, Nest.js will only be
able to determine the names and types of the request headers if we use
a class as the type for the testHeader parameter or use
the @ApiImplicitHeader decorator provided by the Nest.js swagger
module on the controller method. Let’s create a new class to describe
our query params and see how it affects the swagger UI.
@ApiModelPropertyOptional()
public testHeaderB: string;
}
@Put('comments/:commentId')
public async update(
@Body() body: UpdateCommentRequest,
@Param('commentId') comment: string,
@Query('testQuery') testQuery: string,
@Headers() headers: UpdateCommentHeaders
) {
}
If, however, you do not wish to make a new class to use as the type for
your expected headers, you use an interface, or the headers are used in
a Nest.js guard or middleware or in a custom decorator, but not in the
controller method. You can still inform the Nest.js swagger module
about the query parameters using the @ApiImplicitHeader or
the @ApiImplicitHeaders decorators.
@Put('comments/:commentId')
@ApiImplicitHeader({ name: 'testHeader' })
public async update(
@Body() body: UpdateCommentRequest,
@Param('commentId') comment: string,
@Query('testQuery') testQuery: string,
@Headers() headers: any
) {
}
@Put('comments/:commentId')
@ApiImplicitHeader({ name: 'testHeaderA' })
@ApiImplicitHeader({ name: 'testHeaderB' })
public async update(
@Body() body: UpdateCommentRequest,
@Param('commentId') comment: string,
@Query('testQuery') testQuery: string,
) {
}
@Put('comments/:commentId')
@ApiImplicitHeader([
{ name: 'testHeaderA' },
{ name: 'testHeaderB' }
])
public async update(
@Body() body: UpdateCommentRequest,
@Param('commentId') comment: string,
@Query('testQuery') testQuery: string,
) {
}
Authentication
It is very likely that you will need to have some form of authentication
in your application at some point. The blog example application uses
a username and password combination to authenticate a user and
provides a JSON web token to allow the user to access the APIs.
However you decide to setup authentication, one thing is for sure: you
will require either query parameters or headers to maintain an
authentication state and you will most likely use Nest.js middleware or
guards to check a user’s authentication state. You do this because
writing that code in every controller method creates a lot of code
duplication and would complicate every controller method.
@Put('comments/:commentId')
@ApiBearerAuth()
public async update(
@Body() body: UpdateCommentRequest,
@Param('commentId') comment: string,
@Query('testQuery') testQuery: string,
@Headers('testHeader') testHeader: string
) {
}
@Put('comments/:commentId')
@ApiOAuth2Auth(['test'])
public async update(
@Body() body: UpdateCommentRequest,
@Param('commentId') comment: string,
@Query('testQuery') testQuery: string,
@Headers('testHeader') testHeader: string
) {
}
This example depicts a single controller method, API, that requires a
specific set of OAuth2 roles to be able to use the API.
The @ApiOAuth2Auth decorator takes an array of all the roles the user
should have in order to have access to the API.
All of the APIs we have covered in our example blog application follow
a typical modal of accepting inputs in the form of JSON. However, it is
possible that an application may need to take a different input type,
often referred to as a MIME type. For example, we could allow users of
our example blog application to upload an avatar image. An image
cannot easily be represented as JSON so we would need to build an API
that takes an input MIME type of image/png. We can ensure this
information is present in our application’s swagger document by using
the @ApiConsumes decorator.
@Put('comments/:commentId')
@ApiConsumes('image/png')
public async update(
@Body() body: UpdateCommentRequest,
@Param('commentId') comment: string,
@Query('testQuery') testQuery: string,
@Headers('testHeader') testHeader: string
) {
}
@Put('comments/:commentId')
@ApiProduces('image/png')
public async update(
@Body() body: UpdateCommentRequest,
@Param('commentId') comment: string,
@Query('testQuery') testQuery: string,
@Headers('testHeader') testHeader: string
) {
}
@Controller('entries/:entryId')
@ApiResponse({
status: 500,
description: 'An unknown internal server error occurred'
})
export class CommentController {
@Put('comments/:commentId')
@ApiResponse({
status: 200,
description: 'The comment was successfully updated',
type: UpdateCommentResponse
})
public async update(
@Body() body: UpdateCommentRequest,
@Param('commentId') comment: string,
@Query('testQuery') testQuery: string,
@Headers('testHeader') testHeader: string
) {
}
}
@Put('comments/:commentId')
@ApiOperation({
title: 'Comment Update',
description: 'Updates a specific comment with new content',
operationId: 'commentUpdate'
})
public async update(
@Body() body: UpdateCommentRequest,
@Param('commentId') comment: string,
@Query('testQuery') testQuery: string,
@Headers('testHeader') testHeader: string
) {
}
@Controller('entries/:entryId')
@ApiUseTags('comments')
export class CommentController {
fs.ensureDirSync(path.join(process.cwd(), 'dist'));
fs.writeJsonSync(path.join(process.cwd(), 'dist', 'api-doc.json'),
document, { spaces: 2 });
}
writeDoc();
You can place this file in the root of your project or in the source
directory and use an NPM script entry to execute it or run it using
NodeJS. The example code will use the Nest.js swagger module to build
a swagger document and fs-extras to write the document to
the dist directory as a JSON file.
Summary
In this chapter, we covered how the Nest.js swagger module makes use
of the existing decorators you use in your application to create a
swagger v2 specification document. We also covered all the additional
decorators the Nest.js swagger module provides to enhance the
information in the swagger document. We also setup the example blog
application to expose the swagger UI.
But what happens when we are dealing with a large scale application
that may have unique and complex business logic for saving data?
Or maybe we would like to initiate some logic in the background so
the UI is able to call APIs without having to wait for all the business
logic to finish. These are areas where CQRS makes sense. CQRS can
be used to isolate and break apart complex business logic, initiate
that business logic synchronously or asynchronously, and compose
the isolated pieces to solve new business problems.
To ensure the UI does not suffer any performance loss, all keyword
entity operations will be done asynchronously. Keywords will be
stored on the blog entry entity as a string to provide the UI a quick
reference without having to query the keyword table in the database.
Before getting started, be sure you ran npm install @nestjs/cqrs in
your project. To see a working example, remember you can clone the
accompanying Git repository for this book:
git clone https://github.com/backstopmedia/nest-book-example.git
Notice some differences with the update and delete commands? For
the update command, we need to know which database model we are
updating. Likewise, for the delete command, we only need to know the
id of the database model we are deleting. In both cases, having
the userId does not make sense since a blog entry can never be moved
to another user and the userId has no influence on the deletion of a
blog entry.
Command handlers
Now that we have commands for our database write operations, we
need some command handlers. Each command should have an
accompanying handler in a one-to-one fashion. The command handler
is much like our current blog entry service. It will take care of all the
database operations. Typically, the command handlers are placed in a
sub-directory of the module similar to commands/handlers.
@CommandHandler(CreateEntryCommand)
export class CreateEntryCommandHandler implements
ICommandHandler<CreateEntryCommand> {
constructor(
@Inject('EntryRepository') private readonly
entryRepository: typeof Entry,
@Inject('SequelizeInstance') private readonly
sequelizeInstance
) { }
resolve();
}
If you are following along with the example project, you may notice
our execute method looks almost like the create method of the blog
entry service. In fact, almost all of the code for the command handler is
a direct copy from the blog entry service. The big difference is that we
do not return a value. Instead, the execute method of all command
handlers takes a callback method as their second argument.
} finally {
resolve();
}
}
Notice we invoke the resolve callback in the finally block. This is done
to ensure that, no matter the outcome, the command handler will
complete execution and the API will finish processing. But what
happens when an exception is thrown from our ORM. The blog entry
wasn’t saved to the database, but since the API controller did not know
an error occurred, it will return a 200 HTTP status to the UI. To
prevent this, we can catch the error and pass that as an argument to
the resolve method. This might break with the CQRS pattern but it is
better to let the UI know something went wrong than assume the blog
entry was saved.
try {
await this.sequelizeInstance.transaction(async transaction
=> {
return await
this.entryRepository.create<Entry>(command, {
returning: true,
transaction
});
});
} catch (error) {
caught = error
} finally {
resolve(caught);
}
}
Note: Nest.js does not provide any stipulation for when the callback
method must be invoked. We could invoke the callback at the
beginning of the execute method. Nest.js would return processing back
to the controller so the UI is immediately updated and process the
remaining pieces of the executemethod afterwards.
@CommandHandler(UpdateEntryCommand)
export class UpdateEntryCommandHandler implements
ICommandHandler<UpdateEntryCommand> {
constructor(
@Inject('EntryRepository') private readonly
entryRepository: typeof Entry,
@Inject('SequelizeInstance') private readonly
sequelizeInstance: Sequelize,
private readonly databaseUtilitiesService:
DatabaseUtilitiesService
) { }
try {
await this.sequelizeInstance.transaction(async
transaction => {
let entry = await
this.entryRepository.findById<Entry>(command.id, { transaction });
if (!entry) throw new Error('The blog entry was
not found.');
entry = this.databaseUtilitiesService.assign(
entry,
{
...command,
id: undefined
}
);
return await entry.save({
returning: true,
transaction,
});
});
} catch (error) {
caught = error
} finally {
resolve(caught);
}
}
}
@CommandHandler(DeleteEntryCommand)
export class DeleteEntryCommandHandler implements
ICommandHandler<DeleteEntryCommand> {
constructor(
@Inject('EntryRepository') private readonly
entryRepository: typeof Entry,
@Inject('SequelizeInstance') private readonly
sequelizeInstance: Sequelize
) { }
async execute(command: DeleteEntryCommand, resolve: (error?:
Error) => void) {
let caught: Error;
try {
await this.sequelizeInstance.transaction(async
transaction => {
return await this.entryRepository.destroy({
where: { id: command.id },
transaction,
});
});
} catch (error) {
caught = error
} finally {
resolve(caught);
}
resolve();
}
}
@Controller()
export class EntryController {
constructor(
private readonly entryService: EntryService,
private readonly commandBus: CommandBus
) { }
@Post('entries')
public async create(@User() user: IUser, @Body() body: any,
@Res() res) {
if (!body || (body && Object.keys(body).length === 0))
return res.status(HttpStatus.BAD_REQUEST).send('Missing some
information.');
if (error) {
return
res.status(HttpStatus.INTERNAL_SERVER_ERROR).send(result);
} else {
return res.set('location',
`/entries/${result.id}`).status(HttpStatus.CREATED).send();
}
}
@Controller()
export class EntryController {
constructor(
private readonly entryService: EntryService,
private readonly commandBus: CommandBus
) { }
@Get('entries')
public async index(@User() user: IUser, @Res() res) {
const entries = await this.entryService.findAll();
return res.status(HttpStatus.OK).json(entries);
}
@Post('entries')
public async create(@User() user: IUser, @Body() body: any,
@Res() res) {
if (!body || (body && Object.keys(body).length === 0))
return res.status(HttpStatus.BAD_REQUEST).send('Missing some
information.');
if (error) {
return
res.status(HttpStatus.INTERNAL_SERVER_ERROR).send(result);
} else {
return res.set('location',
`/entries/${result.id}`).status(HttpStatus.CREATED).send();
}
}
@Get('entries/:entryId')
public async show(@User() user: IUser, @Entry() entry: IEntry,
@Res() res) {
return res.status(HttpStatus.OK).json(entry);
}
@Put('entries/:entryId')
public async update(@User() user: IUser, @Entry() entry: IEntry,
@Param('entryId') entryId: number, @Body() body: any, @Res() res) {
if (user.id !== entry.userId) return
res.status(HttpStatus.NOT_FOUND).send('Unable to find the entry.');
const error = await this.commandBus.execute(new
UpdateEntryCommand(
entryId,
body.title,
body.content,
user.id
));
if (error) {
return
res.status(HttpStatus.INTERNAL_SERVER_ERROR).send(error);
} else {
return res.status(HttpStatus.OK).send();
}
}
@Delete('entries/:entryId')
public async delete(@User() user: IUser, @Entry() entry: IEntry,
@Param('entryId') entryId: number, @Res() res) {
if (user.id !== entry.userId) return
res.status(HttpStatus.NOT_FOUND).send('Unable to find the entry.');
const error = await this.commandBus.execute(new
DeleteEntryCommand(entryId));
if (error) {
return
res.status(HttpStatus.INTERNAL_SERVER_ERROR).send(error);
} else {
return res.status(HttpStatus.OK).send();
}
}
}
You can see from the example that the controller has been updated so
the blog entry service is only used for retrievals and all modification
methods now dispatch commands on the command bus. The last thing
we need to configure is the blog entry module. To make this easier,
let’s first setup a Typescript barrel to export all our handlers as a single
variable.
Import the barrel into the blog entry module and hook up the module
to the command bus.
@Module({
imports: [CQRSModule, EntryModule],
controllers: [CommentController],
components: [commentProvider, CommentService,
...CommentCommandHandlers],
exports: [CommentService]
})
export class EntryModule implements NestModule, OnModuleInit {
public constructor(
private readonly moduleRef: ModuleRef,
private readonly commandBus: CommandBus
) {}
public onModuleInit() {
this.commandBus.setModuleRef(this.moduleRef);
this.commandBus.register(CommentCommandHandlers);
}
}
@Table(tableOptions)
export class Entry extends Model<Entry> {
@Column({
type: DataType.TEXT,
allowNull: true,
})
public keywords: string;
The ORM definition for the new database column will depend on the
ORM and database server you are using. Here, we are using
the TEXT data type. This data type is widely supported in many different
database servers and offers a large limit to the amount of data we can
store. For example, Microsoft SQL Server limits this field to a maximum
of 2^30 - 1 characters, while Postgres does not impose a limit. Since we
are using an ORM with migrations, we will also need to create a
migration script. If you are unsure of how to do this, reference back to
the TypeORM or Sequelize chapters.
If you are following along, your entries database table should now have
a keywords column. Testing the index API in the blog entries controller
should now return objects with a keywords value. We still need to
update the blog entry commands, command handlers, and controller to
process keywords for new and updated blog entries.
@Controller()
export class EntryController {
@Post('entries')
public async create(@User() user: IUser, @Body() body: any,
@Res() res) {
if (!body || (body && Object.keys(body).length === 0))
return res.status(HttpStatus.BAD_REQUEST).send('Missing some
information.');
if (error) {
return
res.status(HttpStatus.INTERNAL_SERVER_ERROR).send(result);
} else {
return res.set('location',
`/entries/${result.id}`).status(HttpStatus.CREATED).send();
}
}
@Put('entries/:entryId')
public async update(@User() user: IUser, @Entry() entry: IEntry,
@Param('entryId') entryId: number, @Body() body: any, @Res() res) {
if (user.id !== entry.userId) return
res.status(HttpStatus.NOT_FOUND).send('Unable to find the entry.');
const error = await this.commandBus.execute(new
UpdateEntryCommand(
entryId,
body.title,
body.content,
body.keywords,
user.id
));
if (error) {
return
res.status(HttpStatus.INTERNAL_SERVER_ERROR).send(error);
} else {
return res.status(HttpStatus.OK).send();
}
}
}
@CommandHandler(CreateEntryCommand)
export class CreateEntryCommandHandler implements
ICommandHandler<CreateEntryCommand> {
try {
await this.sequelizeInstance.transaction(async
transaction => {
return await this.EntryRepository.create<Entry>({
...command,
keywords: JSON.stringify(command.keywords)
}, {
returning: true,
transaction
});
});
} catch (error) {
caught = error;
} finally {
resolve(caught);
}
}
}
@CommandHandler(UpdateEntryCommand)
export class UpdateEntryCommandHandler implements
ICommandHandler<UpdateEntryCommand> {
comment = this.databaseUtilitiesService.assign(
comment,
{
...command,
id: undefined,
keywords:
JSON.stringify(command.keywords)
}
);
return await comment.save({
returning: true,
transaction,
});
});
} catch (error) {
caught = error;
} finally {
resolve(caught);
}
}
}
Both
the CreateEntryCommandHandler and UpdateEntryCommandHandler command
handlers have been updated to convert the keywords string array into
a JSON string. Keywords also need to be stored individually in their
own table with a list of blog entries they apply to and the last updated
date. To do this, we will need to make a new Nest.js module with an
entity. We will come back later to add more functionality. First, create
the new entity.
@Column({
type: DataType.STRING,
allowNull: false,
validate: {
isUnique: async (value: string, next: any):
Promise<any> => {
const isExist = await Keyword.findOne({ where: {
keyword: value } });
if (isExist) {
const error = new Error('The keyword
already exists.');
next(error);
}
next();
},
},
})
public keyword: string;
@CreatedAt
public createdAt: Date;
@UpdatedAt
public updatedAt: Date;
@DeletedAt
public deletedAt: Date;
@Table(tableOptions)
export class KeywordEntry extends Model<KeywordEntry> {
@ForeignKey(() => Keyword)
@Column({
type: DataType.BIGINT,
allowNull: false
})
public keywordId: number;
@CreatedAt
public createdAt: Date;
}
Our ORM will use the @ForeignKey decorators to link entries in this
database table to the keywords and entries tables. We are also adding
a createdAt column to help us find the latest keywords that have been
linked to a blog entry. We will use this to create our list of “hot
keywords.” Next, create the migration script to add the new tables to
the database.
@Module({
imports: [],
controllers: [],
components: [keywordProvider, keywordEntryProvider],
exports: []
})
export class KeywordModule {}
Keyword events
Events can be thought of as commands with a few differences. Outside
of not being module scoped, they are also asynchronous, they are
typically dispatched by models or entities, and each event can have any
number of event handlers. This makes them perfect for handling
background updates to the keywords database table when blog entries
are created and updated.
Before we start writing code, let’s give some thought to how we want
our application to work. When a new blog entry is created, the
application needs to inform the keyword module that a blog entry has
been associated with a keyword. We should leave it up to the keyword
module to determine if the keyword is new and needs to be created or
already exists and simply needs to be updated. The same logic should
apply to updates made to blog entries but we can make our blog entry
update process simpler if we do not try to determine which keywords
are new and which have been removed. To support both scenarios, we
should create a generic event to update all keyword links for the blog
entry.
The event classes should look pretty similar to the command classes
we wrote earlier in this chapter. The difference is the event classes
implement the IEvent interface to let Nest.js know instances of these
classes are CQRS events. We also need to setup handlers for these
events. Just like command handlers, our event handlers will take care
of all the database operations. Typically, the event handlers are placed
in a sub-directory of the module similar to events/handlers.
@EventsHandler(UpdateKeywordLinksEvent)
export class UpdateKeywordLinksEventHandler implements
IEventHandler<UpdateKeywordLinksEvent> {
constructor(
@Inject('KeywordRepository') private readonly
keywordRepository: typeof Keyword,
@Inject('SequelizeInstance') private readonly
sequelizeInstance: Sequelize,
) { }
Event handlers are simple classes with a single method, handle, that is
responsible for handling the event. Implementing
the IEventHandler<UpdateKeywordLinksEvent> interface helps ensure we
write our event handler correctly. Nest.js uses
the @EventsHandler annotation in our example to know this class is
meant to handle our new UpdateKeywordLinksEvent event.
For the update links event handler, we should split out the logic into
separate methods to make the class a little easier to read and manage.
Let’s write the handle method so it loops through all the keywords and
ensures the keyword exists and the blog entry is associated with the
keyword. Finally, we should ensure the blog entry is not associated
with any keywords that are not in the event keywordsarray.
@EventsHandler(UpdateKeywordLinksEvent)
export class UpdateKeywordLinksEventHandler implements
IEventHandler<UpdateKeywordLinksEvent> {
constructor(
@Inject('KeywordRepository') private readonly
keywordRepository: typeof Keyword,
@Inject('SequelizeInstance') private readonly
sequelizeInstance: Sequelize,
) { }
keywordEntities.forEach(keywordEntity => {
if
(event.keywords.indexOf(keywordEntity.keyword) === -1) {
removedKeywords.push(keywordEntity);
}
});
event.keywords.forEach(keyword => {
if (keywordEntities.findIndex(keywordEntity
=> keywordEntity.keyword === keyword) === -1) {
newKeywords.push(keyword)
}
});
await Promise.all(
newKeywords.map(
keyword =>
this.ensureKeywordLinkExists(transaction, keyword, event.entryId)
)
);
await Promise.all(
removedKeywords.map(
keyword => keyword.$remove('entries',
event.entryId, { transaction })
)
);
});
} catch (error) {
console.log(error);
}
}
The event handler logic starts with finding all keywords the blog entry
is currently linked to. We loop through those and pull out any that are
not in the new keywords array. To find all new keywords, we loop
trough the keywords array in the event to find those that are not in
the keywordEntities array. The new keywords are processing through
the ensureKeywordLinkExists method.
The ensureKeywordLinkExistsuses ensureKeywordExists to create or find
the keyword in the keywords database table and adds the blog entry to
the keywords entries array. The $add and $remove methods are provided
by sequelize-typescriptand are used to quickly add and remove blog
entries without having to query for the blog entry. All processing is
done using transactions to ensure any errors will cancel all database
updates. If an error does happen, the database will become out of sync,
but since we are dealing with metadata, it’s not a big deal. We log the
error out so application admins will know they need to re-sync the
metadata.
Even though we only have one event handler, we should still create a
Typescript barrel to export it in an array. This will ensure adding new
events later is a simple process.
Import the barrel in the keyword module and connect the event bus.
@Module({
imports: [CQRSModule],
controllers: [],
components: [keywordProvider, ...keywordEventHandlers],
exports: []
})
export class KeywordModule implements OnModuleInit {
public constructor(
private readonly moduleRef: ModuleRef,
private readonly eventBus: EventBus
) {}
public onModuleInit() {
this.eventBus.setModuleRef(this.moduleRef);
this.eventBus.register(keywordEventHandlers);
}
}
In the module, import the CQRSModule and add ModuleRef and EventBus as
constructor params. Implement the OnModuleInit interface and create
the onModuleInit method. In the onModuleInit method, we set the module
reference of event bus to the current module using setModuleRef and
use register to register all of the event handlers. Remember to also add
the event handlers to the components array or Nest.js will not be able to
instantiate the event handlers. Now that we have our event and event
handler written and linked in our keyword module, we are ready to
start invoking the event to store and update keyword links in the
database.
updateKeywordLinks(keywords: string[]) {
this.apply(new UpdateKeywordLinksEvent(this.id, keywords));
}
}
We create our data model in the blog entry module since we will be
invoking our events when blog entries are created and updated. The
data model contains a single method, updateKeywordLinks, for refreshing
blog entry keyword links when a blog entry is created or updated. If
new events are needed, we will add more methods to the model to
handle invoking those events. The updateKeywordLinksmethod
instantiates the event we created and call the apply method found in
the AggregateRoot abstract class with the event instance.
@CommandHandler(CreateEntryCommand)
export class CreateEntryCommandHandler implements
ICommandHandler<CreateEntryCommand> {
constructor(
@Inject('EntryRepository') private readonly
EntryRepository: typeof Entry,
@Inject('SequelizeInstance') private readonly
sequelizeInstance: Sequelize,
private readonly eventPublisher: EventPublisher
) { }
try {
const entry = await
this.sequelizeInstance.transaction(async transaction => {
return await this.EntryRepository.create<Entry>({
...command,
keywords: JSON.stringify(command.keywords)
}, {
returning: true,
transaction
});
});
const entryModel =
this.eventPublisher.mergeObjectContext(new EntryModel(entry.id));
entryModel.updateKeywordLinks(command.keywords);
entryModel.commit();
} catch (error) {
caught = error;
} finally {
resolve(caught);
}
}
}
@CommandHandler(UpdateEntryCommand)
export class UpdateEntryCommandHandler implements
ICommandHandler<UpdateEntryCommand> {
constructor(
@Inject('EntryRepository') private readonly
EntryRepository: typeof Entry,
@Inject('SequelizeInstance') private readonly
sequelizeInstance: Sequelize,
private readonly databaseUtilitiesService:
DatabaseUtilitiesService,
private readonly eventPublisher: EventPublisher
) { }
try {
await this.sequelizeInstance.transaction(async
transaction => {
let entry = await
this.EntryRepository.findById<Entry>(command.id, { transaction });
if (!entry) throw new Error('The comment was not
found.');
entry = this.databaseUtilitiesService.assign(
entry,
{
...command,
id: undefined,
keywords:
JSON.stringify(command.keywords)
}
);
return await entry.save({
returning: true,
transaction,
});
});
const entryModel =
this.eventPublisher.mergeObjectContext(new EntryModel(command.id));
entryModel.updateKeywordLinks(command.keywords);
entryModel.commit();
} catch (error) {
caught = error;
} finally {
resolve(caught);
}
}
}
If you have followed along in your own project, you should now be able
to create or update a blog entry with new or existing keywords and see
the keyword links being created, updated, and deleted in the database.
Of course, we could make these changes easier to view by adding a new
API to return all the keywords and blog entries they are linked to.
The above diagram provides a visual representation of how the entry
command handlers work to keep the keywords updated. Notice how
the flow of control is unidirectional. The command handler invokes the
event using the entry model and then forgets about it. This is the
asynchronous nature of the event bus in Nest.js CQRS.
@Component()
export class KeywordService implements IKeywordService {
constructor(@Inject('KeywordRepository') private readonly
keywordRepository: typeof Keyword,
@Inject('KeywordEntryRepository') private
readonly keywordEntryRepository: typeof KeywordEntry) { }
if (search) {
if (!limit || limit < 1 || limit === NaN) {
limit = 10;
}
options = {
where: {
keyword: {
[Op.like]: `%${search}%`
}
},
limit
}
}
return await
this.keywordRepository.findAll<Keyword>(options);
}
return result;
}
}
The findAll method takes an optional search string and limit that can
be used to filter the keywords. The UI can use this to support keyword
search autocomplete. If the limit is not specified when searching, the
service will automatically limit the results to 10 items.
The findById method will support loading all information for a single
keyword, including the associated entries. The methods are relatively
basic and mimic methods in the services of the other modules.
The findHotLinks method, however, is a bit more complex.
Once we have the two lists, we reuse the service’s findById method to
load the complete data record for all the found keywords. This list is
then returned with the keywords that have the newest links first,
ordered newest to oldest, followed by the keywords with the most
links, order most to least. All that remains is to create a controller so
the UI can take advantage of our new query methods.
@Controller()
export class KeywordController {
constructor(
private readonly keywordService: KeywordService
) { }
@Get('keywords')
public async index(@Query('search') search: string,
@Query('limit') limit: string, @Res() res) {
const keywords = await this.keywordService.findAll(search,
Number(limit));
return res.status(HttpStatus.OK).json(keywords);
}
@Get('keywords/hot')
public async hot(@Res() res) {
const keywords = await this.keywordService.findHotLinks();
return res.status(HttpStatus.OK).json(keywords);
}
@Get('keywords/:keywordId')
public async show(@Param('keywordId') keywordId: string, @Res()
res) {
const keyword = await
this.keywordService.findById(Number(keywordId));
return res.status(HttpStatus.OK).json(keyword);
}
}
@CommandHandler(LinkKeywordEntryCommand)
export class LinkKeywordEntryCommandHandler implements
ICommandHandler<LinkKeywordEntryCommand> {
constructor(
@Inject('KeywordRepository') private readonly
keywordRepository: typeof Keyword,
@Inject('SequelizeInstance') private readonly
sequelizeInstance: Sequelize
) { }
try {
await this.sequelizeInstance.transaction(async
transaction => {
const keyword = await
this.keywordRepository.findOrCreate({
where: {
keyword: command.keyword
},
transaction
});
@CommandHandler(UnlinkKeywordEntryCommand)
export class UnlinkKeywordEntryCommandHandler implements
ICommandHandler<UnlinkKeywordEntryCommand> {
constructor(
@Inject('KeywordRepository') private readonly
keywordRepository: typeof Keyword,
@Inject('SequelizeInstance') private readonly
sequelizeInstance: Sequelize
) { }
try {
await this.sequelizeInstance.transaction(async
transaction => {
const keyword = await
this.keywordRepository.findOrCreate<Keyword>({
where: {
keyword: command.keyword
},
transaction
});
await keyword[0].$remove('entries',
command.entryId, { transaction });
});
} catch (error) {
caught = error;
} finally {
resolve(caught);
}
}
}
@Module({
imports: [CQRSModule],
controllers: [KeywordController],
components: [keywordProvider, keywordEntryProvider,
...keywordEventHandlers, KeywordService, ...keywordCommandHandlers],
exports: []
})
export class KeywordModule implements OnModuleInit {
public constructor(
private readonly moduleRef: ModuleRef,
private readonly eventBus: EventBus,
private readonly commandBus: CommandBus
) {}
public onModuleInit() {
this.commandBus.setModuleRef(this.moduleRef);
this.commandBus.register(keywordCommandHandlers);
this.eventBus.setModuleRef(this.moduleRef);
this.eventBus.register(keywordEventHandlers);
}
}
Just like the entry module, we created a Typescript barrel to export the
command handlers as an array. That gets imported into the module
definition and registered to the command bus using
the registermethod.
Keyword saga
Sagas are always written as public methods inside component classes
to allow for Dependency Injection. Typically, you would create a single
saga class for each module you wish to implement sagas in, but
multiple classes would make sense when breaking up complex
business logic. For the update keyword saga, we will need a single saga
method that accepts the UpdateKeywordLinksEvent event and outputs
multiple LinkKeywordEntryCommand and UnlinkKeywordEntryCommand comman
ds.
@Component()
export class KeywordSagas {
constructor(
@Inject('KeywordRepository') private readonly
keywordRepository: typeof Keyword,
@Inject('SequelizeInstance') private readonly
sequelizeInstance: Sequelize,
) { }
public updateKeywordLinks(events$: EventObservable<any>) {
return events$.ofType(UpdateKeywordLinksEvent).pipe(
mergeMap(event =>
merge( // From the rxjs package
this.getUnlinkCommands(event),
this.getLinkCommands(event)
)
)
);
}
}
The remaining pieces to our saga is strictly RxJS functionality. You are
free to use any RxJS operator, as long as the saga emits one or more
CQRS commands. For our saga, we will be using mergeMap to flatten an
inner observable stream of commands. Do not use switchMap here or
commands could be lost if the API is under heavy load due to
how switchMap is cancelled when the outer observable fires multiple
times. The inner observable is a merging of two different observable
streams: this.getUnlinkCommands(event) is a stream
of UnlinkKeywordEntryCommand commands
and this.getLinkCommands(event) is a stream
of LinkKeywordEntryCommand commands.
Summary
CQRS is not just a Nest.js package. It is a pattern for designing and
laying out your application. It requires that you split the command,
creation and update of data, from the query, the retrieving of data, and
aspects of your application. For small applications, CQRS can add a lot
of unnecessary complexity so it’s not for every application. For
medium and large applications, CQRS can help break apart complex
business logic into more manageable pieces.
The next time you find yourself writing complex code to perform some
business logic based on how the user is interacting with your
application, consider giving the CQRS pattern a try. The complexity of
the pattern may be offset by the complexity or eventual complexity of
your applications business logic.
In the next chapter we examine the architecture for two different types
of projects: A server application, and an app using Angular
universal with Nest.js and Angular 6.
Chapter 13. Architecture
As you now know, Nest.js is based on the same principles as
Angular, so it is a good idea to have a similar structure as Angular’s.
Before going into the file structure, we will see some guidelines
about the naming and about how to structure our different
directories and files in order to have an easy and more readable
project.
A server application
A more complete app using Angular universal with Nest.js
and Angular 6
By the end of the chapter, you should know how to structure your
app either for a server application or a complete app with a client
front-end.
Controller
The naming of the controller should respect the following principle:
user.controller.ts
@Controller()
export class UserController { /* ... */ }
Service
The naming of the service should respect the following principle:
user.service.ts
@Injectable()
export class UserService { /* ... */ }
Module
The naming of the module should respect the following principle:
user.module.ts
@Module()
export class UserModule { /* ... */ }
Middleware
The naming of the middleware should respect the following principle:
authentication.middleware.ts
@Injectable()
export class AuthenticationMiddleware { /* ... */ }
Exception filter
The naming of the exception filter should respect the following
principle:
forbidden.exception.ts
Pipe
The naming of the pipe should respect the following principle:
validation.pipe.ts
@Injectable()
export class ValidationPipe { /* ... */ }
Guard
The naming of the guard should respect the following principle:
roles.guard.ts
@Injectable()
export class RolesGuard { /* ... */ }
Interceptor
The naming of the interceptor should respect the following principle:
logging.interceptor.ts
@Injectable()
export class LoggingInterceptor { /* ... */ }
Custom decorator
The naming of the custom decorator should respect the following
principle:
comment.decorator.ts
Gateway
The naming of the gateway should respect the following principle:
comment.gateway.ts
@WebSocketGateway()
export class CommentGateway {
Adapter
The naming of the adapter should respect the following principle:
ws.adapter.ts
Unit test
The naming of the unit test should respect the following principle:
user.service.spec.ts
E2E test
The naming of the e2e test should respect the following principle:
user.e2e-spec.ts
Now we have overviewed the tools provided by Nest.js and have put in
place some naming guidelines. We can now move onto the next part.
Directory structure
It is important to have a project with a well-structured directory file in
order for it to be much more readable, understandable, and easy to
work with.
So, let’s see how we can structure our directory in order for it to be
more clear. You will see in the following example the directory file
architecture used for the repository, which has been created for this
book using the naming convention described in the previous section.
Server architecture
For the server architecture, you will see a proposed architecture used
for the repository to have clean directories.
COMPLETE OVERVIEW
See the base file structure without entering into too much detail:
.
├── artillery/
├── scripts/
├── migrations/
├── src/
├── Dockerfile
├── README.md
├── docker-compose.yml
├── migrate.ts
├── nodemon.json
├── package-lock.json
├── package.json
├── tsconfig.json
├── tslint.json
└── yarn.lock
We have four folders for this server that contain all of the files that we
need for a complete server:
src
├── app.module.ts
├── main.cluster.ts
├── main.ts
├── gateways
│ ├── comment
│ └── user
├── modules
│ ├── authentication
│ ├── comment
│ ├── database
│ ├── entry
│ ├── keyword
│ └── user
└── shared
├── adapters
├── config
├── decorators
├── exceptions
├── filters
├── guards
├── interceptors
├── interfaces
├── middlewares
├── pipes
└── transports
Modules
src/modules
├── authentication
│ ├── authentication.controller.ts
│ ├── authentication.module.ts
│ ├── authentication.service.ts
│ ├── passports
│ │ └── jwt.strategy.ts
│ └── tests
│ ├── e2e
│ │ └── authentication.controller.e2e-spec.ts
│ └── unit
│ └── authentication.service.spec.ts
├── comment
│ ├── comment.controller.ts
│ ├── comment.entity.ts
│ ├── comment.module.ts
│ ├── comment.provider.ts
│ ├── comment.service.ts
│ ├── interfaces
│ │ ├── IComment.ts
│ │ ├── ICommentService.ts
│ │ └── index.ts
│ └── tests
│ ├── unit
│ │ └── comment.service.spec.ts
│ └── utilities.ts
├── database
│ ├── database-utilities.service.ts
│ ├── database.module.ts
│ └── database.provider.ts
├── entry
│ ├── commands
│ │ ├── handlers
│ │ │ ├── createEntry.handler.ts
│ │ │ ├── deleteEntry.handler.ts
│ │ │ ├── index.ts
│ │ │ └── updateEntry.handler.ts
│ │ └── impl
│ │ ├── createEntry.command.ts
│ │ ├── deleteEntry.command.ts
│ │ └── updateEntry.command.ts
│ ├── entry.controller.ts
│ ├── entry.entity.ts
│ ├── entry.model.ts
│ ├── entry.module.ts
│ ├── entry.provider.ts
│ ├── entry.service.ts
│ ├── interfaces
│ │ ├── IEntry.ts
│ │ ├── IEntryService.ts
│ │ └── index.ts
│ └── tests
│ ├── unit
│ │ └── entry.service.spec.ts
│ └── utilities.ts
├── keyword
│ ├── commands
│ │ ├── handlers
│ │ │ ├── index.ts
│ │ │ ├── linkKeywordEntry.handler.ts
│ │ │ └── unlinkKeywordEntry.handler.ts
│ │ └── impl
│ │ ├── linkKeywordEntry.command.ts
│ │ └── unlinkKeywordEntry.command.ts
│ ├── events
│ │ ├── handlers
│ │ │ ├── index.ts
│ │ │ └── updateKeywordLinks.handler.ts
│ │ └── impl
│ │ └── updateKeywordLinks.event.ts
│ ├── interfaces
│ │ ├── IKeyword.ts
│ │ ├── IKeywordService.ts
│ │ └── index.ts
│ ├── keyword.controller.ts
│ ├── keyword.entity.ts
│ ├── keyword.module.ts
│ ├── keyword.provider.ts
│ ├── keyword.sagas.ts
│ ├── keyword.service.ts
│ └── keywordEntry.entity.ts
└── user
├── interfaces
│ ├── IUser.ts
│ ├── IUserService.ts
│ └── index.ts
├── requests
│ └── create-user.request.ts
├── tests
│ ├── e2e
│ │ └── user.controller.e2e-spec.ts
│ ├── unit
│ │ └── user.service.spec.ts
│ └── utilities.ts
├── user.controller.ts
├── user.entity.ts
├── user.module.ts
├── user.provider.ts
└── user.service.ts
Finally, the main files defining the module itself, including the
injectables, the controllers, and the entity, are in the root of the
module directory.
├── e2e/
├── src/
├── License
├── README.md
├── angular.json
├── package.json
├── tsconfig.json
├── tslint.json
├── udk.container.js
└── yarn.lock
This directory will contain the app directory in order to put our client
content with the Angular architecture using modules. Also, we will find
the environments, which define if we are in production mode or not
exporting constant. This environment will be replaced by the
production environment config for the production mode, and then
the server and shared directories. The shared directory allows us to
share some files as an interface, for example, and the server directory
will contain all the server applications as we have seen in the previous
section.
But in this case, the server has changed a bit and now looks like this:
├── main.ts
├── app.module.ts
├── environments
│ ├── environment.common.ts
│ ├── environment.prod.ts
│ └── environment.ts
└── modules
├── client
│ ├── client.constants.ts
│ ├── client.controller.ts
│ ├── client.module.ts
│ ├── client.providers.ts
│ ├── interfaces
│ │ └── angular-universal-options.interface.ts
│ └── utils
│ └── setup-universal.utils.ts
└── heroes
├── heroes.controller.ts
├── heroes.module.ts
├── heroes.service.ts
└── mock-heroes.ts
The modules directory will contain all of the Nest.js modules, exactly as
we have seen in the previous section. One of the modules is
the client module and will serve the Universal app and all of the
required assets, as well as setting up the initializer to set the engine
and provide some Angular configurations.
Regarding the environments, this one will contain all of the configuration
paths related to the Angular application. This configuration references
the project configured into the angular.json file seen in the base of the
previous section’s project.
Summary
This chapter allows you to set up the architecture of your application
in a way that is much more understandable, readable and easier to
work with. We have seen how to define the architecture’s directories
for a server application, but also for a complete application using
Angular Universal. With these two examples, you should be able to
build your own project in a clearer way.
There are two main types of automated tests we are going to cover in
this book: unit tests and end-to-end tests.
Unit testing
As the name implies, each unit test cover one specific functionality. The
most important principles when dealing with unit tests are:
isolation; each component has to be tested without any
other related components; it cannot be affected by side effects
and, likewise, it cannot emit any side effects.
predictability; each test has to yield the same results as
long as the input doesn’t change.
Tooling
Unlike Angular, Nest.js doesn’t have an “official” toolset for running
tests; this means we are free to set up our own tooling for running
automated tests when we work in Nest.js projects.
Preparation
As you would expect, Jest is distributed as an npm package. Let’s go
and install it in our project. Run the following command from your
command line or terminal:
So, let’s create a new JSON file in our project’s root folder, which we
will name nest.json.
/nest.json
...
"scripts": {
...
"test": "jest --config=jest.json",
"test:watch": "jest --watch --config=jest.json",
...
}
}
The three new scripts will, respectively: run the tests once, run the
tests in watch mode (they will run after each file save), and run the
tests and generate the code coverage report (which will be output in
a coverage folder).
Our testing environment is ready. If we now run npm test in our project
folder, we will most likely see the following:
No tests found
In /nest-book-example
54 files checked.
testMatch: - 54 matches
testPathIgnorePatterns: /node_modules/ - 54 matches
testRegex: /src/.*\.(test|spec).ts - 0 matches
Pattern: - 0 matches
npm ERR! Test failed. See above for more details.
The tests have failed! Well, they actually haven’t; we just have not
written any tests yet! Let’s write some now.
Writing our first test
If you have read some more chapters of the book, you probably
remember our blog entries and the code we wrote for them. Let’s take
a look back to the EntryController. Depending on the chapters, the code
looked something like the following:
/src/modules/entry/entry.controller.ts
@Controller('entries')
export class EntriesController {
constructor(private readonly entriesSrv: EntriesService) {}
@Get()
findAll() {
return this.entriesSrv.findAll();
}
...
}
Let’s write a unit test for the controller’s findAll() method. We will be
using a special Nest.js package called @nestjs/testing, which will alow
us to wrap our service in a Nest.js module specially for the test.
Also, it’s important to follow the convention and name the test
file entry.controller.spec.ts, and place it next to
the entry.controller.ts file, so it gets properly detected by Jest when
we trigger a test run.
/src/modules/entry/entry.controller.spec.ts
beforeEach(async () => {
const module = await Test.createTestingModule({
controllers: [EntriesController],
})
.overrideComponent(EntriesService)
.useValue({ findAll: () => null })
.compile();
entriesSrv = module.get<EntriesService>(EntriesService);
entriesController =
module.get<EntriesController>(EntriesController);
});
});
Let’s now take a close look at what the test code is achieving.
Then, it comes the beforeEach method. The code inside that method will
be executed right before each of the following tests are run. In that
code, we are instantiating a Nest.js module for each test. Note that this
is a particular kind of module, since we are using
the .createTestingModule() method from the Testclass that comes from
the @nestjs/testing package. So, let’s think about this module as a
“mock module,” which will serve us for testing purposes only.
.overrideComponent(EntriesService)
.useValue({ findAll: () => null })
You can think of the result of the two code lines above as an empty,
dumb service that only repeats the methods we will need to use later,
without any implementation inside.
Finally, the .compile() method is the one that actually instantiates the
module, so it gets bound to the module constant.
Once all this initial setup is done, we are good to start writing some
actual tests. Let’s implement one that checks whether
the findAll() method in our controller correctly returns an array of
entries, even if we only have one entry:
describe('EntriesController', () => {
let entriesController: EntriesController;
let entriesSrv: EntriesService;
beforeEach(async () => {
const module = await Test.createTestingModule({
controllers: [EntriesController],
})
.overrideComponent(EntriesService)
.useValue({ findAll: () => null })
.compile();
entriesSrv = module.get<EntriesService>(EntriesService);
entriesController =
module.get<EntriesController>(EntriesController);
});
describe('findAll', () => {
it('should return an array of entries', async () => {
expect(Array.isArray(await
entriesController.findAll())).toBe(true);
});
});
});
The describe('findAll', () => { line is the one that starts the actual
test suite. We expect the resolved value
of entriesController.findAll() to be an array. This is basically how we
wrote the code in the first place, so it should work, right? Let’s run the
tests with npm test and check the test output.
FAIL src/modules/entry/entry.controller.spec.ts
EntriesController
findAll
✕ should return an array of entries (4ms)
expect(received).toBe(expected) // Object.is
equality
30 | ];
31 | // jest.spyOn(entriesSrv,
'findAll').mockImplementation(() => result);
> 32 | expect(Array.isArray(await
entriesController.findAll())).toBe(true);
33 | });
34 |
35 | // it('should return the entries
retrieved from the service', async () => {
at
src/modules/entry/entry.controller.spec.ts:32:64
at fulfilled
(src/modules/entry/entry.controller.spec.ts:3:50)
...
.overrideComponent(EntriesService)
.useValue({ findAll: () => null })
.compile();
...
...
describe('findAll', () => {
it('should return an array of entries', async () => {
jest.spyOn(entriesSrv, 'findAll').mockImplementationOnce(() =>
[{}]);
expect(Array.isArray(await entriesController.findAll())).toBe(true);
});
});
...
In order to mock the findAll() method from the service, we are using
two Jest methods. spyOn() takes an object and a method as arguments,
and starts watching the method for its execution (in other words, sets
up a spy). And mockImplementationOnce(), which as its name implies
changes the implementation of the method when it’s next called (in
this case, we change it to return an array of one empty object.)
PASS src/modules/entry/entry.controller.spec.ts
EntriesController
findAll
✓ should return an array of entries (3ms)
The test is passing now, so you can be sure that the findAll() method
on the controller will always behave itself and return an array, so that
other code components that depend on this output being an array
won’t break themselves.
If this test started to fail at some point in the future, it would mean that
we had introduced a regression in our codebase. One of the great sides
of automated testing is that we will be notified about this regression
before it’s too late.
So, let’s improve our tests to check that the method really returns the
output from the service, without messing things up.
describe('EntriesController', () => {
let entriesController: EntriesController;
let entriesSrv: EntriesService;
beforeEach(async () => {
const module = await Test.createTestingModule({
controllers: [EntriesController],
})
.overrideComponent(EntriesService)
.useValue({ findAll: () => null })
.compile();
entriesSrv = module.get<EntriesService>(EntriesService);
entriesController =
module.get<EntriesController>(EntriesController);
});
describe('findAll', () => {
it('should return an array of entries', async () => {
jest.spyOn(entriesSrv, 'findAll').mockImplementationOnce(() =>
[{}]);
expect(Array.isArray(await
entriesController.findAll())).toBe(true);
});
expect(await entriesController.findAll()).toEqual(result);
});
});
});
We just kept most of the test file as it was before, although we did add
a new test, the last one, in which:
PASS src/modules/entry/entry.controller.spec.ts
EntriesController
findAll
✓ should return an array of entries (2ms)
✓ should return the entries retrieved from the
service (1ms)
Both our tests pass. We accomplished quite a lot already. Now that we
have a solid foundation, extending our tests to cover as many test cases
as possible will be an easy task.
Code coverage engines analyze our code and tests together, and check
the amount of lines, statements, and branches that are covered by the
tests running in our suites, returning a percentage value.
Let’s add a script in our package.json file that, when executed, will
generate the coverage report:
...
"scripts": {
...
"test:coverage":"jest --config=jest.json --coverage --
coverageDirectory=coverage",
...
}
}
PASS src/modules/entry/entry.controller.spec.ts
EntriesController
findAll
✓ should return an array of entries (9ms)
✓ should return the entries retrieved from the
service (2ms)
---------------------|----------|----------|----------
|----------|-------------------|
File | % Stmts | % Branch | % Funcs
| % Lines | Uncovered Line #s |
---------------------|----------|----------|----------
|----------|-------------------|
All files | 100 | 66.67 | 100
| 100 | |
entry.controller.ts | 100 | 66.67 | 100
| 100 | 6 |
---------------------|----------|----------|----------
|----------|-------------------|
Test Suites: 1 passed, 1 total
Tests: 2 passed, 2 total
Snapshots: 0 total
Time: 4.62s
Ran all test suites.
In order to have a better vision of the console output within this book,
we will transform the console output to a proper table.
You can easily see we are covering 100% of our code lines in our tests.
This makes sense since we wrote two tests for the only method in our
controller.
All the imaginary developers working in this imaginary project will add
(as well as refactor and delete, but those cases don’t really apply to this
example) new functionality (i.e. new code) all the time, but they might
forget about properly testing that code. What would happen then? The
coverage percentage value of the project would go down.
Anyway, Jest allows you to specify a coverage threshold for tests: if the
value goes below that threshold, the tests will return failed even if they
all passed. This way, our CI/CD pipeline will refuse to merge or deploy
our code.
The coverage threshold has to be included in the Jest configuration
object; in our case, it lives in the jest.json file in our project’s root
folder.
...
"coverageThreshold": {
"global": {
"branches": 80,
"functions": 80,
"lines": 80,
"statements": 80
}
}
}
To demonstrate it, let’s run our controller tests with the coverage
threshold set as above. npm run test:coverage returns this:
PASS src/modules/entry/entry.controller.spec.ts
EntriesController
findAll
✓ should return an array of entries (9ms)
✓ should return the entries retrieved from the
service (1ms)
---------------------|----------|----------|----------
|----------|-------------------|
File | % Stmts | % Branch | % Funcs
| % Lines | Uncovered Line #s |
---------------------|----------|----------|----------
|----------|-------------------|
All files | 100 | 66.67 | 100
| 100 | |
entry.controller.ts | 100 | 66.67 | 100
| 100 | 6 |
---------------------|----------|----------|----------
|----------|-------------------|
Jest: "global" coverage threshold for branches (80%)
not met: 66.67%
Test Suites: 1 passed, 1 total
Tests: 2 passed, 2 total
Snapshots: 0 total
Time: 2.282s, estimated 4s
Ran all test suites.
npm ERR! code ELIFECYCLE
npm ERR! errno 1
npm ERR! nest-book-example@1.0.0 test:coverage: `jest
--config=jest.json --coverage --
coverageDirectory=coverage`
npm ERR! Exit status 1
npm ERR!
npm ERR! Failed at the nest-book-example@1.0.0
test:coverage script.
npm ERR! This is probably not a problem with npm.
There is likely additional logging output above.
As you can see, the tests passed, yet the process failed with status 1
and returned an error. Also, Jest reported "global" coverage threshold
for branches (80%) not met: 66.67%. We have successfully kept non-
acceptable code coverage away from our main branch or productive
environments.
E2E testing
While unit tests are isolated and independent by definition, end-to-end
(or E2E) tests serve for, in a way, the opposite function: they intend to
check the health of the system as a whole, and try to include as many
components of the solution as possible. For this reason, in E2E tests we
will focus on testing complete modules, rather than isolated
components or controllers.
Preparation
Fortunately, we can use Jest for E2E testing just like we did for unit
testing. We will only need to install the supertest npm package to
perform API requests and assert their result. Let’s install it by
running npm install --save-dev supertest in your console.
Also, we will create a folder called e2e in our project’s root folder. This
folder will hold all of our E2E test files, as well as the configuration file
for them.
This brings us to the next step: create a new jest-e2e.json file inside
the e2e folder with the following contents:
As you can see, the new E2E configuration object is very similar to the
one for unit tests; the main difference is the testRegex property, which
now points to files in the /e2e/ folder that have a .e2e-
test or e2e.spec file extension.
...
"scripts": {
...
"e2e": "jest --config=e2e/jest-e2e.json --forceExit"
}
...
}
Let’s write the code for the test. Create a new folder
called entries inside the e2e folder, and then create a new file there
called entries.e2e-spec.ts with the following content:
describe('Entries', () => {
let app: INestApplication;
const mockEntriesService = { findAll: () => ['test'] };
beforeAll(async () => {
const module = await Test.createTestingModule({
imports: [EntriesModule],
})
.overrideComponent(EntriesService)
.useValue(mockEntriesService)
.compile();
app = module.createNestApplication();
await app.init();
});
afterAll(async () => {
await app.close();
});
});
Summary
In this chapter we have explored the importance of adding automated
tests to our projects and what kind of benefits it brings.
Also, we got started with the Jest testing framework, and we learned
how to configure it in order to use it seamlessly with TypeScript and
Nest.js
If you are just hopping into this book now and want to follow along
with the example repository it can be cloned with:
Angular is another topic that can, and has, have an entire book
written about it. We will be using an Angular 6 app that has been
adapted for use in this book by one of the authors. The original
repository can be found here.
https://github.com/patrickhousley/nest-angular-universal.git
if (module.hot) {
module.hot.accept();
module.hot.dispose(() => app.close());
}
await app.listen(environment.port);
}
bootstrap()
.then(() => console.log(`Server started on port
${environment.port}`))
.catch(err => console.error(`Server startup failed`, err));
Let’s step through this function line by line.
if (environment.production) {
enableProdMode();
}
This will create the Nest app variable of type INestApplication and will
be run using ApplicationModule in the app.module.ts file as the entry
point. app will be the instance of the Nest app that is running on
port environment.port, which can be found
in src/server/environment/environment.ts. There are three different
environment files here:
If we are developing locally and want to have hot reloading, where the
server restarts if we change a file, then we need to have the following
included in our main.ts file.
if (module.hot) {
module.hot.accept();
module.hot.dispose(() => app.close());
}
This is set within the webpack.server.config.ts file based on
our NODE_ENV environment variable.
await app.listen(environment.port);
We then call bootstrap(), which will run the function described above.
At this stage we now have our Nest server running and able to serve
the Angular App and listen to serve API requests.
@Module({
imports: [
HeroesModule,
ClientModule.forRoot()
],
})
export class ApplicationModule {}
Here we are importing two Nest modules, the HeroesModule, which will
supply the API endpoints for the Tour of Heroes application, and
the ClientModule that is the module that is handling the Universal stuff.
The ClientModule has a lot going on, but we will touch on the main
things that are handling setting up Universal, here is the code for this
module.
@Module({
controllers: [ClientController],
components: [...clientProviders],
})
export class ClientModule implements NestModule {
constructor(
@Inject(ANGULAR_UNIVERSAL_OPTIONS)
private readonly ngOptions: AngularUniversalOptions,
@Inject(HTTP_SERVER_REF) private readonly app: NestApplication
) {}
static forRoot(): DynamicModule {
const requireFn = typeof __webpack_require__ === "function" ?
__non_webpack_require__ : require;
const options: AngularUniversalOptions = {
viewsPath: environment.clientPaths.app,
bundle: requireFn(join(environment.clientPaths.server,
'main.js'))
};
return {
module: ClientModule,
components: [
{
provide: ANGULAR_UNIVERSAL_OPTIONS,
useValue: options,
}
]
};
}
We will start with the @Module decorator at the top of the file. As with
regular Nest.js modules (And Angular, remember how Nest.js is
inspired by Angular?), there are controllers (for the endpoints)
property and a components (for services, providers and other
components we want to be part of this module) property. Here we are
including the ClientController in the controllers array
and ...clientProviders in components. Here the triple dot (...)
essentially means “insert each of the array elements into this array.”
Let’s disect each of these a bit more.
ClientController
@Controller()
export class ClientController {
constructor(
@Inject(ANGULAR_UNIVERSAL_OPTIONS) private readonly ngOptions:
AngularUniversalOptions,
) { }
@Get('*')
render(@Res() res: Response, @Req() req: Request) {
res.render(join(this.ngOptions.viewsPath, 'index.html'), { req });
}
}
This is like any other controller that we have learned about, but with
one small difference. At the URL path /* instead of supplying an API
endpoint, the Nest.js server will render an HTML page,
namely index.html, from that same viewsPath we have seen before in the
environment files.
This is similar to how we define our own provider inside the return
statement of ClientModule, but instead of useValue we use useFactory, this
passes in the Nest app and the AngularUniversalOptions we defined
earlier to a function setupUniversal(app, options). It has taken us a
while, but this is where the Angular Universal server is actually
created.
setupUniversal(app, options)
In the ClientModule we can see the .forRoot() function that was called
when we imported the ClientModule in the AppliationModule (server
entry point). Essentially, forRoot() is defining a module to return in
place of the originally imported ClientModule, also called ClientModule.
This module being returned has a single component that
provides ANGULAR_UNIVERSAL_OPTIONS, which is an interface that defines
what kind of object will be passed into the useValue property of the
component.
bundle: {
AppServerModuleNgFactory: any,
LAZY_MODULE_MAP: any
};
Well, if you trace that require statement back (.. means go up one
directory) then you will see we are setting the bundle property equal to
another module AppServerModule. This will be discussed in a bit, but the
Angular App will end up being served.
The last piece in the ClientModule is in the configure() function that will
tell the server where to find static assets.
Summary
And that is it! We have a working Angular Universal project to play
around with. Angular is a great client side framework that has been
gaining a lot of ground lately. There is much more that can be done
here as this chapter only scratched the surface, especially in terms of
Angular itself.
And, this is the last chapter in this book. We hope you are excited to
use Nest.js to create all sorts of apps.