Slides
Slides
02 A default subnet is created in the VPC. A subnet is created for each Availability Zone.
ENI State
01 This is a physical location in the world where AWS has their data centers.
03 You can at any time stop, restart, start and terminate an instance.
Reserved Pricing
Type OS
01 You also have 2 different offerings – Standard and Convertible Reserved Instances.
02 With Convertible which offers less discount, you can opt to exchange the offering for
another offering which can also include changing the Instance type.
03 You can sell the Standard Reserved Instance in the Reserved Instance Marketplace. But
this can’t be done for Convertible Reserved Instances.
Spot Instances
Price RR
01 Here your EC2 instance can run on dedicated hardware for you as the AWS Customer.
02 EC2 Instances from the same AWS Customer can run on the same hardware.
03 You need to create a VPC based on dedicated tenancy and then launch the instance.
Dedicated Hosts
02 You can then deploy the EC2 Instances on that dedicated host.
03 This is great when you have software licenses that are based on per-socket, per-CPU.
Saving Plans
01 Here you are making a commitment to paying a price per hour for compute for a one-
or three-year term.
02 This plans helps to get discounts on EC2 and other compute services as well like AWS
Lambda and AWS Fargate.
01 Here you want to ensure that you can create the required EC2 Instances when they are
required.
02 The capacity reservation is mapped to a Region, Availability Zone, Instance type and
platform.
03 You can create a reservation to start for a particular time or create and cancel the
reservation at any point in time.
Storage
Elastic Block Store
Summary
Points
Amazon EBS Volumes
IOPS HDD
Provisioned IOPS (io1, io2) Cold HDD
These are also backed by solid- IP This is good for workloads that are
state drives. But they provide high not accessed that frequently.
performance. Great for critical
workloads.
Amazon EBS Snapshots
01 This ensures the volume is fully initialized when it is restored from a snapshot.
02 Here you get all of the provisioned performance on the volume in the beginning itself.
01 This service provides fully elastic file storage. Here the storage is managed for you.
02 Compute solutions such as Amazon EC2 can then connect to Amazon EFS.
02 Here the storage has native support for Windows File systems and the Server Message
Block protocol.
You can store data for a wide variety of purposes – Data for web applications, Big data
03 applications, for backup operations.
Amazon S3
02 An object is a file and the metadata that is used to describe the file.
With versioning enabled, you can preserve, retrieve and restore every version of every
03
object within the bucket.
Amazon S3
An IAM role is created to ensure the Amazon S3 service has permission to replicate the
03
objects.
Storage
Classes
S3 Standard
Availability
Performance
Designed for 99.99% availability.
Provides low latency and high
throughput performance.
P
D A
Durability Purpose
Designed for 99.999999999% P Can be used for common use cases
durability of objects across multiple when it comes to storage of data.
Availability Zones.
S3 Standard-IA
Availability
Access
Designed for 99.9% availability.
This is for data that is accessed less
frequently. You get a lower price
when it comes to per GB storage
A
D A
Durability Purpose
Designed for 99.999999999% P Ideal for data that is not accessed
durability of objects across multiple that frequently.
Availability Zones.
S3 One Zone-IA
Availability
Access
Designed for 99.5% availability.
This is for data that is accessed less
frequently. But when you want to
access the data, you need it
A
immediately.
D A
Durability Purpose
Designed for 99.999999999% P You want a low-cost option for
durability of objects in a single storing data and don’t mind the less
Availability Zone. resiliency when it comes to data
storage.
S3 Glacier Instant Retrieval
Availability
Access
Designed for 99.9% availability.
This is an archive solution that gives
low-cost storage. This can be
chosen if you want retrieval of data
A
in milliseconds.
D A
Durability Purpose
Designed for 99.999999999% P Archive data that requires
durability of objects across multiple immediate access.
Availability Zone.
S3 Glacier Flexible Retrieval
Availability
Access
Designed for 99.99% availability.
This is an archive solution that gives
low-cost storage. Here the data
retrieval can range from minutes to
A
hours.
D A
Durability Purpose
Designed for 99.999999999% P Archive data that needs to be
durability of objects across multiple accessed very rarely.
Availability Zone.
S3 Glacier Deep Dive
Retrieval
Access
Here the retrieval time can be
This is an archive solution that gives
within 12 hours.
low-cost storage. This is used when
organizations want to store their
A
D D
Charge Availability
Here there is a small charge when it A Designed for 99.9% availability.
comes to monitoring data to
understand the tier to set for the
object.
Amazon S3 Lifecycle rules
01 These are rules that define the actions that can be taken on objects in the S3 bucket.
01 This can be used to prevent objects from being overwritten or deleted. Here the
objects can be in a read-only state.
02 You can either have objects locked for a certain duration of time, or you can have a legal
hold for an indefinite amount of time.
You can have both retention locks and legal holds defined for objects.
03
Amazon S3 static web site
01 Transfer acceleration – This allows fast and easy transfer of objects over large distances
from the client.
02 Transfer acceleration – This take advantage of Edge locations across the world.
Requester pays – Here the requester pays for the request and data download.
03
Amazon S3 Storage Lens
02 You can get insights onto the current storage , look at cost optimization.
03 You can also see which buckets are not following best practices – example S3
versioning or S3 replication.
Amazon S3
02 Remember that you directly can’t connect this storage to an EC2 Instance – use EBS
volumes and EFS instead.
Remember – AWS manages the underlying storage capacity, you need to manage the
03
data stored in S3.
Identity and Access
Management
IAM Best
Practices
IAM best practices
Don’t generate root access keys. Ensure Multi-factor authentication is enabled for the root
02 account. Use a strong password for the AWS root account.
There are some activities that require root credentials – Changing account settings,
03
Activate IAM access to Billing and Cost Management.
IAM best practices
01 Generate users in IAM for your organization users that will use the AWS account.
02 Ensure Multi-factor authentication is enabled for IAM users especially those that carry out privileged
tasks.
01 If access keys have been generated for IAM users , ensure the keys are rotated on a
regular basis.
02 Always apply the least permissions when giving access to AWS resources.
02 You then deploy resources such as your EC2 Instances in the VPC.
01 Subnets can be used to further segregate your virtual private cloud network.
02 Here you also define a CIDR block, this is a range of IP addresses from the VPC.
VPC Route
VPC Public
You create the Internet gateway and Public This makes a subnet public in
attach it to the VPC. nature.
AWS Security Groups
01 This is used to control the Inbound and Outbound traffic to your resources in a VPC.
02 You define both Inbound and Outbound rules. Here these are Allow-based rules.
03 For example, if you want users on the Internet to reach a web application running on a
web server on an EC2 Instance, you create an Inbound rule for traffic from 0.0.0.0/0 into
port 80 (HTTP Traffic) or 443 (HTTPS Traffic).
AWS Security Groups
01 Security Groups are stateful in nature. If the Inbound request is allowed, the outbound
request is also allowed.
02 For Security group rules, you define the Protocol, the port range, the source or
destination.
This allows outbound You define a NAT gateway You then define a route in the
communication from EC2 in the public subnet. An route table for the private
Instances in your private Elastic IP address is subnet to direct Internet traffic
subnet. External services assigned to the NAT via the NAT gateway.
cannot initiate a gateway.
connection.
NAT Instance
You can also create an But here you manage the You define a route in the route
EC2 Instance based on a EC2 Instance. You need to table for the private subnet to
NAT-based AMI. This can manage the security and direct Internet traffic via the
provide the NAT service. availability of the NAT NAT instance.
Instance.
Network Access Control Lists
01 This is used to control inbound and outbound traffic at the subnet level.
03 NACL’s are stateless in nature. This means you need to add both inbound and outbound
rules for traffic.
Network Access Control Lists
01 Each rule has a number. The rules are evaluated with the lowest rule number. If a rule
matches a request, no further rules are processed.
02 The rules you define have a rule number, the type, protocol, port range, source or
destination and whether it’s an Allow or Deny rule.
03 You can create custom NACL’s and associate them with subnets.
Bastion Hosts
01 You can deploy an EC2 Instance that is primarily used to connect and log-in into EC2
Instances in the VPC.
02 In the security groups you allow connectivity onto the Bastion Host from machines that
need to connect to EC2 Instances in the VPC.
03 You also define rules that allow connectivity for log-in purposes from the Bastion Host to
the EC2 Instances.
VPC Peering
03 Here the traffic between the VPC’s can traverse the AWS backbone network.
Gateway Endpoints
01 Your VPC can operate in dual stack mode wherein there is support for IPv6 addressing
as well.
03 Egress-only gateway allows outbound communication over IPv6 from instances in the
VPC. But it prevents connections being initiated from the Internet onto the Instances.
AWS Databases
Automated
Backups
Automated Backups
01 You can also manually backup the database instance by creating a DB snapshot.
02 The backups for the database instance are stored in Amazon S3.
03 The first backup is the full database backup. Subsequent backups are the changes that
have occurred on the database.
AWS Database
Migration Service
AWS Database Migration Service
02 You can discover the source infrastructure via the use of the DMS Fleet Advisor.
You can migrate to a different database engine via the use of the DMS Schema
03 Conversion tool.
AWS Database Migration Service
01 With this service you get a server that is used for the migration. This is known as the
replication instance.
02 The replication instance carries out replication tasks that replicate data from the source
onto a target endpoint.
You can carry out a one-time migration of your data. You can also carry out on-going
03
changes from the source onto the destination.
AWS Database Migration Service
01 You can create a task in the Database Migration Service to carry out an initial full load of
the data.
02 You can also create a task that replicates on-going changes from the source onto the
destination.
03
Summary
Points
Relational Database Service
Features
Database
You get features such as Automated
This service allows you to host a
backups, software patching, high
database on the AWS cloud. Here
you don’t need to manage the
DB availability.
Support Features
Support Network
There is support for MariaDB, VPC When you create a database
Microsoft SQL server, MySQL, instance, it is launched in a VPC.
Oracle and PostgreSQL.
Amazon RDS – Multi-AZ Support
01 This helps provide high availability and failover support for the database instances.
02 If AWS detects a problem in the infrastructure for the primary database instance, AWS
switches to the standby replica.
03 You can have applications that need read throughput only to perform read requests
from the read replica.
Amazon RDS – Read Replica
01 You can combine Multi-AZ and Read Replicas. You can enable both Multi-AZ and a Read
Replica for a primary database instance.
03
Amazon RDS
Automated backups
1. Here RDS creates a storage volume
snapshot. This helps to create a
backup of your database. $30/month
Multiple Choice Tittle
Fully managed relational Here you get better The underlying storage
database that is throughput and system is highly available and
compatible with MySQL performance. grows based on demand.
and PostgreSQL.
Amazon Aurora
When you work with There is the primary There are multiple read-
Aurora, you get a database instance that is replica’s that can be defined in
database cluster. used for the read and the cluster.
write operations.
Amazon Aurora High
Availability
1. Copies of the underlying data is
stored across multiple Availability
Zones. $30/month
Multiple Choice Tittle
01 This can manage the connections to the RDS database. Instead of the database engine
using compute to manage the connections, this can be done with RDS Proxy.
02 This also works if you have a standby replica in place. If there is a failure, the RDS Proxy
can start managing connections on the standby replica.
03 This service can also make use of AWS Secrets Manager for storage of credentials.
Amazon Aurora – Serverless
02 Here the database can automatically scale the compute capacity based on the load.
03 Here you are only charged for the resources consumed by the database.
Amazon Aurora – Global database
01 Here you can have the primary database run in one region. And have a read-replica in
another region.
02 This can help in an outage on an entire region. If there is an issue in the primary region,
the service can switch over to a secondary region.
01 You can quickly create a newer copy of an Aurora database by cloning the primary
database.
03 This is ideal when you want to quickly provision test environments based on data in
production environments.
Amazon RDS – Additional notes
01 You can stop an RDS instance to save on compute costs. You will still pay for the storage
costs.
02 If you need administrative rights to the database and the underlying operating system,
you can make use of the Amazon RDS Custom option.
Items Features
Items Sort key
Each item in the table is a set of VPC Along with the partition key, you
attributes. can also define one attribute to be
the sort key.
Amazon DynamoDB
01 On-demand capacity – Here DynamoDB will manage the capacity for you. You only pay
based on how much you use.
02 Your DynamoDB database needs to have read and write capacity. In On-demand mode
this will be managed by the service.
03 This is ideal when you have unpredictable application traffic and unknown workloads.
Amazon DynamoDB
01 Provisioned capacity – Here you mention the amount of read and write capacity.
02 You pay for the capacity even if you don’t use it.
01 Time to Live(TTL) feature allows you to expire or automatically delete items from the
table.
02 You first define the attribute in the table that DynamoDB will use when it comes to Time
to live.
03 For each item you specify the epoch time for the attribute.
Amazon DynamoDB
01 You can create backups in DynamoDB. You can create an on-demand backup and
perform a restore at any point in time.
02 The retention for the backups when it comes to a Point-in-time restore is 35 days.
03 You can also use AWS Backup for long term backup of Amazon DynamoDB.
Amazon DynamoDB
01 DynamoDB has the Accelerator DAX that provides more faster response time.
02 Here an in-memory cache is implemented. So, data can be retrieved from the cache.
01 This service allows you to host an in-memory data store based on the Redis tool.
03 Here you can create cluster of shards. Each shard consists of nodes.
Amazon ElastiCache for Redis
01 You can have one node that is the primary node used for read and write operations.
02 The other nodes are replicas or read-only nodes. If there is a failure in the primary
node, the service can switch over to a replica node.
03 You also can enable Multi-AZ wherein your nodes can reside in different Availability
Zones.
AWS
Advanced Networking
Summary
Points
Amazon Route 53
Failover Weighted
Failover Latency
This allows to define an endpoint to Latency Here the endpoint that provides the
failover to if the primary endpoint is least latency is returned to the
down. You can define a health check to client.
determine the resource health.
Amazon Route 53 – Routing policies
Geolocation
Here the routing is determined
based on the location of the user.
Geolocation
Multivalue Weighted
Multivalue answer
01 This service is used to distribute traffic across multiple targets such as EC2 Instances.
02 It can perform health checks and only route requests to backend targets that are
healthy.
Listener Target
Listener Health checks
This listens for client requests on a Health This is used to monitor the health of
particular protocol and port number. the backend targets.
Application Load Balancer
IP address
Path conditions
You can also register IP address
Here requests can be forwarded
which are targets outside of the
based on the URL of the request.
Path VPC.
Host IP
Host conditions Targets
You can configure rules that forward Targets You can also containerized
requests based on the host field in the applications, Lambda functions as
HTTP header. targets.
Network Load Balancer
01 The application load balancer works at layer 7 of the OSI Model whereas the Network
Load Balancer works at Layer 4.
01 This helps to route traffic via network appliances such as firewalls, intrusion detection
and prevention systems.
03 Traffic can flow between VPC’s and onto the VPC that contains the Gateway Load
Balancer.
Amazon EC2 Auto
Scaling
1. This allows to scale your EC2
Instances based on the application
load. $30/month
Multiple Choice Tittle
01 This is a fully managed network firewall with intrusion detection and prevention
capabilities.
02 You can filter traffic that’s coming and going out of the VPC. This includes traffic coming
over an Internet Gateway, a NAT Gateway , AWS Site-to-Site VPN, AWS Direct Connect.
Customer Virtual
Customer gateway VPN Connection
This allows to define the details of VPN This is the secure connection
your on-premises gateway device. between the on-premises data
center and the VPC.
AWS Direct Connect
01 This is used to connect your on-premises network to an AWS Direct Connect Location
over an Ethernet fiber-optic cable.
02 This is a dedicated connection. This allows for less latency and higher durability for
data transfer.
03 You can access your VPC and also AWS public services via virtual interfaces.
More on AWS
Compute Services
Summary
Points
AWS Lambda
Trigger Event
Trigger Logging
You have various ways to trigger your Logging Logging for your AWS Lambda
Lambda functions. For example, when function is available via
an object is uploaded to an S3 bucket. CloudWatch logs .
AWS Lambda
01 You can also attach a Lambda function to a VPC. This allows the Lambda function to
access resources in a private subnet in a VPC.
02 If you need resources in your private subnet to initiate communication with AWS
Lambda, you can make use of VPC endpoints.
03 AWS Lambda functions can also interact with a host of other AWS services.
AWS Step Functions
01 If you want to build a business workflow you can make use of AWS Step Functions.
02 This is a serverless orchestration service that integrates with AWS Lambda and other
AWS services.
03 In Step Functions, your workflow is represented as a state machine. The workflow then
has a series of steps in the form of tasks.
Amazon API Gateway
02 You can deploy a new version of your API for testing. Your traffic is separated from the
production and canary release.
03 You can then promote your canary release to the production release.
Amazon API Gateway
01 Usage plans - This specifies who can access one or more of the deployed API stages.
It is used to decouple
2.
components of an application.
01 When a message is read from the queue, the message becomes invisible to other
consumers based on the visibility timeout.
02 The initial consumer has to delete the message after reading and processing the
message.
03 For messages that can’t be processed, you can configure a dead letter queue.
Amazon Elastic Container Registry
03 You can then deploy your container-based applications onto AWS EKS.
Amazon Elastic Container Service
03 Your images can be pulled from the Amazon Elastic Container Registry.
AWS App
Runner
AWS App Runner
01 This service provides a simple and fast way to deploy an application based on source
code or a container image.
03 You can have your container images reside in the Elastic Container Registry service.
AWS App Runner
01 Here the underlying compute infrastructure for hosting the application is managed for
you.
02 You only pay for the resources being consumed by the service.
03 When deploying your code AWS App Runner provides several supported programming
platforms.
AWS Elastic
Beanstalk
AWS Elastic Beanstalk
01 This services allows you to easily deploy applications to AWS without the need of learning
how to manage the underlying infrastructure required for hosting the application.
02 Here you can just deploy your application onto the AWS Elastic Beanstalk service.
03 There is support for Go, Java, .NET, Node.js, PHP, Python and Ruby.
AWS Elastic Beanstalk
01 When you upload your application AWS automatically launches an environment with the
required infrastructure for hosting the application.
02 You can have a web server environment in place that uses the Amazon Elastic Load
Balancer, Amazon EC2 Instances etc.
03 You can also host a worker environment that also makes use of the Simple Queue
Service.
AWS
Security
Amazon
Inspector
Amazon Inspector
01 This is a service that is used to continuously scan your AWS workloads such as
Amazon EC2 Instances for vulnerabilities.
You can also activate ECR scanning so that is scans the images in your repositories for
03
any sort of vulnerabilities.
Amazon Inspector
01 For your EC2 Instances, it collects operating system package and programming
language package vulnerabilities.
02 To collect Common Vulnerabilities and Exposures (CVE) data , the EC2 Instance needs
to have the AWS Systems Manager (SSM) agent installed.
The agent normally comes installed on many EC2 Instances, but it might need to be
03
activated manually.
Amazon Inspector
01 When you activate Amazon Inspector, it will initiate a scan of all EC2 Instances.
02 New scans are initiated when you launch a new EC2 Instance or install a new software
on a Linux-based instance.
A new scan is also initialized when a new CVE item is added to the database and
03
relevant to the Linux-based EC2 Instance.
Amazon
Macie
Amazon Macie
01 This service evaluates the configuration of your S3 bucket. If the access becomes
too public, then this service will flag it accordingly.
02 Using the data in the logs it helps to identify any sort of malicious behaviour.
It uses threat intelligence feeds and tries to identify requests coming in from malicious
03
IP addresses, any escalation of privileges etc.
AWS Systems
Manager
AWS Systems Manager – Application Management
01 This service is used to help DevOps engineers to investigate and remediate issues
with their AWS resources.
02 It has support for Amazon Elastic Kubernetes and Amazon Elastic Container Service.
03 All of the required information can be viewed from the AWS Management Console.
AWS Systems Manager – Change Management
01 You can add your machines as managed nodes in AWS Systems Manager.
03 You can use Patch Manager to scan the managed nodes for missing patches.
AWS Systems Manager – Incident Management
03 Provides customizable dashboards for information reporting about the various AWS
resources.
Amazon Cognito
Amazon Cognito
02 Here you can create user pools wherein users can be defined who need to be
authenticated and authorized to use an application.
You can define identity pools for authorizing authenticated or anonymous users to
03
access AWS resources.
Summary
Points
AWS Secrets Manager
02 Here an AWS Lambda function can be used to change the value of the secret to keep it
more secure.
03 The secrets can also be encrypted with the use of the Key Management service.
AWS Shield
01 This helps to protect again Distributed Denial of Service attacks – DDoS attacks.
02 By default, you get the standard protection when it comes to AWS Shield.
But you can also get advanced protection for your resources with AWS Shield Advanced.
03 This can provide aspects such as real-time monitoring of threats and extra protection for
resources.
AWS Key
Management Service
1. This service allows you to manage
your encryption keys.
$30/month
Multiple Choice Tittle
03 It can also scan for images stored in the Elastic Container Registry service.
Amazon Macie
03
If you want to detect sensitive information such as AWS credentials, personal
identifiable information.
Amazon GuardDuty
01 This is a security analysis service. It can detect threats by monitoring log data.
02 It can monitor CloudTrail management events, VPC flow logs and DNS logs.
03 You can automate the patching of EC2 Instances via the use of Systems Manager.
Amazon Cognito
02 Here you can define user pools for authentication and authorization of users onto
applications.
03 You can also define identity pools to authorize access to AWS resources.
AWS X-Ray
01 With the help of this service, you can trace the various calls being made by the
application through various layers.
03 For example, you can trace the time taken for calls made to Amazon DynamoDB.
Amazon AppFlow
01 This is a service that allows you to securely exchange data between AWS services and
other software-as-a-service applications.
02 For example , you can securely exchange data between Salesforce and the Amazon
Simple Storage service.
02 You can make use of SQL to analyze data stored in Amazon S3.
03 You can also run data analytics via the use of Apache Spark without having the need to
manage the underlying infrastructure.
AWS Snowball
Edge
AWS Snowball Edge
01 Use case – Let’s say that a company has terabytes of data that they want to move to AWS.
02 Transferring it over the Internet is unreliable and can take time. Creating an AWS Direct
Connect connection just for this task is very expensive
03 You can make use of AWS Snowball Edge to transfer the data to AWS.
AWS Snowball Edge
01 You create a job in the AWS Management console when it comes to Snow Family.
02 AWS then sends the device over to you. When you receive the device, you setup the
device and copy data onto to it.
03 You then send the device over to AWS. And they then start copying the data , let’s say
onto S3. You can also import data from S3 onto your data center.
Amazon
QuickSight
Amazon QuickSight
01 This is a fully managed service. You don’t need to provision any infrastructure.
02 You get data insights, use machine learning to automatically make reliable forecasts.
01 This is a secure transfer service that can be used to transfer files in and out of AWS storage
services .
02 The storage services that are supported are Amazon S3 and Amazon Elastic File System.
The protocols supported are – Secure Shell File Transfer Protocol (SFTP), File Transfer
03 Protocol Secure (FTPS), File Transfer Protocol (FTP).
AWS Transfer Family
01 A company might have the requirement to constantly upload data from their on-premises
environment using a File Transfer server.
02 With AWS Transfer Family, you get a fully managed server endpoint that has the benefits
of managed file transfer.
Here there are no upfront costs, you only pay based on how much you use.
03
Amazon Kinesis
Data Analytics
Amazon Kinesis Data Analytics
01 This service can be used to process and analyze streaming data using standard SQL .
02 Here you create a Kinesis Data Analytics application that can read and process
streaming data.
03 You can use Amazon Kinesis Streams and Data Firehose as the source.
Amazon EMR
Amazon EMR
This is a managed cluster platform that can be used to run applications that work with
01 Apache Hadoop and Apache Spark.
In Amazon EMR, you create a cluster. The cluster consists of Amazon Elastic Compute
03
Instances.
Amazon EMR
You have the primary node that is used for distributing the tasks to the other nodes in the
01 cluster.
02 Core node – This has software components that is used for running tasks and stores data
in Hadoop Distributed File System.
Task node – This has software that is just used to run tasks.
03
Amazon MQ
Amazon MQ
This is a managed message broker service. Currently Amazon MQ supports Apache
01 ActiveMQ and RabbitMQ engine types.
02 With ActiveMQ, you can either have a single-instance broker in one Availability Zone.
Or you can have an active/standby broker that comprises of two brokers in two different
03
Availability Zones.
AWS Backup
AWS Backup
01 This is a fully managed service that allows you to take a backup of your data.
03
This helps in your recovery scenarios.
Machine Learning
Services
Amazon Rekognition
01 This service allows you to get aspects about your images and videos.
02 You can identify labels such as objects, people, scenes and text.
You can detect inappropriate content, get facial analysis. You can use these features
03
from within your application.
Amazon Transcribe
01 This is a speech recognition service. This service can be sued to convert audio to text.
02 You can transcribe in real time or media files located in an Amazon S3 bucket.
01 This is a text translation service. You can translate your text documents to a target language.
02 You can also analyze text that comes up in news and social media feeds in different
languages.
02 You can get entities, key phrases, detect sentiments from underlying text.
03 This is available for content in real time, or you can create jobs for existing data sets.
Amazon Comprehend Medical
01 This is a special type of service when it comes to Amazon Comprehend. It can be used to detect
and get information when it comes to text used in the medical industry.
02 You can use it to detect certain aspects in text such as medications or Protected Health
Information (PHI).
01 This is a fully managed machine learning service. Here you can build and train machine
learning models.
03 You can deploy your machine learning models in your production environments.
Amazon Forecast
01 This service used statistical and machine learning algorithms to forecast time-series based
data.
02 This can be used in multiple fields such as retail, finance, logistics and healthcare.
03 Here you import your existing data sets, train and predictor and generate the forecast.
Amazon Textract
01 This service can be used to detect typed and handwritten text in documents that include
financial reports.
01 This can be used as a base for designing, building and maintaining solutions on the
AWS Cloud .
02 Here you can make use of well-known practices for building reliable, secure, efficient,
cost-effective and sustainable solutions on the cloud.
Democratize advanced
Experiment more often.
technologies.
Maximize utilization.
Summary
Points
AWS Storage Gateway
01 Amazon S3 File Gateway – This provides a file interface that allows you to work with
objects in Amazon S3.
02 The files can be accessed either via the SMB (Server Message Block) or NFS (Network
File System) protocol.
03 You can also make use of lifecycle policies in S3 to save on storage costs.
AWS Storage Gateway
01 Amazon FSx File Gateway – Here you get access to a file share via the SMB protocol.
02 This has Windows-native compatible which includes support for Access Control Lists.
03 You on-premises workloads can store files on the file share via the gateway.
AWS Storage Gateway
01 Volume Gateway – Here you get block storage volumes that can be accessed via the
iSCSI protocol.
02 Cached Volume Gateway – Here frequently accessed files are cached locally.
03 Stored Volume Gateway – Here all files are stored locally. And backups are taken on
AWS.
Amazon QuickSight
01 This is a business intelligence service. You can use this service to get insights into your
data.
03 This is a completely managed service wherein you don’t need to manage the
underlying infrastructure.
AWS Glue
Develop Firehose
Develop Data Analytics
You need to develop the producer and Analytics Here you can process and analyze
consumer of the applications that will your data using standard SQL.
read and write the streams of data.
Amazon Redshift
03 You also get a lots of compute power that can be used to perform analysis on the data.
Amazon EMR
01 Elastic Map Reduce – This is a managed platform that allows you to run big data
frameworks such as Apache Hadoop and Apache Spark.
02 Here you can create a cluster which is nothing but a collection or nodes or Amazon EC2
Instances.
03 You can then process data via the use of the cluster.
Amazon MQ
With the ActiveMQ engine, you can have an active/standby broker working in two
03 different Availability zones.
AWS Transfer Family
01 This is a secure service that can be used to transfer files to AWS Storage services.
02 This service has support for the standard protocols that includes File Transfer Protocol.
This service can be used to transfer files in and out of Amazon S3 and Amazon Elastic
03
File System.
AWS Data Sync
02 It has support for services that include Amazon S3, Amazon EFS and Amazon FSx.
03 AWS Data Sync can also help identify the data stores that need to be transferred.
AWS Backup