0% found this document useful (0 votes)
50 views21 pages

Cloud Computing

This document discusses various topics related to cloud computing including: 1. The principles of cloud computing are abstraction, automation, and elasticity. The types of cloud computing are public, private, community, and hybrid. 2. Virtualization allows the creation of virtual versions of servers, desktops, storage, operating systems, and network resources. Some disadvantages of virtualization include lack of offline work capability and limited supported operating systems. 3. Cloud platforms aim for scalability, efficiency, virtualization, and reliability. Third party providers offer external cloud services, integration, infrastructure support, and more. Distributed computing provides application and data sharing across reliable, scalable, and low-cost resources.

Uploaded by

Meganadhan
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
Download as docx, pdf, or txt
0% found this document useful (0 votes)
50 views21 pages

Cloud Computing

This document discusses various topics related to cloud computing including: 1. The principles of cloud computing are abstraction, automation, and elasticity. The types of cloud computing are public, private, community, and hybrid. 2. Virtualization allows the creation of virtual versions of servers, desktops, storage, operating systems, and network resources. Some disadvantages of virtualization include lack of offline work capability and limited supported operating systems. 3. Cloud platforms aim for scalability, efficiency, virtualization, and reliability. Third party providers offer external cloud services, integration, infrastructure support, and more. Distributed computing provides application and data sharing across reliable, scalable, and low-cost resources.

Uploaded by

Meganadhan
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1/ 21

CLOUD COMPUTING

PART A
1. List out the principles of cloud computing?
1. Abstraction
2. Automation
3. Elasticity

2. What are the types of Cloud Computing?


They are the four types of Cloud Computing.
1.Public 2. Private 3. Community 4. Hybird.

3. List out the Disadvantage of Virtualization.


They are the four types of disadvantages
1. No work off-line capability
2. High-speed LAN recommended.
3. Limited number of OS are supported
4. Imaging disadvantages apply to this technique.
4. define virtualization?
Virtualization is the "creation of a virtual (rather than actual) version of something, such as a server, a
desktop, a storage device, an operating system or network resources".
5. List out the major design goals of cloud computing platform?
1. scalability, efficiency, VZ, and reliability.

6. Who is third party provider?


Service providers, integrators, vendors, telecommunications, and infrastructure support that
are external to the organization that operates the manufacturing system.

7. State the advantages of distributed computing


 Distributed applications
 Data sharing
 Resource sharing
 Low cost
 Reliability
 Scalability
8. Define OpenNebula
OpenNebula is a hyper-converged infrastructure platform for managing heterogeneous
distributed data center infrastructures.

9. List some cloud software environment


Amazon web services
Microsoft Azure
Google cloud platform

10. Write down some of the issues in cloud security


1. Data loss 2. Data Privacy 3. Misconfiguration 4. Unauthorized Access
PART B

1.Elucidate on public and hybrid cloud?

PUBLIC CLOUD:

The public cloud refers to the cloud computing model in which IT services are delivered via the
internet. As the most popular model of cloud computing services, the public cloud offers vast choices
in terms of solutions and computing resources to address the growing needs of organizations of all
sizes and verticals.

The defining features of a public cloud solution include:

 High elasticity and scalability


 A low-cost subscription-based pricing tier
Services on the public cloud may be free, freemium, or subscription-based, wherein you’re charged
based on the computing resources you consume.
The computing functionality may range from common services—email, apps, and storage—to the
enterprise-grade OS platform or infrastructure environments used for software development and
testing.
The cloud vendor is responsible for developing, managing, and maintaining the pool of computing
resources shared between multiple tenants from across the network.
When to use the public cloud
The public cloud is most suitable for these types of environments:

 Predictable computing needs, such as communication services for a specific number of


users
 Apps and services necessary to perform IT and business operations
 Additional resource requirements to address varying peak demands
 Software development and test environments
Advantages of public cloud
People appreciate these public cloud benefits:

 No CapEx. No investments required to deploy and maintain the IT infrastructure.


 Technical agility. High scalability and flexibility to meet unpredictable workload
demands.
 Business focus. The reduced complexity and requirements on in-house IT expertise is
minimized, as the cloud vendor is responsible for infrastructure management.
 Affordability. Flexible pricing options based on different SLA offerings
 Cost agility. The cost agility allows organizations to follow lean growth strategies and
focus their investments on innovation projects

Drawbacks of public cloud


The public cloud does come with limitations:

 Lack of cost control. The total cost of ownership (TCO) can rise exponentially for large-
scale usage, specifically for midsize to large enterprises.
 Lack of security. Public cloud is the least secure, by nature, so it isn’t best for sensitive
mission-critical IT workloads.
 Minimal technical control. Low visibility and control into the infrastructure may not
meet your compliance needs.

HYBRID CLOUD
The hybrid cloud is any cloud infrastructure environment that combines both public and private cloud
solutions.
The resources are typically orchestrated as an integrated infrastructure environment. Apps and data
workloads can share the resources between public and private cloud deployment based on
organizational business and technical policies around aspects like:

 Security
 Performance
 Scalability
 Cost
 Efficiency
This is a common example of hybrid cloud: Organizations can use private cloud environments for
their IT workloads and complement the infrastructure with public cloud resources to accommodate
occasional spikes in network traffic.
Or, perhaps you use the public cloud for workloads and data that aren’t sensitive, saving cost, but opt
for the private cloud for sensitive data.
As a result, access to additional computing capacity does not require the high CapEx of a private
cloud environment but is delivered as a short-term IT service via a public cloud solution. The
environment itself is seamlessly integrated to ensure optimum performance and scalability to
changing business needs.
When to use the hybrid cloud
Here’s who the hybrid cloud might suit best:

 Organizations serving multiple verticals facing different IT security, regulatory, and


performance requirements
 Optimizing cloud investments without compromising on the value that public or private
cloud technologies can deliver
 Improving security on existing cloud solutions such as SaaS offerings that must be
delivered via secure private networks
 Strategically approaching cloud investments to continuously switch and tradeoff between
the best cloud service delivery model available in the market

Advantages of hybrid cloud

 Policy-driven option. Flexible policy-driven deployment to distribute workloads across


public and private infrastructure environments based on security, performance, and cost
requirements.
 Scale with security. Scalability of public cloud environments is achieved without
exposing sensitive IT workloads to the inherent security risks.
 Reliability. Distributing services across multiple data centers, some public, some private,
results in maximum reliability.
 Cost control. Improved security posture as sensitive IT workloads run on dedicated
resources in private clouds while regular workloads are spread across inexpensive public
cloud infrastructure to tradeoff for cost investments

Drawbacks of hybrid cloud


Common drawbacks of the hybrid cloud include:

 Price. Toggling between public and private can be hard to track, resulting in wasteful
spending.
 Management. Strong compatibility and integration is required between cloud
infrastructure spanning different locations and categories. This is a limitation with public
cloud deployments, for which organizations lack direct control over the infrastructure.
 Added complexity. Additional infrastructure complexity is introduced as organizations
operate and manage an evolving mix of private and public cloud architecture.

2. Write the benefits of cloud provisioning?

Cloud provisioning has numerous benefits for an organization that cannot be achieved by
traditional provisioning approaches.
1. Scalability: A company makes a huge investment in its on-site infrastructure under
the conventional IT provisioning model. This requires immense preparation and
prophesying infrastructure needs. However, in the cloud provisioning model, cloud
resources can scale up and scale down which is entirely dependant on the short-term
consumption of usage. This way scalability can help the organizations.
2. Speed: Speed is another factor of the cloud’s provisioning which can benefit the
organizations. For this, the developers of the organization can schedule the jobs which
in turn removes the need for an administrator who provisions and manages resources.
3. Cost Savings: It is another potential benefit of cloud provisioning. Traditional
technology can incur a huge cost to the organizations while cloud providers allow
customers to pay only for what they consume. This is another major reason why cloud
provisioning is preferred.

3. Draw the different implementation level of virtualization?

There are five levels of virtualizations available that are most commonly used in the industry. These
are as follows:

Instruction Set Architecture Level (ISA)

In ISA, virtualization works through an ISA emulation. This is helpful to run heaps of legacy code
which was originally written for different hardware configurations.
These codes can be run on the virtual machine through an ISA.
A binary code that might need additional layers to run can now run on an x86 machine or with some
tweaking, even on x64 machines. ISA helps make this a hardware-agnostic virtual machine.
The basic emulation, though, requires an interpreter. This interpreter interprets the source code and
converts it to a hardware readable format for processing.
Hardware Abstraction Level (HAL)

As the name suggests, this level helps perform virtualization at the hardware level. It uses a bare
hypervisor for its functioning.
This level helps form the virtual machine and manages the hardware through virtualization.
It enables virtualization of each hardware component such as I/O devices, processors, memory, etc.
This way multiple users can use the same hardware with numerous instances of virtualization at the
same time.
IBM had first implemented this on the IBM VM/370 back in 1960. It is more usable for cloud-based
infrastructure.
Thus, it is no surprise that currently, Xen hypervisors are using HAL to run Linux and other OS on
x86 based machines.

Operating System Level

At the operating system level, the virtualization model creates an abstract layer between the
applications and the OS.
It is like an isolated container on the physical server and operating system that utilizes hardware and
software. Each of these containers functions like servers.
When the number of users is high, and no one is willing to share hardware, this level of virtualization
comes in handy.
Here, every user gets their own virtual environment with dedicated virtual hardware resources. This
way, no conflicts arise.

Library Level

OS system calls are lengthy and cumbersome. Which is why applications opt for APIs from user-
level libraries.
Most of the APIs provided by systems are rather well documented. Hence, library level virtualization
is preferred in such scenarios.
Library interfacing virtualization is made possible by API hooks. These API hooks control the
communication link from the system to the applications.
Some tools available today, such as vCUDA and WINE, have successfully demonstrated this
technique.

Application Level
Application-level virtualization comes handy when you wish to virtualize only an application. It does
not virtualize an entire platform or environment.
On an operating system, applications work as one process. Hence it is also known as process-level
virtualization.
It is generally useful when running virtual machines with high-level languages. Here, the application
sits on top of the virtualization layer, which is above the application program.
The application program is, in turn, residing in the operating system.
Programs written in high-level languages and compiled for an application-level virtual machine can
run fluently here.

4. Describe about the layered cloud architecture?

Layered Architecture of Cloud

Application Layer

1. The application layer, which is at the top of the stack, is where the actual cloud apps are
located. Cloud applications, as opposed to traditional applications, can take advantage of
the automatic-scaling functionality to gain greater performance, availability, and lower
operational costs.
2. This layer consists of different Cloud Services which are used by cloud users. Users can
access these applications according to their needs. Applications are divided
into Execution layers and Application layers.
3. In order for an application to transfer data, the application layer determines whether
communication partners are available. Whether enough cloud resources are accessible
for the required communication is decided at the application layer. Applications must
cooperate in order to communicate, and an application layer is in charge of this.
4. The application layer, in particular, is responsible for processing IP traffic handling
protocols like Telnet and FTP. Other examples of application layer systems include web
browsers, SNMP protocols, HTTP protocols, or HTTPS, which is HTTP’s successor
protocol.
Platform Layer   
1. The operating system and application software make up this layer.
2. Users should be able to rely on the platform to provide them with  Scalability,
Dependability, and Security Protection which gives users a space to create their apps,
test operational processes, and keep track of execution outcomes and performance.  SaaS
application implementation’s application layer foundation.
3. The objective of this layer is to deploy applications directly on virtual machines.
4. Operating systems and application frameworks make up the platform layer, which is
built on top of the infrastructure layer. The platform layer’s goal is to lessen the
difficulty of deploying programmers directly into VM containers.
5. By way of illustration, Google App Engine functions at the platform layer to provide
API support for implementing storage, databases, and business logic of ordinary web
apps.

Infrastructure Layer

1. It is a layer of virtualization where physical resources are divided into a collection of


virtual resources using virtualization technologies like Xen, KVM, and VMware.
2. This layer serves as the Central Hub of the Cloud Environment, where resources are
constantly added utilizing a variety of virtualization techniques.
3. A base upon which to create the platform layer. constructed using the virtualized
network, storage, and computing resources. Give users the flexibility they want.
4. Automated resource provisioning is made possible by virtualization, which also
improves infrastructure management.
5. The infrastructure layer sometimes referred to as the virtualization layer, partitions the
physical resources using virtualization technologies like Xen, KVM, Hyper-V, and
VMware to create a pool of compute and storage resources. 
6. The infrastructure layer is crucial to cloud computing since virtualization technologies
are the only ones that can provide many vital capabilities, like dynamic resource
assignment.
Datacenter Layer

 In a cloud environment, this layer is responsible for Managing Physical


Resources such as servers, switches, routers, power supplies, and cooling systems.
 Providing end users with services requires all resources to be available and managed in
data centers.
 Physical servers connect through high-speed devices such as routers and switches to the
data center.
 In software application designs, the division of business logic from the persistent data it
manipulates is well-established. This is due to the fact that the same data cannot be
incorporated into a single application because it can be used in numerous ways to
support numerous use cases. The requirement for this data to become a service has
arisen with the introduction of microservices.
 A single database used by many microservices creates a very close coupling. As a result,
it is hard to deploy new or emerging services separately if such services need database
modifications that may have an impact on other services. A data layer containing many
databases, each serving a single microservice or perhaps a few closely related
microservices, is needed to break complex service interdependencies.

5. Explain the data flow of word count problem using MapReduce function?

Hadoop can be developed in programming languages like Python and C++. MapReduce Hadoop is a
software framework for ease in writing applications of software processing huge amounts of data.
MapReduce Word Count is a framework which splits the chunk of data, sorts the map outputs and
input to reduce tasks. A File-system stores the output and input of jobs. Re-execution of failed tasks,
scheduling them and monitoring them is the task of the framework.

THE OVERALL MAPREDUCE WORD COUNT PROCESS


Explanation: Taking in STDIN and STDOUT (standard input and standard output ) helps passing
data between the Reduce and Map code. sys.stdout to print the output and sys stdin to read the input
is used in Python.

Splitting: The parameter of splitter can be anything. By comma, space, by a new line or a semicolon.

Mapping: This is done as explained below.

Shuffle / Intermediate splitting: The process is usually parallel on cluster keys. The output of the
map gets into the Reducer phase and all the similar keys of data are aligned in a cluster.

Reduce: This is done as explained below. Final result – All the data is clustered or combined to show
the together form of a result.

The input given is converted into the string. Then it toknises them into words as if it need to break
them. The mapper will append a single number or digit to each word and mapper outputs are shown
above. Once we get the outputs as key-value pairs, once we pass the offset address as input to the
mapper, the output of the value would be key-value pairs.

The output is getting into the sorting and shuffling phase. When we sort based on keys, all the keys
will come to once a particular place. Sorting on the keys and shuffling the keys is done. A single
word will go to a single reducer. Input to the reducer is key-value pairs. Once we pass outputs to
reducer as input, the reducer will sum up all the values to keys.

That is, it groups up all the similar keys and output would be the concatenated key-value pair. The
reducer will pick the result from the temp path and it will arrive at the final result.  When we execute
map-reduce, the input and output should be created in HDFS. Which is why import a lot of files that
will help do the word count. We use something called a job client to do configuration. Extends
configure and implements the tools.

6. Describe the cloud security challenges?

Misconfiguration
Cloud computing is a popular way to access resources remotely and save on costs. However, cloud
security threats cannot arise if your cloud resources are configured correctly. Misconfiguration is the
top cloud security challenge, as users must appropriately protect their data and applications in the
cloud. To avoid this security threat, users must ensure that their data is protected and applications are
configured correctly. It can be accomplished using a cloud storage service that offers security features
such as encryption or access control.
Additionally, implementing security measures such as authentication and password requirements can
help protect sensitive data in the cloud. By taking these steps, users can increase the security of their
cloud computing infrastructure and stay protected from cyber threats.

Unauthorized Access
Unauthorized access to data is one of the top cloud security challenges businesses face. The cloud
provides a convenient way for businesses to store and access data, which can make data vulnerable to
cyber threats. Cloud security breaches can include unauthorized access to user data, theft of data, and
malware attacks. To protect their data from these types of threats, businesses must ensure that only
authorized users have access to it.

Another security feature business can implement encrypting sensitive data in the cloud. It will help
ensure that only authorized users can access it. By implementing security measures such as
encryption and backup procedures, businesses can safeguard their data from unauthorized access and
ensure its integrity.

Hijacking of Accounts
Hijacking of user accounts is one of the major cloud security hacks. Using cloud-based applications
and services will increase the risk of account hijacking. As a result, users must be vigilant about
protecting their passwords and other confidential information to stay secure in the cloud.

Users can protect themselves using strong passwords, security questions, and two-factor
authentication to access their accounts. They can also monitor their account activity and take steps to
protect themselves from unauthorized access or usage. This will help ensure that hackers cannot
access their data or hijack their accounts. Overall, staying vigilant about security and updating your
security measures are vital to cloud computing security.

Lack of Visibility
Cloud computing has made it easier for businesses to access and store their data online, but this
convenience comes with risks. As a result, companies need to protect their data from unauthorized
access and theft. But cloud computing also poses security threats due to its reliance on remote servers.
In order to ensure that their systems are vulnerable only to authorized sources, businesses must
implement security measures such as strong authentication, data loss prevention (DLP), data breach
detection, and data breach response.

With cloud computing, visibility is vital, and businesses must regularly audit security operations and
procedures to detect vulnerabilities and threats before they become a real problem. By taking the
necessary precautions and implementing security best practices, organizations can ensure that their
data remains secure in this cloud-based environment.

Data Privacy/Confidentiality
Data privacy and confidentiality are critical issues when it comes to cloud computing. With cloud
computing, businesses can access their data from anywhere in the world, which raises security
concerns. Companies don’t have control over who has access to their data, so they must ensure that
only authorized users can access it. Data breaches can happen when hackers gain access to company
data. In coming years, there will be even more data privacy and confidentiality issues due to the rise
of big data and the increased use of cloud computing in business.

Data privacy and confidentiality issues will continue to be an essential concern for businesses in the
years ahead as data-intensive applications continue to grow in popularity. However, experts at
Managed IT Services Charlotte ensure proper security measures and data practice for a cloud-ready
organization to avoid data breach risks.

External Sharing of Data


External data sharing is one of the leading cloud security challenges businesses face. This issue arises
when data is shared with third-party providers who have to be vetted and approved by the
organization. As a result, external data sharing can lead to the loss of critical business information and
theft and fraud. To prevent these risks, companies must implement robust security measures, such as
encryption, and data management practices. In addition, it will help ensure that sensitive data remains
secure and confidential.

By implementing appropriate security measures, companies can protect their data from unauthorized
access and ensure its reliability and integrity. Overall, external data sharing is a severe cloud security
challenge that businesses must address to stay ahead of the competition.

Legal and Regulatory Compliance


A cloud is a powerful tool that can help organizations reduce costs and improve the efficiency of their
operations. However, cloud computing presents new security challenges that must be addressed to
protect data and ensure compliance with legal and regulatory requirements.

Organizations must ensure data security and comply with legal and regulatory requirements to ensure
the safety and integrity of their cloud-based systems. Cyber threats such as malware, data breach, and
phishing are just a few of the challenges organizations face when using cloud computing.

To combat these cybersecurity threats, it’s vital to perform regular security audits, maintain up-to-
date security configurations, implement robust authentication procedures, use strong passwords, use
multi-factor authentication methods, and regularly update software and operating systems. While
cloud computing can increase the risk of cyberattacks, organizations that are diligent about their
security posture can stay ahead of their competitors in this rapidly changing market.

Unsecure Third-party Resources


Third-party resources are applications, websites, and services outside the cloud provider’s control.
These resources may have security vulnerabilities, and unauthorized access to your data is possible.
Additionally, unsecured third-party resources may allow hackers to access your cloud data. These
vulnerabilities can put your security at risk. Therefore, it is essential to ensure that only trusted, and
secure resources are used for cloud computing. In addition, it will help ensure that only authorized
individuals access data and reduce the risk of unauthorized data loss or breach.

Unsecured third-party resources can pose a cybersecurity threat, especially when interacting with
sensitive data in cloud storage accounts. Hackers can access these resources to gain access to your
cloud data and systems. Implementing strong security controls such as multi-factor authentication and
enforcing strict password policies can help safeguard against this risk. In addition, by restricting
access to only trusted resources, you can ensure that only authorized individuals access data and
reduce the risk of unauthorized data loss or breach.

PART C

1. Explain in detail about different types of cloud computing services?

There are the following three types of cloud service models -

1. Infrastructure as a Service (IaaS)


2. Platform as a Service (PaaS)
3. Software as a Service (SaaS)

Infrastructure as a Service (IaaS)

IaaS is also known as Hardware as a Service (HaaS). It is a computing infrastructure managed over
the internet. The main advantage of using IaaS is that it helps users to avoid the cost and complexity
of purchasing and managing the physical servers.

Characteristics of IaaS

There are the following characteristics of IaaS -


o Resources are available as a service
o Services are highly scalable
o Dynamic and flexible
o GUI and API-based access
o Automated administrative tasks

Example: DigitalOcean, Linode, Amazon Web Services (AWS), Microsoft Azure, Google Compute
Engine (GCE), Rackspace, and Cisco Metacloud.

Platform as a Service (PaaS)

PaaS cloud computing platform is created for the programmer to develop, test, run, and manage the
applications.

Characteristics of PaaS

There are the following characteristics of PaaS -

o Accessible to various users via the same development application.


o Integrates with web services and databases.
o Builds on virtualization technology, so resources can easily be scaled up or down as per the
organization's need.
o Support multiple languages and frameworks.
o Provides an ability to "Auto-scale".

Example: AWS Elastic Beanstalk, Windows Azure, Heroku, Force.com, Google App Engine,
Apache Stratos, Magento Commerce Cloud, and OpenShift.

Software as a Service (SaaS)

SaaS is also known as "on-demand software". It is a software in which the applications are hosted
by a cloud service provider. Users can access these applications with the help of internet connection
and web browser.

Characteristics of SaaS

There are the following characteristics of SaaS -

o Managed from a central location


o Hosted on a remote server
o Accessible over the internet
o Users are not responsible for hardware and software updates. Updates are applied
automatically.
o The services are purchased on the pay-as-per-use basis

Example: BigCommerce, Google Apps, Salesforce, Dropbox, ZenDesk, Cisco WebEx, ZenDesk,


Slack, and GoToMeeting.

2. briefly explain on global exchange of cloud resources?

Cloud Exchange (CEx) serves as a market maker, bringing service providers and users together. The
University of Melbourne proposed it under Intercloud architecture (Cloudbus). It supports brokering
and exchanging cloud resources for scaling applications across multiple clouds. It aggregates the
infrastructure demands from application brokers and evaluates them against the available supply. It
supports the trading of cloud services based on competitive economic models such as commodity
markets and auctions.

GLOBAL EXCHANGE OF CLOUD RESOURCES

Entities of the Global exchange of cloud resources

Market directory

A market directory is an extensive database of resources, providers, and participants using the
resources. Participants can use the market directory to find providers or customers with suitable
offers.

Auctioneers
Auctioneers clear bids and ask from market participants regularly. Auctioneers sit between providers
and customers and grant the resources available in the Global exchange of cloud resources to the
highest bidding customer.

Brokers

Brokers mediate between consumers and providers by buying capacity from the provider and sub-
leasing these to the consumers. They must select consumers whose apps will provide the most utility.
Brokers may also communicate with resource providers and other brokers to acquire or trade resource
shares. To make decisions, these brokers are equipped with a negotiating module informed by the
present conditions of the resources and the current demand.

Service-level agreements(SLAs)

The service level agreement (SLA) highlights the details of the service to be provided in terms of
metrics that have been agreed upon by all parties, as well as penalties for meeting and failing to meet
the expectations.

The consumer participates in the utility market via a resource management proxy that chooses a set of
brokers based on their offering. SLAs are formed between the consumer and the brokers, which bind
the latter to offer the guaranteed resources. After that, the customer either runs their environment on
the leased resources or uses the provider's interfaces to scale their applications.

Providers

A provider has a price-setting mechanism that determines the current price for their source based on
market conditions, user demand, and the current degree of utilization of the resource.

Based on an initial estimate of utility, an admission-control mechanism at a provider's end selects the
auctions to participate in or to negotiate with the brokers.

Resource management system

The resource management system provides functionalities such as advance reservations that enable
guaranteed provisioning of resource capacity.

3. Review on amazon AWS and its services?

Most functionality

AWS has significantly more services, and more features within those services, than any other cloud
provider–from infrastructure technologies like compute, storage, and databases–to emerging
technologies, such as machine learning and artificial intelligence, data lakes and analytics, and
Internet of Things. This makes it faster, easier, and more cost effective to move your existing
applications to the cloud and build nearly anything you can imagine.

AWS also has the deepest functionality within those services. For example, AWS offers the widest
variety of databases that are purpose-built for different types of applications so you can choose the
right tool for the job to get the best cost and performance.

Largest community of customers and partners

AWS has the largest and most dynamic community, with millions of active customers and tens of
thousands of partners globally. Customers across virtually every industry and of every size, including
startups, enterprises, and public sector organizations, are running every imaginable use case on AWS.
The AWS Partner Network (APN) includes thousands of systems integrators who specialize in AWS
services and tens of thousands of independent software vendors (ISVs) who adapt their technology to
work on AWS.

Most secure

AWS is architected to be the most flexible and secure cloud computing environment available today.
Our core infrastructure is built to satisfy the security requirements for the military, global banks, and
other high-sensitivity organizations. This is backed by a deep set of cloud security tools, with over
300 security, compliance, and governance services and features. AWS supports 98 security standards
and compliance certifications, and all 117 AWS services that store customer data offer the ability to
encrypt that data.

Fastest pace of innovation

With AWS, you can leverage the latest technologies to experiment and innovate more quickly. We
are continually accelerating our pace of innovation to invent entirely new technologies you can use to
transform your business. For example, in 2014, AWS pioneered the serverless computing space with
the launch of AWS Lambda, which lets developers run their code without provisioning or managing
servers. And AWS built Amazon SageMaker, a fully managed machine learning service that
empowers everyday developers and scientists to use machine learning–without any previous
experience.

Most proven operational expertise

AWS has unmatched experience, maturity, reliability, security, and performance that you can depend
upon for your most important applications. For over 16 years, AWS has been delivering cloud
services to millions of customers around the world running a wide variety of use cases. AWS has the
most operational experience, at greater scale, of any cloud provider.
4. Discuss on cloud application security?

Cloud application security is a system of policies, processes, and controls that enable enterprises to
protect applications and data in collaborative cloud environments.

Cloud solutions are ubiquitous in modern enterprises. As a result, cloud security is now front and
center for optimizing enterprise security posture. Our survey of over 650 cybersecurity professionals
reinforced this truth, indicating that 94% are moderately or extremely concerned about cloud security.
Here, we’ll take a closer look at cloud-native application security, common threats facing modern
enterprises, and best practices and tooling that can help mitigate risk and improve cloud security
posture.

The Need for Cloud Application Security

Modern enterprise workloads are spread across a wide variety of cloud platforms ranging from suites
of SaaS products like Google Workspaces and Microsoft 365 to custom cloud-native applications
running across multiple hyper-scale cloud service providers.

As a result, network perimeters are more dynamic than ever and critical data and workloads face
threats that simply didn’t exist a decade ago. Enterprises must be able to ensure workloads are
protected wherever they run. Additionally, cloud computing adds a new wrinkle to data sovereignty
and data governance that can complicate compliance.

Individual cloud service providers often offer security solutions for their platforms, but in a world
where multi-cloud is the norm — a Gartner survey indicated over 80% of public cloud users use
multiple providers — solutions that can protect an enterprise end-to-end across all platforms are
needed.

Cloud Application Security Threats

 Account hijacking: Weak passwords and data breaches often lead to legitimate accounts being
compromised. If an attacker compromises an account, they can gain access to sensitive data
and completely control cloud assets.

 Credential exposure: A corollary to account hijacking is credential exposure. As the


SolarWinds security breach demonstrated, exposing credentials in the cloud (GitHub in this
case) can lead to account hijacking and a wide range of sophisticated long-term attacks.

 Bots and automated attacks: Bots and malicious scanners are an unfortunate reality of
exposing any service to the Internet. As a result, any cloud service or web-facing application
must account for the threats posed by automated attacks.
 Insecure APIs: APIs are one of the most common mechanisms for sharing data — both
internally and externally — in modern cloud environments. However, because APIs are often
both feature and data- rich, they are a popular attack surface for hackers.

 Oversharing of data: Cloud data storage makes it trivial to share data using URLs. This
greatly streamlines enterprise collaboration. However, it also increases the likelihood of assets
being accessed by unauthorized or malicious users.

 DoS attacks: Denial of Service (DoS) attacks against large enterprises have been a
cybersecurity threat for a long time. With so many modern organizations dependent on public
cloud services, attacks against cloud service providers can now have an exponential impact.

 Misconfiguration: One of the most common reasons for data breaches is misconfigurations.
The frequency of misconfiguration in the cloud is due in large part to the complexity involved
in configuration management (which leads to disjointed manual processes) and access control
across cloud providers.

 Phishing and social engineering: Phishing and social engineering attacks that exploit the
human side of enterprise security are one of the most frequently exploited attack vectors.

 Complexity and lack of visibility: Because many enterprise environments are multi-cloud, the
complexity of configuration management, granular monitoring across platforms, and access
control often lead to disjointed workflows that involve manual configuration and limit
visibility which further exacerbates cloud security challenges.

Types Of Cloud Application Security Solutions

There is no shortage of security solutions designed to help enterprises mitigate cloud application
security threats. For example, cloud access security brokers (CASBs) act as a gatekeeper to cloud
services and enforce granular security policies. Similarly, web application firewalls (WAFs) and
runtime application self-protection (RASP) to protect web apps, APIs, and individual applications.

Additionally, many enterprises continue to leverage point appliances to implement firewalling,


IPS/IDS, URL filtering, and threat detection. However, these solutions aren’t ideal for the modern
cloud-native infrastructure as they are inherently inflexible and tied to specific locations.

Web Application & API Protection (WAAP) has emerged as a more holistic and cloud-native
solution that combines — and enhances — the functionality of WAFs, RASP, and traditional point
solutions in a holistic multi-cloud platform. With WAAP, enterprises can automate and scale modern
application security in a way legacy tooling simply cannot.

Cloud Application Security Best Practices


Enterprises must take a holistic approach to improve their cloud security posture. There’s no one-
size-fits-all approach that will work for every organization, but there are several cloud application
security best practices that all enterprises can apply.

Here are some of the most important cloud app security best practices enterprises should consider:

 Leverage MFA: Multi Factor authentication (MFA) is one of the most effective mechanisms
for limiting the risk of account compromise.

 Account for the human aspect: User error is one of the most common causes of data breaches.
Taking a two-pronged approach of user education and implementing security tooling such as
URL filters, anti-malware, and intelligent firewalls can significantly reduce the risk of social
engineering leading to a catastrophic security issue.

 Automate everything: Enterprises should automate cloud application monitoring, incident


response, and configuration as much as possible. Manual workflows are error-prone and a
common cause for oversight or leaked data.

 Enforce the principle of least privilege: User accounts and applications should be configured
to only access the assets required for their business function. Security policies should enforce
the principle of least privilege across all cloud platforms. Leveraging enterprise identity
management solutions and SSO (single-sign-on) can help enterprises scale this cloud
application security best practice.

 Use holistic multi-cloud solutions: Modern enterprise infrastructure is complex and


enterprises need complete visibility to ensure a strong security posture across all platforms.
This means choosing visibility and security tooling that isn’t inherently tied to a given
location (e.g. point appliances) or cloud vendor is essential.

 Don’t depend on signature matching alone: Many threat detection engines and anti-malware
solutions depend on signature matching and basic business logic to detect malicious behavior.
While detecting known threats is useful, in practice depending only on basic signature
matching for threat detection is a recipe for false positives that can lead to alert fatigue and
unnecessarily slow down operations. Additionally, reliance on signature mapping alone means
enterprises have little to no protection against zero-day threats that don’t already have a
known signature. Security tooling that can analyze behavior in-context, for example by using
an AI engine, can both reduce false positives and decrease the odds of a zero-day threat being
exploited.

You might also like