Serverless Computing
1. INTRODUCTION
Serverless computing is an innovative cloud computing model that
fundamentally changes how applications are developed and deployed. Unlike
traditional server-based architectures, serverless computing abstracts the underlying
infrastructure management, allowing developers to focus exclusively on writing and
deploying code. This paradigm shift is enabled by cloud providers, who handle the
provisioning, scaling, and maintenance of servers on behalf of users. When we are
building the serverless application, it helps us in taking our minds off from the
infrastructure concerns because we do not need to manage any of the servers.
In a serverless architecture, developers write functions that are executed in
response to specific events, such as HTTP requests, database changes, or file uploads.
These functions, often referred to as "Function-as-a-Service" (FaaS), run in stateless
compute containers that are ephemeral and fully managed by the cloud
[Link] computing abstracts server management, addressing key issues
like scalability, cost efficiency, and development speed, making it a valuable model
for modern application development.
Some famous platforms for serverless computing:
1) Microsoft Azure
Fig 1.1 Microsoft Azure
This is the platform provided by Microsoft. It also provides all the features and here
also we have to pay for the time we use its services. It is known to improve developer
productivity, focus on business goals and building intelligent apps. Azure provides
robust solutions for enterprise integration with a strong focus on hybrid cloud
P.R.M.I.T. & R, BADNERA // B.E (IT) / /2024-25 1
Serverless Computing
scenarios, enabling seamless integration between on-premises and cloud
environments.
2) Google Cloud Platform
Fig 1.2 Google Cloud Platform
It is Google’s online platform that is based on serverless computing. It is a suite of
cloud computing services that runs on the same infrastructure that is used by the
Google internally for its end-user products
1.1 Motivation
Modern application development is increasingly complex, requiring efficient
and scalable solutions to manage infrastructure. Traditional server-based architectures
often burden developers with the responsibility of provisioning, maintaining, and
scaling servers, detracting from their core focus on writing and deploying code. This
challenge is compounded by fluctuating workloads and the need for rapid
deployment, which can lead to over-provisioning and increased costs.
Serverless computing addresses these issues by abstracting infrastructure
management, allowing developers to concentrate solely on their application logic. In
this architecture, cloud providers handle the provisioning, scaling, and maintenance of
servers, enabling automatic scaling based on demand and ensuring optimal
performance without manual intervention. This approach not only reduces operational
overhead but also offers cost efficiency through a pay-as-you-go model, where users
are billed only for the actual compute time consumed by their functions.
Serverless computing supports the rapid development and deployment of
applications, making it an attractive option for organizations aiming to innovate
quickly and efficiently in a competitive landscape.
P.R.M.I.T. & R, BADNERA // B.E (IT) / /2024-25 2
Serverless Computing
1.2 Objectives
The primary objective of serverless computing is to streamline the
development and deployment of applications by abstracting infrastructure
management. This enables developers to focus on writing and deploying code without
the burden of provisioning, maintaining, and scaling servers.
Serverless computing aims to provide a cost-efficient solution through a pay-
as-you-go model, where users are billed only for the actual compute time their
functions consume. This eliminates the need to pay for reserved server capacity,
leading to significant cost savings, especially for applications with variable
[Link] reducing operational overhead and simplifying the development
process, serverless computing promotes agility and rapid iteration. This enables
organizations to innovate quickly, delivering high-quality applications to market
faster and more efficiently.
P.R.M.I.T. & R, BADNERA // B.E (IT) / /2024-25 3
Serverless Computing
2. LITERATURE SURVEY
In this chapter the survey of the research papers referred during the Seminar
preparation work are reported.
In a paper by K. Luo, T. Ouyang, Z. Zhou, and X. Chen entitled "Behavior
Tree-based Workflow Modeling and Scheduling for Serverless Edge Computing," the
challenges and solutions in the field of Serverless computing, specifically focusing on
Serverless workflows and their orchestration, are discussed. Despite the growing
popularity of Serverless computing, there has been insufficient attention given to the
orchestration of Serverless functions, particularly in the context of Serverless edge
[Link] paper identifies the difficulties associated with deploying the state-of-
the-art cloud-oriented Serverless workflow scheduling on resource-constrained edge
devices. The authors propose modeling Serverless workflows using behavior trees as
a solution to these challenges. They present key observations and preliminary results
demonstrating the effectiveness of behavior tree-based Serverless workflow
scheduling.
In a paper by C. Cicconetti et al. titled "A Prototype for QKD-secure
Serverless Computing with ETSI MEC," the implementation of a secure edge
computing network using Quantum Key Distribution (QKD) and the Function-as-a-
Service (FaaS) paradigm is explored. This paper focuses on a hospital use case where
digital health applications invoke remote functions provided by an Apache
OpenWhisk cluster deployed in the edge infrastructure. The prototype showcases an
edge computing network where both the client and edge domains host simulated QKD
devices. This setup aims to enhance security by encrypting the arguments and return
values of the invoked functions using keys generated through a simulated QKD point-
to-point network. The paper highlights the use of standard interfaces defined by the
ETSI MEC (Multi-access Edge Computing) and QKD industry study groups to handle
all interactions in the control and management planes.
In a paper by A. Kumar, R. Gupta, and R. Bhandari titled "WoS Bibliometric-
based Review on Serverless Computing Model," the authors provide a comprehensive
bibliometric analysis of the Serverless computing paradigm. The Serverless
architecture, although named so, still involves servers, but their management—
provisioning, upkeep, and scaling—is handled entirely by cloud providers.
Developers focus solely on deploying their code, often in containers, without dealing
with the underlying infrastructure. The paper utilizes a bibliometric approach to
analyze research on Serverless computing over the past six years (2017-2022),
leveraging the Web of Science (WoS) database. By employing the "bibliometrix"
library in RStudio, the authors conduct a detailed analysis of various factors,
including publication sources, citation counts, technology adoption, research impact,
P.R.M.I.T. & R, BADNERA // B.E (IT) / /2024-25 4
Serverless Computing
and emerging research gaps. They also assess the methodological quality of their
review using AMSTAR and PRISMA checklists, ensuring a robust and systematic
approach.
In the paper by K. Govindarajan and A. D. Tienne, titled "Resource
Management in Serverless Computing - Review, Research Challenges, and
Prospects," the authors address the evolving trend of Serverless computing and the
associated challenges in resource management. Serverless computing has gained
prominence due to its ability to abstract infrastructure management away from users,
allowing them to focus solely on developing cloud applications composed of multiple
cloud functions. This shift relieves users from the complexities of infrastructure
management but introduces its own set of challenges, particularly in resource
management. The paper provides a comprehensive literature review on Serverless
computing, exploring various open-source frameworks that support this paradigm. It
delves into the complexities inherent in managing Serverless platforms and the impact
on Quality of Service (QoS). The authors emphasize that despite the infrastructure
being managed by cloud providers, effective resource management remains crucial to
meet both user and consumer service requirements.
In the paper by J. Cho and Y. Kim entitled "A Design of Serverless
Computing Service for Edge Clouds," the focus is on optimizing Serverless
computing for edge cloud environments. The paper highlights the growing interest in
Serverless computing at the edge due to its efficient resource utilization, and proposes
a design to enhance its performance in edge cloud scenarios. The authors address the
limitations of centralized architectures in Serverless computing at the edge. In such
architectures, the control plane cluster is responsible for monitoring resource status
and redeploying functions based on the conditions of various edge devices. However,
this process can be time-consuming, particularly if an edge device encounters issues
or anomalies, leading to delays in function deployment and [Link] overcome
these challenges, the paper introduces an innovative approach involving cross-
monitoring. This technique involves importing monitoring metrics directly from
nearby edge devices, enabling real-time assessment and response.
In the paper by Y. Li, J. Liu, B. Jiang, C. Yang, and Q. Wang titled "Cost
Minimization in Serverless Computing with Energy Harvesting SECs," the authors
address the challenges faced by Multi-access Edge Computing (MEC) due to an
increasing number of Mobile Users (MUs) and propose a novel solution to mitigate
these [Link] MEC systems struggle with resource limitations, Serverless Edge
Computing (SEC) emerges as a promising approach to alleviate these constraints.
However, existing research predominantly focuses on the operational aspects of SEC
servers while neglecting the interaction between SEC and MUs. To bridge this gap,
P.R.M.I.T. & R, BADNERA // B.E (IT) / /2024-25 5
Serverless Computing
the authors introduce a Stackelberg game model aimed at maximizing the utility for
each MU. They propose an iterative algorithm to optimize the interactions between
SEC and MUs, considering the impact of the function resource pool and the use of
renewable [Link] proposed scheme involves downloading functions from the
cloud when they are not stored locally in the SEC, which incurs additional costs.
Conversely, using harvested renewable energy reduces the SEC's operational costs
compared to purchasing energy from the grid.
In the paper by T. P. Bac, M. N. Tran, and Y. Kim titled "Serverless
Computing Approach for Deploying Machine Learning Applications in Edge Layer,"
the authors address the integration of Serverless computing with machine learning
applications, focusing on the edge computing environment. Serverless computing,
known for its stateless model and cost efficiency, has demonstrated significant
advantages for event-driven applications in the cloud, including AI and machine
learning. The paper highlights that while Serverless computing simplifies the
management of machine learning systems, there are notable limitations when applied
to cloud environments, such as latency and data privacy concerns. To overcome these
challenges, the authors propose an architecture for deploying machine learning
workloads as Serverless functions specifically in edge environments, where
computing nodes are located closer to the end-users. The proposed architecture aims
to mitigate the issues associated with cloud-based Serverless AI applications by
leveraging local distributed edge computing nodes.
In the paper by R. Veuvolu, A. Suryadevar, T. Vignesh, and N. R. Avthu
entitled "Cloud Computing Based (Serverless Computing) using Serverless
Architecture for Dynamic Web Hosting and Cost Optimization," the focus is on
leveraging Serverless architecture for optimizing dynamic web hosting and cost
management. The authors discuss the Serverless computing model, which abstracts
the complexities of server management from developers. Instead of handling servers
and computing resources directly, cloud providers manage these aspects
automatically. This model eliminates the need for virtual machines or physical
servers, with cloud providers taking responsibility for provisioning, maintaining, and
scaling the infrastructure. The paper highlights the key benefits of Serverless
architecture, including the ability to launch applications only as needed, which leads
to cost savings by avoiding the expenses associated with constantly running servers.
P.R.M.I.T. & R, BADNERA // B.E (IT) / /2024-25 6
Serverless Computing
3. SYSTEM ANALYSIS AND DESIGN
In this chapter the System Analysis and System Design of Serverless Computing is
described.
3.1 ANALYSIS:
Introduction to Serverless Computing:
Serverless computing is a cloud-computing execution model where the cloud provider
dynamically manages the allocation and provisioning of servers. Applications are
broken down into individual functions that can be executed in response to events or
triggers. This model allows developers to focus on writing code without worrying
about server management.
3.1.1. Problem Definition
Problems Addressed by Serverless Computing:
1) Infrastructure Management:
Traditional infrastructure management requires significant resources and expertise.
Serverless computing eliminates the need for manual server management, allowing
developers to focus on code rather than infrastructure.
2) Scalability Issues:
Scaling applications to handle varying loads is challenging and often inefficient.
Serverless platforms automatically scale up or down based on demand, optimizing
resource usage and cost.
3) Cost Management:
Traditional servers incur costs continuously, even during idle times, leading to
inefficiencies. In contrast, serverless computing charges only for actual execution
time, resulting in significant cost savings.
4) Development Speed:
Setting up and managing servers slows down the development process. Serverless
platforms speed up development by abstracting the infrastructure layer.
Specific Problems in Traditional Computing:
1) Over-provisioning and under-utilization:
Allocating more resources than needed results in unnecessary expenses and
inefficiency.
P.R.M.I.T. & R, BADNERA // B.E (IT) / /2024-25 7
Serverless Computing
2) Maintenance overhead:
Regular updates, patches, and maintenance demand constant attention, increasing the
operational burden.
3) Deployment complexity:
Managing different environments (development, testing, production) can be
cumbersome.
4) Latency issues:
Inefficient resource management can lead to latency in processing requests.
3.1.2 Requirement Analysis
Functional Requirements:
1) Event-driven Execution:
The system must support triggering functions in response to events such as HTTP
requests, database changes, file uploads, etc.
2) Automatic Scaling:
Functions must automatically scale based on the number of incoming requests,
without manual intervention.
3) Resource Allocation:
Dynamically allocate resources to functions based on demand, ensuring efficient use
of resources.
4) Support for Multiple Languages:
The platform should support various programming languages to accommodate diverse
development teams.
5) Integrated Development Environment:
Provide tools and interfaces for coding, testing, and deploying functions directly from
the platform.
Technical Requirements:
1) API Gateway:
Deploy a robust API gateway to efficiently manage and route incoming requests to
the appropriate serverless functions. This ensures seamless communication between
clients and backend services, enhancing application performance.
P.R.M.I.T. & R, BADNERA // B.E (IT) / /2024-25 8
Serverless Computing
2) Logging and Monitoring:
Implement comprehensive logging and monitoring tools to track function
performance, identify errors, and analyze usage patterns. These tools provide valuable
insights for optimizing application efficiency and reliability.
3) Versioning and Rollback:
Support for versioning functions and the ability to rollback to previous versions if
necessary.
4) Integration with other Services:
Seamless integration with other cloud services, such as databases, storage, and
messaging services.
5) Developer Tools:
Provide SDKs, CLIs, and other tools to facilitate development, testing, and
deployment of serverless functions.
3.2 SYSTEM DESIGN:
3.2.1 Architectural Diagram:
Fig 3.1 Architectural Diagram
1) Web Application:
Web application is a software program that runs on a web server and can be accessed
through a web browser over the Internet. It combines front-end technologies like
P.R.M.I.T. & R, BADNERA // B.E (IT) / /2024-25 9
Serverless Computing
HTML, CSS, and JavaScript with back-end technologies such as databases and
server-side programming to deliver interactive and dynamic user experiences. Web
applications are widely used for a range of purposes, including e-commerce, social
networking, content management, and more.
2) API Gateway:
The API Gateway acts as the entry point for all client requests, managing routing,
authorization, throttling, and monitoring. It decouples client interactions from
backend logic, simplifying scalability and management.
3) Authorizer:
The Authorizer handles authentication and authorization, ensuring only authenticated
users with proper permissions can access resources. It integrates with identity
providers like AWS Cognito, Auth0, or custom services to validate tokens or
credentials.
4) Content Service:
The Content Service manages static and dynamic content operations, such as retrieval,
creation, updating, or deletion. It uses serverless functions (e.g., AWS Lambda) and
storage solutions like S3 or databases for handling content.
5) User Service:
The User Service handles user-related operations, including registration, login, profile
updates, and data retrieval. It ensures secure handling of user information, interacts
with the Authorizer, and often uses databases like DynamoDB or Firestore for data
storage.
6) External API:
The External API represents third-party services that the application relies on, such as
payment processing or geolocation. Serverless functions interact with these external
APIs, allowing the application to leverage additional functionalities without managing
infrastructure.
7) External Database
The External Database stores persistent data, such as user information and content
data, using scalable databases like AWS DynamoDB, Google Firestore, or Azure
Cosmos DB. These databases automatically handle high availability, backup, and
scaling.
P.R.M.I.T. & R, BADNERA // B.E (IT) / /2024-25 10
Serverless Computing
3.2.2 Components of Serverless Computing:
API Gateway:
Manages and routes incoming HTTP requests to the appropriate serverless functions.
It handles authentication, request transformation, and response formatting.
Fig 3.2 Components of Serverless Computing
Function as a Service (FaaS):
Function as a Service (FaaS) allows developers to deploy individual functions that
execute in response to specific events. It abstracts the underlying infrastructure,
automatically managing scaling, execution, and resource allocation based on demand,
which simplifies deployment and reduces operational overhead.
Backend as a Service (BaaS):
Backend as a Service (BaaS) provides ready-to-use backend functionalities such as
databases, authentication, and file storage. It streamlines backend development by
offering pre-built services and integrations, allowing developers to focus on frontend
development and business logic without managing backend infrastructure.
P.R.M.I.T. & R, BADNERA // B.E (IT) / /2024-25 11
Serverless Computing
4. SYSTEM IMPLEMENTATION
In this chapter the System implementation of Seminar topic Serverless Computing is
described.
4.1 System Implementation:
Fig 4.1 Serverless Architecture
1) Authentication Service:
Function:
Authentication services handle user identity verification by validating credentials and
issuing authentication tokens. They also manage access control by enforcing
permissions and ensuring that users can only access authorized resources.
Components:
Identity Providers: Services like AWS Cognito, Google Identity Platform, or Azure
Active Directory handle user sign-ups, sign-ins, and federated identity management.
Authentication Methods: Supports various methods such as username/password,
multi-factor authentication (MFA), and social logins (e.g., Google, Facebook).
Authorization: Access control systems define and enforce permissions, ensuring
users can only access resources they are authorized to use.
How It Works:
Users authenticate via the authentication service, which verifies their credentials and
issues tokens (e.g., JWTs). These tokens are then used to securely access serverless
functions and other resources.
P.R.M.I.T. & R, BADNERA // B.E (IT) / /2024-25 12
Serverless Computing
2. Database:
Function:
Provides data storage and retrieval capabilities.
Types:
Relational Databases: Services like Amazon RDS, Google Cloud SQL, and Azure
SQL Database manage structured data with support for SQL queries and ACID
transactions.
NoSQL Databases: Services like Amazon DynamoDB, Google Cloud Firestore, and
Azure Cosmos DB handle unstructured or semi-structured data, offering flexible
schemas and high scalability.
How It Works:
Data Storage: Stores data persistently and allows serverless functions to interact with
it for reading and writing operations.
Scaling: Databases automatically scale based on the workload, handling varying
amounts of data and user requests.
3. API:
Function:
Facilitates communication between different components and external systems.
Components:
API Gateway: Manages and routes API requests to appropriate serverless functions
or microservices. Handles tasks like request validation, throttling, and authentication.
API Endpoints: Define the various operations (e.g., GET, POST, PUT) that the
serverless functions or services expose to clients.
How It Works:
Clients send requests to the API Gateway, which forwards them to the serverless
functions. The functions process the requests and return responses via the API
Gateway.
4. File Storage:
Function:
File storage services offer scalable solutions for managing large amounts of
unstructured data, such as documents, images, and videos.
Components:
Object Storage: Services like Amazon S3, Google Cloud Storage, and Azure Blob
Storage store and manage large amounts of unstructured data (e.g., images, videos,
backups).
File Systems: Managed file systems like Amazon EFS or Azure Files provide shared
file storage for applications that require file-based access.
P.R.M.I.T. & R, BADNERA // B.E (IT) / /2024-25 13
Serverless Computing
How It Works:
File Upload/Download: Serverless functions can upload files to or download files
from the storage service, with automatic scaling to handle varying storage needs.
Access Control: Ensures secure access to files through authentication and
authorization mechanisms.
5. Reporting:
Function:
Generates and visualizes reports based on data analysis.
Components:
Data Analytics: Services like AWS QuickSight, Google Data Studio, and Azure
Power BI analyze and visualize data to create actionable insights and reports.
Event-Driven Reporting: Serverless functions can trigger reporting tasks based on
specific events (e.g., new data entry) and automate report generation.
How It Works:
Data Integration: Aggregates and processes data from various sources (e.g.,
databases, APIs) to generate reports.
Visualization: Presents data in graphical formats such as charts, graphs, and tables
for better understanding and decision-making.
4.2 System Execution Detail:
1. Event Triggering
Event Sources:
HTTP Requests: Triggered via API Gateway when users make HTTP requests to
defined endpoints.
Database Changes: Changes in a database (e.g., new records, updates) can invoke
functions through database triggers.
File Uploads: Uploads to storage services (e.g., S3, Google Cloud Storage) can
trigger functions to process files.
Messaging Queues: Messages in queues (e.g., AWS SQS, Google Cloud Pub/Sub)
can activate functions to handle message processing.
Example:
A user uploads an image to Amazon S3, which triggers an image processing function
via S3 event notifications.
2. Function Invocation
Invocation Process:
Event Reception: The event source sends the event data to the API Gateway or
directly to the serverless platform.
P.R.M.I.T. & R, BADNERA // B.E (IT) / /2024-25 14
Serverless Computing
Function Execution: The serverless platform receives the event, allocates the
necessary resources, and invokes the corresponding serverless function.
Execution Environment:
Isolation: Functions run in isolated environments (e.g., containers, VMs) to ensure
security and independence.
Statelessness: Functions are stateless and do not retain any information between
executions. They rely on external services for state management if needed.
Example:
Upon receiving an image upload event, the function processes the image and
generates a thumbnail.
3. Data Processing
Interaction with BaaS:
Database Access: Functions may read from or write to databases (e.g., DynamoDB,
Cloud Firestore) as part of their processing logic.
Authentication: Functions may interact with authentication services (e.g., AWS
Cognito) to verify user identity and access permissions.
External Services:
APIs: Functions can call external APIs or services to perform additional tasks, such
as sending notifications or integrating with third-party systems.
Example:
A function processes data from a message queue, interacts with a database to update
records, and calls an external API to send notifications.
4. Result Handling
Result Processing:
Data Formatting: Functions may format the output data as needed (e.g., JSON,
XML) before sending it back to the client or triggering service.
Error Handling: Functions are designed with error-handling logic to manage
exceptions and unexpected conditions. This includes mechanisms for retrying
operations, logging errors, and providing meaningful error responses. API Gateway
Integration:
Response Management: The API Gateway formats and returns the response to the
client or initiates further processing based on the function’s output.
Example:
After processing the image, the serverless function stores the processed image in a
database or file storage service. It then generates a URL pointing to the processed
image. This URL is returned to the client via the API Gateway. The API Gateway
handles the response and communicates it back to the client.
P.R.M.I.T. & R, BADNERA // B.E (IT) / /2024-25 15
Serverless Computing
5. Scaling and Resource Management
Automatic Scaling:
On-Demand Execution: Serverless platforms automatically scale resources based on
the number of incoming events. Functions are executed in parallel to handle high
loads.
Resource Allocation: The platform manages resource allocation and deallocation
dynamically, ensuring efficient utilization.
Cost Efficiency:
Pay-as-You-Go: You are charged based on the actual execution time and resources
used, rather than provisioning and paying for fixed resources.
Example:
During peak usage, the serverless platform scales up the number of function instances
to handle increased event traffic.
6. Monitoring and Logging
Monitoring:
Performance Metrics: Track key performance metrics such as execution time, error
rates, and invocation counts.
Alerts: Set up alerts for performance issues or abnormal activity to ensure timely
responses.
Logging:
Execution Logs: Capture logs related to function execution, including input
parameters, processing details, and error messages.
Centralized Logging: Use centralized logging services (e.g., AWS CloudWatch,
Google Stackdriver) to aggregate and analyze logs.
Example:
Monitoring tools provide insights into function performance and error rates, helping
identify and address issues proactively.
P.R.M.I.T. & R, BADNERA // B.E (IT) / /2024-25 16
Serverless Computing
5. CONCLUSION
In this report, I have explored the transformative potential of serverless
computing. Serverless computing allows developers to build and deploy applications
without managing infrastructure, resulting in scalable, cost-efficient solutions. Key
aspects covered include the event-driven nature of serverless functions, their
integration with Backend as a Service (BaaS) components, and the benefits of
automatic scaling and pay-as-you-go pricing models.
I have also examined the merits of serverless computing, such as reduced
operational costs and simplified deployment, alongside its limitations, including
potential cold start latency and challenges in state management. Future enhancements
in serverless technology may focus on optimizing execution speed, improving state
management, and expanding integration capabilities, which could further enhance its
efficiency and applicability across various domains.
P.R.M.I.T. & R, BADNERA // B.E (IT) / /2024-25 17
Serverless Computing
6. ADVANTAGES
In this chapter the advantages of my seminar topic Serverless Computing are
described.
Advantages of Serverless Computing:
1) Cost Efficiency:
Pay-as-You-Go: Charges are based on actual execution time, eliminating costs for idle
resources.
2) Automatic Scaling:
Elasticity: Scales resources automatically in response to the number of incoming
events or workload demands.
3) Simplified Deployment:
No Server Management: Developers focus on code rather than infrastructure,
streamlining deployment and updates.
4) Enhanced Developer Productivity:
Focus on Application Logic: Developers can concentrate on writing business logic
instead of managing servers.
5) Improved Reliability:
Built-In Fault Tolerance: Platforms provide high availability and automatic fault
tolerance.
6) Integration with Cloud Services:
Seamless Integration: Easy integration with Backend as a Service (BaaS) components
and other cloud services.
7) Increased Flexibility:
Modular Architecture: Allows for independent scaling and development of functions,
adapting easily to changes.
P.R.M.I.T. & R, BADNERA // B.E (IT) / /2024-25 18
Serverless Computing
DISADVANTAGES
In this chapter the disadvantages of my seminar topic Serverless Computing are
described.
Disadvantages of Serverless Computing:
1) Startup Delay:
Functions can experience latency during initial execution after being idle, affecting
performance.
2) Limited Execution Time:
Functions often have maximum execution time limits, which can be restrictive for
long-running tasks.
3) Resource Limitations:
Functions may have limitations on memory, execution time, and concurrency,
impacting performance for certain applications.
4) Complex Debugging:
Troubleshooting issues can be challenging due to the distributed and abstracted nature
of serverless functions.
5) Vendor Lock-In:
Relying on specific cloud provider features can lead to difficulty migrating to other
platforms.
P.R.M.I.T. & R, BADNERA // B.E (IT) / /2024-25 19
Serverless Computing
7. FUTURE SCOPE
In this chapter the future scope of my seminar topic Serverless Computing is
described.
Future Scope of Serverless Computing:
1) Enhanced Performance:
Innovations will focus on minimizing delays during function startup to improve
response times and user experience.
2) Improved Stateful Solutions: Development of better mechanisms for state
management will address current limitations and support more complex applications.
3) Broader Ecosystem Integration:
Increased integration with a wider range of services will enable more comprehensive
and interconnected serverless solutions.
4) Improved Security:
Advances in security protocols and isolation techniques will address vulnerabilities
and improve the overall safety of serverless environments.
5) Greater Customization:
Future developments will provide greater control over resource allocation and
execution environments, catering to specific application needs.
6) Cross-Platform Compatibility: Enhanced support for multi-cloud and hybrid
environments will facilitate seamless integration and management across various
cloud platforms.
P.R.M.I.T. & R, BADNERA // B.E (IT) / /2024-25 20
Serverless Computing
REFERENCES
[1] K. Luo, T. Ouyang, Z. Zhou and X. Chen, "Behavior Tree-based Workflow
Modeling and Scheduling for Serverless Edge Computing," 2023 IEEE 43rd
International Conference on Distributed Computing Systems (ICDCS), Hong
Kong, Hong Kong, 2023, pp. 955-956.
[2] C. Cicconetti et al., "A Prototype for QKD-secure Serverless Computing with
ETSI MEC," 2023 IEEE International Conference on Smart Computing
(SMARTCOMP), Nashville, TN, USA, 2023, pp. 189-190.
[3] A. Kumar, R. Gupta and R. Bhandari, "WoS Bibliometric-based Review on
Serverless Computing model," 2022 Seventh International Conference on
Parallel, Distributed and Grid Computing (PDGC), Solan, Himachal Pradesh,
India, 2022,
[4] K. Govindarajan and A. D. Tienne, "Resource Management in Serverless
Computing - Review, Research Challenges, and Prospects," 2023 12th
International Conference on Advanced Computing (ICoAC), India, 2023.
[5] J. Cho and Y. Kim, "A Design of Serverless Computing Service for Edge
Clouds," 2021 International Conference on Information and Communication
Technology Convergence (ICTC), Jeju Island, Korea, Republic of, 2021.
[6] Y. Li, J. Liu, B. Jiang, C. Yang and Q. Wang, "Cost Minimization in Serverless
Computing with Energy Harvesting SECs," 2023 IEEE International Symposium
on Broadband Multimedia Systems and Broadcasting (BMSB), Beijing, China,
2023.
[7] T. P. Bac, M. N. Tran and Y. Kim, "Serverless Computing Approach for
Deploying Machine Learning Applications in Edge Layer," 2022 International
Conference on Information Networking (ICOIN), Jeju-si, Korea, Republic of,
2022.
P.R.M.I.T. & R, BADNERA // B.E (IT) / /2024-25 21