Topics In Demand
Notification
New

No notification found.

Knative-ly Serverless: Unlock the power of Serverless with Knative & Kubernetes

June 14, 2023 76 0 Cloud Computing

Knative-ly Serverless: Unlock the power of Serverless with Knative & Kubernetes

Serverless computing has emerged as a popular approach for building and deploying applications in the cloud. By abstracting away the underlying infrastructure and focusing on the code, Serverless computing enables developers to focus on delivering value to their users, while also reducing operational overhead and cost. Knative is an open-source platform that provides a set of building blocks for building and running Serverless applications on Kubernetes. With Knative, developers can easily deploy and scale Serverless functions, containers, and applications, without having to worry about the underlying infrastructure. In this white paper, we will explore the benefits of Serverless computing with Knative, and how it can help organizations improve their agility, reduce costs, and increase developer productivity. In addition, we will highlight some real-world examples of companies using Knative for Serverless computing, and the results and benefits they have achieved.

What is serverless computing?

 

Serverless computing is a model of computing where the infrastructure and resources required to run an application are managed and allocated dynamically by a third-party service provider, without the need for the developer to manage or provision servers. The developer only needs to write and upload the application code to the Serverless platform, which then handles the execution of the code and scales the resources up or down as needed based on the application workload. Serverless computing allows businesses to concentrate on their core services instead of worrying about the underlying IT infrastructure, such as server operating systems and networks

 

 

The significance of Serverless computing lies in its ability to reduce the operational overhead and cost of managing infrastructure while improving the agility and scalability of applications. By eliminating the need to provision and manage servers, developers can focus on delivering value to their users, rather than managing infrastructure. Serverless computing also enables organizations to scale their applications automatically, without having to worry about capacity planning or performance tuning.

In the mid-2000s, there were a few developments that laid the foundation for what would eventually become serverless computing. One key development was the emergence of cloud computing, which made it possible to provision computing resources on-demand and at scale, without the need for on-premises infrastructure. Another important development was the rise of event-driven computing, which allowed developers to trigger code execution in response to events such as changes to a database, a new file appearing in a storage bucket, or a new message being sent to a queue. This event-driven approach enabled developers to build highly responsive and scalable applications that could scale automatically based on demand.

Sample Event Driven application

Businesses initially had to manage their own hardware to host web applications, but cloud computing models such as IaaS, PaaS, & SaaS enabled them to buy IT resources and servers from third-party providers, reducing the burden of managing application traffic. However, this resulted in companies paying for unused resources. To address these inefficiencies, Serverless Computing was introduced, which involves assembling code into functions and scaling individual components, similar to Microservices but with further breakdown of monolithic systems.

 

The benefits of Serverless computing include:

Simplified Application Development: With Serverless, developers can focus on writing code, without having to worry about the underlying infrastructure. Serverless computing provides a higher-level abstraction for developers, enabling them to build and deploy applications quickly and easily, while also taking advantage of the scalability and flexibility of the platform.

Reduced Time to Market: Serverless computing enables developers to deploy applications quickly and easily, without having to worry about the underlying infrastructure. This can lead to faster time-to-market and increased innovation. Event-Driven Architecture: With Serverless Computing, developers can create event-driven architectures that can process and respond to events in real time.

Cost savings: Serverless computing eliminates the need to pay for idle resources, allowing organizations to pay only for the computing resources they actually use. This can lead to significant cost savings, particularly for applications with unpredictable or variable workloads.

Improved scalability: Serverless computing enables applications to scale automatically in response to changes in workload, without requiring manual intervention or capacity planning.

Some common use cases for Serverless computing include:

Web and mobile applications: Serverless computing is well suited for web and mobile applications that have unpredictable or variable workloads, as it enables organizations to scale their applications automatically without having to worry about capacity planning.

Data processing and analytics: Serverless computing can be used for data processing and analytics tasks, such as ETL (extract, transform, and load) processes, batch processing, and real-time stream processing.

Internet of Things (IoT): Serverless computing can be used for IoT applications, such as processing sensor data or triggering actions based on events.

 

What is Knative?

Knative is a Serverless application layer designed specifically for developers, and it works well alongside the existing Kubernetes application constructs. It is a CNCF (Cloud Native Computing Foundation) incubating project, which means it is supported by a community of experts and organizations committed to advancing cloud-native technologies.

 

Knative provides an essential set of components for building and running Serverless applications on Kubernetes. Created with contributions from over 50 different companies, Knative includes features like scale-to-zero, auto scaling, in-cluster builds, and an eventing framework for cloud-native applications. Knative can be used on-premises, in the cloud, or in a third-party data center. Most importantly, Knative enables developers to focus on writing code, without needing to worry about the challenging and mundane aspects of deploying and managing their applications.

Knative is made up of two components: “Knative Serving,” which is a container runtime that can automatically scale based on HTTP requests, and “Knative Eventing,” which is a routing layer for Events asynchronously sent over HTTP.

Knative Serving

Knative Serving is responsible for handling incoming requests and auto-scaling applications based on traffic. When a request is received, Knative Serving routes the request to the appropriate service, which then processes the request and returns a response. Knative Serving uses a routing mechanism to direct requests to the appropriate service based on the request URL or other criteria.

How does Knative Serving work?

 

1. The developer deploys a Serverless workload on Knative Serving by creating a Kubernetes Custom Resource Definition (CRD) object.

2. Knative Serving automatically creates and manages a set of resources, including Services, Routes, Configurations, and Revisions.

3. The Service resource manages the entire lifecycle of the workload and creates other objects such as Routes and Configurations.

4. Routes define the network endpoint that will be used to access the workload, and Configurations define the desired state of the workload deployment.

5. When a Configuration is modified, it generates a new Revision, which represents a snapshot of the code and configuration at that specific time.

6. Revisions are immutable objects that can be scaled up or down automatically based on incoming traffic. Knative Serving uses the Knative Pod Autoscaler (KPA), which is built on top of Kubernetes Horizontal Pod Autoscaler (HPA), to manage the scaling of Revisions based on the configured scaling rules and observed resource utilization, including request throughput and concurrency.

7. KPA provides a more responsive and dynamic scaling experience by automatically scaling to zero when there is no incoming traffic, and quickly scaling up when there is a sudden increase in traffic.

8. The Serverless Workload is accessed by sending requests to the network endpoint defined by the Route resource.

 

Knative Eventing

Knative Eventing is a set of APIs that enable an event-driven architecture for the applications. With these APIs, we can build components that route events from event producers to event consumers, known as sinks. Sinks can receive events or respond to HTTP requests by sending a response event. The communication between event producers and sinks in Knative Eventing is based on standard HTTP POST requests. The events themselves follow the CloudEvents specifications, which allows for easy creation, parsing, sending, and receiving of events in any programming language. Knative Eventing components are loosely coupled, which means that they can be developed and deployed independently of each other. This flexibility allows event producers to generate events before active event consumers are listening for them. Similarly, event consumers can express interest in a particular class of events before any producers are creating those events.

Benefits of serverless computing with Knative

Knative provides all the benefits of Serverless computing and offers additional benefits, including:

Cloud agnostic: Knative is a Kubernetes-based platform that is designed to be cloud agnostic. This means that it can run on any cloud infrastructure that supports Kubernetes, such as Google Cloud Platform (GCP), Amazon Web Services (AWS), Microsoft Azure, and on-premise Kubernetes cluster among others

Progressive Rollouts: Knative provides Progressive Rollouts as a key feature that allows developers to deploy new versions of their applications to a subset of users or traffic, and gradually increase the rollout over time. This can help reduce the risk of introducing new bugs or issues in the application, and ensure a smooth transition for end-users. Knative provides several features to support Progressive Rollouts, including traffic splitting, automatic rollbacks, and version tracking.

Knative is pluggable: Knative is designed to be a pluggable, Kubernetes-native platform that can be easily integrated and extended to meet the needs of different applications and use cases. Knative's pluggable and extensible architecture makes it a powerful platform for building and deploying modern, cloud-native applications. Its seamless integration with Kubernetes, and flexible building blocks and extension points, allow developers to tailor Knative to meet the unique requirements of their applications and environments.

Knative developers focus on code: Knative provides a simplified developer experience that abstracts away many of the complexities of deploying and managing cloud-native applications. Knative will automatically take care of deploying and managing the application, including scaling, autoscaling, and routing traffic to the appropriate version. In addition to this, Knative provides a set of tools and APIs that allow developers to manage and monitor their applications in a simple and intuitive way. Knative's focus on simplifying the developer experience can help reduce the learning curve for building and deploying cloud-native applications, and allow developers to focus on what they do best: writing code.

Case studies

Several companies have already started using Knative to build and deploy Serverless applications.

Here are a few examples:

Puppet: Puppet is a company that specializes in automating infrastructure for businesses. It was established in 2009 with the aim of solving complex operational challenges. In 2019, the company identified a problem where cloud operations teams were struggling to manage modern cloud-native applications due to reliance on manual workflows. To address this issue, Puppet developed a platform to connect events triggered by modern architectures. This platform ensures that cloud environments remain secure, compliant, and cost-effective.

PNC Bank: PNC, one of the largest banks in the United States with assets under administration of $367 billion, has a substantial IT presence and a development team that must deliver innovative code while meeting regulatory compliance obligations. To address the challenge of ensuring new code meets security standards and audit compliance requirements, PNC sought to develop an automated solution to replace the 30-day manual process that was in place. PNC leveraged the capabilities of Knative, a cloud-native framework for serverless computing and event-driven architecture, to build internal tools that automatically verify new and updated code. With Knative's eventing and serverless capabilities, PNC was able to connect Apache Kafka and CI/CD toolchain events, achieving an automated state. Additionally, PNC utilized TriggerMesh's declarative API to address the specific needs of their event-driven workflow. This process enables PNC to prevent code from going into production if any of the outlined requirements are missing, ensuring compliance with company-wide standards

 

Outfit7: Outfit7, a mobile gaming company founded in 2009, has achieved tremendous growth, with over 17 billion downloads and 85 billion video views in the previous year alone. Outfit7 has consistently ranked among the top five game publishers on iOS and Google Play worldwide by the number of game downloads for six consecutive years (2015-2020). However, the company faced significant challenges due to the massive scale of its operations, which includes up to 470 million monthly active users, 20 thousand server requests per second, and terabytes of data generated daily. To address these challenges, they turned to Knative and Kubernetes to optimize real-time bidding ad sales in a way that can automatically scale up and down as needed. The resulting system is easy to maintain, freeing up software engineers to work on more critical tasks such as optimizing backend costs and adding new game features. These case studies demonstrate the power and flexibility of Knative for building and deploying Serverless applications on Kubernetes. By leveraging Knative, companies can reduce costs, improve scalability, and increase the speed and efficiency of their application development processes.

 

Conclusion

Serverless computing has emerged as a popular model for building and deploying applications in the cloud. With Knative, developers can leverage the power of Kubernetes to build and deploy Serverless applications easily and efficiently. By using Knative, developers can focus on writing code and building applications, without worrying about infrastructure management.

Knative offers several benefits for building and deploying Serverless applications, including scalability, cost-effectiveness, and ease of use. The platform also provides several powerful features, such as automatic scaling, eventing, and build automation, that make it easier for developers to build and deploy Serverless applications. In conclusion, Knative is an excellent platform for building and deploying Serverless applications in the cloud. With its powerful features and ease of use, Knative can help developers build and deploy applications more quickly and efficiently, while also reducing costs and improving scalability. As more companies adopt Serverless computing, Knative is sure to play a key role in the future of cloud computing.

Key Takeaways

Serverless computing is a popular model for building and deploying applications in the cloud.

• Knative is a powerful platform for building and deploying Serverless applications on Kubernetes.

• Knative offers several benefits, including scalability, cost-effectiveness, and ease of use.

• Knative provides several powerful features, such as automatic scaling, eventing, and build automation, that make it easier for developers to build and deploy Serverless applications.

• Several companies have adopted Knative across various industries, including PNC, Puppet, Outfit7, and DeepC.

• By using Knative, developers can focus on writing code and building applications, without worrying about infrastructure management.

• Knative is sure to play a key role in the future of cloud computing as more companies adopt Serverless computing

 

Author

Dhanesh U

Senior Architect Cloud & DevOps


That the contents of third-party research report/s published here on the website, and the interpretation of all information in the report/s such as data, maps, numbers etc. displayed in the content and views or the opinions expressed within the content are solely of the author's; and do not reflect the opinions and beliefs of NASSCOM or its affiliates in any manner. NASSCOM does not take any liability w.r.t. content in any manner and will not be liable in any manner whatsoever for any kind of liability arising out of any act, error or omission. The contents of third-party research report/s published, are provided solely as convenience; and the presence of these research report/s should not, under any circumstances, be considered as an endorsement of the contents by NASSCOM in any manner; and if you chose to access these research report/s, you do so at your own risk.


NeST Digital, the software arm of the NeST Group, has been transforming businesses, providing customized and innovative software solutions and services for customers across the globe. A leader in providing end-to-end solutions under one roof, covering contract manufacturing and product engineering services, NeST has 25 years of proven experience in delivering industry-specific engineering and technology solutions for customers, ranging from SMBs to Fortune 500 enterprises, focusing on Transportation, Aerospace, Defense, Healthcare, Power, Industrial, GIS, and BFSI domains.



LATEST REPORTS

© Copyright nasscom. All Rights Reserved.