As applications, architectures, and business processes become more complex, understanding how all those moving parts are performing is absolutely critical. This blog post series is about Application Modernization and cloud migration, and how observability can help organizations can keep their products running optimally for their customers. In part one, we define Application Modernization and explain how it has become the lever for user experience.
Application Modernization is happening. Fast. It’s being driven by new ways to improve application functionality, performance, and user experience.
Microservices and containers have vastly improved on the application performance and cost effectiveness that VMs achieved less than a decade ago. They also represent a step forward from Service Oriented Architectures (SOA), which drove the previous Application Modernization generation.
So what is Application Modernization?
Application Modernization is defined as 1) optimizing cloud applications using microservices and container capabilities and 2) moving applications to the cloud.
Why is Application Modernization so important? Let’s summarize the four key business promises driving Application Modernization:
- Business agility
- Cost effectiveness
Enterprises view business agility as being responsive to user needs and bringing new application capabilities online faster. Some organizations have gone from one or two releases a day to hundreds or even thousands. That velocity requires the agility of an Olympic decathlete!
New technologies and competitive pressures have driven the need for businesses to adapt quickly. In fact, this need led to the origin of Agile software methodology as the older models like Waterfall took way too long. Agility is critical both for tactical actions like feature releases and for high-level strategic initiatives. As business conditions and competitive threats change, organizations must adapt quickly.
Business agility and responsiveness are intertwined. Responsiveness is the key impetus for business agility. It means immediately responding to user concerns by rapidly diagnosing and remediating complex issues. It’s also important for achieving SLI, SLO, and SLA goals, which are important performance measurements.
Scalability launched the cloud industry. Scaling up and down instantly as application demand peaks and ebbs is indispensable for system availability, user experience, and cost effectiveness (if used strategically).
Scalability enables the effective use of cloud resource credits as requirements change. Reducing or eliminating application downtime with on-demand resources, especially for e-commerce sites, is important for mitigating lost revenue from incomplete or abandoned transactions. The long-term reputation loss and customer attrition can be even more damaging.
Current Application Modernization Technologies
- Fast Network Technologies
- Reverse Proxy Servers
Cloud computing is essentially network of remote servers hosted on the internet to store, manage, and process data, versus on-premise servers.
The global cloud computing market size was valued at USD 274.79 billion in 2020 and is expected to grow at a compound annual growth rate (CAGR) of 19.1% from 2021 to 2028, according to a 2021 Grand View Research report.
Cloud infrastructure is now the predominant computing style and is offered in many configurations and service options to meet a range of needs. Each as-a-Service platform provides its own unique set of benefits depending on the use case.
- IaaS (Infrastructure as a Service) is where enterprises rent or lease servers for cloud computing and storage. In other words, with IaaS, you bring the software.
- PaaS (Platform as a Service) is where a provider delivers hardware and software tools to users over the internet. The service provider packages both the infrastructure hardware and software (up to everything except the application).
- SaaS (Software as a Service) is centrally hosted software licensed on a subscription basis to provide complete functionality.
All of the XaaS services can be operated in the public cloud, private cloud, and hybrid cloud configurations.
Fast network technologies
The underlying internet network technologies have remained primarily the same during the current Application Modernization epoch, with HTTPS, IP4, and TLS most prominent. However, the speed and latency of networks have improved considerably, and 10Gbps wired and 5G wireless support highly distributed microservice applications.
Software defined networks (SDN) have also simplified network bandwidth allocation, both within the cloud and on-premise. They make on-demand network performance for scaling applications easier and more rapidly scalable. In addition, reverse proxy servers and load balancers play an important role in ensuring that microservice interconnections perform at optimal levels.
Reverse Proxy Servers
Reverse proxy servers forward client requests to a server that can fulfill them and return the server’s response to the client. A load balancer distributes incoming client requests among a group of servers, returning the response from the selected server to the appropriate client.
Deploying a load balancer usually makes the most sense with multiple servers. A reverse proxy is valuable for even one web or application server and sits at the networks edge accept content requests from browsers and mobile apps.
Reverse proxies provide increased security because no server information is visible outside the internal network, so malicious access through vulnerabilities can occur. They also enable increased scalability and flexibility. Clients only access the reverse proxy’s IP address, allowing the backend server configuration to change dynamically. This complements microservices implementations by enabling the number of servers to be scaled up and down to match fluctuations in traffic volume.
Container technology is the powerful successor to VM technology and is a key driver for microservice application architectures. Containers in many ways have reduced the need for VMs – not because VMs didn’t perform well, but because independent host operating systems within a VM often weren’t needed.
Containers provide the same ‘packaging’ concept as VMs, but only provide a thin Container Engine between the host OS and the containers, instead of a larger Hypervisor that can manage numerous Guest OS implementations. Both containers and VMs can coexist in an application environment, enabling choice by functional need.
Containers have led to microservices, which package application components, typically as a service. This differs from monolithic, all-in-one applications and more distributed SOA architectures. With SOA, most of the services are concentrated with some distributed connections. With microservices, services are highly distributed into a service mesh.
Focus on Container Technologies: Kubernetes Clusters, Nodes, and Pods
Container orchestration automates the scheduling, deployment, scalability, load balancing, availability, and networking of containers. Container orchestration is the automation and management of the lifecycle of containers and services with the Service Mesh.
Kubernetes (K8s) has rapidly become the dominant platform for automating deployment, scaling, and orchestrating containerized applications. K8s is open-source software and there are many enterprise-ready K8s platforms, such as IBM Red Hat OpenShift, that make implementing and managing containerized applications much easier.
K8s creates a cluster of worker nodes that run containerized applications. Worker node(s) host Pods, which are the application workload components. The control plane manages the worker nodes and the pods in the cluster. The overall value of this is that K8s provides a highly available and modularized platform for creating modern applications.
Microservices are usually a collection of small, autonomous services using containers and pods. They are usually self-contained, implement a single capability, and communicate over a network using technology-agnostic protocols such as HTTP. Applications are comprised of a few or numerous microservices within a service mesh.
Microservices also lend themselves more readily to a continuous delivery software development process – more so than VM-based services. A change to a small part of the application requires rebuilding and redeploying only one or a small number of services. Microservices also provide fine-grained interfaces, such as endpoints, which simply network interactions for microservices.
Focus on Microservices Security
For microservices security you should consider each element – container, pod, node and cluster – to determine what level of security is needed for each. Not all microservice elements have the same security needs, although all have at least access control and code integrity security at their foundation.
To accommodate these needs, popular techniques such as RBAC, sidecars and service mesh methodologies can provide a high degree of security when used properly. Other techniques, including an automated deployment pipeline and scheduler, let you simplify host and application patch management with rolling upgrades to keep code levels up to date to avoid vulnerabilities.
Logging and monitoring techniques for performance and reliability can also be used to identify security-specific events. Enterprise Observability provides advanced capabilities for defining smart alerts and monitoring logs to identify potential security issues.
Microservices and containers have achieved scalability and rapid update and refresh. However, it has also tripled the number of individual code entities that now comprise the application. The challenge is managing and coordinating the integration and deployment of all those services.
Kubernetes and other platforms have filled that need by providing a large suite of capabilities that enables DevOps team to orchestrate service instantiation, deprecation, and other application functions.
In my next post, I’ll discuss Application Modernization Options and the CI/CD Pipeline.