Tutorial: High Availability for API Gateways in Multi-Cloud and Hybrid Environments - NGINX (2024)

Tutorial: High Availability for API Gateways in Multi-Cloud and Hybrid Environments - NGINX (1)

Akash Ananthanarayanan of F5

Technical Marketing Manager

December 20, 2022

Multi‑cloud deployments are here to stay. According to F5’s State of Application Strategy in2022 report,77% of enterprises operate applications across multiple clouds. The adoption of multi‑cloud and hybrid architectures unlocks important benefits, like improved efficiency, reduced risk of outages, and avoidance of vendor lock‑in. But these complex architectures also present unique challenges.

The software and IT leaders surveyed by F5 named these as their top multi‑cloud challenges:

  • Visibility (45% of respondents)
  • Security (44%)
  • Migrating apps (41%)
  • Optimizing performance (40%)

Managing APIs for microservices in multi‑cloud environments is especially complex. Without a holistic API strategy in place, APIs proliferate across public cloud, on‑premises, and edge environments faster than Platform Ops teams can secure and manage them. We call this problem API sprawl and in an earlier post we explained why it’s such a significant threat.

You need a multi‑cloud API strategy so you can implement a thoughtful approach to unifying your microservices– now distributed across multiple clouds– to ensure end-to-end connectivity. Two of the common scenarios for multi‑cloud and hybrid deployments are:

  • Different services in multi‑cloud/hybrid environments– You need to operate different applications and APIs in different locations, perhaps for cost efficiency or because different services are relevant to different groups of users.
  • Same services in multi‑cloud/hybrid environments– You need to ensure high availability for the same applications deployed in different locations.

In the following tutorial we show step-by-step how to use API Connectivity Manager, part of F5 NGINX Management Suite, in the second scenario: deploying the same services in multiple environments for high availability. This eliminates a single point of failure in your multi‑cloud or hybrid production environment: if one gateway instance fails, another gateway instance takes over and your customers don’t experience an outage, even if one cloud goes down.

API Connectivity Manager is a cloud‑native, run‑time–agnostic solution for deploying, managing, and securing APIs. From a single pane of glass, you can manage all your API operations for NGINXPlus API gateways and developer portals deployed across public cloud, on‑premises, and edge environments. This gives your Platform Ops teams full visibility into API traffic and makes it easy to apply consistent governance and security policies for every environment.

Enabling High Availability for API Gateways in a Multi-Cloud Deployment

As mentioned in the introduction, in this tutorial we’re configuring API Connectivity Manager for high availability of services running in multiple deployment environments. Specifically, we’re deploying NGINXPlus as an API gateway routing traffic to two services, ServiceA and ServiceB, which are running in two public clouds, Google Cloud Platform (GCP) and Amazon Web Services (AWS). (The setup applies equally to any mix of deployment environments, including Microsoft Azure and on‑premises data centers.)

Figure1 depicts the topology used in the tutorial.

Tutorial: High Availability for API Gateways in Multi-Cloud and Hybrid Environments - NGINX (2)

Follow the steps in these sections to complete the tutorial:

  • Install and Configure API Connectivity Manager
  • Deploy NGINXPlus Instances as API Gateways
  • Set Up an Infrastructure Workspace
  • Create an Environment and API Gateway Clusters
  • Apply Global Policies

Install and Configure API Connectivity Manager

  1. Obtain a trial or paid subscription for NGINX Management Suite, which includes Instance Manager and API Connectivity Manager along with NGINXPlus as an API gateway and NGINX App Protect to secure your APIs. Start a free, 30-day trial of NGINX Management Suite to get started.
  2. Install NGINX Management Suite. In the Install Management Suite Modules section, follow the instructions for API Connectivity Manager (and optionally other modules).
  3. Add the license for each installed module.
  4. (Optional.) Set up TLS termination and mTLS, to secure client connections to NGINX Management Suite and traffic between API Connectivity Manager and NGINXPlus instances on the data plane, respectively.

Deploy NGINX Plus Instances as API Gateways

Select the environments that make up your multi‑cloud or hybrid infrastructure. For the tutorial we’ve chosen AWS and GCP and are installing one NGINXPlus instance in each cloud. In each environment perform these steps on each data‑plane host that will act as an API gateway:

  1. Install NGINXPlus on a supported operating system.
  2. Install the NGINX JavaScript module (njs).
  3. Add the following directives in the main (top‑level) context in /etc/nginx/nginx.conf:

    load_module modules/ngx_http_js_module.so;load_module modules/ngx_stream_js_module.so;
  4. Restart NGINXPlus, for example by running this command:

    $ nginx -s reload

Set Up an Infrastructure Workspace

You can create multiple Infrastructure Workspaces (up to10 at the time of writing) in API Connectivity Manager. With segregated Workspaces you can apply policies and authentication/authorization requirements that are specific to different lines of business, teams of developers, external partners, clouds, and so on.

Working in the API Connectivity Manager GUI, create a new Workspace:

  1. Click Infrastructure in the left navigation column.
  2. Click the  + Create  button to create a new workspace, as shown in Figure2.

    Tutorial: High Availability for API Gateways in Multi-Cloud and Hybrid Environments - NGINX (3)
  3. In the CreateWorkspace panel that opens, fill in the Name field (demo in Figure3). Optionally, fill in the Description field and the fields in the Workspace Contact Information section. The infrastructure admin (your Platform Ops team, for example) can use the contact information to provide updates about status or issues to the users of the Workspace.

    Tutorial: High Availability for API Gateways in Multi-Cloud and Hybrid Environments - NGINX (4)
  4. Click the  Create  button.

Create an Environment and API Gateway Clusters

In API Connectivity Manager, an Environment is a logical grouping of dedicated resources (such as API gateways or API developer portals). You can create multiple Environments per Workspace (up to25 at the time of writing); they usually correspond to different stages of app development and deployment such as coding, testing, and production, but they can serve any purpose you want.

Within an Environment, an API Gateway Cluster is a logical grouping of NGINXPlus instances acting as API gateways. A single Environment can have multiple API Gateway Clusters which share the same hostname (for example, api.nginx.com, as in this tutorial). The NGINXPlus instances in an API Gateway Cluster can be located in more than one type of infrastructure, for example in multiple clouds.

There are two ways to configure an Environment in API Connectivity Manager for active‑active high availability of API gateways:

  • With one API Gateway Cluster
  • With multiple API Gateway Clusters

The primary reason to deploy multiple API Gateway Clusters is so that you can apply a different set of security policies to each cluster.

In Deploy NGINXPlus Instances as API Gateways, we deployed two NGINXPlus instances– one in AWS and the other in GCP. The tutorial uses the same instances to illustrate both Environment types (with a single API Gateway Cluster or with multiple API Gateway Clusters); if you want to deploy both Environment types in a single Workspace you would need to create additional NGINXPlus instances for the second Environment.

Deploy an Environment with One API Gateway Cluster

For an Environment with one API Gateway Cluster, the same security policies apply to all NGINXPlus API gateway instances, as shown in Figure4.

Tutorial: High Availability for API Gateways in Multi-Cloud and Hybrid Environments - NGINX (5)
Create an Environment and API Gateway Cluster
  1. Navigate to your Workspace and click the  Create Environment  button, as shown in Figure5.

    Tutorial: High Availability for API Gateways in Multi-Cloud and Hybrid Environments - NGINX (6)
  2. In the CreateEnvironment panel that opens, fill in the Name field (prod in Figure6) and optionally the Description field, and select the Environment type (here we’re choosing Prod).

    Tutorial: High Availability for API Gateways in Multi-Cloud and Hybrid Environments - NGINX (7)
  3. In the API Gateway Clusters section, fill in the Name and Hostname fields (api-cluster and api.nginx.com in Figure6).
  4. Click the  Create  button.

    The EnvironmentCreated panel opens to display the command you need to run on each NGINXPlus instance to assign it to the API Gateway Cluster. For convenience, we show the commands in Step7 below.

Assign API Gateway Instances to an API Gateway Cluster

Repeat on each NGINXPlus instance:

  1. Use ssh to connect and log in to the instance.
  2. If NGINX Agent is already running, stop it:

    $ systemctl stop nginx-agent
  3. Run the command of your choice (either curl or wget) to download and install the NGINX Agent package:

    • If you didn’t enable mTLS in Install and Configure API Connectivity Manager, add:

      • The ‑k flag to the curl command
      • The --no-check-certificate flag to the wget command
    • For <NMS_FQDN>, substitute the IP address or fully qualified domain name of your NGINX Management Suite server.
    • For <cluster_name>, substitute the name of the API Gateway Cluster (api-cluster in this tutorial).
    $ curl [-k] https://<NMS_FQDN>/install/nginx-agent > install.sh && sudo sh -install.sh -g <cluster_name> && sudo systemctl start nginx-agent

    or

    $ wget [--no-check-certificate] https://<NMS_FQDN>/install/nginx-agent --no-check-certificate -O install.sh && sudo sh install.sh -g <cluster_name> && sudo systemctl start nginx-agent

    The NGINXPlus instances now appear in the Instances section of the Cluster window for api-cluster, as shown in Figure7.

    Tutorial: High Availability for API Gateways in Multi-Cloud and Hybrid Environments - NGINX (8)
  4. Proceed to Apply Global Policies.

Deploy an Environment with Multiple API Gateway Clusters

For an Environment with multiple API Gateway Clusters, different security policies can apply to different NGINXPlus API gateway instances, as shown in Figure8.

Tutorial: High Availability for API Gateways in Multi-Cloud and Hybrid Environments - NGINX (9)
Create an Environment and API Gateway Cluster
  1. Navigate to your Workspace and click the  Create Environment  button, as shown in Figure9.

    Tutorial: High Availability for API Gateways in Multi-Cloud and Hybrid Environments - NGINX (10)
  2. In the CreateEnvironment panel that opens, fill in the Name field (prod in Figure10) and optionally the Description field, and select the Environment type (here we’re choosing Prod).

    Tutorial: High Availability for API Gateways in Multi-Cloud and Hybrid Environments - NGINX (11)
  3. In the API Gateway Clusters section, fill in the Name and Hostname fields (in Figure10, they are aws-cluster and api.nginx.com).
  4. Click the  Create  button.

    The EnvironmentCreated panel opens to display the command you need to run on each NGINXPlus instance to assign it to the API Gateway Cluster. For convenience, we show the commands in Step10 below.

  5. Navigate back to the Environment tab and click the  + Add  button in the upper right corner of the API Gateway Clusters section, as shown in Figure11.

    Tutorial: High Availability for API Gateways in Multi-Cloud and Hybrid Environments - NGINX (12)
  6. On the Create API Gateway Cluster panel, fill in the Name field with the second cluster name (gcp-cluster in Figure12) and the Hostname field with the same hostname as for the first cluster (api.nginx.com).

    Tutorial: High Availability for API Gateways in Multi-Cloud and Hybrid Environments - NGINX (13)

The two API Gateway Clusters now appear on the API Gateway Clusters for the Prod Environment, as shown in Figure13.

Tutorial: High Availability for API Gateways in Multi-Cloud and Hybrid Environments - NGINX (14)
Assign API Gateway Instances to an API Gateway Cluster

Repeat on each NGINXPlus instance:

  1. Use ssh to connect and log in to the instance.
  2. If NGINX Agent is already running, stop it:

    $ systemctl stop nginx-agent
  3. Run the command of your choice (either curl or wget) to download and install the NGINX Agent package:

    • If you didn’t enable mTLS in Install and Configure API Connectivity Manager, add:

      • The ‑k flag to the curl command
      • The --no-check-certificate flag to the wget command
    • For <NMS_FQDN>, substitute the IP address or fully qualified domain name of your NGINX Management Suite server.
    • For <cluster_name>, substitute the name of the appropriate API Gateway Cluster (in this tutorial, aws‑cluster for the instance deployed in AWS and gcp‑cluster for the instance deployed in GCP).
    $ curl [-k] https://<NMS_FQDN>/install/nginx-agent > install.sh && sudo sh -install.sh -g <cluster_name> && sudo systemctl start nginx-agent

    or

    $ wget [--no-check-certificate] https://<NMS_FQDN>/install/nginx-agent --no-check-certificate -O install.sh && sudo sh install.sh -g <cluster_name> && sudo systemctl start nginx-agent

    The appropriate NGINXPlus instance now appears in the Instances section of the Cluster windows for aws‑cluster (Figure14) and gcp‑cluster (Figure15).

    Tutorial: High Availability for API Gateways in Multi-Cloud and Hybrid Environments - NGINX (15)
    Tutorial: High Availability for API Gateways in Multi-Cloud and Hybrid Environments - NGINX (16)

    Apply Global Policies

    Now you can add global policies which apply to all the NGINXPlus instances in an API Gateway Cluster. For example, to secure client access to your APIs you can apply the OpenID Connect Relying Party or TLSInbound policy. To secure the connection between an API gateway and the backend service which exposes the API, apply the TLSBackend policy. For more information about TLS policies, see the API Connectivity Manager documentation.

    1. Navigate to the Cluster tab for the API Gateway where you want to apply a policy (api-cluster in Figure16). Click the Manage button that’s above the right corner of the Policies table.

      Tutorial: High Availability for API Gateways in Multi-Cloud and Hybrid Environments - NGINX (17)
    2. Click GlobalPolicies in the left navigation column, and then the icon in the rightmost column of the row for the policy (TLSBackend in Figure17.) Select + Add Policy from the drop‑down menu.

      Tutorial: High Availability for API Gateways in Multi-Cloud and Hybrid Environments - NGINX (18)

    Conclusion

    Managing multi‑cloud and hybrid architectures is no easy task. They are complex environments with fast‑changing applications that are often difficult to observe and secure.

    With the right tools, however, you can avoid vendor lock‑in while retaining the agility and flexibility you need for delivering new capabilities to market faster. As a cloud‑native tool, API Connectivity Manager from NGINX gives you the scalability, visibility, governance, and security you need to manage APIs in multi‑cloud and hybrid environment.

    Start a30‑day free trial of NGINX Management Suite, which includes access to API Connectivity Manager, NGINXPlus as an API gateway, and NGINX App Protect to secure your APIs.

I'm an enthusiast with extensive knowledge in the field of multi-cloud deployments, API management, and hybrid architectures. My expertise is rooted in a deep understanding of cloud-native solutions, particularly in the context of NGINX, as demonstrated by Akash Ananthanarayanan's article on F5's State of Application Strategy in 2022.

The article discusses the prevalent trend of multi-cloud deployments, citing F5's report that reveals 77% of enterprises are operating applications across multiple clouds. The advantages of such deployments, such as improved efficiency, reduced risk of outages, and avoidance of vendor lock-in, are highlighted. However, the article also addresses the challenges associated with these complex architectures, listing visibility, security, app migration, and performance optimization as the top concerns among surveyed software and IT leaders.

A key focus of the article is on API management in multi-cloud environments, emphasizing the complexity of managing APIs for microservices. The term "API sprawl" is introduced to describe the proliferation of APIs across various cloud and on-premises environments faster than they can be secured and managed. To address this issue, the article recommends adopting a multi-cloud API strategy to ensure end-to-end connectivity for distributed microservices.

The tutorial presented in the article details the step-by-step process of using API Connectivity Manager, a cloud-native, run-time-agnostic solution from NGINX, which is part of F5's NGINX Management Suite. The tutorial demonstrates how to configure API Connectivity Manager for high availability of services deployed in multiple cloud environments, specifically using NGINXPlus as an API gateway routing traffic to services in Google Cloud Platform (GCP) and Amazon Web Services (AWS).

The tutorial covers essential concepts and steps, including installing and configuring API Connectivity Manager, deploying NGINXPlus instances as API gateways, setting up infrastructure workspaces, creating environments and API gateway clusters, and applying global policies. The goal is to enable high availability for API gateways in a multi-cloud deployment, ensuring continuity even if one cloud experiences an outage.

In summary, my expertise encompasses a comprehensive understanding of multi-cloud deployments, API management challenges, and the practical implementation of solutions like API Connectivity Manager for NGINX in complex, distributed environments.

Tutorial: High Availability for API Gateways in Multi-Cloud and Hybrid Environments - NGINX (2024)

FAQs

Can I use NGINX as an API gateway? ›

NGINX Plus API Gateway receives all API requests from clients, determines which services are required by the request, and delivers responses with high performance. NGINX provides ultra-fast API responses in less than 30 milliseconds, and can handle thousands of requests per second.

Is API gateway high availability? ›

Setting up an API gateway cluster is a way to achieve a high availability (HA) gateway that can handle everything from traffic spikes to hardware failures – all without impacting performance. An HA gateway is, essentially, a means of spreading your risk and removing a single point of failure.

What is the difference between API and API gateway? ›

API Design: Creating well-defined endpoints and specifying the API's function, including request and response structures. API Gateway: Serving as the entry point for API requests, responsible for tasks like routing, load balancing, and request and response transformation.

What is the difference between single and multiple API gateways? ›

There are many benefits of using multiple API gateways for both external and internal traffic. For external traffic, benefits realized can include: Authentication, authorization, and auditing — ensuring only valid users have access to Vanguard APIs and enforcing the standard in a centralized way.

What is the difference between API portal and API gateway? ›

In essence, an API Gateway acts as a traffic manager. Unlike catalogs or portals that serve as directories or interfaces, gateways directly affect APIs' functions. They consolidate multiple backend services into a unified API, manage requests, and enhance overall performance and security.

What is the difference between NGINX and Azure Application Gateway? ›

Comparing NGINX and Azure Application Gateway Functionality

To operate at this layer, Azure Application Gateway must and does act as a proxy. One major difference is that NGINX is able to do this for all TCP and UDP protocols, whereas Application Gateway is concentrated only on HTTP(S).

How to make API highly available? ›

Designing highly available APIs require you to consider how failures should be handled. MaximumFailures in Apigee allow you to specify a set number of unsuccessful connections. Once that number of failures have been reached the load balancer will take that specific target server out of rotation.

What is the most widely used API Gateway? ›

The Top 8 API Gateways Include:
  1. Amazon API Gateway.
  2. Azure API Management.
  3. Boomi API Management.
  4. Google API Gateway.
  5. IBM API Connect.
  6. Kong Gateway.
  7. MuleSoft Anypoint Flex Gateway.
  8. WSO2 API Manager.

What are the disadvantages of API Gateway? ›

Disadvantages of using API Gateway
  • Additional Complexity. Introducing an API Gateway adds an extra layer of complexity to your architecture. ...
  • Single Point of Failure. ...
  • Latency. ...
  • Vendor Lock-in. ...
  • Cost. ...
  • Maintenance Overhead. ...
  • Configuration Complexity.

Which API gateway is better? ›

WSO2 API Microgateway, more suited in the context of microservices as an open-source, cloud-native gateway that excels in handling lightweight operations. It's designed to support the composition of multiple services through a single API, making it a prime choice for environments that demand agility and efficiency.

How many types of API gateways are there? ›

There are three different API gateways. Each one is provides an API for its client.

What is API gateway for beginners? ›

An API gateway accepts API requests from a client, processes them based on defined policies, directs them to the appropriate services, and combines the responses for a simplified user experience. Typically, it handles a request by invoking multiple microservices and aggregating the results.

Why use API gateway instead of load balancer? ›

The main difference between AWS load balancer and API gateway is that load balancers distribute incoming requests, while API gateways authenticate and provide access to data sources or other applications.

Can we have multiple instances of API gateway? ›

Configuring Multiple Instances of API Gateway in a Single Installation. The instance creation script can be used to create another instance of API Gateway in the same installation. While creating another instance you can choose your preferred HTTP and HTTPS port for the API Gateway web application using web.

What is an API gateway in layman's terms? ›

An API gateway is an API management tool that sits between a client and a collection of backend services. In this case, a client is the application on a user's device and the backend services are those on an enterprise's servers.

Can proxy act as an API gateway? ›

An API proxy is essentially a lightweight, simple version of an API gateway. Let's look at why you might want to use one: Granular control: An API proxy allows you to control and manage individual API endpoints separately. This is useful when different endpoints have different security requirements or rate limits.

What can I use as API gateway? ›

Tools built on the Gateway API specification, such as NGINX Kubernetes Gateway, can be used as API gateways for use cases that include routing requests to specific microservices, implementing traffic policies, and enabling canary and blue‑green deployments.

Can we use NGINX as load balancer? ›

Nginx, a popular web server software, can be configured as a simple yet powerful load balancer to improve your server's resource availability and efficiency. How does Nginx work? Nginx acts as a single entry point to a distributed web application working on multiple separate servers.

Can ingress be used as an API gateway? ›

Ingress controllers and service meshes can fulfill many API gateway use cases. Some vendors position their API gateway tool as an alternative to using an Ingress controller or service mesh – or they roll all three capabilities into one tool.

Top Articles
Understanding Different Account Types for Your Financial Goals | PaySpace Magazine
Best Free and Paid Budget & Bill organizer apps to Consider | PaySpace Magazine
Dricxzyoki
Wisconsin Women's Volleyball Team Leaked Pictures
Google Sites Classroom 6X
Konkurrenz für Kioske: 7-Eleven will Minisupermärkte in Deutschland etablieren
Pj Ferry Schedule
Best Private Elementary Schools In Virginia
Umn Biology
Max 80 Orl
Günstige Angebote online shoppen - QVC.de
Ree Marie Centerfold
Bestellung Ahrefs
Craigslist Pets Longview Tx
WWE-Heldin Nikki A.S.H. verzückt Fans und Kollegen
Sand Castle Parents Guide
Enterprise Car Sales Jacksonville Used Cars
Ou Class Nav
Lonesome Valley Barber
Golden Abyss - Chapter 5 - Lunar_Angel
97226 Zip Code
18889183540
Johnnie Walker Double Black Costco
Craigslist Personals Jonesboro
1973 Coupe Comparo: HQ GTS 350 + XA Falcon GT + VH Charger E55 + Leyland Force 7V
Who is Jenny Popach? Everything to Know About The Girl Who Allegedly Broke Into the Hype House With Her Mom
Coomeet Premium Mod Apk For Pc
kvoa.com | News 4 Tucson
Regina Perrow
Cars & Trucks - By Owner near Kissimmee, FL - craigslist
Sinai Sdn 2023
Rogold Extension
MethStreams Live | BoxingStreams
Smartfind Express Henrico
Skip The Games Ventura
Nearest Ups Office To Me
The Holdovers Showtimes Near Regal Huebner Oaks
Craigslist Lakeside Az
World Social Protection Report 2024-26: Universal social protection for climate action and a just transition
The All-New MyUMobile App - Support | U Mobile
Dinar Detectives Cracking the Code of the Iraqi Dinar Market
Craigslist Com St Cloud Mn
Mybiglots Net Associates
Portal Pacjenta LUX MED
Go Nutrients Intestinal Edge Reviews
Europa Universalis 4: Army Composition Guide
The Many Faces of the Craigslist Killer
Hy-Vee, Inc. hiring Market Grille Express Assistant Department Manager in New Hope, MN | LinkedIn
2121 Gateway Point
Sdn Dds
Craigslist Centre Alabama
Latest Posts
Article information

Author: Frankie Dare

Last Updated:

Views: 5972

Rating: 4.2 / 5 (53 voted)

Reviews: 92% of readers found this page helpful

Author information

Name: Frankie Dare

Birthday: 2000-01-27

Address: Suite 313 45115 Caridad Freeway, Port Barabaraville, MS 66713

Phone: +3769542039359

Job: Sales Manager

Hobby: Baton twirling, Stand-up comedy, Leather crafting, Rugby, tabletop games, Jigsaw puzzles, Air sports

Introduction: My name is Frankie Dare, I am a funny, beautiful, proud, fair, pleasant, cheerful, enthusiastic person who loves writing and wants to share my knowledge and understanding with you.