Skip to content

This sample demonstrates how to load balance requests between multiple Azure OpenAI Services using Azure API Management.

License

Notifications You must be signed in to change notification settings

MSBrett/azure-frontdoor-apim-ai

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

71 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

page_type languages products name description
sample
azurecli
bicep
powershell
yaml
json
azure
azure-openai
azure-api-management
azure-resource-manager
azure-key-vault
azure-front-door
Azure OpenAI Service with Azure API Management and Azure Front Door
This sample demonstrates how to access Azure OpenAI Services using Azure API Management and Azure Front Door.

Summary

This sample demonstrates how to access Azure OpenAI Services using Azure API Management and Azure Front Door.

Components

  • Azure OpenAI Service, a managed service for OpenAI GPT models that exposes a REST API.
  • Azure API Management, a managed service that provides a gateway to the backend Azure OpenAI Service instances.
  • Azure Front Door, a modern cloud Content Delivery Network (CDN) that provides fast, reliable, and secure access between your users and your applications’ static and dynamic web content across the globe.
  • Azure Key Vault, a managed service that stores the API keys for the Azure OpenAI Service instances as secrets used by Azure API Management.
  • Azure Managed Identity, a user-defined managed identity for Azure API Management to access Azure Key Vault.
  • Azure Bicep, used to create a repeatable infrastructure deployment for the Azure resources.

Getting Started

To deploy the infrastructure and test load balancing using Azure API Management, you need to:

Prerequisites

Run the sample notebook

The Sample.ipynb notebook contains all the necessary steps to deploy the infrastructure using Azure Bicep, and make requests to the deployed Azure API Management API to test load balancing between two Azure OpenAI Service instances.

Note: The sample uses the Azure CLI to deploy the infrastructure from the main.bicep file, and PowerShell commands to test the deployed APIs.

The notebook is split into multiple parts including:

  1. Login to Azure and set the default subscription.
  2. Deploy the Azure resources using Azure Bicep.
  3. Test load balancing using Azure API Management.
  4. Cleanup the Azure resources.

Each step is documented in the notebook with additional information and links to relevant documentation.

About

This sample demonstrates how to load balance requests between multiple Azure OpenAI Services using Azure API Management.

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Bicep 82.6%
  • Jupyter Notebook 12.3%
  • PowerShell 4.2%
  • Shell 0.9%