Private Open AI Service in Azure with Web UI Front End!

Joey Brakefield
5 min readJul 19, 2023

--

Do you have security requirements that mandate all traffic including Open AI traffic must remain on your private networking space? Do users need to authenticate to use your Open AI service?

If you are looking to set up a new Azure Open AI instance completely isolated on your own network in an authenticated, you can do that with a few steps leveraging existing Azure private networking technology.

Why would you do this? Microsoft provides the ability to securely access the REST API for your organization’s own Open AI ChatGPT/DALL-E et. Al.; however, a front end is not necessarily part of the normal Azure Open AI deployment nor is the Open AI a private-only networking deployment by default.

Never Fear! The Azure Open AI service now supplies an “easy button” inside the Azure Open AI Studio to deploy an Azure Web App-based front end for you. One caveat with this “easy button” is that it doesn’t deploy as a privatized version of the Azure Web App front end. Additionally, the Web App also includes Azure Active Directory Authentication to allow only users you define access inside the private network.

In this tutorial you will find both the steps to deploy Azure Open AI and the new Azure Web Apps-based front end for the Open AI service in a private networking-only manner.

Azure Private Networking Diagram

Azure Open AI & Web Private Networking Diagram

Deploying the Open AI Service Back End Privately

  1. Setup Azure OpenAI backend first using Private Endpoints as described here.

— — — TIP: Make sure your DNS Settings are correct and your endpoints/users can resolve the private IP address of the private endpoint that was just provisioned.

2. Next, you’ll deploy a model so that the REST endpoint is ready to be called over a private network. Follow the steps here. After you’ve deployed the endpoint, you can test the resolution of the Open AI service to the private IP address by pinging it or using nslookup. For a full test with a JSON test payload to test the private networking AND the deployment, take a look here using Postman on your own laptop with connectivity to the private network.

Deploying the Azure Web App Front End as a ChatGPT Interface on a Private Network

3. 1. Now, deploy the Azure App Services — Web App front end by using Deploy to… button at the top. This deploys:

a. an App Service registration.
b. the web app front end.
c. sets up authentication on the web app.

Deploy button in Azure AI Studio to deploy generic web front end

Once the Azure Web App is deployed, you can navigate to it and look at the Authentication blade to ensure authentication is set up.

Authentication blade from Azure Web App that was deployed from Azure AI Studio

4. Now that we have a Web App front end, we need to ensure the both ingress and egress web traffic is on the private network only so that it can effectively communicate to the private endpoint-ed Azure Open AI service and that also no traffic leaves our public network.

To do this, we’ll need to setup a both a private endpoint for ingress and VNET integration/injection for egress.

Private Ingress of Web Traffic to the Front End

For ingress on the web app, follow these steps and make sure that the ingress subnet can talk to the Azure Open AI private endpoint you setup in step 1 either via Route Tables or in the same logical VNET.

— — — TIP: Be sure that your DNS zone information is correctly setup and perform an nslookup on the FQDN of the Web App to ensure it resolves to a private IP address so you can be sure that traffic is routing privately.

nslookup or ping utilities can verify your private networking links are setup and resolving correctly

Here’s something close to what you will see once private ingress is enabled:

Azure Web App networking properly configured for private traffic

Private Egress of Web Traffic from the Front End

For egress on the web app, be sure to follow these steps to make sure egress routes out through private networking.

TIP: you can further enhance the solution by routing through a NextGen firewall if you want to capture more information on the traffic or for IPS/IDS purposes.

Here’s what it would look like when you enable VNET integration:

Egress properly setup from the Azure Web App front End

Now that you have both the Open AI service and Web App front ends enabled for private networking, all data will be routed over your private network, and no one will either be able to browse to it nor authenticate to the front-end unless they are both enabled for AAD access and on your private network.

Web Front End over Private Endpoints with Azure AD Authentication

Azure Open AI “Easy Button” Deployment Web Front End

I hope this guide helps you and please let me know if you run into issues with the documentation.

Cheers and Happy Azure-ing!

-Joey

--

--

Joey Brakefield

Cloud Data Scientist in-training, former rugger, all-around geek. @kfprugger for my personal ramblins