Skip to content

Latest commit

 

History

History
272 lines (231 loc) · 8.34 KB

File metadata and controls

272 lines (231 loc) · 8.34 KB

Azure CloudMachine

Write Azure apps in 5 minutes

Getting started

Prerequisites

  • You must have an Azure subscription.
  • You must have .NET 8 (or higher) installed
  • You must have Azure CLI (az) installed
  • You must have Azure Developer CLI (azd) installed
  • You must have npm installed
  • You must be logged into Azure CLI and Azure Developer CLI

Walkthrough

Create Server Project

In a command line window type

mkdir cmdemo
cd cmdemo
mkdir server
cd server
dotnet new web

Add Azure.CloudMachine.All package

dotnet add package Azure.CloudMachine.All --prerelease

Use Azure Developer CLI to provision CloudMachine

Open Program.cs file and add the following two lines of code to the top of the file

using Azure.CloudMachine;

if (CloudMachineInfrastructure.Configure(args)) return;

The CloudMachineInfrastructure.Configure call allows running the app with a -bicep switch, which will generate bicep files required to provision CloudMachine resources in Azure. Let's generate these bicep files now.

dotnet run -bicep

As you can see, a folder called infra was created with several bicep files in it. Let's now initialize the project.

azd init

select template, choose 'yes' when asked 'Continue initializing an app here?', choose the 'minimal' template, and use 'cmserver' as the environment name

Once the initialization completes, let's provision the resources. Select eastus as the region

azd provision

When provisioning finishes, you should see something like the following in the console output

 (✓) Done: Resource group: cm125957681369428 (627ms)

And if you go to your Azure portal, or execute the following az command, you can see the resource group created. The resource group will contain resources such as Storage, ServiceBus, and EventGrid.

az resource list --resource-group <resource_group_from_command_line> --output table

Use CDK to add resources to the CloudMachine

Since we are writing an AI application, we need to provision Azure OpenAI resources. To do this, add the follwoing class to the end of the Program.cs file:

class AssistantService {
    internal static void Configure(CloudMachineInfrastructure cm) {
        cm.AddFeature(new OpenAIFeature() { Chat = new AIModel("gpt-4o-mini", "2024-07-18") });
    }
}

And then change the configuration call at the begining of the file to:

if (CloudMachineInfrastructure.Configure(args, AssistantService.Configure)) return;

Now regenerate the bicep files and re-provision

dotnet run -bicep
azd provision

Call CloudMachine APIs

You are now ready to call Azure OpenAI service from the app. To do this, add CloudMachineClient field and a Chat method to AssistantService:

class AssistantService {
    CloudMachineClient cm = new CloudMachineClient();

    public async Task<string> Chat(string message) {
        var client = cm.GetOpenAIChatClient();
        ChatCompletion completion = await client.CompleteChatAsync(message);
        return completion.Content[0].Text;
    } 
}

Lastly, create an instance of the service and call the Chat method when the user navigates to the root URL:

var service = new AssistantService();
app.MapGet("/", async () => await service.Chat("List all noble gases"));

The full program should now look like the following:

using Azure.CloudMachine;
using Azure.CloudMachine.OpenAI;
using OpenAI.Chat;

if (CloudMachineInfrastructure.Configure(args, AssistantService.Configure)) return;

var builder = WebApplication.CreateBuilder(args);
var app = builder.Build();

var service = new AssistantService();
app.MapGet("/", async () => await service.Chat("List all noble gases"));

app.Run();

class AssistantService {
    CloudMachineClient cm = new CloudMachineClient();

    public async Task<string> Chat(string message) {
        var client = cm.GetOpenAIChatClient();
        ChatCompletion completion = await client.CompleteChatAsync(message);
        return completion.Content[0].Text;
    } 

    internal static void Configure(CloudMachineInfrastructure cm) {
        cm.AddFeature(new OpenAIFeature() {
            Chat = new AIModel("gpt-4o-mini", "2024-07-18")
        });
    }
}

You can now start the application

dotnet run

and navigate to the URL printed in the console.

Use TDK to expose Web APIs and generate TypeSpec

First, let's define an API we want to expose. We will do it using a C# interface. Add the following interface to the end of Program.cs:

interface IAssistantService {
    Task<string> Chat(string message);
}

Make sure that the AssistantService class implements the interface:

class AssistantService : IAssistantService

Expose the service methods as web APIs by adding the following line after the existing var service = new AssistantService(); line:

app.Map(service);

Lastly, add the ability to generate TypeSpec for the new API by adding a new statement to the Configure method

cm.AddEndpoints<IAssistantService>();

Your program shoud now look like the following:

using Azure.CloudMachine;
using Azure.CloudMachine.OpenAI;
using OpenAI.Chat;

if (CloudMachineInfrastructure.Configure(args, AssistantService.Configure)) return;

var builder = WebApplication.CreateBuilder(args);
var app = builder.Build();

var service = new AssistantService();
app.Map(service);
app.MapGet("/", async () => await service.Chat("List all noble gases"));

app.Run();

class AssistantService : IAssistantService {
    CloudMachineClient cm = new CloudMachineClient();

    public async Task<string> Chat(string message) {
        var client = cm.GetOpenAIChatClient();
        ChatCompletion completion = await client.CompleteChatAsync(message);
        return completion.Content[0].Text;
    } 

    internal static void Configure(CloudMachineInfrastructure cm) {
        cm.AddFeature(new OpenAIFeature() {
            Chat = new AIModel("gpt-4o-mini", "2024-07-18")
        });
        cm.AddEndpoints<IAssistantService>();
    }
}

interface IAssistantService {
    Task<string> Chat(string message);
}

You can now start the application and browse to the /chat endpoint [TDB on how to pass the parameter]

But what's more interesting, you can run the app with the -tsp switch to generate TypeSpec for the endpoint:

dotnet run -tsp

This will create a tsp directory, AssistantService.tsp file, with the following contents:

import "@typespec/http";
import "@typespec/rest";
import "@azure-tools/typespec-client-generator-core";

@service({
  title: "AssistantService",
})

namespace AssistantService;

using TypeSpec.Http;
using TypeSpec.Rest;
using Azure.ClientGenerator.Core;

@client interface AssistantServiceClient {
  @get @route("chat") Chat(@body message: string) : {
    @statusCode statusCode: 200;
    @body response : string;
  };
}

Generate client libraries from TypeSpec

Let's now generate and build a C# client library from the AssistantService.tsp file:

cd ..
npm install @typespec/http-client-csharp
tsp compile .\server\tsp\AssistantService.tsp --emit "@typespec/http-client-csharp"
dotnet build tsp-output\@typespec\http-client-csharp\src\AssistantService.csproj

You can also generate libraries for other languages, e.g.

npm install @typespec/http-client-python
tsp compile .\server\tsp\AssistantService.tsp --emit "@typespec/http-client-python"

Create command line client app for the service

mkdir cmdclient
cd cmdclient
dotnet new console
dotnet add reference ..\tsp-output\@typespec\http-client-csharp\src\AssistantService.csproj

And change the Program.cs file to the following, replacing the client URI with the URI in your server's launchsettings.json file ('cmdemo\server\Properties' folder)

using AssistantService;

var client = new AssistantServiceClient(new Uri("http://localhost:5121/"));

while(true){
    string message = Console.ReadLine();
    var completion = client.Chat(message);
    Console.WriteLine(completion);
}

Now start the server

start cmd
cd..
cd server
dotnet run

Go back to the client project command window and start the client:

dotnet run

You can use the simple command line app to ask Azure OpenAI some questions.