Skip to content

Commit 1925b9a

Browse files
authored
Merge pull request #285 from Azure-Samples/jimchou/chat-openai-sample
Chat with Azure OpenAI quickstart sample
2 parents 5698a4f + 125d472 commit 1925b9a

File tree

7 files changed

+384
-0
lines changed

7 files changed

+384
-0
lines changed

chat-openai-sample/.env

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
PORT=8080
2+
CONNECTION_STRING="<YOUR-ACS-CONNECTION_STRING>"
3+
4+
ACS_URL_ENDPOINT = "<ACS_URL_ENDPOINT>"
5+
AZURE_OPENAI_SERVICE_KEY = "<YOUR-AZURE_OPENAI_SERVICE_KEY>"
6+
AZURE_OPENAI_SERVICE_ENDPOINT="<YOUR-AZURE_OPENAI_SERVICE_ENDPOINT>"
7+
AZURE_OPENAI_DEPLOYMENT_MODEL_NAME="<YOUR-AZURE_OPENAI_DEPLOYMENT_MODEL_NAME>"

chat-openai-sample/.gitignore

Lines changed: 30 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,30 @@
1+
# Ignore node_modules directory
2+
node_modules/
3+
4+
# Ignore environment variables file
5+
.env
6+
7+
# Ignore build output directory
8+
dist/
9+
build/
10+
public/assets/
11+
12+
# Ignore IDE/Editor-specific files
13+
.vscode/
14+
.vs
15+
.idea/
16+
17+
# Ignore user-specific configuration files
18+
.npmrc
19+
.gitconfig
20+
21+
# Ignore log files
22+
*.log
23+
24+
# Ignore OS-generated files
25+
.DS_Store
26+
Thumbs.db
27+
28+
# Ignore package lock files
29+
package-lock.json
30+
yarn.lock

chat-openai-sample/README.md

Lines changed: 58 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,58 @@
1+
|page_type|languages|products
2+
|---|---|---|
3+
|sample|<table><tr><td>Typescript</tr></td></table>|<table><tr><td>azure</td><td>azure-communication-services</td></tr></table>|
4+
5+
# Chat with Azure OpenAI - Quick Start Sample
6+
7+
This sample application demonstrates how to integrate Azure Communication Services Chat SDK with Azure OpenAI Service to enable intelligent message analysis. The application listens for a user message, processes the text through the Azure OpenAI Service, and generates appropriate analysis. Or optionally, developers can replace the logic with their own AI model for message analysis.
8+
9+
- app.ts - Node JS application providing HTTP endpoints for message analysis (including EventGrid webhook endpoint)
10+
- client.ts - script to setup chat messages and test HTTP endpoints locally for message analysis
11+
12+
## Prerequisites
13+
14+
- Create an Azure account with an active subscription. For details, see [Create an account for free](https://azure.microsoft.com/free/).
15+
- Install [Visual Studio Code](https://code.visualstudio.com/download).
16+
- Install [Node.js](https://nodejs.org/en/download).
17+
- Create an Azure Communication Services resource. For details, see [Create an Azure Communication Resource](https://docs.microsoft.com/azure/communication-services/quickstarts/create-communication-resource). You need to record your resource **connection string** for this sample.
18+
- An Azure OpenAI Resource and Deployed Model. See [instructions](https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/create-resource?pivots=web-portal).
19+
20+
## Before running the sample for the first time
21+
22+
1. Open an instance of PowerShell, Windows Terminal, Command Prompt or equivalent and navigate to the directory where you want to clone the sample.
23+
2. git clone `https://github.com/Azure-Samples/communication-services-javascript-quickstarts.git`.
24+
3. cd into the `chat-openai-sample` folder.
25+
4. From the root of the `chat-openai-sample` folder run `npm install`.
26+
27+
### Setup and host your Azure DevTunnel
28+
29+
[Azure DevTunnels](https://learn.microsoft.com/en-us/azure/developer/dev-tunnels/get-started?tabs=windows) is an Azure service that enables you to share local web services hosted on the internet. Use the following commands to connect your local development environment to the public internet. This creates a tunnel with a persistent endpoint URL and enables anonymous access. We use this endpoint to notify your application of chat events from the Azure Communication Services Chat service.
30+
31+
```bash
32+
// Only needs to be done the first time
33+
devtunnel user login
34+
35+
devtunnel create --allow-anonymous
36+
devtunnel port create -p 8080
37+
devtunnel host
38+
```
39+
40+
### Configuring application
41+
42+
Open the `.env` file to configure the following settings:
43+
44+
1. `PORT`: Localhost port to run the server app on.
45+
2. `CONNECTION_STRING`: Azure Communication Services resource connection string.
46+
3. `ACS_URL_ENDPOINT`: Azure Communication Services resource URL endpoint.
47+
4. `AZURE_OPENAI_SERVICE_KEY`: Azure Open AI service key.
48+
5. `AZURE_OPENAI_SERVICE_ENDPOINT`: Azure Open AI endpoint.
49+
6. `AZURE_OPENAI_DEPLOYMENT_MODEL_NAME`: Azure Open AI deployment name.
50+
51+
### Run app locally
52+
53+
1. Open a new Powershell window, cd into the `chat-openai-sample` folder and run `npm run dev`.
54+
2. The browser displays the following dialog box. If not navigate to `http://localhost:8080/`.
55+
3. To test the AI analysis API endpoint on your local machine, in another new Powersehll window for the same directory, run `npm run client` to observe how messages are generated and processed.
56+
4. (optional) To setup EventGrid, follow [Setup and host your Azure DevTunnel](#setup-and-host-your-azure-devtunnel) and register an EventGrid Webhook for the `ChatMessageReceived` event that points to your DevTunnel URI for `<DevTunnelUri>/api/chatMessageReceived`.
57+
58+
Once that's completed you should have a running application. The best way to test this is to send a message in a chat thread to be analyzed by your intelligent agent.

chat-openai-sample/package.json

Lines changed: 30 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,30 @@
1+
{
2+
"name": "chat_openai",
3+
"version": "1.0.0",
4+
"description": "",
5+
"main": "index.js",
6+
"scripts": {
7+
"build": "tsc",
8+
"dev": "nodemon ./src/app.ts",
9+
"client": "tsc ./src/client.ts --outDir ./dist && node ./dist/client.js"
10+
},
11+
"keywords": [],
12+
"author": "",
13+
"license": "ISC",
14+
"dependencies": {
15+
"@azure/communication-chat": "^1.5.0",
16+
"@azure/communication-common": "^2.3.1",
17+
"@azure/communication-identity": "^1.3.0",
18+
"@azure/openai": "1.0.0-beta.12",
19+
"@types/express": "^4.17.17",
20+
"@types/node": "^20.2.1",
21+
"axios": "^1.7.7",
22+
"dotenv": "^16.3.1",
23+
"express": "^4.18.2"
24+
},
25+
"devDependencies": {
26+
"nodemon": "^2.0.22",
27+
"ts-node": "^10.9.1",
28+
"typescript": "^5.0.4"
29+
}
30+
}

chat-openai-sample/src/app.ts

Lines changed: 147 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,147 @@
1+
import { config } from 'dotenv';
2+
import express, { Application } from 'express';
3+
import { AzureCommunicationTokenCredential } from '@azure/communication-common';
4+
import { ChatClient } from '@azure/communication-chat';
5+
import { AzureKeyCredential, OpenAIClient } from '@azure/openai';
6+
config();
7+
8+
const PORT = process.env.PORT;
9+
const ACS_URL_ENDPOINT = process.env.ACS_URL_ENDPOINT;
10+
const app: Application = express();
11+
app.use(express.json());
12+
const TRANSLATION_LANGUAGE = 'spanish'; // Change this to the target language you want to translate to
13+
14+
let openAiClient : OpenAIClient;
15+
16+
const numMessagesToSummarize = 10;
17+
const summarizationSystemPrompt = 'Act like you are a agent specialized in generating summary of a chat conversation, you will be provided with a JSON list of messages of a conversation, generate a summary for the conversation based on the content message.';
18+
const sentimentSystemPrompt = 'Act like you are a agent specialized in generating sentiment of a chat message, please provide the sentiment of the given message as POSITIVE, NEGATIVE or NEUTRAL.';
19+
const translationSystemPrompt = 'Act like you are a agent specialized in generating translation of a chat message, please translate the given message to the TARGET_LANGUAGE. If you do not understand language or recognized words, just echo back the original message';
20+
21+
/* Azure Open AI Service */
22+
async function createOpenAiClient() {
23+
const openAiServiceEndpoint = process.env.AZURE_OPENAI_SERVICE_ENDPOINT || "";
24+
const openAiKey = process.env.AZURE_OPENAI_SERVICE_KEY || "";
25+
openAiClient = new OpenAIClient(
26+
openAiServiceEndpoint,
27+
new AzureKeyCredential(openAiKey)
28+
);
29+
console.log("Initialized Open AI Client.");
30+
}
31+
32+
async function getChatCompletions(systemPrompt: string, userPrompt: string){
33+
const deploymentName = process.env.AZURE_OPENAI_DEPLOYMENT_MODEL_NAME;
34+
const messages = [
35+
{ role: "system", content: systemPrompt },
36+
{ role: "user", content: userPrompt },
37+
];
38+
39+
const response = await openAiClient.getChatCompletions(deploymentName, messages);
40+
const responseContent = response.choices[0].message.content;
41+
console.log(responseContent);
42+
return responseContent;
43+
}
44+
45+
/* Azure Communication Service */
46+
async function getRecentMessages(token: string, threadId: string) {
47+
const chatClient = new ChatClient(ACS_URL_ENDPOINT, new AzureCommunicationTokenCredential(token));
48+
const threadClient = chatClient.getChatThreadClient(threadId);
49+
const messagesIterator = threadClient.listMessages({ maxPageSize: numMessagesToSummarize });
50+
const messages = [];
51+
for await (const message of messagesIterator) {
52+
messages.push(message);
53+
}
54+
return messages;
55+
}
56+
57+
async function getMessage(token: string, threadId: string, messageId: string) {
58+
const chatClient = new ChatClient(ACS_URL_ENDPOINT, new AzureCommunicationTokenCredential(token));
59+
const threadClient = chatClient.getChatThreadClient(threadId);
60+
const message = await threadClient.getMessage(messageId);
61+
return message;
62+
}
63+
64+
/* API routes */
65+
app.get('/api/chat/:threadId/summary', async (req: any, res: any)=>{
66+
// Authnorization header format: "Bearer <ACS_TOKEN>"
67+
const token = req.headers['authorization'].split(' ')[1];
68+
const { threadId } = req.params;
69+
try{
70+
const messages = await getRecentMessages(token, threadId);
71+
const result = await getChatCompletions(summarizationSystemPrompt, JSON.stringify(messages))
72+
res.json(result);
73+
}
74+
catch(error){
75+
console.error("Error during get summary.", error);
76+
}
77+
});
78+
79+
app.get('/api/chat/:threadId/message/:messageId/sentiment', async (req: any, res: any)=>{
80+
// Authnorization header format: "Bearer <ACS_TOKEN>"
81+
const token = req.headers['authorization'].split(' ')[1];
82+
const { threadId, messageId } = req.params;
83+
try{
84+
const message = await getMessage(token, threadId, messageId);
85+
const result = await getChatCompletions(sentimentSystemPrompt, message.content.message);
86+
res.json(result);
87+
}
88+
catch(error){
89+
console.error("Error during get sentiment.", error);
90+
}
91+
});
92+
93+
app.get('/api/chat/:threadId/message/:messageId/translation/:language', async (req: any, res: any)=>{
94+
// Authnorization header format: "Bearer <ACS_TOKEN>"
95+
const token = req.headers['authorization'].split(' ')[1];
96+
const { threadId, messageId, language } = req.params;
97+
try{
98+
const message = await getMessage(token, threadId, messageId);
99+
const systemPrompt = translationSystemPrompt.replace('TARGET_LANGUAGE', language);
100+
const result = await getChatCompletions(systemPrompt, message.content.message);
101+
res.json(result);
102+
}
103+
catch(error){
104+
console.error("Error during get translation.", error);
105+
}
106+
});
107+
108+
// EventGrid
109+
app.post("/api/chatMessageReceived", async (req: any, res:any)=>{
110+
console.log(`Received chatMessageReceived event - data --> ${JSON.stringify(req.body)} `);
111+
const event = req.body[0];
112+
113+
try{
114+
const eventData = event.data;
115+
if (event.eventType === "Microsoft.EventGrid.SubscriptionValidationEvent") {
116+
console.log("Received SubscriptionValidation event");
117+
res.status(200).json({
118+
validationResponse: eventData.validationCode
119+
});
120+
return;
121+
}
122+
123+
const messageId = event.data.messageId;
124+
125+
// Sentiment Analysis
126+
const sentimentResult = await getChatCompletions(sentimentSystemPrompt, event.data.messageBody);
127+
console.log(`Sentiment ${messageId}: ${sentimentResult}`);
128+
129+
// Translation
130+
const translaitonPrompt = translationSystemPrompt.replace('TARGET_LANGUAGE', TRANSLATION_LANGUAGE);
131+
const translationResult = await getChatCompletions(translaitonPrompt, event.data.messageBody);
132+
console.log(`Translating ${messageId}: ${translationResult}`);
133+
}
134+
catch(error){
135+
console.error("Error during the message recieved event.", error);
136+
}
137+
});
138+
139+
app.get('/', (req, res) => {
140+
res.send('Hello ACS Chat!');
141+
});
142+
143+
// Start the server
144+
app.listen(PORT, async () => {
145+
console.log(`Server is listening on port ${PORT}`);
146+
await createOpenAiClient();
147+
});

chat-openai-sample/src/client.ts

Lines changed: 99 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,99 @@
1+
import { config } from 'dotenv';
2+
import axios from 'axios';
3+
import { CommunicationIdentityClient } from '@azure/communication-identity';
4+
import { AzureCommunicationTokenCredential } from '@azure/communication-common';
5+
import { ChatClient, CreateChatThreadOptions } from '@azure/communication-chat';
6+
config();
7+
8+
const PORT = process.env.PORT;
9+
const ACS_CONNECTION_STRING = process.env.CONNECTION_STRING;
10+
const ACS_URL_ENDPOINT = process.env.ACS_URL_ENDPOINT;
11+
12+
const messages = [
13+
// Alice says:
14+
'How can I help you today?',
15+
// Bob says:
16+
'I am very upset I have not receive my package yet.',
17+
// Alice says:
18+
'I have looked it up for you, it has been dispatched already please follow this tracking number 123456789 on when it will arrive'
19+
];
20+
21+
async function fetchAIAnalysis(url: string, token: string) {
22+
try {
23+
const response = await axios.get(url, {
24+
headers: {
25+
'Authorization': `Bearer ${token}`,
26+
'Content-Type': 'application/json'
27+
}
28+
});
29+
console.log(url);
30+
console.log(response.data);
31+
} catch (error) {
32+
console.error('Error fetching data:', error);
33+
}
34+
}
35+
36+
export async function main() {
37+
console.log("=== Analyze Sentiment Chat Sample ===");
38+
39+
// Create identity and token for Alice and Bob
40+
const identityClient = new CommunicationIdentityClient(ACS_CONNECTION_STRING);
41+
const { token: token1 , user: user1 } = await identityClient.createUserAndToken(["chat"]);
42+
const { token: token2 , user: user2 } = await identityClient.createUserAndToken(["chat"]);
43+
let chatClient1 = new ChatClient(ACS_URL_ENDPOINT, new AzureCommunicationTokenCredential(token1));
44+
let chatClient2 = new ChatClient(ACS_URL_ENDPOINT, new AzureCommunicationTokenCredential(token2));
45+
46+
// Create a chat thread with both users
47+
const createChatThreadResult = await chatClient1.createChatThread(
48+
{ topic: 'Customer Service' },
49+
{ participants: [
50+
{ id: { communicationUserId: user1.communicationUserId }, displayName: 'Alice' },
51+
{ id: { communicationUserId: user2.communicationUserId }, displayName: 'Bob' },
52+
]}
53+
);
54+
const threadId = createChatThreadResult.chatThread.id;
55+
let chatThreadClient1 = chatClient1.getChatThreadClient(threadId);
56+
let chatThreadClient2 = chatClient2.getChatThreadClient(threadId);
57+
58+
// send a message from user1 and user2
59+
const sendChatMessageResult1 = await chatThreadClient1.sendMessage(
60+
{ content: messages[0] }, { senderDisplayName : 'Alice', type: 'text' }
61+
);
62+
console.log(`Alice (${sendChatMessageResult1.id}) ${messages[0]}`);
63+
const sendChatMessageResult2 = await chatThreadClient2.sendMessage(
64+
{ content: messages[1] }, { senderDisplayName : 'Bob', type: 'text' }
65+
);
66+
console.log(`Bob (${sendChatMessageResult1.id}) ${messages[1]}`);
67+
const sendChatMessageResult3 = await chatThreadClient1.sendMessage(
68+
{ content: messages[2] }, { senderDisplayName : 'Alice', type: 'text' }
69+
);
70+
console.log(`Alice (${sendChatMessageResult1.id}) ${messages[2]}`);
71+
72+
// Sentiment analysis
73+
console.log('\n=== Fetching Sentiment Analysis ===');
74+
await fetchAIAnalysis(
75+
`http://localhost:${PORT}/api/chat/${threadId}/message/${sendChatMessageResult1.id}/sentiment`, token1);
76+
await fetchAIAnalysis(
77+
`http://localhost:${PORT}/api/chat/${threadId}/message/${sendChatMessageResult2.id}/sentiment`, token1);
78+
await fetchAIAnalysis(
79+
`http://localhost:${PORT}/api/chat/${threadId}/message/${sendChatMessageResult3.id}/sentiment`, token1);
80+
81+
82+
// Translation
83+
console.log('\n=== Fetching Translation ===');
84+
await fetchAIAnalysis(
85+
`http://localhost:${PORT}/api/chat/${threadId}/message/${sendChatMessageResult1.id}/translation/chinese`, token1);
86+
await fetchAIAnalysis(
87+
`http://localhost:${PORT}/api/chat/${threadId}/message/${sendChatMessageResult2.id}/translation/spanish`, token1);
88+
await fetchAIAnalysis(
89+
`http://localhost:${PORT}/api/chat/${threadId}/message/${sendChatMessageResult3.id}/translation/german`, token1);
90+
91+
// Summary
92+
console.log('\n=== Fetching Summary ===');
93+
await fetchAIAnalysis(
94+
`http://localhost:${PORT}/api/chat/${threadId}/summary`, token2);
95+
}
96+
97+
main().catch((err) => {
98+
console.error("The sample encountered an error:", err);
99+
});

chat-openai-sample/tsconfig.json

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
{
2+
"compilerOptions": {
3+
"target": "ES2015",
4+
"module": "commonjs",
5+
"outDir": "./dist",
6+
"rootDir": "./src",
7+
"moduleResolution": "node",
8+
"esModuleInterop": true,
9+
"skipLibCheck": true,
10+
"forceConsistentCasingInFileNames": true
11+
},
12+
"include": ["./src"]
13+
}

0 commit comments

Comments
 (0)