To use this solution accelerator, you will need access to an Azure subscription. While not required, a prior understanding of Azure Synapse Analytics, Azure Machine Learning, Azure Logic Apps, Power Apps and Azure Kubernetes Service will be helpful.
For additional training and support, please see:
Start by deploying the required resources to Azure. The button below will deploy Azure Synapse Analytics, Azure Machine Learning and its related resources, Azure Cosmos DB, Function App, Logic App, Speech Service, Translator and Azure Kubernetes Service.
Clone or download this repository and navigate to the project's root directory.
More information on Cloning a repository
- We are using the data provided by UCI Machine Learning Repository. You will need the following datasets (download them):
- diabetic_data.csv
Upload the following files from the /Analytics_Deployment/Data folder into the ADLS storage account.
Create under the container raw a new folder DatasetDiabetes
- diabetic_data.csv
- Names.csv
Upload the following files from the /Analytics_Deployment/Data folder into the Cosmos DB.
Each file matches the name of the container.
- AdmissionSource.json
- Appointments.json
- ColumnLookupValues.json
- ColumnNameMap.json
- DischargeDisposition.json
- ICD9Code.json
You can use the Data migration tool or go to the Data Explorer of your Cosmos DB, select the container and click Upload Item
In order to perform the necessary actions in Synapse workspace, you will need to grant more access.
- Go to the Azure Data Lake Storage Account created above
- Go to the
Access Control (IAM) > + Add > Add role assignment - Now click the Role dropdown and select
Storage Blob Data Contributor- Search for your username and add
- Click
Saveat the bottom
- NOTE: If you did not allow all connections during deployment of your resources, follow the steps below
Before you can upload any assets to the Synapse Workspace you will first need to add your IP address to the Synapse Workspace. Before you can upload any assets to the Synapse Workspace you will first need to add your IP address to the Synapse Workspace.
- Go to the Azure Synaspe resource you created in Step 1.
- Navigate to
FirewallsunderSecurityon the left hand side of the page. - At the top click
+ Add client IP
- Your IP address should now be visable in the IP list.
- Go to the Machine Learning Service and click the button
Launch Studio - Click on
Compute- selectInference Clustersand clickNew - Use an Existing Azure kubernetes cluster and select the kubernetes cluster that was created during the deploymnet of the resources
- Click
Next - Specify a Compute Name and click
Create
In this step you're going to add the Cosmos DB as a linked service in the Synapse Workspace.
- Launch the Synapse workspace (via Azure portal > Synapse workspace > Workspace web URL)
- Click on
Manage - Linked Services - New - Type in the search box "Cosmos" and select the service "Azure Cosmos DB (SQL API)"
- Click
Continue - Fill in the following data for the linked service
| Field | Value |
|---|---|
| Name | "patientHubDB" |
| Connect via integration runtime | AutoResolveIntegrationRuntime |
| Authentication method | Connection String |
| Account selection method | Enter Manually |
| Azure Cosmos DB account URI | Copy the Cosmos DB URI from the Cosmos DB you have created |
| Azure Cosmos DB access key | Copy the Cosmos DB Primary Key from the Cosmos DB you have created |
| Database name | "patienthubdb" |
- Launch the Synapse workspace (via Azure portal > Synapse workspace > Workspace web URL)
- Go to
Develop, click the+, and clickImportto select all Spark notebooks from the repository's/Analytics_Deployment/Synapse-Workspace/Notebooksfolder - For each of the notebooks, select
Attach to > spark1in the top dropdown
- Update
data_lake_account_namevariable to your ADLS in the 00_preparedata.ipynb notebook - Update
file_system_namevariable to your container in the 00_preparedata.ipynb notebook
- Update
data_lake_account_namevariable to your ADLS in the 01_train_diabetes_readmission_automl.ipynb notebook - Update
file_system_namevariable to your container in the 01_train_diabetes_readmission_automl.ipynb notebook - Update
subscription_idvariable to your Azure SubscriptionID in the 01_train_diabetes_readmission_automl.ipynb notebook - Update
resource_groupvariable to your ResourceGroup in the 01_train_diabetes_readmission_automl.ipynb notebook - Update
workspace_namevariable to your Azure Machine Learning Workspace in the 01_train_diabetes_readmission_automl.ipynb notebook - Update
workspace_regionvariable to your Azure Machine Learning Workspace Region in the 01_train_diabetes_readmission_automl.ipynb notebook
- Update
data_lake_account_namevariable to your ADLS in the 02_deploy_AKS_diabetes_readmission_model.ipynb notebook - Update
file_system_namevariable to your container in the 02_deploy_AKS_diabetes_readmission_model.ipynb notebook - Update
subscription_idvariable to your Azure SubscriptionID in the 02_deploy_AKS_diabetes_readmission_model.ipynb notebook - Update
resource_groupvariable to your ResourceGroup in the 02_deploy_AKS_diabetes_readmission_model.ipynb notebook - Update
workspace_namevariable to your Azure Machine Learning Workspace in the 02_deploy_AKS_diabetes_readmission_model.ipynb notebook - Update
workspace_regionvariable to your Azure Machine Learning Workspace Region in the 02_deploy_AKS_diabetes_readmission_model.ipynb notebook - Update
autoMLRunIdvariable to the Run ID that was returned in the 01_train_diabetes_readmission_automl notebook in the 02_deploy_AKS_diabetes_readmission_model.ipynb notebook - Update
aks_target_namevariable to the Compute you have created in step 5 in the 02_deploy_AKS_diabetes_readmission_model.ipynb notebook
When all the variables are modified, publish your all imported notebooks so they are saved in your workspace. Run the following notebooks in order:
- 00_preparedata
- 01_train_diabetes_readmission_automl.ipynb
- 02_deploy_AKS_diabetes_readmission_model.ipynb
Before you can confiure the service endpoint, there are some Prerequisites that need to be installed on your device.
- Azure CLI - Required for the deployment and configuration of the Azure Resources and Source code
- Docker Desktop - Required for Debugging in local or containerizing codes in the Deployment process
- PowerShell 7.x - Required to execute the Deployment Script
Open the deployapplications.ps1 file and update the necesary variables.
| Field | Value |
|---|---|
| $subscriptionId | SubscriptionID where Azure resources will be deployed |
| $resourcegroupName | Resourcegroup Name where Azure resources will be deployed |
| $containerRegistryName | Container Registry Name already deployed in previous deployment by Azure ML |
| $kubernetesName | AKS Name already deployed in previous deployment by Azure ML |
| $azurecosmosaccountName | Azure Cosmos DB Name already deployed in previous deployment |
| $azurecosmosdbDataBaseName | "patientHubDB" |
| $ttsSubscriptionKey | Azure Speech Service Subscription key in previous deployment |
| $ttsServiceRegion | Azure Speech Service Service Region in previous deployment |
| $mlServiceURL | Deployed Realtime inference service URL in Azure ML Studio in previous step |
| $mlServiceBearerToken | Deployed Realtime infrence service BearerToken in Azure ML Studio in previous step |
Then run PowerShell as Administrator and launch the script deployapplications.ps1
Once the script is finished, note down the public IP addresses for each services.
You need them in the next step.
Go back to your Synapse workspace and open the notebook 03_load_predictions
Fill in the variables:
- Update
data_lake_account_namevariable to your ADLS in the 03_load_predictions.ipynb notebook - Update
file_system_namevariable to your container in the 03_load_predictions.ipynb notebook - Update
subscription_idvariable to your Azure SubscriptionID in the 03_load_predictions.ipynb notebook - Update
resource_groupvariable to your ResourceGroup in the 03_load_predictions.ipynb notebook - Update
workspace_namevariable to your Azure Machine Learning Workspace in the 03_load_predictions.ipynb notebook - Update
workspace_regionvariable to your Azure Machine Learning Workspace Region in the 03_load_predictions.ipynb notebook
When all the variables are modified, publish your the notebook so it is saved in your workspace, and run it.
- Go to https://make.preview.powerapps.com/
- In the right upper corner, make sure you select the correct environment where you want to deploy the Power App.
- Click on
Apps - Import Canvas App - Click upload and select the Frontend_Deployment/PatientHubDoctorPortal.zip Zipfile.
- Review the package content. You should see the details as the screenshot below
- Under the
Review Package Content, click on the little wrench next to the Application NameProvider Portal, to change the name of the Application. Make sure the name is unique for the environemnt. - Click Import and wait until you see the message
All package resources were successfully imported. - Click on
Flows. You will notice that all the flows are disabled.
- You need to turn them on before you can use them. Hover over each of the flows, select the button
More Commandsand clickTurn on.
- For each flow, you need to change the HTTP component so that the URI points to your API Services. Edit each flow, open the HTTP component and past the Public IP addresses you noted down in the previous step. Your URI should look similar like the screenshot below.
| API Service | Flow |
|---|---|
| appointment | PatientHub-GetNextAppointments |
| batchinference | PatientHub-InferenceExplanation |
| patient | PatientHub-GetAllPatients |
| realtimeinference | PatientHub-RealtimeInference |
| tts | PatientHub-GetSpeechFile |
- After the modification, click the "Test" button in the upper right corner to test the flow. If all went well, you should receive "Your flow ran successfully".
- Once the flows are modified, you should open the Power App and all should work like a charm.
By accessing this code, you acknowledge that the code is not designed, intended, or made available: (1) as a medical device(s); (2) for the diagnosis of disease or other conditions, or in the cure, mitigation, treatment or prevention of a disease or other conditions; or (3) as a substitute for professional medical advice, diagnosis, treatment, or judgment. Do not use this code to replace, substitute, or provide professional medical advice, diagnosis, treatment, or judgement. You are solely responsible for ensuring the regulatory, legal, and/or contractual compliance of any use of the code, including obtaining any authorizations or consents, and any solution you choose to build that incorporates this code in whole or in part.




