In my previous post I described how to train Azure AI Translator Custom Model. I tested with ready made app but now I want to create self-help portal for Sitra personnel to test the translator. I decided to have a SharePoint folder (in a Teams channel files) where you can just drop a PDF to be translated.
I created Power Automate flow, which will check when there are new files created in SharePoint folder. Below is the flow and after it I will guide you trough each step how to build it.
In the previous post I was wondering why I need storage account. Here is the point, I need to move the files from SharePoint to Azure Blob if I want to them to be translated. Create there two containers, one where you set documents to be translated and where custom translator will save the translated documents.
Connecto to Blob with Power Automate Azure Blob Storage action. Give endpoint and access key. Then I need to get the file content from SharePoint and move it to the Azure blob meaning create a new blob (binary large object).
Then I need to check how to command the custom translator with REST API. Since I did not have owner permission for the storage account, I need to use SAS authentication for giving the translator service permission to access the documents in storage account containers.
Instructions to build the HTTP REST call and then need to add category parameter. HTTP looks like below picture. Get your endpoint URI and add the /translator/text… rest of the URI. Then get key for headers and add the other headers as well. Format the body, get container addresses with SAS tokens as instructed below. Add the category ID and then your are good to go. Translation takes couple minutes and then you can see translated document in the target container from Azure Portal.
I added there 5min delay and then I will get the file from Azure Blob and save (create as new file) to SharePoint
After moving the file to SharePoint, you need to delete the files from containers since you translate the container all content.