Azure Function
Azure Function could play a role as an orchestrator instead of using Azure Data Factory.
Power Platform using the built-in service will synchonize data in near real-time. So, it will continuously export data and metadata to Azure Data Lake. All create, update and delete operations will be exported from Dataverse (Power Platform) to a Data Lake. The export could be initial and incremental for table data and metadata.
Azure Function is triggered based on a change in the Data Lake (update, creation, delete) : it’s a blob starter.
The blob starter will call the orchestrator.
The orchestrator will orchestrate several activities, each one will return the needed data.
The orchestrator at the end of the workflow will call the last activity which will push the new data in the Data Lake.
The API Manager will access the Data Lake in order to retrieve the new data and push the latter in SAP.
Azure Data Factory
Azure Data Factory is too much because the data volume is low. However, we could implement ADF as an orchestrator.
Power Platform using the built-in service will synchonize data in near real-time. So, it will continuously export data and metadata to Azure Data Lake. All create, update and delete operations will be exported from Dataverse (Power Platform) to a Data Lake. The export could be initial and incremental for table data and metadata.
Azure Data Factory (ADF) is triggered based on a change in the Data Lake (update, creation, delete) : it’s a blob starter.
The blob starter will call the orchestrator.
The orchestrator (ADF) will orchestrate several activities, each one will return the needed data.
The orchestrator (ADF) at the end of the workflow will call the last activity which will push the new data in the Data Lake.
The API Manager will access the Data Lake in order to retrieve the new data and push the latter in SAP.
Azure Worker Role
Power Platform using the built-in service will synchonize data in near real-time. So, it will continuously export data and metadata to Azure Data Lake. All create, update and delete operations will be exported from Dataverse (Power Platform) to a Data Lake. The export could be initial and incremental for table data and metadata.
Deployed as an Azure Container Instance in an Azure Container Registry, the Worker Role (WR) (running as a Background Service) will extract the data from the Data Lake.
The WR will push the data into the other Data Lake.
The API Manager will extract the data from the Data Lake and push the latter into SAP.
We don’t need multiple services including an orchestrator. The code will be able to orchestrate if need.