Skip to end of metadata
Go to start of metadata

You are viewing an old version of this content. View the current version.

Compare with Current View Version History

« Previous Version 8 Next »

Azure Function

Azure Function could play a role as an orchestrator instead of using Azure Data Factory.

image-20240304-210133.png
  1. Power Platform using the built-in service will synchonize data in near real-time. So, it will continuously export data and metadata to Azure Data Lake. All create, update and delete operations will be exported from Dataverse (Power Platform) to a Data Lake. The export could be initial and incremental for table data and metadata.

  2. Azure Function is triggered based on a change in the Data Lake (update, creation, delete) : it’s a blob starter.

  3. The blob starter will call the orchestrator.

  4. The orchestrator will orchestrate several activities, each one will return the needed data.

  5. The orchestrator at the end of the workflow will call the last activity which will push the new data in the Data Lake.

  6. The API Manager will access the Data Lake in order to retrieve the new data and push the latter in SAP.

Azure Data Factory

Azure Data Factory is too much because the data volume is low. However, we could implement ADF as an orchestrator.

image-20240305-134947.png

  1. Power Platform using the built-in service will synchonize data in near real-time. So, it will continuously export data and metadata to Azure Data Lake. All create, update and delete operations will be exported from Dataverse (Power Platform) to a Data Lake. The export could be initial and incremental for table data and metadata.

  2. Azure Data Factory (ADF) is triggered based on a change in the Data Lake (update, creation, delete) : it’s a blob starter.

  3. The blob starter will call the orchestrator.

  4. The orchestrator (ADF) will orchestrate several activities, each one will return the needed data.

  5. The orchestrator (ADF) at the end of the workflow will call the last activity which will push the new data in the Data Lake.

  6. The API Manager will access the Data Lake in order to retrieve the new data and push the latter in SAP.

Azure Worker Role

  • No labels