Context, Problem & Solution
Enable the integration between Power Platform and SAP to announce changes in SAP to Power Platform asynchronously and so not in realtime, without coupling the both systems. A good design will take in consideration the existing design related to SAP (see image below).
Context and Problem
Current architecture implying a front door (micro-service) is mandatory. Direct access to SAP is not allowed.
No current integration between Power Platform and SAP. Power Platform is a SAAS in the public Internet even if it’s in the backbone of Microsoft.
For now, there is only a one-way communication, from SAP to Power Platform. Data transfered from SAP is not considered as confidential.
Data volume is not substantial : about 2 thousand records could be affected in Power Platform. Not a case of intensive computing and long-running functions.
Hybrid scenario is highly recommended regarding the current architecture related to SAP and the willing to upgrade the latter to Hana.
Solutions
Technology choices
Information below is established according the context (data volume, etc.). We know that Azure provides a multitude of services (as IAAS & ad PAAS).
Azure Data Factory : Even if ADF is a good choice, debuging a no-code/low-code in ADF could be very tough and an issue because of the multitude of business rules coming from SAP. If we decide that SAP pushes raw data, the no-code/low-code could be very hard to debug.
Azure Worker Role : Deployed in Azure Container Registry - ACR (fast, 24x7 reliability, no downtime, easy to install, …), running as Azure Container Instance (ACI) which have limits that can’t be increased through a quota request (number of continers per container group, …). However, there are limits that can be increased (sku, …). ACI is used when we don’t need to provision or manage any underlying infrastructure and to host simple applications, task automation or build jobs. Perfect for a scenarion of serverless conainer in Azure.
ACI is not a good choice if containers need a bit more complicated orchestration.
Azure Function : Even if Azure Function could be a good choice because it’s severless, it might be not a right choice because of the multitude of business rules coming from SAP and Method of Azure Functions has to be short, stateless, defensive and we must avoid long running functions.
Executable/Windows Service : A service deployed in a on-premise server could be a good choice. The issue could be related to the .NET Framework to use because of the Power Platform (SAAS), precisely, because of the SDK (Software Developement Kit) of Power Platform : .NET Framework or .NET Core.
What do we know
On one side, we have a SAAS Platform (Power Plaftorm) accessible through Internet (even if it’s accessible through Microsoft Backbone/Infrastructure). On the other side, we have a IAAS Platform (SAP) which cannot be accessed directly and actually, an API Manager is and has to remain the Front Door. Moreover, a communication between this API Manager and File Server is currently in use.
The 1st step of the integration is one-way communication (from SAP to Power Platform) and the 2nd step will be a bidirectionnal communication (sending data from SAP to Power Platform & sending data from Power Platfotm to SAP).
Even if we are in the 1st step of the integration, different solutions and scenarios could include Azure Data Factory as an Orchestrator and not only as a Data Transformer or Mapper.
Solution Architecture (high-level)
The 1st step could be considered as a temporary solution including the existing File Server used by the API Manage deployed in a Server in Azure (IAAS) and the latter already connected to SAP.
The API Manager will extract the data from SAP and generate files (for example, CSV files).
The API Manager pushes the CSV files into the server file to a specific location.
Several options are considered :
Option 1 - In a server on-premise, a service as an executable is deployed and the trigger is scheduled. The service will extract the CSV files one by one. It will build a dataset from the latter.
Option 2 - Azure Data Factory, based on a schedule, will extract the CSV files one by one. It will build a dataset from the latter as well.
Option 3 - Azure Function, based on a schedule, will extract the CSV files one by one. It will build a dataset from the latter.
Option 4 - Azure Worker Role, when published in Azure Container Registry, will start running as a background service and will extract the CSV files one by one. It will build a dataset from the latter.
All services will be able to use the Power Platform SDK (Software Development Kit) to push the data into the Power Platform Database and log the execution process (failure and success).
Chosen Solution
Regarding the fact that the Azure ecosystem is not ready, the privileged solution would be the service deployed in an on-premise server.
Daily schedule will trigger the service (executable with extension .exe) in the Windows Server.
The executable will initiate an FTP connection (ftps) to connect to the root folder in the file server.
The executable will extract the CSV files stored in a specific folder in the file server.
With the app registration in Azure Active Directory (AAD), the executable will connect to Power Platform.
The executable will push the data coming from SAP to Power Platform Database using the SDK Client (Software Development Kit).
During the process and at the end (failure or success), it will log the transaction in a file stored in the file server and in the server in which it has been deployed.
Different scenarios for the 2nd step of the integration
The integration between is not really complex because the system A (Power platform) sending data to system B (SAP) does not expect a response from the latter. And, vice versa, the system B (SAP) does not expect a response from system A (Power Platform).
Because the integration does not need to be a realtime integration, it would be overkill to use Azure Event Hub.
From Power Platform to SAP : See the pages below
From SAP to Power Platform : See the pages below
Software solution
Manager Layer
Instanciation of the execution rules (rule pattern)
List<IExecutionContext> _executionContexts = new List<IExecutionContext>();
//Constructor
public ManagerObjectValidation()
{
_executionContexts.Add(new Validate_KNA1());
_executionContexts.Add(new Validate_KNA1_KNVV());
//etc...
}
Execution of the method to valite the execution context
public ResultValidation ProcessExecutionValidation(string key)
{
var result = new ResultValidation();
if(_executionContexts.Count > 0 )
{
for(var i=0; i<= _executionContexts.Count -1; i++)
{
result = _executionContexts[i].ValidateExecutionContext(key);
if (result.validated) { break; }
}
}
else
{
result.errorMessage = Constantes.MessageFromManagerValidation;
}
return result;
}
Execution of the validation method
public ResultValidation ValidateExecutionContext(string key)
{
if (string.IsNullOrEmpty(key)) return null;
var result = new ResultValidation();
if (key.ToLower() != Constantes.KNA1.ToLower()) { return result; }
result.executionContext = EnumExecutionContext.UseCase.KNA1;
result.validated = true;
result.key = key;
return result;
}
Business Layer
Instanciation of the right process with the right execution context : instanciation in the Factory
Call the processes ProcessContextBuildingObject, ProcessMappingObject, etc.
Repository Layer
Instanciation of the repository with Power Platform Context
Example of overriden method in a specific repository
Issues and Considerations
Complexity : it could lead to a more complex application design regarding the business rules related to SAP and especially if the API Manager push the SAP raw data.
Orchestration : in a 2-way communication, an orchestration will be necessary to manage eventual consistency related to the data.
Resilience : logging the transaction is not enough. If the server crashes, alert is the only way to deal with it.
Design : without decoupling with Azure (PAAS), design is not ready for a 2-way communication.