Skip to end of metadata
Go to start of metadata

You are viewing an old version of this content. View the current version.

Compare with Current View Version History

« Previous Version 24 Next »

Enable the integration between Power Platform and SAP to announce changes in SAP to Power Platform asynchronously and so not in realtime, without coupling the both systems. A good design will take in consideration the existing design related to SAP (see image below).

image-20240306-163222.png

Context and Problem

  1. Current architecture implying a front door (micro-service) is mandatory. Direct access to SAP is not allowed.

  2. No current integration between Power Platform and SAP. Power Platform is a SAAS in the public Internet even if it’s in the backbone of Microsoft.

  3. For now, there is only a one-way communication, from SAP to Power Platform. Data transfered from SAP is not considered as confidential.

  4. Data volume is not substantial : about 2 thousand records could be affected in Power Platform. Not a case of intensive computing and long-running functions.

  5. Hybrid scenario is highly recommended regarding the current architecture related to SAP and the willing to upgrade the latter to Hana.

image-20240306-165016.png

Solutions

Technology choices

Information below is established according the context (data volume, etc.). We know that Azure provides a multitude of services (as IAAS & ad PAAS).

  1. Azure Data Factory : Even if ADF is a good choice, debuging a no-code/low-code in ADF could be very tough and an issue because of the multitude of business rules coming from SAP. If we decide that SAP pushes raw data, the no-code/low-code could be very hard to debug.

  2. Azure Worker Role : Deployed in Azure Container Registry - ACR (fast, 24x7 reliability, no downtime, easy to install, …), running as Azure Container Instance (ACI) which have limits that can’t be increased through a quota request (number of continers per container group, …). However, there are limits that can be increased (sku, …). ACI is used when we don’t need to provision or manage any underlying infrastructure and to host simple applications, task automation or build jobs. Perfect for a scenarion of serverless conainer in Azure.

ACI is not a good choice if containers need a bit more complicated orchestration.

  1. Azure Function : Even if Azure Function could be a good choice because it’s severless, it might be not a right choice because of the multitude of business rules coming from SAP and Method of Azure Functions has to be short, stateless, defensive and we must avoid long running functions.

  2. Executable/Windows Service : A service deployed in a on-premise server could be a good choice. The issue could be related to the .NET Framework to use because of the Power Platform (SAAS), precisely, because of the SDK (Software Developement Kit) of Power Platform : .NET Framework or .NET Core.

 

What do we know

On one side, we have a SAAS Platform (Power Plaftorm) accessible through Internet (even if it’s accessible through Microsoft Backbone/Infrastructure). On the other side, we have a IAAS Platform (SAP) which cannot be accessed directly and actually, an API Manager is and has to remain the Front Door. Moreover, a communication between this API Manager and File Server is currently in use.

The 1st step of the integration is one-way communication (from SAP to Power Platform) and the 2nd step will be a bidirectionnal communication (sending data from SAP to Power Platform & sending data from Power Platfotm to SAP).

Even if we are in the 1st step of the integration, different solutions and scenarios could include Azure Data Factory as an Orchestrator and not only as a Data Transformer or Mapper.

image-20240306-165357.png

Solution Architecture (high-level)

The 1st step could be considered as a temporary solution including the existing File Server used by the API Manage deployed in a Server in Azure (IAAS) and the latter already connected to SAP.

image-20240306-165510.png
  1. The API Manager will extract the data from SAP and generate files (for example, CSV files).

  2. The API Manager pushes the CSV files into the server file to a specific location.

  3. Several options are considered :

    1. Option 1 - In a server on-premise, a service as an executable is deployed and the trigger is scheduled. The service will extract the CSV files one by one. It will build a dataset from the latter.

    2. Option 2 - Azure Data Factory, based on a schedule, will extract the CSV files one by one. It will build a dataset from the latter as well.

    3. Option 3 - Azure Function, based on a schedule, will extract the CSV files one by one. It will build a dataset from the latter.

    4. Option 4 - Azure Worker Role, when published in Azure Container Registry, will start running as a background service and will extract the CSV files one by one. It will build a dataset from the latter.

  4. All services will be able to use the Power Platform SDK (Software Development Kit) to push the data into the Power Platform Database and log the execution process (failure and success).

 

Chosen Solution

Regarding the fact that the Azure ecosystem is not ready, the privileged solution would be the service deployed in an on-premise server.

image-20240306-165717.png
  1. Daily schedule will trigger the service (executable with extension .exe) in the Windows Server.

  2. The executable will initiate an FTP connection (ftps) to connect to the root folder in the file server.

  3. The executable will extract the CSV files stored in a specific folder in the file server.

  4. With the app registration in Azure Active Directory (AAD), the executable will connect to Power Platform.

  5. The executable will push the data coming from SAP to Power Platform Database using the SDK Client (Software Development Kit).

  6. During the process and at the end (failure or success), it will log the transaction in a file stored in the file server and in the server in which it has been deployed.

 

Different scenarios for the 2nd step of the integration

The integration between is not really complex because the system A (Power platform) sending data to system B (SAP) does not expect a response from the latter. And, vice versa, the system B (SAP) does not expect a response from system A (Power Platform).

Because the integration does not need to be a realtime integration, it would be overkill to use Azure Event Hub.

  1. From Power Platform to SAP : See the pages below

  2. From SAP to Power Platform : See the pages below

Software solution

image-20240306-170516.png

Manager Layer

Instanciation of the execution rules (rule pattern)

List<IExecutionContext> _executionContexts = new List<IExecutionContext>();
//Constructor
public ManagerObjectValidation() 
{
     _executionContexts.Add(new Validate_KNA1());
     _executionContexts.Add(new Validate_KNA1_KNVV());
      //etc...
}

Execution of the method to valite the execution context

public ResultValidation ProcessExecutionValidation(string key)
{
      var result = new ResultValidation();
      if(_executionContexts.Count > 0 )
      {
          for(var i=0; i<= _executionContexts.Count -1; i++)
          {
              result = _executionContexts[i].ValidateExecutionContext(key);
              if (result.validated) { break; }
          }
       }
       else
       {
            result.errorMessage = Constantes.MessageFromManagerValidation;
       }
          return result;
}

Execution of the validation method

public ResultValidation ValidateExecutionContext(string key)
{
      if (string.IsNullOrEmpty(key)) return null;
      var result = new ResultValidation();

      if (key.ToLower() != Constantes.KNA1.ToLower()) { return result; }

      result.executionContext = EnumExecutionContext.UseCase.KNA1;
      result.validated = true;
      result.key = key;

      return result;
}

Business Layer

Instanciation of the right process with the right execution context : instanciation in the Factory

public IAbstractFactory InstanciateBuildingObject(EnumExecutionContext.UseCase executionContext)
{
    IAbstractFactory instanciation = null;
    
    switch (executionContext)
    {
        case EnumExecutionContext.UseCase.KNA1:
             instanciation = new BuildingObjectKNA1();
             break;
        case EnumExecutionContext.UseCase.KNA1_KNVV:
             instanciation = new BuildingObjectKNA1_KNVV();
             break;
             //etc...
    }
     return instanciation;
}

Call the processes ProcessContextBuildingObject, ProcessMappingObject, etc.

//ProcessContextBuildingObject
public ResultValidation ProcessContextBuildingObject(ref ResultValidation result, IContextServices contextServices, string key, 
      string dirPath, string file, string FTPUser, string FTPPassword)
{
   var filesFtpInDirectory = dirPath.FtpDirectoryList(FTPUser, FTPPassword);
   if(filesFtpInDirectory == null) { return result; }
   
   var filesKNA1 = filesFtpInDirectory.Where(fileFtpInDirectory => fileFtpInDirectory.Contains(Constantes.KNA1)).ToList();
   if (!filesKNA1.Any()) { return result; }
   //etc..
   result.kNA1s = new List<KNA1>();
   if (dt != null) { result.kNA1s = Util.ConvertDataTableToKNA1(dt); }
   return result;
}
//ProcessMappingObject
public void ProcessMappingObject(IContextServices contextServices, ref ResultValidation result)
{
    result.accountsKNA1 = new List<Account>();
    result.accountsKNA1 = Util.MapKNA1toAccount(contextServices, result.optionSetMetadataBases, result.kNA1s, 
                              result.existingAccounts, result.industryCodes, result.industries, result.shippingTypes, result.customerSectors,
                              result.customerActivities, result.customerProducts, result.transactionCurrencies, result.transportZones);
    //etc...                              
}

Repository Layer

Instanciation of the repository with Power Platform Context

//Constructor
protected XrmRepository(IContextServices context)
{
      OrganizationContextXrm = context.OrganizationContextPowerPlatform;            
      ServiceClientDataverse = context.ServiceClientDataverse;
      CrmServiceClientDataverse = context.CrmServiceClientDataverse;
}

//Virtual Methods to override, but not mandatory
public virtual IEnumerable<T> FindAll(QueryExpression qe)
{
     throw new NotImplementedException();
}

Issues and Considerations

  • No labels