We are all well aware of how fast technology evolves.
New tools and services keep emerging daily, offering us more content that we can probably consume.
This “trend” is, in fact, not a trend but the usual way of how things transpire. As long as we stay focused on what we are trying to achieve, we can even greatly benefit from those “revolutions” we keep running into.
Nonetheless, most of us (including yours truly) never completely abandon the old “Legacy Approach” that in some way has set the tone and has laid the groundwork, for years now, for many of the new approaches we are using today.
The only real question is how we can leverage progress and combine newer and more advanced approaches with Legacy approaches to facilitate our organizational tasks and speed-up processes so we can become more efficient in doing our jobs?
The Scenario
For instance, imagine your company has a huge amount of part numbers that are stored in a warehouse and, effectively, your inventory controller has to run a weekly MRP Recommendation Report using the SAP Business One MRP Module.
However, while I am not going to be covering MRP concepts in this blog (maybe it is worthwhile that I do so sometime soon …), those of you who actually use the MRP Module already know that in order to get a reliable MRP Recommendation Report, it is imperative that your MRP Forecasts are regularly maintained in SAP Business One – especially when Sales Forecasts keep changing all the time.
Potentially, your Inventory Controller would need to go back to the MRP Forecast within SAP Business One and modify the forecasted values for each of your SKUs/Storage Locations.
Using some of the Expert Tools SAP Business One has to offer might help you get there a bit faster. Still, those tools are designed in such a way that they require a deep understanding of the SAP Business One table schemas and can lead to severe damages if mismanaged.
And to make things a little bit more interesting, I will add another complexity tier to the business scenario – let us assume that the forecasted values are NOT being generated in SAP Business One but are extracted to a “Flat File” (*.CSV) from other SAP/Non-SAP systems, such as SAP-ECC or even an SAP S/4HANA system or perhaps a web interface you’re using. Say your Inventory controller just wishes to drop that output file into his local file system, knowing that this will be taken care of in the background by an automated process that will push those values into the right place in SAP Business One – making sure his MRP Recommendations Report is taking into consideration latest changes made to the forecasted sales values and he can go on with his day completing other tasks.
Sounds tricky? Well, it is not a “piece of cake”, but it is definitely doable with the tools Microsoft Azure and the amazing Power Platform have to offer us.
How This Could Be Achieved
The first thing we need to understand is that our local file system (On-Premises Data Source) needs to be able to “talk” to Azure and the latter needs to be able to “listen” to our local folder structure.
This could be achieved by installing and configuring the “On-Premises Data Gateway“ on your local machine.
The on-premises data gateway acts as a “bridge”.
It provides quick and secure data transfer between on-premises data, which is data that isn’t in the cloud, and several Microsoft Cloud Services on Azure, for which Microsoft has pre-built connectors.
The following diagram illustrates how the On-Premises Data Gateway works and allows you to see the underlying architecture behind it:
So, the first step is to download and install the On-Premises Gateway. Please be aware that Microsoft releases a new update for data gateways every month.
Microsoft provides fairly neat documentation on how to install & configure the Gateway, so I won’t get into a step-by-step specification here. But we will cover the basics and also some of MY OWN best practices while I was working on some of my solutions.
You can find all the documentation here:
- What is an On-Premises Data Gateway?
- On-Premises Data Gateway Architecture
- Install an On-Premises Data Gateway
Some Best Practices while installing the Gateway Instance on your local machine:
1.Use a Recovery Key – you never know when the restore option might come in handy, and instead of recreating the instance, you can retrieve it quite easily.
2. Make Sure to create a Gateway Instance on Azure as well
3. If you set up the Gateway correctly, you should be able to see that the resource is available for you:
4. When the On-Premise Gateway finishes installing, it will have its own default service name (account):
We would want to change that account to point to the Local User on the machine where the Gateway was installed, preferably an Administrator’s account.
Important note – don’t use the local Administrator account – it should be disabled and replaced with an alternative account that’s a member of the local admins group on the server. The local Administrator account is the first one hackers go after in any data breach.
Once you hit “Change Account”, you will be presented with a dialogue that suggests you restart the Gateway, which will reset its credentials :
You now have two options:
- The first is to open up the Command Prompt and type “WHOAMI” to get the local user on the machine
- The safest way is to navigate to the System Properties and grab the “Device Name”, which is basically the “Computer’s name” that serves as a “local domain name”:
If everything went well, you should be able to sign into the new service name and see the changes applied:
5. Please note that the machine must stay connected to the Internet at all times if you intend to use it as a data source that constantly “talks” to Azure and exchanges information.
If there is any possibility that this Gateway Instance becomes unavailable at some point, you might want to consider adding this Gateway Instance to a “cluster” when high availability is required.
6. The user name you use to install the On-Premise Data Gateway must be a local machine user – meaning, you cannot use Cloud users (like Azure AD, M365) to trigger the flow – even if that user is the one you are using to log in to your machine. I would recommend activating “Administrator” or setting up another local user with administrative privileges.
Now that we have set up the connection between our local file system and Azure, we can move on to the next step – creating the required logic to enable the MRP forecast values to be updated in SAP Business One.
In this case, the best approach would be to create a Logic App that “listens” to your local mapped folder (we will be covering this soon) so that whenever a new file is dropped into that folder, our flow is triggered.
This approach where we react to events instead of polling the folder periodically guarantees fewer executions which leads to less consumption-based fees (especially if you are using the “Consumption” Logic Apps and not the “Standard” ones). Azure Logic Apps offer an entire set of built-in connectors to handle our local file system.
For our scenario, we would need to start with two:
- The “trigger” – that connector “listens” to our local folder –“When a file is created (properties only).”
- The “action” – that connector fetches the actual contents of our file (Comma Separated Delimited content)-“ Get file content.”
Before we can use them, a connection to our On-Premise Data Gateway must be established :
Once we authenticate the connection, we can start using both the trigger and the action.
The trigger, as the name suggests, will be fired only when a new file is dropped into the local folder we mapped – this file will contain the forecast values that we would like to update in SAP Business One.
The file structure, capturing the forecast values, could potentially look like the following:
I would also suggest, for Productive solutions, thinking of a naming convention for this file and adding logic that checks to see that the naming convention is being adhered to. That will prevent triggering the flow each time someone drops “some file” into this folder. I will cover this logic later in this blog.
Assuming we successfully signed into our Gateway and enabled it, we should be able to drop a CSV file into the folder and check whether our Logic App recognizes the new event.
If you set it up correctly, you should see the following under “Trigger History” :
And then to have a glimpse of the Logic App run:
As you can see, our CSV file content is now fetched by the “Get_file_content” action.
If, for some reason, you see that the new file creation is not being recognized by your Logic App, try checking your “API Connection” section in your Logic app to see that you correctly mapped the file path and provided all the necessary credentials for it to log into your local file system via the Gateway successfully:
Recap of Integrating Your Local File Systems with SAP Business One and Microsoft Azure – Part 1
- We covered some basic concepts of “Legacy” approaches and the need to integrate local file systems with our solutions to read/write data from/to our SAP Business One system
- We suggested a business scenario where a large number of items need to have their forecast values updated regularly in SAP Business One with a small amount of effort, while the source data could be another SAP/Non-SAP 3rd party system.
- We also covered basic concepts on how to install and configure the On-Premise Data Gateway to generate the connectivity between any local machine and Azure Logic Apps.
- We tested the “file creation” event to see that our Logic App could identify it and fetch the file content.
What to expect in Integrating Your Local File Systems with SAP Business One and Microsoft Azure – Part 2
- Adding validation to the file name to filter unwanted files being dropped into our folder, triggering our Logic App
- Converting the file content to a readable JSON body we can work with
- Converting week numbers to dates that will represent the first work day of the week where a forecast could be maintained for an item. This is because the SAP Business One Service Layer can only accept dates when invoking the Weekly Forecast update, as opposed to the B1 UI.
- Generating the payload for the forecast and incorporating the forecast generation within our solution
- Covering some Error-Handling concepts
Watch the SAP Business One Community blog page for Part 2!
Leave A Comment
You must be logged in to post a comment.