Now that we managed to establish a connection between our local file system and Azure and get our flat file content with the forecast values, let us see how we can take this another step further and complete the entire solution. 

We concluded Part One knowing that we had to add some additional steps: 

  • Adding validation to the file name to filter unwanted files being dropped into our folder, triggering our Logic App 
  • Converting the file content to a readable JSON body we can work with
  • Converting week numbers to dates that will represent the first work day of the week where a forecast could be maintained for an item (The SAP Business One Service Layer can only accept dates when invoking the Weekly Forecast update as opposed to the B1 UI). 
  • Generating the payload for the forecast and incorporating the forecast generation within our solution  
  • Covering some Error-Handling concepts 

 

Let’s address each of those steps and finalize our solution. 

 

Adding validation to the file name

We first want to add a validation that checks the file name that is being dropped into our local folder and ensures we enforce a naming convention. 

This will save us unnecessary executions for file names that do not correspond to what the company has decided to adhere to. 

Say that we want our file name to at least contain the string “Forecast” in it. We would add a control connector that verifies it: 

 

 

If the check returns ‘True’, our flow kicks in; otherwise, we cancel the run.

 

Converting the file content to a readable JSON

The approach here would be to create a JSON array that will hold both the CSV headers and items. 

Our first step will be to add a “Compose” action as follows: 

Important note regarding line endings:  I used the decodeUriComponent function to split the CSV. 

decodeUriComponent(‘%0A’) 

This represents a new line feed character (LF), often displayed as \n. This is the Unix standard. 

 

CSV Files generated in Windows, may use this format but often use a carriage return and line feed (CR+LF). This is represented as \r\n. 

The split expression above will still work with CR+LF, but you will be left with \r characters in your data. The correct expression to split on a CR+LF is: 

decodeUriComponent(‘%0D%0A’)

 

If you load your CSV file into Notepad, you can easily see your file’s format in the bottom right-hand corner. It will show either “Unix (LF)” or “Windows (CR LF)”. 

The function “Split” returns an Array and requires a String as the first argument. We often use that function in this solution as it is pretty powerful. 

The unique structure of our CSV dictates that we separate the file headers and items into two arrays. 

The reason is that each item code has 12 forecast periods, and we need to maintain a different quantity for each period. 

Notice how I used the function first in my expression to only fetch the first array element created by the “Split” function. 

To get only the rows, I used the function “Skip” in my “Compose” action and enforced my connector to ignore the first element (= the headers), thus leaving me with only the lines. 

If we run our flow, we will see that the last line returns an empty string: 

To clean the output, we would add a “Filter Array” action and make sure to use the “length” function only to select items that are greater than zero (0): 

Our following action would be to “Select” where the source would be the “FilterRows” array. 

Bare in mind that the “Select” action only accepts arrays as input and iterates them via an internal loop to display all the occurrences of a specific element: 

The “Select” action above is where the real magic occurs. We need to rebuild the payload in such a way that for each item code, we get the forecast quantity for each week number, where we have 12 weeks in total in our scenario. 

This is precisely why I decoupled the header from the items – we need to count the position of each period in our header array and associate the corresponding  quantity to it from the items array and, along the way, deal with some “data cleaning/conversions”: 

  • Our week numbers  have an undesired character (‘W’) we need to get rid of  
  • Our week numbers cannot be used as they appear on the file since SAP Business One expects to get, for a weekly forecast mode, the first working day for each week that was on our flat file. 

 

Look at our “FilterRows” action output: 

 

We need to identify the position of each element, but not before we turn each iteration of the “Select” action into an array. 

The “Item Code” is the first one, so the position is 0. It will then look like the following: 

trim(split(item(), ’,’)[0]) 

Same for the quantity: 

split(item(), ’,’)[2………13] 

 

The output of our “Select” statement would look like this: 

 

We managed to create a meaningful JSON array we can work with successfully! 

 

If you read the SAP Business One Service Layer API Reference (go to https://<hostname>:50000) where <hostname> is the server name where you installed your SAP Business One Server Components, you will find that to patch an MRP weekly Forecast object successfully, you need to include the following payload in the request body: 

 

 

This is where we need to add more magic to our solution to make this happen. 

Our file contains 13 periods that represent the week #, so we would need to loop through each of our items, but we will also need to nest another loop that runs 13 times and assign the forecast values for each of the period #. 

For that, let us create a global variable called varIDStep that will serve as a numerator/counter in our scenario and set it to ‘1’: 

Now, let’s add two nested loops – a “Do-Until” loop and a “For each” loop: 

 

This means that we would need to iterate 12 times for each item code to assign the values. 

Each time the inner loop (“For each”) finishes execution (Remember, it iterates through each of our item codes!), our variable is incremented by using the “Increment variable” action: 

When the counter value is 13 then the outer loop (“Until-Do”) exists as well. 

To further optimize our solution and avoid unnecessary executions, I would recommend adding a condition that ignores iterations where the Quantity equals ‘null’: 

Note how I used the “concat” function to string together the “Quantity” and our variable (counter) current iteration value, so it corresponds to our “Select” definition above : 

items(‘For_each’)?[concat(‘Quantity’,variables(‘varIDStep’))] 

 

This will ensure that I always reference the current value of either “Quantity” or “Period”. 

Using this methodology, the following action will be to convert the week representation that was written to our file to a date SAP Business One can use to post the forecast values (remember, it is the first workday of the week).

I will not get too deep into how this conversion is done but you are more than welcome to reach out to me if you wish to know about this added logic. 

I will briefly mention how to use “Relative Paths” to invoke another sub-workflow and pass on a parameter to that flow. 

Note how I defined the relative path on the connector : 

/Period/{Period} 

The parameter will be passed via an HTTP call from our main workflow in the following way : 

Again, we use the “concat” method to reference the current period inside the loop we wish to convert to a date. 

Once the date is converted, we can go ahead and prepare the output and trigger the change : 

The Final result will be that the forecast values are pushed into the SAP Business One Client: 

We could potentially (mainly for a Productive solution) add some user interaction steps that notify the submitter of a success/failure at the end of the run and make our solution a bit more “user friendly”. 

One way to do it is to use an Array variable that will get populated from within the loop at runtime whenever we make a call to the Service Layer: 

We can then check the status code of the response we get and decide which array to populate:  

 

We could then use the “Send an email (V2)” connector in combination with the “length” function to size up the length of each array outside the outer loop (which equals the amount of successful/erroneous records we got): 

 

Summary 

Now that we have concluded building our solution, we can further build extension scenarios that combine those “Legacy” approaches of working with local file systems with newer approaches that help us build a winning integration to our SAP Business One systems. 

You can use the template we created to respond to similar cases where a user triggers an event that is further leveraged and embedded within our ERP with little to no effort. 

This setup enables you to develop decoupled business extensions using Logic Apps with all the tools and services provided by Microsoft Azure Platform in whichever language suits you or your use case best. 

If you’re interested in more use cases that could benefit from such a decoupled setup, please reach out to me, and I will be happy to hear about your ideas as well.

 

Join us in the SAP Business One Community to add your thoughts, comments and ideas!