Note: This solution has been released as a Beta and is being trialled by select organisations. Please contact us if you wish to be part of the pilot.
This guide will take you through the process of using the iAuditor connector available in Microsoft Flow to automatically transfer key pieces of audit data into SQL Server tables. It assumes you have a working knowledge of SQL Server and an existing table structure you wish to populate.
Note: This approach may also work for other database technologies supported by Microsoft Flow that we have not yet tested with. For example: Oracle, DB2, MySQL, PostgreSQL, Teradata etc.
Begin by adding the Schedule – Recurrence trigger in a new blank flow. We’re going to configure the flow to check iAuditor for audits to process every hour.
Go ahead and set up the trigger so that the Interval is 1 and the Frequency is Hour.
The Get past time action is also needed for this. Add it to your flow and configure it to match the settings for the trigger.
The next step is to add the Search modified audits action listed under iAuditor and set the Modified after field to use Past time from the available dynamic content.
At this point, your flow is configured to perform an hourly check for audits.
Add the Get a specific audit action as it will allow us to access all the data for each audit that the flow processes. Set the Audit ID from the available dynamic content and the action will immediately become nested in an Apply to each loop.
In our example scenario, we’re going to populate a table with the responses to key questions within our template. Each column will represent one of the questions and then each audit will result in a new row.
Our example template looks like this:
Go ahead and add a Data Operations – Compose action and click inside the Inputs field. You’ll see a flyout appear on the right. Click the Expression tab.
Enter this expression: body(‘Get_a_specific_audit’)?[‘items’]
Rename the action to Items.
Add a Data Operations – Filter Array action and set the From to Output which will be available under the heading Items in the available Dynamic content.
In the left-hand Choose a value input, enter this expression: item()?[‘item_id’]
In the right-hand Choose a value input, enter the unique identifier for the item you wish to select.
Note: The unique identifiers for your template’s items will be entirely different to what is used in this example. Follow this guide to find out the unique identifiers for the items you wish to access.
Rename the action to be more easily distinguishable as we will be replicating these steps for each item you want to access the data for.
Add a Data Operations – Compose action. We’ll use it to access the selected response of the item. The type of item seen above is a ‘question’, so the expression to access the selected response, is:
Note: Other item types will require slight variations to this expression.
Now, whenever the response is needed, we can simply select it from the Dynamic content flyout.
Repeat these steps for each of your target items, adjusting the item ID within the Filter Array action and the expression for the Compose action.
The flow in this example looks like this after adding the remaining items:
Finally, add the Insert a row action which is available for SQL Server. Select your table and input fields should appear for all of the available columns.
Work through each input and select the relevant response from the available Dynamic content:
You can now save the flow and turn it on to continually populate your table with item responses. Here’s the final result from this example:
Warning: Limits and Performance
This guide walks you through configuring a flow that will insert a single SQL row per audit. Heavy usage where many thousands of audits are created daily can result in performance bottlenecks and delayed row insertion.
The number of rows created per second will be largely dependant on a number of different factors. These include but are not limited to the number of loop iterations and HTTP requests the Microsoft Flow platform handles.
Microsoft Flow provides some information on the platform’s limits here.