Dynamics 365 for Finance and Operations Inventory Cost Model, Part 4: Landed Cost

This is the fourth instalment in my series of articles explaining the inventory cost model in Dynamics 365 for Finance and Operations (D365FO). The series is expected to include the following parts:

In the previous posts, we looked at the core concepts underpinning the inventory cost model in D365FO and how some of the key areas should be configured. In this post we will look more into how you work with landed cost.


I am not sure if there is a formal definition of what landed cost is, but for the purpose of this article landed cost is defined as:

Total cost of an inbound shipment of goods including product cost, freight, insurance and duties.

The concept of charges

In D365FO, landed costs are recorded and managed through so-called Charges. A charge can be any type of landed cost and can be set up for products and suppliers. A charge is a relatively simple concept based on the following business logic:

  • A Charge can be applied to a purchase order header or lines manually or automatically.
  • A charge cannot be split across multiple purchase orders.
  • A purchase order header charge can be allocated across the purchase order lines (manually).
  • Purchase order line charges can be included in the product cost and taken into the inventory.
  • A charge can only be applied to the supplier on the purchase order – not a 3PL supplier.
  • Charges on a purchase order are perceived to be estimated – realised charges are connected to the invoice.

Configuring charges

Before we can use Charges in D365FO, they need to be configured. Since we are dealing with inbound charges, the configuration takes place in the Procurement and sourcing module under Setup / Charges.

Firstly, we must configure the necessary charge codes as shown in the following example.

Inventory 1.PNG

The key decision to make here is how to post the charge to the general ledger. In this case, FREIGHT, the cost (debit) is posted to a nominal ledger account (600120) and the debt (credit) is included is the amount owed to the supplier and therefore the accounts payable account set up for the supplier.

In the next example, FEE, the debit configuration is pointing to Item. This means that the charge will go to the inventory account for the item on the purchase order line and be included in the inventory value.

Inventory 2.PNG

Obviously, charge codes with debit posting set to Item are only relevant for purchase order line charges.

Once the charge codes have been defined, they can be used manually on a purchase order. However, most companies would like to have default charges applied automatically. This is set up as Automatic charges.

Firstly, I have set up a handling fee of $5.00 for vendor 1001 as shown in the following example. This is set up as a Header charge.

Inventory 3.PNG

This means. that all purchase orders for vendor 1001 will have a handling charge applied to the purchase order header automatically.

Next, I have set up two line item charges for freight and fee respectively. The freight charge is multiplied by the quantity on the purchase order line. The fee charge is a fixed amount for each purchase order line.

Inventory 4.PNG

The charges are automatically applied to all purchase order lines for vendor 1001, item 1000.

I could also have set up automatic charges for a group of vendors or a group of items. These groups are maintained in special charges groups.

Use of charges in the purchasing process

When I create a new purchase order for vendor 1001, the handling charge is automatically applied as shown in the following screenshot.

Inventory 5.PNG

If required, I can add, change or delete charges in the purchase order header. Charges in the purchase order header are maintained under Maintain charges.

The purchase order header charge is a general charge that will not be included in the inventory value. The charge can be allocated to the purchase order lines instead by using the Allocate charges menu point as shown below.

Inventory 6.PNG

Now, when I create a purchase order line for item 1000, the freight and fee charges are automatically applied as shown here.

Inventory 7.PNG

Since the fee charge is fixed, it does not change with the order quantity whereas the freight charge does.

Once the purchase order has been confirmed to the supplier, changes to charges cannot be applied until the invoicing stage.

Invoicing matching

At the invoicing stage, the user cannot remove the estimated charges on the purchase order. The purchase order charges are automatically connected to the invoice charges, but the user can remove the connection and apply new corrected charges. This way, the user can match the invoicing but at the same allow comparison between estimated charges and actual charges for that purchase order.

Charges postings

The following screenshot shows the inventory transaction details for the purchase order line after the supplier invoice has been applied.

Inventory 8.PNG

As you can see, the purchase order line amount of $1,798.00 has been increased to $1,798.50 because of the $0.50 fixed fee charge. This charge has been included in the inventory value because the charge code was set up to post the debit side to the item.

If we look at the financial voucher for the purchase order as a whole, we can see that the $2.00 freight charge has been taken to the “Freight/transportation in” account and the $5.00 handling charge has gone to “Other miscellaneous expenses” account. Lastly, it is worth noting that the accumulated charges have been added to the accounts payable account as well.


The above example pretty much sums up what can be achieved with charges in D365FO. If your requirements involve distributing freight charges across multiple purchase orders, the Transportation management module may be applicable, but it is beyond the scope of this article.

As mentioned, the system keeps both the estimated and the realised charges, but I have yet to find a report that shows a comparison or any statistics.



Using Derived Test Cases in RSAT

To provide the best possible test coverage and to test breaking scenarios, we need to test the same process with multiple variations on data – also data that will deliberately break the test case.

Luckily, in RSAT we do not need to record a new test case every time we would like to add a new test data scenario.

Instead, we can leverage the derived test case feature.

A derived test case in RSAT is essentially a copy of an existing test case with a separate parameters file holding the test data variation.

The following screenshot shows a test plan with three test cases loaded into RSAT. As you can see, test case #4027 has the Derived flag ticked since it is a derived test case from test case #4018.

Test plan with a derived test case in RSAT

Creating a Derived Test Case

To create a new derived test case, you select the original test case and click on New / Create derived test case in the menu at the top of the screen.

Test plan with newly added derived test case

As the above screenshots shows, a new test case has been created from the original test case (#4018) named #4028.

Clicking on Parameters in the menu opens the Excel file holding the test case parameters as shown below.

Derived test case parameters (before change)

The warehouse used in the original test is “11”. Instead, we would like to test how the system reacts, if we select an illegal value (“33”) in the warehouse field as shown in the new scenario below.

Derived test case parameters (after change)

In this scenario we are providing the system with an invalid warehouse (“33”) for site (“1”) and therefore the test case should fail.

The derived test case can now be executed by clicking on Run in the menu.

Test case failed – as expected

As you can see in the above screenshot, the test did indeed fail.

To investigate further, click on Open / Open recent test runs the menu. This brings up the details of the test case run as shown below.

Test case results are captured in Azure DevOps

As you can see, the error message clearly indicates that the test case attempted to use an illegal value in the warehouse field.

As you close the RSAT tool it will warn you that not all data is saved to Azure DevOps. This is because a derived test case is created in RSAT locally and not Azure DevOps and therefore you need to upload the artefacts to Azure DevOps.

Remember to upload new artefacts to Azure DevOps


Derived test cases is an efficient and powerful way to expand the test coverage for a given process across multiple data scenarios and will help ensure that the system is able to cope with intended as well as unintended use cases.

Passing Legal Entity Context to an Embedded Power Apps App in Dynamics 365 Finance and Operations

With Platform Update 43 it is possible to pass the current legal entity context from Dynamics 365 Finance and Operations (D365F&O) to an embedded Power Apps app.

In this blog post I will look at an example where I display fixed assets for a given legal entity. The legal entity is passed from D365F&O when I open the app.

To capture the legal entity passed from D365F&O, I add the following code to the OnStart app event:

Code in OnStart event

This code captures the “cmp” parameter passed from D365F&O and puts it into a variable named “FinOpsLegalEntity”.

I have added two data sources to the app:

App data sources

The FixedAssetsV2 data source points to the Fixed Assets data entity in my instance of Dynamics 365 Finance and Operations (D365F&O) and the CachedAssets is a collection to hold the assets I fetch from D365F&O.

To fetch and cache the fixed assets in the app, I also add the following code to the OnStart event:

ClearCollect(CachedAssets,Filter(FixedAssetsV2,Company = FinOpsLegalEntity))

This code fetches all fixed assets from D365F&O where the Company field value is equal to the legal entity value that was passed from D365F&O when the app what started.

To display the fixed assets, I have fetched and cached from D365F&O, I add a gallery (Gallery1) to my app as shown below.

Gallery added to app

As you can see, the Items event contains the following code to populate the gallery:


This code uses the CachedAssets collection to populate the gallery sorted by the FixedAssetNumber column.

Lastly, I add a text field to display the legal entity passed from D365F&O as shown in the below screenshot.

Text field with legal entity value

As you can see, the field contains the FinOpsLegalEntity variable in the Text event to display the content of the variable.

I save the app and copy the app ID GUID. Now we are ready to embed the app into D365F&O.

Personalise dashboard

In this example I would like to embed the app as a tile on the dashboard, so I open the personalisation dialogue and select Add a page from the dropdown menu.

Add a page dialogue

Next I choose Power Apps from the list.

Add Power Apps app dialogue

Lastly, I give the tile a name and paste the app ID GUID into the App ID field.

Asset list tile on the dashboard

As you can see, the dashboard now contains a new tile named Asset list.

When I click on the tile, the app starts and as you can see, the legal entity context (usmf) has been passed to the app.

Asset list filtered by legal entity

This was a very simple demonstration showing how easy it is to pass the legal entity context to a Power Apps app and use it to filter data.

Extend Dynamics 365 Finance and Operations Workflows with Business Events

Back when Business Events were introduced in Dynamics 365 Finance and Operations (D365F&O), a slightly overlooked use case was the ability to extend inbuilt workflows to external endpoints such as Power Automate.

In this blog post I explore how to use a Business Event to execute the approval process in Power Automate for a general ledger journal.

Designing the Inbuilt Workflow

To kick things off, I create a simple general ledger journal approval workflow using the inbuilt designer as shown below.

General ledger journal approval workflow

As you can see, the workflow is quite simple in this example and only contains a single approval activity. Once I save the workflow, Business Events representing the workflow elements are automatically created in the catalogue:

Workflow elements in the Business Events catalogue

Connecting the Business Event to Power Automate

Since we are going to make the actual approval happen in Power Automate, the Business Event we are interested in is in the category called “Workflow workitem” as this is the category used for the actual work items in the workflow.

To connect the business event to Power Automate, we need to set up an endpoint in the Business Event catalogue. Before you can set up an endpoint, you need to create the Power Automate flow that the endpoint will point to.

As shown below, I create a new flow with the “When a Business Event occurs” trigger on the D365F&O connector.

Power Automate flow trigger

Once the flow has been saved, you can use the flow URL to set up the Business Event endpoint.

When the endpoint has been created, you can activate the business event.

Power Automate Approval

Now the Business Event is linked to the Power Automate flow, and we need to design the actual flow to replace the inbuilt approval process.

Parse Business Event payload JSON

After the trigger, the first step in the flow is to parse the JSON supplied as a payload with the Business Event. This allows us to reference workflow data later in our flow.

Capture workflow instance ID as a string

Next we want to capture the workflow instance in a string variable to allow us to query and update the workflow item later in the flow.

Check the state of the workflow

Before we send the workflow item for approval with the user, we need to ensure that it is in a state that allows for approval. In this case we are using the “Execute action” action on the D365F&O connector in Power Automate calling an action named “WorkflowWorkflowItems-validate” with the workflow instance as a parameter.

Acting on workflow state

If the response from the check is “True”, the workflow is in a ready-state for approval actions to be applied.

Configuration of the approval action in Power Automate

In Power Automate I use the Approval action to start the approval process. In this example I use data captured in the payload (see above) to fill in the Approval action properties.

Firstly I use the “WorkflowUserEmail” value to assign the approval request to the user assigned to the workflow item.

Subsequently, I fill in the details of the message with “WorkItemOwner” (user) and “WorkflowStepInstruction” (instructions from the workflow item).

Lastly, I use the “LinkToWeb” value for the Item link property. This allows the user to click on the link in the approval request and jump straight to the ledger journal in D365F&O.

Once the user has approved or rejected the approval request, we need to update the workflow item with the result.

Update workflow item with approval result

As you can see from the above, I am using the “WorkflowWorkflowItems-complete” action for this. I am using the “Outcome” property from the Approval action to set the outcome in the workflow item.

The possible outcomes for the workflow item are available in the payload and can be used in the approval request.

I am using the “Response summary” from the approval request in the Comment field and the “WorkItemOwner” from the payload as the RunAsUser.


So now it is time to test our configuration. To get started I create a new general ledger journal and select Submit from the ribbon:

Submit general ledger journal for approval

Assuming that everything is configured correctly, the user now receives an approval request. In my case I am using the Approvals app in Teams (who would have guessed?).

This screenshot shows the approval I receive in Teams:

Teams approval request

The same request is also sent by e-mail to the user.

From the request I can approve or reject the item, but I can also click on the link to jump directly to the general ledger journal.

When looking at the workflow history from the general ledger journal in D365F&O, we can now see that the workflow item has been approved:

Workflow item history

As I have showed in this blog post, we can extend inbuilt workflows using Business Events and, in this case, Power Automate. Beyond external approval, this opens up a range of options for external data processing as part of a workflow.

How to Create a Quick Customer Profitability Analysis with Dynamics 365 Finance and Excel

How often do you get the question:

Can we produce a full customer P&L that allows us to analyze customer profitability?

Of course you can!

In Dynamics 365 Finance there is a number of ways to achieve this, but in this example we will leverage the new “customer account and name on finance transactions” feature to achieve our objective.

Firstly, I need to get the data I need for my analysis. A simple way to do this is to use General ledger / Inquiries and reports / Voucher transactions. The following screenshot shows my query:

Voucher Transactions Query

Since I would only like to see a customer P&L, I am only interested in Revenue and Expense accounts.

Once I have made my selection, I get the result as raw finance transactions.

Query Result

As you can see from the screen, the transactions now include fields for customer ID and customer name.

From here I want to export the raw data into Excel for further analysis. In the ribbon I click on the Office icon and select General journal entry.

Export to Excel

The data is now in Excel as shown below.

Data Exported to Excel

However, since the data is based on financial transactions the signed is reversed for our purpose, revenue is after all positive. To fix this, I create a new column in the table named Reversed Amount where I multiply the Amount in reporting currency by -1.

Reversed Amount Column

Now we are ready to perform our analysis. First, I mark the table and click on PivotTable in the Insert menu.

Insert PivotTable

This creates a new PivotTable in a blank workbook.

New PivotTable in a Blank Workbook

Just to get us started, I add Customer name and Account name to the Rows section of the PivotTable designer. Also, I add the Reversed Amount field and rename it to Amount ($). Now the PivotTable has the basic ability to show the P&L by customer.

Basic Customer P&L

In this example, I only have two customers, but in reality I may have many. Therefore, I would like to add a slicer to allow me to filter the customers. The following screenshot shows how the slicer is added.

Adding Customer Slicer

There is now a customer slicer and in the below I have selected customer “US-017” and the PivotTable is filtered accordingly.

Customer Slicer Applied

Also, it is often relevant to filter on periods. To do this, I add a Timeline slicer as shown below.

Adding Timeline Slicer

I now have the ability to filter the data on days, months, quarters and years.

Timeline Slicer Added

Lastly, I do a bit of formatting to make it look a bit nicer.

Final Formatting

This was just an example to show how easy it is to use the new customer and vendor fields in the finance transactions to perform ad hoc analysis. Obviously, using financial dimensions for customer and vendor instead could achieve something similar.

Also, the CFO Overview analytical workspace provides the ability to show a customer trial balance with budget variance and drill-down.

Adding a Power Apps app as a Tile in the Dynamics 365 Finance and Operations Dashboard

As you know, I am a big fan of Power Apps and the ability to embed them in the Dynamics 365 Finance and Operations (D365F&O) user interface.

In this blog post I am going to show how you can embed a Power Apps app as a tile in the dashboard.

The following screenshot is a fairly normal dashboard for a user working in production.

Production User Dashboard

In this example, I have created a Power Apps app that allows users to create asset maintenance requests from outside Dynamics 365 Supply Chain Management (D365SCM).

This is what the app looks like.

Asset Maintenance Request Landing Page
Asset Maintenance Request Create Form

The app is designed for a tablet, but in this example, the production user quite likes the app and would like to use it directly from the D365F&O user interface – specifically in the dashboard.

To start the process, the user right-clicks in a blank part of the dashboard and selects Personalize: TilePageContainer as shown below.

Personalize Tile Container in the Dashbaord

This brings up the personalisation dialogue for the dashboard as shown here.

Personalization Dialogue

To add the app to the dashboard, the user must now select + Add a page.

This brings up the following dialogue where the user can select from three options.

Add a Page Options

In this case the user selects the Power Apps option.

A new dialogue allows the user to give the tile a name and enter the app ID.

Add an App Details

The App ID GUID can be found on the Details page for the Power Apps app.

When the user clicks OK, the app is added to the dashboard as shown below.

Dashboard with Added Power Apps Tile

When the user clicks on the tile, the Asset Maintenance Request app is opened within the D365F&O context.

Asset Maintenance Request App in Dynamics 365 Finance and Operations

I hope this small example shows how easy it is to enrich the D365F&O user experience with Power Apps apps.

Adding Routing Capabilities to Asset Management in Dynamics 365 Supply Chain Management

Ever thought it would be nice to know how long it takes to get from where you are to a functional location in Asset Management (AM)? With Power Apps and Bing Maps you can.

In this blog post, I will explore how we can enrich the AM experience with routing capabilities using Bing Maps. Bing Maps, as you know, provides various mapping capabilities and is a native Power Apps connector (in preview). It comes with an API that allows us to call various mapping functions.

Please note that Bing Maps is a paid service so you need to sign up and generate an API key before you can use the service.

In this small example, I have used Bing Maps in a Power Apps app to show routing information between my current location and the location of the selected functional location in AM. This is what the app looks like:

Functional Location Routing Information

As you can see, the app displays my current location and the location of the asset (functional location). Bing Maps calculates distance, duration and estimated arrival time between the two locations and displays a short route description. Optionally, we can select what to avoid on the route.

Once you have added Bing Maps to your app as a data source, it is relatively easy to call the service using the API.

The idea was to provide routing capabilities in AM. To achieve this, I have embedded the app in the Functional Locations details page. By providing the functional location ID to the app, I can look up the primary address and use this as the routing endpoint.

This is what it looks like:

Routing Tab Added to the Functional Locations Details Screen

As you can see from the above screenshot, I have added a new tab to the screen called Routing Information through personalisation.

When the tab is expanded you can see that the app is now embedded directly in the user interface and since the functional location can be passed to the app, it is also context sensitive.

Routing Capability Embedded in User Interface

When I toggle the Avoid motorways option, the app instantly recalculates the distance and duration as shown below.

Using Avoid Options in the App

This was just one example of how we can leverage online services to enrich the Dynamics 365 Finance and Operations experience using the Power Platform.

In a coming blog post, I will look at the mechanics behind the app and show how easy it is to interact with the Bing Maps API.

A Closer Look at How I Use AI Builder Models inside a Power Apps App

Previously, we looked at how to create and train AI models using AI Builder in Power Apps and how it can be embedded in an app in these four blog posts:

In this blog post I am taking a closer look at how we can use Power Automate to take the data captured in the app and submit it to Dynamics 365 Finance.

As you may recall from the previous blog post, the Power Automate flow is called when the user clicks on the save icon in the app to submit the invoice data to Dynamics 365 Finance.

Power Automate flow is triggered by Power Apps

As you can see from the above screenshot, the flow is triggered by Power Apps as you would expect. Subsequently, I initiate a number of variables that I need further down the flow. Some variables, for instance Amount and MainAccount are passed from the app.

Since the VendInvoiceJournalLines data entity in Dynamics 365 Finance has separate debit and credit fields, I create a condition to set the variables based on the amount received from the app as shown below.

Set debit / credit fields based on amount sign

Since this flow is used both for creating invoice journals (this example) and general ledger journals, I need a condition to determine if it is one or the other. In this case, I make the distinction based on whether an off-set account was passed from the app.

Condition to determine the type of journal

In this example, we do have an off-set account, so the condition will go down the No path.

The first step we make to create an invoice journal in Dynamics 365 Finance is to create the journal header.

Create invoice journal header in Dynamics 365 Finance

As you can see, I am using the standard Dynamics 365 Finance and Operations connector (OData) in Power Automate to create the header. The Name attribute is set to “APInvoice”. This is a journal name I have set up under General ledger / Journal setup / Journal names as shown below.

Journal name set up in Dynamics 365 Finance

The data entity is designed in a way that it will automatically draw the next journal number from the number sequence.

Why did I use the invoice journal instead of the invoice register or pending invoices? In this example where the user is also the approver, I assume the invoices can be posted without any further review.

Next step is to create the invoice journal line.

Create the invoice journal line in Dynamics 365 Finance

As you can see, the Journal batch number field is filled in by a placeholder holding the journal number value generated when we created the journal header.

If I expand the connector by clicking on Show advanced options, I am able to fill in the remaining fields in the data entity. To make this example work, I fill in the following fields:

CreditCredit variable
DocumentInvoice number from app
Document dateutcnow()**
AccountMain account from app
Offset accountMain account from app.Cost centre from app*
DebitDebit variable
InvoiceInvoice number from app
Addition fields required to create invoice journal line in Dynamics 365 Finance

* Please note, the off-set account must respect the format set up in Dynamics 365 Finance under General ledger / Chart of accounts / Dimensions / Financial dimension configuration for integration applications.

** Expression used to set timestamp with current date and time.

Lastly, I capture the voucher number created for the line in a variable as we will need it later in the flow.

Set voucher variable

The last major step in the flow is to store the invoice image and alert the app user.

Store image and notify user

Firstly, I have inserted a condition to check if there is actually an image included. If not, we do nothing.

In the first step, I use a Compose action to convert the image format received from the app to a binary format.

Convert image string to a binary format

Subsequently, I store the binary image content in BLOB storage in Azure.

Store image in Azure BLOB storage

As you may have noticed, I am using the voucher number for the file name. This makes it easier for me to identify the image later on.

In the last step, I notify the user using Outlook 365.

Notify user through e-mail

So what does the result look like?

Invoice journal header created by the flow

The above screenshot shows the journal header created by the flow.

If we open the journal, we can see that a journal line has been created using the information we captured in the app.

Invoice journal line based on captured information

As you may have noticed, the screenshot contains a tab named Invoice image. This is not a standard tab. However, I have added the tab using personalisation.

When I click on the tab, the invoice image stored in Azure BLOB storage earlier on is displayed. This is achieved by embedding a Power Apps app in the user interface using personalisation.

Embedded Power Apps app displaying the invoice image

In my view, this is a great example of how we can extend the Dynamics 365 Finance user interface without writing customisations.

You can read more about user interface personalization here.

This concludes this short series on how to leverage AI capabilities with Dynamics 365 Finance using the Power Platform. I hope you found the information useful.

A Closer Look at How I Use AI Builder Models inside a Power Apps App

Previously, we looked at how to create and train AI models using AI Builder in Power Apps in these three posts:

In this post we will take a look at the mechanics behind how the AI models are used in the app.

As you may recall from my previous post, the user was guided to a screen where he or she could take a picture of an invoice for processing. The following screenshot shows this screen in edit mode in the Power Apps editor.

Figure 1: Upload Invoice Screen in Power Apps Editor

As you can see, at the top of the screen, I have inserted a Form processor AI Builder control. This is done simply by going to the Insert menu in the ribbon and selecting Form processor from the AI Builder drop-down. You are then presented with a selection box where you can select from existing AI Builder models (or create a new one).

In the properties pane on the right-hand side, you can see the various options available for the control. This is all we need to do to embed the form processor AI Builder model we created earlier into the app.

Now we need to react to the result once the user has uploaded an invoice image and the image has been processed. We do this by writing some code in the OnChange method.

Before we start looking at the code, here is some important information:

  1. The form processor control is named AIInvoice.
  2. You refer to field on the invoice using AIInvoice.Fields.[Field].
  3. Instead of using the category classification AI Builder model I created earlier, I am actually using the underlying Azure cognitive service directly because when I built the app, it was not possible to access the model programmatically.
  4. The Azure text analysis service has been added to my data sources and is named TextAnalytics.
  5. In CDS I have created a table called TextToAccounts when I match key phrases with ledger accounts.

Now, let’s take a look at the code in the OnChange method:


// This statement sets the value of the PostingText variable to the description found on the invoice.

language: “en”,
text: PostingText

// This statement creates a new collection and collects key phrases from the invoice description using text analysis.


Set(CostAccount,LookUp(TextToAccounts,Text = AccountTag).Account);

// The first statement sets the AccountTag variable to the first tag found in the collection. Based on this tag, the CostAccount variable is set to the ledger account value found by looking up in the TextToAccounts table in Common Data Service (CDS).


// The captured image is stored in a new collection.


// The VendAccount variable is set to the captured vendor account from the invoice.

As you can read from this short article, it is now possible (in preview) to use an AI Builder model programmatically within a Power App. This means that instead of using the Azure text analysis service, I would have been able to use a model built in AI Builder instead. A topic for a future blog post…

Now we have combined the form processor results with the text analysis service from Azure to provide us with the information we need to submit the invoice to Dynamics 365 Finance.

We will look at how this is done using Power Automate in the next and final post in this series.

Using a Power Apps app with AI Builder models to Capture a Vendor Invoice

In my previous posts, I used AI Builder in Power Apps to set up models for form processing and category classification respectively. In this post I will show the app, I have built with Power Apps that brings those two capabilities together. In the next post, I will dissect the app to show how to leverage the AI Builder models inside an app.

In the last post of this series, we will take a quick look at how I use Power Automate to submit the invoice to Dynamics 365 Finance.

As you can see from the screenshots, my use case has a slightly agricultural flavour.

Screen 1: Landing Page

When I start the app, I am presented with a landing page. As you can see, the app is quite simple, but obviously the landing page allows me to add more items to the menu, if required.

From the landing page, I select Process Invoice.

Screen 2: Select Cost Centre

In my example, I am asking the user to select the cost centre where the cost on the invoice needs to go. In this case I select John Deere Tractor 5100R and click on the <NEXT> icon on the line.

Screen 3: Main Invoice Processing Screen

From this screen the user is able to see the selected cost centre and the captured invoice image before submitting it to Dynamics 365 Finance. However, first, the user must capture the invoice by clicking on the camera icon.

Screen 4: Capture Invoice Screen

I now come into the screen where I can capture the invoice. This is done by clicking on the Capture Invoice button. Since I am doing this from an iPhone, I am asked to take a photo or select an existing photo from the library as shown below.

Screen 5: Select Photo Source

In this case I take the following photo of the invoice.

Screen 6: Invoice Image

When I click on Choose, the image is submitted to the AI Builder form processor for interpretation. The result is shown below.

Screen 7: Invoice Interpretation Result

As you can see, the image is brought back into the app and the form processor has interpreted the invoice. The image has labels applied that show the captured fields and their accuracy. In the lower section of the screen, I am using this information to fill out the variables I need to submit the invoice to Dynamics 365 Finance.

Since I am using an English form processing mode, the description has been interpreted as “110mm ror” instead of “110mm rør”. After the invoice has been interpreted, I take the description and process is through the category classification model in built earlier. This identifies the description to belong to the category “ror” and this category is tagged to the main account “2020”.

Since the invoice has been interpreted correctly, I can click on the accept icon to go back.

Screen 8: Submit Invoice to Dynamics 365 Finance

Coming back to the main processing screen, the captured image is now displayed. Also, the save icon becomes available and when I click on it, the invoice is submitted to Dynamics 365 Finance.

Once the Power Automate flow has finished processing, the user is told the invoice has been processed.

Obviously, in the real world I should have trained the model to read multiple lines into a table. However, I hope this simple example has shown the power of using AI Builder in apps.

In the next post, I will take a look at the mechanics of this app. Stay tuned…

Using Category Classification AI to Identify the General Ledger Account

In my previous post, I took a look at how to build a model to interpret vendor invoices using AI Builder in the Power Platform. In this blog post, I take it a step further and build a model that analyses the invoice (line) description and match it to a general ledger account. This is, potentially, a way to fully automate the AP automation process end-to-end.

Before I can create the AI model I need some data to train it on. This data needs to reside in the Common Data Service (CDS). Therefore, I have created a new entity in CDS called “Text to Account” as shown below.

The entity is very simple and only contains two important fields:

  • Text.
  • Account.

When you look at the data I have populated the entity with:

you can see that for each text example, the Account field contains the corresponding general ledger account. This way, the model can learn the relationship between a text and the account. Obviously, the more data you feed into the model (entity), the more accurate it becomes.

Now it is time to create the AI Builder model. To do this, you must go to Power Apps and open the Build sub menu.

From here, select the Category Classification tile and give the new model a name as shown below.

When you click Create, a wizard takes you through the five model creation steps.

First, you must select the data entity (see above) where the text data is stored and within the data entity the exact field that contains the text phrases.

Next, you must select the field that contains the corresponding tag. In this case the tag is the Account field.

As the following screenshot shows, the AI Builder analyses the data and proposes a separator scheme. As you can see, this is mainly meant to deal with scenarios where data is stored in a delimited format. In my case there is no delimiter.

The next step is to review how the model has matched text phrases with tags (accounts).

As you can see from the above example, each text phrase has been matched with a tag. This is what we want.

The last step is to select the language used for the model. For more information on geographical availability, please see this link.

That is it. All I need to do now, is to click on the Train button on the summary screen to train my model.

Once the model has been trained, all you need to do is publish it to make it available for use in Power Apps.

In the next blog post in the series, we will take a look at how we can use the two models in a Power App to add a layer of intelligence to the experience. This is where it all comes together.