Chaining Test Cases in Dynamics 365 for Finance and Operations

In a previous blog post, I wrote about how to automate regression testing using the Regression Suite Automation Tool (RSAT) with Dynamics 365 for Finance and Operations (D365F&O).

At the time, I promised to come back to how you link test cases using RSAT. Therefore, in this blog post I will explore this topic a bit further. In fact, linking test cases using RSAT is incredibly simple.

Capturing Values During Recording

To link test cases, you need to capture one or more values to pass between the test cases.

In this example I have two test cases:

  1. Create sales order.
  2. Generate delivery note.

When executing my test plan, I would like to generate a delivery note (test case #2) for the sales order created in test case #1. To do this, I need to pass the sales order number from test case #1 to test case #2.

For this to happen, I need to capture the sales order number during task recording.

Chain 1.png

In the above screenshot, I am recording the process for creating a new sales order. Once the sales order number has been generated, I right-click in the field and select Task recorder / Copy.

The function does not copy the value itself, but the field meta-data.

Passing Meta-Data Downstream

The spreadsheet containing the task recording meta-data now has a section in the “General” tab called “Saved variables”. In cell A-15 in the spreadsheet below, you can see the reference to the sales order number we have just captured.

Chain 2.png

The reference value is: {{SalesTable_SalesTable_SalesId_3975_Copy}}

Now, if I open the task recording spreadsheet for the second test case (“Generate delivery note”), I can use this reference value to link the two test cases together.

Chain 3.png

As shown in the above screenshot, I have navigated to the “SalesTableListPage” tab in the spreadsheet for the second test case. In this tab there is a field reference called “Sales order”. This is because, in the second test case, I filter the sales order list page on sales order number to find the correct sales order. Instead of using a fixed value, I paste the value reference ({{SalesTable_SalesTable_SalesId_3975_Copy}}) from the first test case into Value cell. This way I pass the sales order number between the two test cases (by reference).

It really is that simple!

So in my scenario, the automated test will always create a new sales order (test case #1) and in the second test case, it will look up this new sales order and generate a delivery note for it.

Obviously, you need to get the sequence of the test cases right for this to work.

Since the reference value uses the test case number, it is possible to link across any number of test cases and also reuse the value several times. If for instance, I had a third test case that generated an invoice, I could also pass the sales order number to this test case using the same value reference.

Additional Reading

There is a short article on how to chain test cases on Docs.

The Financial Period Close Workspace in Dynamics 365 Finance

For many organisations, month-end and year-end closing processes may not run as smoothly as they would have liked. To alleviate this problem, Dynamics 365 Finance (D365F) comes with the Period Close Workspace (PCW).

The PCW is a dedicated workspace designed to help finance managers and participants in the closing process to work in a structured way and maintain an overview of the status of the process.


The following figure shows a simplified overview of the components within PCW.

Period 1.png

A period close process is carried out based on a Closing Schedule. The closing schedule is generated from a Template and carried out within the time frame specified in the Calendar.

A template consists of a list of Tasks to be carried out. Each task can be associated with a task link that opens a D365F menu point. Also, on the task you can define the day offset and time deadline for the task.

On the task, you can select the Closing Role accountable for the task and the Legal Entities in which the task must be performed.

Financial Period Close Configuration

In the following subsections, we will take a look at the configuration required.

Task Areas

Period 2.png

To keep track of the progress for each business area, it is necessary to configure task areas. In the above example, the task areas resemble the sub ledgers, but a company is free to define the task areas in any way they want.

Closing Roles

Period 3.png

In the same way, closing roles must be defined. A task is always associated with a closing role, so it is possible to track progress by closing role.

As the following example shows, for each closing role, it is possible to define the persons working in that role by legal entity. When a closing schedule is generated, the system will automatically assign tasks to resources in a role, but this can be overridden by the manager.

Period 4.png


Lastly, we must define closing calendars to use when we generate the closing schedule.

Period 5.png

In the above example, I have created the closing calendar for November 2019. Since I have decided that work is not done on Saturday and Sunday, the closing schedule can take up to five working days.

The Closing Template

With the basic configuration in place, it is time to create the closing template. As mentioned above, the template contains all the tasks that must be carried out as part of the period close process. The following example shows the template task list for a month end process. An organisation can have as many templates as they like and use them for specific purposes.

Period 6.png

If we take the “Verify customer aging” task as an example, we can see that it must be carried out on the second day of the process (offset+1) before 12:00pm. The task falls within the “AR Payments Clerk” closing role and is applicable to multiple legal entities. The task link points to the “Customer aging report”.

If we click on the Set dependency menu point, we can see that the task depends on two other tasks that must be completed before this task can start, as the following screenshot shows.

Period 7.png

Lastly, if we click on Attachments in the ribbon, we can see that a spreadsheet has been attached to the task.

Period 8.png

In this example, the spreadsheet is used by the person carrying out the task to report customer aging. This way, it is possible to attach guidelines, documents and spreadsheets to help the users carry out the task.

Generate Closing Schedule

Now it is time to generate the closing schedule to start the closing process.

Period 10.png

Clicking New in the ribbon brings up the dialogue to generate a new closing schedule. In this example, I am creating a closing schedule for multiple legal entities, but it is also possible to generate individual schedules instead.

Processing a Period Close

Once the closing schedule has been generated, it is possible to select it at the top of the workspace as shown below.

Period 11.png

As you can see, the workspace has now been populated with data based on the closing schedule.

In the Tasks and status section in the workspace, it is possible to view tasks grouped by legal entity, task area or person. It is also possible to click on the Task list and see all tasks.

In the following screenshot, I have filtered the list on Area = “Accounts receivable” and Task = “Verify customer aging”. Based on the template, the task has been generated in multiple legal entities.

Period 12.png

By hovering over the padlock on the line, the user can see the tasks, which must be completed before this task can start (task dependencies), as show in this screenshot.

Period 14.png

In this case, Arnie must first do the billing and then post the open payment journals before he can verify customer aging. If Arnie ticks the Completed field he is told to complete the task dependencies first.

As the following screenshot shows, once the task dependencies have been completed, the padlock is removed and Arnie can complete the task.

Period 15.png

Before Arnie completes the task, he clicks on Template attachments and opens the attached spreadsheet template.

Period 16.png

Once he has filled in the template, he attaches the spreadsheet to the task for future reference. Arnie now completes the task by ticking the Completed field.

Correcting the Schedule

In real life, the closing schedule may require some adjustments as we go along.

Therefore, it is possible add, remove and edit tasks in the schedule.

Closing Schedule Status

There are numerous ways to follow-up on the closing schedule. In the following example, I have selected the “Accounts receivable” closing area and as you can see, 42 of 45 tasks are still remaining. At the top of the list, I can see the (3) completed tasks. The system also tells me that the “Accounts receivable” area is 6.67% complete.

At the top, it is possible to filter on due and past due tasks as well as on legal entity.

Period 17.png

This way, the finance managers and individual users can follow progress. If a task becomes overdue, it will be marked with a red exclamation mark. Also, on the left hand side of the workspace, four standard tiles show status information.

Under General ledger / Period close / All financial period close tasks, it is possible to create new tiles and lists and add them to the workspace as personalisation.

The following screenshot shows a list of my incomplete tasks with a due date before today’s date. I have created this query by using the Advanced filter or sort function in the ribbon.

Period 18

I have added this selection as a tile to the workspace as “My Due Tasks” as shown below.

Period 19.png


With PCW, organisations using D365F are now able to structure the entire period closing process across the enterprise making sure dependencies are maintained avoiding errors and potential rework.


For more detailed information on the PCW, please visit the Docs site.

Working with Consignment Stock in Dynamics 365 Supply Chain Management

In Dynamics 365 Supply Chain Management (D365SCM) it is possible to work with inbound consignment stock. This means, that the vendor owns the stock until is has been consumed by the customer. In this blog post I will be exploring how to set up and work with consignment stock in D365SCM.

Consignment Stock Set Up

As mentioned above, consignment stock is owned by the vendor until it is consumed by the customer. This means, that when and a product is set up in D365SCM we must select a Tracking dimension group with the inventory dimension “Owner” activated as shown in the following screenshot.

Consignment 1.png

In addition, we must associate the Owner inventory dimension value with supplier. This is done under Inventory management / Setup / Dimensions / Inventory owners. In the following example, I have associated vendor account “SAM-001” with inventory dimension value “SAM-001”.

Consignment 2.png

This, essentially, is all the set up required to use consignment stock.

Inventory On-Hand View in Vendor Collaboration Workspace

In this example, I have set up item “SAM-003 Glat uPVC afløbsrør m/muffe – EN 1401” to use the “SAM-001” Owner inventory dimension. Therefore, when a vendor collaboration user for vendor “SAM-001” logs on, they can see on-hand stock for this item under Vendor collaboration / Consignment inventory / On-hand consignment inventory as shown below.

Consignment 3

Replenishment Orders

Consignment stock is replenished through replenishment orders. These are found under Procurement and sourcing / Consignment / Consignment replenishment orders. In the below example, we have ordered 100 pieces from vendor “SAM-001”.

Consignment 4

The vendor does not receive an order, but can see the requirement in the on-hand consignment inventory view in the vendor collaboration portal instead.

Product Receipt

Based on the requirement generated by replenishment order, the vendor can now send the required products. In the Arrival overview found under Inventory management / Inbound orders / Arrival overview the warehouse worker can see that we expect 100 pieces to arrive from “SAM-001” on a replenishment order as shown below.

Consignment 5.png

The warehouse worker will perform a normal items arrival process from here.

This generates the below arrivals journal, which must be posted in the normal way.

Consignment 6

As you can see in the above screenshot, the system automatically fills in the Owner dimension based on information on the replenishment order.

When we look at on-hand stock for item “SAM-003”, we can see that 100 pieces are now available.

Consignment 7

The same information is also available in the vendor collaboration portal.

Since posting the arrival journal only recorded the goods as registered, we will now post a product receipt to formally receive the goods into stock. This is done from the replenishment order as shown below.

Consignment 8.png

At this stage, the status of the stock transaction for the 100 pieces is “Purchased” as this screenshot shows.

Consignment 9

However, since the stock is still owned by “SAM-001”, no ledger transactions have been generated at this point.

Consuming Consignment Stock through a Sales Order

In this example we will use a sales order to consume consignment stock. As the below screenshot shows, it is not possible to reserve against the consignment stock since it is not owned by Contoso.

Consignment 10.png

Please note, it is possible to reserve against consignment stock not owned by Contoso from a production order line.

Change of Ownership

Therefore, before I can consume the consignment stock, I need to change ownership. This is done in a journal under Procurement and sourcing / Consignment / Inventory ownership change.

Consignment 11.png

As the above example shows, I use the journal to change the ownership from “SAM-001” to “USMF” for 20 pieces. This results in the following two stock transactions.

Consignment 12.png

The system has now automatically created a new purchase order (00000700) and set the status for this order to “Received”. Contoso now formally owns 20 pieces of “SAM-003” and a goods-received-not-invoiced (GRNI) accrual has been posted.

In my system, I am using the business event for product receipt to automatically generate an e-mail to vendors set up for collaboration. This is the e-mail received by the vendor:

Consignment 14.png

Now the vendor knows that the ownership of the consignment stock has changed and they can invoice 20 pieces. This is the power of Power Automate.

In the vendor portal, the purchase order is now visible under Vendor collaboration / Consignment inventory / Purchase orders consuming consignment inventory as shown here.

Consignment 15.png

The vendor can now use vendor collaboration invoicing workspace to generate an invoice based on the purchase order.

The Sales Order Revisited

Now, when I go back to the sales order and try to reserve the same lot (20 pieces), the system will now allow this and the sales order can be processed as the following screenshot shows.

Consignment 16

Additional Information

For more information on how to work with consignment stock in D365SCM, please visit this link.





Revenue Recognition in Dynamics 365 Finance

With Platform Update 30 we finally got our hands on the new revenue recognition functionality in Dynamics 365 Finance. In this article I will briefly explore how we can use this functionality to recognise revenue in accordance with IFRS 15.

Basic Set Up

To start with, we need to do some basic set up to get the system working. You can find more details on set up here. However, I would like to pick up on a couple of set up topics in the following.

Revenue Schedules

Obviously, if you want to invoice an order you are still able to do this without invoking the revenue recognition functionality.

To invoke revenue recognition, you need to apply a revenue schedule to the sales order line.

Let’s first have a look at how to define a revenue schedule. You find the revenue schedules set up under Revenue recognition / Setup / Revenue schedules.

Revenue 1.png

As you can see from the above example, my “Subscription” schedule has 12 monthly occurrences i.e. this is a yearly subscription where the customer is invoiced up-front and revenue is subsequently recognised in monthly arrears on the first day of each month.

If you click on the Revenue schedule details menu point in the ribbon, you can see that the system has automatically distributed 100% recognition across the 12 months:

Revenue 2.png

You can manually override the distribution, if required.

Deferred Accounts

As you can see from the below screenshot, five new posting types have been added to the inventory posting set up.

Revenue 3.png

These accounts are used to defer revenue, COGS and tax when posting and reversing deferred revenue.

Released Product

Lastly, on the released product (“M0013 Microphone”), I have set it up to use the “Subscriptions” revenue schedule on the Revenue Recognition fast tab. This set up will be used as a default when creating a new sales order line.

Order Invoicing

Now, when I create a new sales order line using the above product, the revenue schedule is automatically set to “Subs” as shown below.

Revenue 5.png

As the below invoice shows, there is no indication to the customer that Contoso will recognise revenue using a monthly schedule.

Revenue 6.png

If we take a look at the financial voucher for the invoice, we can see that the postings for revenue and COGS are now made to the deferred accounts using the new posting types.

Revenue 7.png

Recognising Revenue

So far, the generated revenue has been placed on deferred accounts. To recognise revenue we need to process the revenue schedule. To do this, we go to the new revenue recognition workspace found under Revenue recognition / Workspaces / Revenue management (or for a more detailed view, use Revenue recognition / Periodic tasks / Revenue recognition schedule instead).

In this screen we can see all unprocessed revenue schedule entries and use the filters at the top to make a more detailed selection. In my example, I have filtered specifically on the sales order created above.

Revenue 8.png

Revenue is recognised through a revenue recognition journal. To generate a journal, I will need to click on Create journal in the ribbon. In the pop-up dialogue, I must select the date on which to process the data and the to-date for selection. In my case, I would like to process deferred revenue until the end of 2019.

Revenue 9.png

The following screenshot shows the journal that has been generated.

Revenue 10.png

Since I used a specific date (31/12/2019) in the dialogue, all revenue is recognised on this date. If I had opted to use the revenue schedule date in the dialogue instead, the revenue would have been recognised with monthly intervals.

All I need to do now, to recognise revenue for this order, is to post the journal.

Generating a PDF Document using Microsoft Flow

In my previous blog post, I explained how Flow can be used to automate intercompany invoicing. As part of that flow, I submitted the invoice for approval with the receiving legal entity before posting it. In the approval notification, it is possible to embed a link and I thought it would be nice, if the user could see a rendering of the invoice using that link before approving.

Therefore, I decided to expand my flow to also generate a PDF rendering of the invoice and add it to the approval notification.

Firstly, I am using the Word Online (Business) connector in Flow to generate a Word document. The connector takes a Word template as input.

The connector currently has some limitation. You can read more about the connector here.

The following picture shows the template I am using for my example.

IC 16.png

Flow is using so called content controls within the Word document to populate the template. You must add the Developer tab to the ribbon to be able to add content controls to the template.

Once the template has been designed, it is time to connect data in Flow with the bookmarks (content controls) in the template. The following screenshot shows how this is done.

IC 17

Firstly, you need to point to where the Word template is located. Once the template is selected, the bookmarks in the template should automatically become visible. Now it is only a question of selecting the right content for each bookmark.

The cryptically named bookmark “1602840370” is a so called repeating section (table row) in the Word template. Since an invoice can contain multiple lines, I need the Flow to dynamically expand the table when required. For the same reason, I am not able to statically assign content to the bookmarks. Instead, I use an array that I have generated using the Select Flow control as shown below.

IC 18

The input value to the select control is the output from fetching the invoice lines.

It is important the each key in the select is named exactly like the bookmark in the Word template. Otherwise, it will not work.

The next step is to save the Word document. I save the document to OneDrive as shown below.

IC 19

Subsequently, I use the Word document to generate a PDF document:

IC 20

This document also needs to be stored to allow the user to view it later:

IC 21

We now have both a Word and a PDF representation of the invoice. Now, we need to get a URL that points to the location of the PDF file:

IC 22

Lastly, I need to add the link to the approval component in the Flow:

IC 23

I add the URL (contained in the variable PDFFileName) to the Item link attribute in the approval component.

Now, when the user receives the approval e-mail, a link is available.

IC 25.png

When the user clicks on the link, a rendering of the intercompany invoice is displayed.

IC 24


Again, I think the above example shows how easy it is to augment and automate business processes with Flow. Within a few hours, I have been able to create a solution that automates a cumbersome business process and gives the users a better experience.

An remember, all of this, is driven out of business events from Dynamics 365 for Finance and Operations.

Using Flow to Generate Intercompany Invoices in Dynamics 365 for Finance and Operation

Intercompany invoicing in Dynamics 365 for Finance and Operations (D365F&O) is based on a sales order to purchase order relationship by design. In many cases this works fine, but in some scenarios all you want is a lightweight approach that allows a user to simply create an intercompany invoice without having to create sales orders and purchase orders.

Configuring a Business Event

To achieve this, I am using free text invoices in combination with Flow. As shown in the following screenshot I start by activating a business event for when a free text invoice is posted.

IC 1

Once the business event is in place, I need to configure a flow that takes the free text invoice data and converts it into an inbound invoice.

Configuring the Flow

As with any flow triggered by a business event from D365F&O, the flow needs to start with a “When a Business Event occurs” component as shown in the following picture.

IC 2

The flow is now triggered automatically by a free text invoice posted in the USMF legal entity. The next thing to do is to capture the free text invoice meta-data sent with the business event. This is done through a JSON-parser component as shown below.

IC 3

Now that I have access to the data sent with the business event, I would like to determine whether the customer is actually an intercompany customer. In my definition, any customer in customer group “90” is an intercompany customer. Firstly, I fetch the customer record as shown in the following screenshot.

IC 4

Once the customer is found, I simply apply a conditional statement:

IC 5

If the condition is true (the customer is an intercompany customer), I ask the receiving accounts payable team for approval before I create an invoice in their legal entity. The following picture shows the configuration of the approval.

IC 6

This results in an approval e-mail (or mobile notification if you are using the Flow mobile app) sent to the receiver as shown below.

IC 7.png

Once the request has been approved, the flow will continue to execute and in the next step, I fetch the free text invoice header:

IC 8

Subsequently, I use data from the free text invoice header to create a journal header in the invoice register in legal entity DEMF as the following screenshot shows.

IC 9

In the Description field in the invoice register header, I use the below expression to create a meaningful name for the journal.

concat(‘Intercompany invoice ‘,body(‘Get_Free_Text_Invoice_Header’)?[‘InvoiceId’])

Since a free text invoice can contain many lines, I now traverse through the lines:

IC 10

For each invoice line I create a new line in the invoice register journal as the following picture shows.

IC 11

The Result

The following picture shows an example of a free text invoice.

IC 12

Once the free text invoice is posted, a business event is fired off and picked up by the above flow. The flow generates the following invoice register journal header:

IC 13

Within the invoice register journal, the following invoice line is created and populated with data from the free text invoice:

IC 14.png


With this example I have tried to show how simple it is to extend the functionality of D365F&O and automate business processes. Obviously, this is a prototype and in a real-world solution I would probably do more to parameterise the solution, for instance:

  • How is an intercompany customer defined?
  • How is the receiving legal entity determined.

As an alternative to using the invoice register, I could have used the pending invoices register. This would have allowed me to mirror the free text invoice lines with pending invoice lines, but this is a challenge for another day.

Business Events with Dynamics 365 for Finance and Operations

For a while I have been experimenting with the preview version of business events in Dynamics 365 for Finance and Operations (D365F&O) and now that it has been released proper, I have set up some examples to showcase how brilliant the whole thing is.

Business events in D365F&O is relatively well documented, and you can find the documentation here.

Customer Collection Example

Now, one of my examples is this:

when the collection status of a customer transactions changes, I would like to, as a courtesy to the customer, send an e-mail informing the customer of the status and perhaps next step in the process.

Before I can set up the D365F&O side of this example, I need to create an endpoint, which subscribes to the business event. In this example I am using a Microsoft Flow as the endpoint.

The following screenshot show the Flow in its entirety.

Event 1.PNG

As you can see, the Flow if triggered by a D365F&O When a Business Event occurs component. To configure the component I need to enter the D365F&O URL and then select Category and Business Event respectively using the drop-downs. Lastly, I need to select the applicable legal entity. Now this flow is ready to trigger when this specific event happens in the selected instance of D365F&O. I will explain the rest of the Flow later in this post.

If you would like the event to be processed in an enterprise context not using specific user credentials, you would need to use LogicApps instead of flow. The configuration is the same, though.

Now back to D365F&O. I go to System Administration / Setup / Business Events / Business Events to active the business event.

Event 2

After selecting the applicable business event in the library, I click on the + Activate function. This allows me to select the legal entity and bind the business event to the Flow I have just created above. As you can see from my example, I have two flows available, so I need to select the right one (the other one subscribes to a sales invoice business event).

Before going back to configure the Flow, download the JSON schema for this business event by clicking on Download schema. This downloads a text file that looks like this.





Now you have configured D365F&O to send business events. All you have to do is finish the Flow Configuration.

Since I would like my flow to send an e-mail to the customer, I start by initiating a text variable that holds the mail body as shown here.

Event 3.PNG

Next step is to take the content of the business event (a JSON message) and parse it. I use the Parse JSON component to do this. The schema used for this component is the one downloaded above (just paste it in).

Event 4.PNG

Depending on the status of the customer transaction, I would like to compose different e-mail bodies. For this, I use the Switch component in flow.

Event 5

As you can see, I can use the Status field from the JSON message I just parsed above as the variable for my switch test.

Event 6.PNG

As you can see in the above screenshot, I now populate the MailBody variable by using a combination of HTML and the InvoiceId variable from the JSON message. The business event sends a limited amount of information in the JSON message, but if I need more customer or invoice information, I can always use OData to query D365F&O.

In my example I create a Case component for each customer transaction status and compose mail bodies accordingly.

Lastly, I use the Outlook 365 component to send the e-mail to the customer as shown below.

Event 7.PNG

As you can see from the above, I use the MailBody variable to populate the e-mail body. I also set the Is HTML property to Yes to render the e-mail correctly.

The Result

So what happens? As the following screenshot shows, I change the status of a customer transaction from Not disputed to Disputed.

Event 8

Once the status has changed, it is picked up by the system and a business event is fired to the subscribing endpoints.

As the following screenshot shows, the flow is triggered by the business event and runs successfully.

Event 9.PNG

If we double-click on the Parse JSON component we can see the values passed from D365F&O as part of the business event payload.

Event 10.PNG

Lastly, let’s have a look a the e-mail, the flow has generated.

Event 11.PNG

Extending the Business Event Library

D365F&O comes with a substantial number of predefined business events. However, if you need other business events to support your use cases, these can easily be built by following these instructions.


In the above, relatively simple example, I have tried to show how easy it is to subscribe to a D365F&O business event and use it to automate a process beyond the boundaries of D365F&O. Obviously, this functionality can also be used to send messages to other business applications and create a loosely coupled ecosystem based on events.



Using Recurring Integration to Import AP Invoices into Dynamics 365 for Finance and Operations

In a previous post, I explored the AP Automation framework used in Dynamics 365 for Finance and Operations (D365F&O) to import vendor invoices.

In this post, I am taking the concept one step further. I am invoking the AP automation framework through Microsoft Flow and the recurring integration framework.


Before you can import vendor invoices there are a few things you need to do. These prepatory activities are described in the following subsections.

Data Package

The AP Automation framework is designed to import data packages – not individual files, so you need to be able to submit a data package to the service endpoint. A data package consists of the Manifest.xml file, the PackageHeader.xml file and data files for Vendor Invoice Header, Vendor Invoice Lines and Vendor Invoice Document Attachments respectively. The following screenshot shows my data package.


The easiest way to generate the XML files is to export a data package from the Data Management workspace in D365F&O. This generates a .ZIP file you can extract.

The files you want to submit must be zipped before they can be put on the job queue. Each .ZIP file represents a data package.

Data Management Import Job

Next step is to create an import job in the Data Management workspace in D365F&O. The following screenshot shows the import job I will be using to import vendor invoices.


Next step is to set up job recurrence. To do this, click on Create recurring data job in the ribbon. This governs how often the system should process messages put on the inbound message queue. This is shown in the following screenshot.


Some of the settings in this screen are very important, namely:

  • ID = The GUID used to bind the enqueue http call to the job.
  • Application ID = The application ID registered for D365F&O in system administration.
  • Process messages in order = Must be set to Yes to ensure headers are processed before lines.
  • Supported data source type = Must be set to Data package to enable import of data packages.

Folder Structure

Lastly, I have set up two folders on my OneDrive:

  • Pending Invoices = This is where I places packages ready for import.
  • Processed Invoices = This is were packages are moved to after import.

Now we are set up to build the flow that puts data packages onto the job queue.

AP Automation Flow

Any third party application that can make calls to a REST endpoint can put data packages on the message queue for D365F&O. In this example I am using Microsoft Flow.

The following screenshot shows the required steps in all their glory.


In the following subsections, I will be double-cliking on each box and explain their configuration. As you can see, I am manually triggering this flow. In the real world, the flow would be triggered automatically either through a schedule or when a file is created in the Pending Invoices folder. This can be automated using Microsoft Flow.

Read Files From Pending Invoice Folder

The first action in the flow is to get all files in the Pending Invoices folder as shown here.


Get File Content

For each file in the folder I get the file content (binary data stream).


Put Data Package onto the Job Queue

The following screenshot shows how you make an http POST call to the REST endpoint for the job queue set up earlier.

AP 5

In the http call you need, as a minimum, to configure the following properties:

  • Method = POST
  • URI = https://%5BD365F&O URI]/api/connector/enqueue/[ID]; ID is the job ID generated in the recurring job
  • Body = Binary file content for the data package
  • Authentication = Active Directory OAuth
  • Tenant = [Your Tenant]
  • Audience = Mandatory, but content is not used. I use the D365F&O URI.
  • Client ID = The application you have registered for OAuth in Azure
  • Secret = The OAuth token issued for the application in Azure

This call to the REST endpoint with “enqueue” will put the data package onto the message queue associated with the recurring job. Depending on the frequency of the recurring job, the message will be processed automatically.

Move the File After Processing

Once the data package has successfully been put onto the queue, I move it to the Processed Invoices folder so it does not get picked up again.

Monitoring Messages in D365F&O

In D365F&O when I go to the recurring job and click on Manage messages in the ribbon, I can see the messages put on the queue by the flow. In the following screenshot you can see that at the top of the list, I have a queued message ready for processing after I invoked the flow.


When I select the message and click on Execution details in the ribbon I get a status on how the import is going as shown below.


In this case, the job has successfully imported 3 headers, 2 lines and 3 images from the data package.

Import Result

Now, when I go to Accounts payable / Invoices / Pending invoices I see three invoices in the overview:


When I click on the AP.003 invoice, I am presented with the invoice data and the imported image as shown here.



What I have shown here is how easy it is to use recurring integration and the AP automation framework to import vendor invoices and related images into D365F&O. With Microsoft Flow the sting has been taken out of the mechanics of managing the files and calling the REST endpoint. What’s not to like?

Embedding a PowerApp in a Dynamics 365 for Finance and Operations Workspace

One of the best things about Dynamics 365 for Finance and Operations (D365FO) is the ability to embed PowerApps – especially in a workspace.

To embed a PowerApp in a workspace simply go to the Options menu point in the ribbon and select Personalize this form. Clicking on the … button allows you to add a PowerApp. You can place the PowerApp in an existing placeholder in the workspace.

The following screenshot shows an example where I have embedded a PowerApp called Transaction entry in the General journal processing workspace.

PowerApps 1.PNG

In the above example I have added a PowerApp that allows a user to quickly enter a general ledger journal transaction using a simple form. The Main account, Offset account and  Currency fields are drop-downs bound to OData data sources (data entities) in D365FO with some filtering to show the correct options. When the Save icon is pressed, data captured in the app is stored in a SharePoint list using Microsoft Flow.

A second flow is automatically kicked off when a new item is created in the SharePoint list and a ledger journal transaction is inserted into D365FO using the Create record action in the Flow connector.

PowerApps 2.PNG

When inserting into a ledger journal transaction, it is worth noting that the Account and Offset account fields require a syntax similar to the account-picker in the normal form. The format is picked up from General ledger / Chart of accounts / Dimensions / Financial dimension configuration for integrating applications. The active ledger dimensions format is used for this type of integration. In this example the format is:

MainAccount-BusinessUnit-CostCenter-Department for instance


Make sure to include the “-“.

Also, you need to implement the logic for creating a new ledger journal header to provide the ledger journal batch number if the PowerApp is not called from the ledger journal header screen.

That is about all you need to be able to provide users with simple guided data entry of ledger journal transactions using PowerApps. This is just a great and simple way to augment workspaces with even more relevant functionality.


Tools for Creating Rollout Templates in Dynamics 365 for Finance and Operations, Part 1

The holy grail of multi-national or enterprise ERP implementations has, for a long time, been the template-based approach. A template would ideally contain large chunks of pre-defined assets that can easily be deployed and validated within a business unit as part of a rollout.

The following figure shows four key elements of a rollout template.

Template Assets Overview

In this blog, I will be exploring the tools available in Dynamics 365 for Finance and Operations (D365FO) for each of these four areas. I have elected to split the blog into four pieces, one for each area, for the sake of readability.

This part explores how D365FO supports the Data area of the template

Definition of Template Data

First, let us start with defining what template data is.

Distributed Template Data; To me, distributed template data is defined as:

Data that is relevant, but not necessarily common, across the legal entities in the instance. Distributed template data is used as a starting point in a legal entity and may or may not be redistributed at a later point when template data changes.

Shared Template Data; Shared template data on the other hand is defined as:

Data that is common and shared across the entire enterprise (or selected parts thereof).

Changes to shared template data will automatically affect all (selected) legal entities. More on this in the subsequent subsection Data Sharing.

The Shared Asset Library in Lifecycle Services

For the purpose of this blog, I assume a single instance deployment scenario. In this scenario, code is shared across all legal entities within the instance and is, as such, not part of my template definition.

A key repository for solution assets is the Shared Asset Library (SAL) in Lifecycle Services (LCS). Through the SAL it is possible for partners to share solution assets with customers and for customers to share solution assets across multiple projects.

In the following subsections, I will be highlighting the parts of the solution, which can be used for managing and distributing template data.

Data Package

In the Data package section of the SAL you are able to upload data packages you have generated in D365FO Data Management. The following screenshot shows an example where I upload absence codes generated in my D365FO to the library:

Template 1.PNG

Once the data package has been uploaded to the SAL, it becomes available to all projects in the tenant. As shown below, the package is now available for import to a specific project (in this case the project called “TSP Demo (V8.1)”).

Template 2.PNG

Data packages in the shared asset library can be used to share pre-defined data across projects. In a template context it allows for easy distribution of the data relevant for deployment of the template.

Data Task Automation; Data packages from the SAL can be downloaded and imported automatically into D365FO using Data Task Automation (DTA), which is available in the Data management workspace. The automation tasks are configured through a manifest. The following figure shows an example of a DTA manifest file.

Test 31

The above manifest file can be loaded into Data management and results in the creation of a number of data automation tasks as shown below.

Template 3.PNG

 The combination of data packages and data task automation will allow you to build a flexible framework that automates the generation of all relevant data in a new deployment from the template.

Configuration and Data Manager; The Configuration and Data Manager (CDM) is a simpler, but less flexible, alternative to the DTA. The CDM is available as a function in a project in LCS. Data packages imported to the project asset library automatically become available in the CDM.

In the following example, I am applying the absence codes data package to the D365FONOV18 environment.

Template 5.PNG

Subsequently, I select the legal entity to apply the selected data packages to within the instance (as shown below).

Template 6.PNG

In this case, applying absence codes to the “FRSI” legal entity fails because reference data is not yet available. Since data is being applied through a data project, I can analyse the failed import through the Data management workspace as shown in the following screenshot.

Template 7.PNG

Data Validation Checklist

Once template data has been successfully imported into a legal entity, it needs to be enriched and validated. For this task, D365FO offers the Data validation checklist (DVC) workspace (shown below).

Template 4.PNG

In the DVC, the test manager can configure any number of data validation checklists. A validation checklist consists of a number of tasks that can be grouped by area (configurable). The tasks are then sequenced and assigned to individual users. The user can open the menu point relating to the data directly from the workspace.

The DVC can be used to provide a structured approach to validating data imported from the template and enrich this data with data specific to the legal entity.

Data Sharing

As part of the template, a company may decide to share a section of data across the entire enterprise. For this purpose, D365FO offers the cross-company data sharing function. This function can be found under:

System administration / Setup / Configure cross-company data sharing

In the following example, I have created a new data sharing group called “Template Data Sharing” that shares absence codes and classroom groups across the “FRRT” and “ITCO” legal entities.

Template 8.PNG

Optimization Advisor

A less known tool to validate data is the Optimization Advisor (OA). Some while ago I wrote a blog post on the subject, so I will not spend time on how it works here. However, in a template context, the OA rules can be used across all legal entities and be used to validate data.


In the above, I have highlighted some D365FO tools that support the configuration and distribution of template data. These tools are:

  • Asset Library for managing template data packages.
  • Data Task Automation for flexible and automated distribution of template data.
  • The Configuration and Data Manager for simple distribution of template data across instances and legal entities.
  • Data Validation Checklist that enables the project manager or test manager to take a structured approach to validating and enriching data.
  • Data Sharing allowing (template) data to be automatically shared across selected legal entities.
  • Optimization Advisor providing automated rule-based data validation and self-healing capabilities.

In the next part of this blog post I will take a closer look out how to work with D365FO processes in a template context.