Dynamics 365 for Finance and Operations Inventory Cost Model, Part 4: Landed Cost

This is the fourth instalment in my series of articles explaining the inventory cost model in Dynamics 365 for Finance and Operations (D365FO). The series is expected to include the following parts:

In the previous posts, we looked at the core concepts underpinning the inventory cost model in D365FO and how some of the key areas should be configured. In this post we will look more into how you work with landed cost.

 

I am not sure if there is a formal definition of what landed cost is, but for the purpose of this article landed cost is defined as:

Total cost of an inbound shipment of goods including product cost, freight, insurance and duties.

The concept of charges

In D365FO, landed costs are recorded and managed through so-called Charges. A charge can be any type of landed cost and can be set up for products and suppliers. A charge is a relatively simple concept based on the following business logic:

  • A Charge can be applied to a purchase order header or lines manually or automatically.
  • A charge cannot be split across multiple purchase orders.
  • A purchase order header charge can be allocated across the purchase order lines (manually).
  • Purchase order line charges can be included in the product cost and taken into the inventory.
  • A charge can only be applied to the supplier on the purchase order – not a 3PL supplier.
  • Charges on a purchase order are perceived to be estimated – realised charges are connected to the invoice.

Configuring charges

Before we can use Charges in D365FO, they need to be configured. Since we are dealing with inbound charges, the configuration takes place in the Procurement and sourcing module under Setup / Charges.

Firstly, we must configure the necessary charge codes as shown in the following example.

Inventory 1.PNG

The key decision to make here is how to post the charge to the general ledger. In this case, FREIGHT, the cost (debit) is posted to a nominal ledger account (600120) and the debt (credit) is included is the amount owed to the supplier and therefore the accounts payable account set up for the supplier.

In the next example, FEE, the debit configuration is pointing to Item. This means that the charge will go to the inventory account for the item on the purchase order line and be included in the inventory value.

Inventory 2.PNG

Obviously, charge codes with debit posting set to Item are only relevant for purchase order line charges.

Once the charge codes have been defined, they can be used manually on a purchase order. However, most companies would like to have default charges applied automatically. This is set up as Automatic charges.

Firstly, I have set up a handling fee of $5.00 for vendor 1001 as shown in the following example. This is set up as a Header charge.

Inventory 3.PNG

This means. that all purchase orders for vendor 1001 will have a handling charge applied to the purchase order header automatically.

Next, I have set up two line item charges for freight and fee respectively. The freight charge is multiplied by the quantity on the purchase order line. The fee charge is a fixed amount for each purchase order line.

Inventory 4.PNG

The charges are automatically applied to all purchase order lines for vendor 1001, item 1000.

I could also have set up automatic charges for a group of vendors or a group of items. These groups are maintained in special charges groups.

Use of charges in the purchasing process

When I create a new purchase order for vendor 1001, the handling charge is automatically applied as shown in the following screenshot.

Inventory 5.PNG

If required, I can add, change or delete charges in the purchase order header. Charges in the purchase order header are maintained under Maintain charges.

The purchase order header charge is a general charge that will not be included in the inventory value. The charge can be allocated to the purchase order lines instead by using the Allocate charges menu point as shown below.

Inventory 6.PNG

Now, when I create a purchase order line for item 1000, the freight and fee charges are automatically applied as shown here.

Inventory 7.PNG

Since the fee charge is fixed, it does not change with the order quantity whereas the freight charge does.

Once the purchase order has been confirmed to the supplier, changes to charges cannot be applied until the invoicing stage.

Invoicing matching

At the invoicing stage, the user cannot remove the estimated charges on the purchase order. The purchase order charges are automatically connected to the invoice charges, but the user can remove the connection and apply new corrected charges. This way, the user can match the invoicing but at the same allow comparison between estimated charges and actual charges for that purchase order.

Charges postings

The following screenshot shows the inventory transaction details for the purchase order line after the supplier invoice has been applied.

Inventory 8.PNG

As you can see, the purchase order line amount of $1,798.00 has been increased to $1,798.50 because of the $0.50 fixed fee charge. This charge has been included in the inventory value because the charge code was set up to post the debit side to the item.

If we look at the financial voucher for the purchase order as a whole, we can see that the $2.00 freight charge has been taken to the “Freight/transportation in” account and the $5.00 handling charge has gone to “Other miscellaneous expenses” account. Lastly, it is worth noting that the accumulated charges have been added to the accounts payable account as well.

Summary

The above example pretty much sums up what can be achieved with charges in D365FO. If your requirements involve distributing freight charges across multiple purchase orders, the Transportation management module may be applicable, but it is beyond the scope of this article.

As mentioned, the system keeps both the estimated and the realised charges, but I have yet to find a report that shows a comparison or any statistics.

 

 

Advertisements

Generating a PDF Document using Microsoft Flow

In my previous blog post, I explained how Flow can be used to automate intercompany invoicing. As part of that flow, I submitted the invoice for approval with the receiving legal entity before posting it. In the approval notification, it is possible to embed a link and I thought it would be nice, if the user could see a rendering of the invoice using that link before approving.

Therefore, I decided to expand my flow to also generate a PDF rendering of the invoice and add it to the approval notification.

Firstly, I am using the Word Online (Business) connector in Flow to generate a Word document. The connector takes a Word template as input.

The connector currently has some limitation. You can read more about the connector here.

The following picture shows the template I am using for my example.

IC 16.png

Flow is using so called content controls within the Word document to populate the template. You must add the Developer tab to the ribbon to be able to add content controls to the template.

Once the template has been designed, it is time to connect data in Flow with the bookmarks (content controls) in the template. The following screenshot shows how this is done.

IC 17

Firstly, you need to point to where the Word template is located. Once the template is selected, the bookmarks in the template should automatically become visible. Now it is only a question of selecting the right content for each bookmark.

The cryptically named bookmark “1602840370” is a so called repeating section (table row) in the Word template. Since an invoice can contain multiple lines, I need the Flow to dynamically expand the table when required. For the same reason, I am not able to statically assign content to the bookmarks. Instead, I use an array that I have generated using the Select Flow control as shown below.

IC 18

The input value to the select control is the output from fetching the invoice lines.

It is important the each key in the select is named exactly like the bookmark in the Word template. Otherwise, it will not work.

The next step is to save the Word document. I save the document to OneDrive as shown below.

IC 19

Subsequently, I use the Word document to generate a PDF document:

IC 20

This document also needs to be stored to allow the user to view it later:

IC 21

We now have both a Word and a PDF representation of the invoice. Now, we need to get a URL that points to the location of the PDF file:

IC 22

Lastly, I need to add the link to the approval component in the Flow:

IC 23

I add the URL (contained in the variable PDFFileName) to the Item link attribute in the approval component.

Now, when the user receives the approval e-mail, a link is available.

IC 25.png

When the user clicks on the link, a rendering of the intercompany invoice is displayed.

IC 24

Conclusion

Again, I think the above example shows how easy it is to augment and automate business processes with Flow. Within a few hours, I have been able to create a solution that automates a cumbersome business process and gives the users a better experience.

An remember, all of this, is driven out of business events from Dynamics 365 for Finance and Operations.

Using Flow to Generate Intercompany Invoices in Dynamics 365 for Finance and Operation

Intercompany invoicing in Dynamics 365 for Finance and Operations (D365F&O) is based on a sales order to purchase order relationship by design. In many cases this works fine, but in some scenarios all you want is a lightweight approach that allows a user to simply create an intercompany invoice without having to create sales orders and purchase orders.

Configuring a Business Event

To achieve this, I am using free text invoices in combination with Flow. As shown in the following screenshot I start by activating a business event for when a free text invoice is posted.

IC 1

Once the business event is in place, I need to configure a flow that takes the free text invoice data and converts it into an inbound invoice.

Configuring the Flow

As with any flow triggered by a business event from D365F&O, the flow needs to start with a “When a Business Event occurs” component as shown in the following picture.

IC 2

The flow is now triggered automatically by a free text invoice posted in the USMF legal entity. The next thing to do is to capture the free text invoice meta-data sent with the business event. This is done through a JSON-parser component as shown below.

IC 3

Now that I have access to the data sent with the business event, I would like to determine whether the customer is actually an intercompany customer. In my definition, any customer in customer group “90” is an intercompany customer. Firstly, I fetch the customer record as shown in the following screenshot.

IC 4

Once the customer is found, I simply apply a conditional statement:

IC 5

If the condition is true (the customer is an intercompany customer), I ask the receiving accounts payable team for approval before I create an invoice in their legal entity. The following picture shows the configuration of the approval.

IC 6

This results in an approval e-mail (or mobile notification if you are using the Flow mobile app) sent to the receiver as shown below.

IC 7.png

Once the request has been approved, the flow will continue to execute and in the next step, I fetch the free text invoice header:

IC 8

Subsequently, I use data from the free text invoice header to create a journal header in the invoice register in legal entity DEMF as the following screenshot shows.

IC 9

In the Description field in the invoice register header, I use the below expression to create a meaningful name for the journal.

concat(‘Intercompany invoice ‘,body(‘Get_Free_Text_Invoice_Header’)?[‘InvoiceId’])

Since a free text invoice can contain many lines, I now traverse through the lines:

IC 10

For each invoice line I create a new line in the invoice register journal as the following picture shows.

IC 11

The Result

The following picture shows an example of a free text invoice.

IC 12

Once the free text invoice is posted, a business event is fired off and picked up by the above flow. The flow generates the following invoice register journal header:

IC 13

Within the invoice register journal, the following invoice line is created and populated with data from the free text invoice:

IC 14.png

Conclusion

With this example I have tried to show how simple it is to extend the functionality of D365F&O and automate business processes. Obviously, this is a prototype and in a real-world solution I would probably do more to parameterise the solution, for instance:

  • How is an intercompany customer defined?
  • How is the receiving legal entity determined.

As an alternative to using the invoice register, I could have used the pending invoices register. This would have allowed me to mirror the free text invoice lines with pending invoice lines, but this is a challenge for another day.

Business Events with Dynamics 365 for Finance and Operations

For a while I have been experimenting with the preview version of business events in Dynamics 365 for Finance and Operations (D365F&O) and now that it has been released proper, I have set up some examples to showcase how brilliant the whole thing is.

Business events in D365F&O is relatively well documented, and you can find the documentation here.

Customer Collection Example

Now, one of my examples is this:

when the collection status of a customer transactions changes, I would like to, as a courtesy to the customer, send an e-mail informing the customer of the status and perhaps next step in the process.

Before I can set up the D365F&O side of this example, I need to create an endpoint, which subscribes to the business event. In this example I am using a Microsoft Flow as the endpoint.

The following screenshot show the Flow in its entirety.

Event 1.PNG

As you can see, the Flow if triggered by a D365F&O When a Business Event occurs component. To configure the component I need to enter the D365F&O URL and then select Category and Business Event respectively using the drop-downs. Lastly, I need to select the applicable legal entity. Now this flow is ready to trigger when this specific event happens in the selected instance of D365F&O. I will explain the rest of the Flow later in this post.

If you would like the event to be processed in an enterprise context not using specific user credentials, you would need to use LogicApps instead of flow. The configuration is the same, though.

Now back to D365F&O. I go to System Administration / Setup / Business Events / Business Events to active the business event.

Event 2

After selecting the applicable business event in the library, I click on the + Activate function. This allows me to select the legal entity and bind the business event to the Flow I have just created above. As you can see from my example, I have two flows available, so I need to select the right one (the other one subscribes to a sales invoice business event).

Before going back to configure the Flow, download the JSON schema for this business event by clicking on Download schema. This downloads a text file that looks like this.

{“BusinessEventId”:””,”ControlNumber”:0,”CustAccount”:””,”EventId”:””,”EventTime”:

“/Date(-2208988800000)/”,”FollowUpDate”:”/Date(-2208988800000)/”,”InvoiceId”:

“”,”LegalEntity”:””,”MajorVersion”:0,”MinorVersion”:0,”PreviousStatus”:””,”Status”:

“”,”TransactionAmount”:0.0,”Voucher”:””}

Now you have configured D365F&O to send business events. All you have to do is finish the Flow Configuration.

Since I would like my flow to send an e-mail to the customer, I start by initiating a text variable that holds the mail body as shown here.

Event 3.PNG

Next step is to take the content of the business event (a JSON message) and parse it. I use the Parse JSON component to do this. The schema used for this component is the one downloaded above (just paste it in).

Event 4.PNG

Depending on the status of the customer transaction, I would like to compose different e-mail bodies. For this, I use the Switch component in flow.

Event 5

As you can see, I can use the Status field from the JSON message I just parsed above as the variable for my switch test.

Event 6.PNG

As you can see in the above screenshot, I now populate the MailBody variable by using a combination of HTML and the InvoiceId variable from the JSON message. The business event sends a limited amount of information in the JSON message, but if I need more customer or invoice information, I can always use OData to query D365F&O.

In my example I create a Case component for each customer transaction status and compose mail bodies accordingly.

Lastly, I use the Outlook 365 component to send the e-mail to the customer as shown below.

Event 7.PNG

As you can see from the above, I use the MailBody variable to populate the e-mail body. I also set the Is HTML property to Yes to render the e-mail correctly.

The Result

So what happens? As the following screenshot shows, I change the status of a customer transaction from Not disputed to Disputed.

Event 8

Once the status has changed, it is picked up by the system and a business event is fired to the subscribing endpoints.

As the following screenshot shows, the flow is triggered by the business event and runs successfully.

Event 9.PNG

If we double-click on the Parse JSON component we can see the values passed from D365F&O as part of the business event payload.

Event 10.PNG

Lastly, let’s have a look a the e-mail, the flow has generated.

Event 11.PNG

Extending the Business Event Library

D365F&O comes with a substantial number of predefined business events. However, if you need other business events to support your use cases, these can easily be built by following these instructions.

Summary

In the above, relatively simple example, I have tried to show how easy it is to subscribe to a D365F&O business event and use it to automate a process beyond the boundaries of D365F&O. Obviously, this functionality can also be used to send messages to other business applications and create a loosely coupled ecosystem based on events.

 

 

Using Recurring Integration to Import AP Invoices into Dynamics 365 for Finance and Operations

In a previous post, I explored the AP Automation framework used in Dynamics 365 for Finance and Operations (D365F&O) to import vendor invoices.

In this post, I am taking the concept one step further. I am invoking the AP automation framework through Microsoft Flow and the recurring integration framework.

Prerequisites

Before you can import vendor invoices there are a few things you need to do. These prepatory activities are described in the following subsections.

Data Package

The AP Automation framework is designed to import data packages – not individual files, so you need to be able to submit a data package to the service endpoint. A data package consists of the Manifest.xml file, the PackageHeader.xml file and data files for Vendor Invoice Header, Vendor Invoice Lines and Vendor Invoice Document Attachments respectively. The following screenshot shows my data package.

AP 12.PNG

The easiest way to generate the XML files is to export a data package from the Data Management workspace in D365F&O. This generates a .ZIP file you can extract.

The files you want to submit must be zipped before they can be put on the job queue. Each .ZIP file represents a data package.

Data Management Import Job

Next step is to create an import job in the Data Management workspace in D365F&O. The following screenshot shows the import job I will be using to import vendor invoices.

AP 1.PNG

Next step is to set up job recurrence. To do this, click on Create recurring data job in the ribbon. This governs how often the system should process messages put on the inbound message queue. This is shown in the following screenshot.

AP 2.PNG

Some of the settings in this screen are very important, namely:

  • ID = The GUID used to bind the enqueue http call to the job.
  • Application ID = The application ID registered for D365F&O in system administration.
  • Process messages in order = Must be set to Yes to ensure headers are processed before lines.
  • Supported data source type = Must be set to Data package to enable import of data packages.

Folder Structure

Lastly, I have set up two folders on my OneDrive:

  • Pending Invoices = This is where I places packages ready for import.
  • Processed Invoices = This is were packages are moved to after import.

Now we are set up to build the flow that puts data packages onto the job queue.

AP Automation Flow

Any third party application that can make calls to a REST endpoint can put data packages on the message queue for D365F&O. In this example I am using Microsoft Flow.

The following screenshot shows the required steps in all their glory.

AP 11.PNG

In the following subsections, I will be double-cliking on each box and explain their configuration. As you can see, I am manually triggering this flow. In the real world, the flow would be triggered automatically either through a schedule or when a file is created in the Pending Invoices folder. This can be automated using Microsoft Flow.

Read Files From Pending Invoice Folder

The first action in the flow is to get all files in the Pending Invoices folder as shown here.

AP 3.PNG

Get File Content

For each file in the folder I get the file content (binary data stream).

AP 4.PNG

Put Data Package onto the Job Queue

The following screenshot shows how you make an http POST call to the REST endpoint for the job queue set up earlier.

AP 5

In the http call you need, as a minimum, to configure the following properties:

  • Method = POST
  • URI = https://%5BD365F&O URI]/api/connector/enqueue/[ID]; ID is the job ID generated in the recurring job
  • Body = Binary file content for the data package
  • Authentication = Active Directory OAuth
  • Tenant = [Your Tenant]
  • Audience = Mandatory, but content is not used. I use the D365F&O URI.
  • Client ID = The application you have registered for OAuth in Azure
  • Secret = The OAuth token issued for the application in Azure

This call to the REST endpoint with “enqueue” will put the data package onto the message queue associated with the recurring job. Depending on the frequency of the recurring job, the message will be processed automatically.

Move the File After Processing

Once the data package has successfully been put onto the queue, I move it to the Processed Invoices folder so it does not get picked up again.

Monitoring Messages in D365F&O

In D365F&O when I go to the recurring job and click on Manage messages in the ribbon, I can see the messages put on the queue by the flow. In the following screenshot you can see that at the top of the list, I have a queued message ready for processing after I invoked the flow.

AP 7.PNG

When I select the message and click on Execution details in the ribbon I get a status on how the import is going as shown below.

AP 8.PNG

In this case, the job has successfully imported 3 headers, 2 lines and 3 images from the data package.

Import Result

Now, when I go to Accounts payable / Invoices / Pending invoices I see three invoices in the overview:

AP 9.PNG

When I click on the AP.003 invoice, I am presented with the invoice data and the imported image as shown here.

AP 10.PNG

Conclusion

What I have shown here is how easy it is to use recurring integration and the AP automation framework to import vendor invoices and related images into D365F&O. With Microsoft Flow the sting has been taken out of the mechanics of managing the files and calling the REST endpoint. What’s not to like?

Embedding a PowerApp in a Dynamics 365 for Finance and Operations Workspace

One of the best things about Dynamics 365 for Finance and Operations (D365FO) is the ability to embed PowerApps – especially in a workspace.

To embed a PowerApp in a workspace simply go to the Options menu point in the ribbon and select Personalize this form. Clicking on the … button allows you to add a PowerApp. You can place the PowerApp in an existing placeholder in the workspace.

The following screenshot shows an example where I have embedded a PowerApp called Transaction entry in the General journal processing workspace.

PowerApps 1.PNG

In the above example I have added a PowerApp that allows a user to quickly enter a general ledger journal transaction using a simple form. The Main account, Offset account and  Currency fields are drop-downs bound to OData data sources (data entities) in D365FO with some filtering to show the correct options. When the Save icon is pressed, data captured in the app is stored in a SharePoint list using Microsoft Flow.

A second flow is automatically kicked off when a new item is created in the SharePoint list and a ledger journal transaction is inserted into D365FO using the Create record action in the Flow connector.

PowerApps 2.PNG

When inserting into a ledger journal transaction, it is worth noting that the Account and Offset account fields require a syntax similar to the account-picker in the normal form. The format is picked up from General ledger / Chart of accounts / Dimensions / Financial dimension configuration for integrating applications. The active ledger dimensions format is used for this type of integration. In this example the format is:

MainAccount-BusinessUnit-CostCenter-Department for instance

110101-010–2000

Make sure to include the “-“.

Also, you need to implement the logic for creating a new ledger journal header to provide the ledger journal batch number if the PowerApp is not called from the ledger journal header screen.

That is about all you need to be able to provide users with simple guided data entry of ledger journal transactions using PowerApps. This is just a great and simple way to augment workspaces with even more relevant functionality.

 

Tools for Creating Rollout Templates in Dynamics 365 for Finance and Operations, Part 1

The holy grail of multi-national or enterprise ERP implementations has, for a long time, been the template-based approach. A template would ideally contain large chunks of pre-defined assets that can easily be deployed and validated within a business unit as part of a rollout.

The following figure shows four key elements of a rollout template.

Template Assets Overview

In this blog, I will be exploring the tools available in Dynamics 365 for Finance and Operations (D365FO) for each of these four areas. I have elected to split the blog into four pieces, one for each area, for the sake of readability.

This part explores how D365FO supports the Data area of the template

Definition of Template Data

First, let us start with defining what template data is.

Distributed Template Data; To me, distributed template data is defined as:

Data that is relevant, but not necessarily common, across the legal entities in the instance. Distributed template data is used as a starting point in a legal entity and may or may not be redistributed at a later point when template data changes.

Shared Template Data; Shared template data on the other hand is defined as:

Data that is common and shared across the entire enterprise (or selected parts thereof).

Changes to shared template data will automatically affect all (selected) legal entities. More on this in the subsequent subsection Data Sharing.

The Shared Asset Library in Lifecycle Services

For the purpose of this blog, I assume a single instance deployment scenario. In this scenario, code is shared across all legal entities within the instance and is, as such, not part of my template definition.

A key repository for solution assets is the Shared Asset Library (SAL) in Lifecycle Services (LCS). Through the SAL it is possible for partners to share solution assets with customers and for customers to share solution assets across multiple projects.

In the following subsections, I will be highlighting the parts of the solution, which can be used for managing and distributing template data.

Data Package

In the Data package section of the SAL you are able to upload data packages you have generated in D365FO Data Management. The following screenshot shows an example where I upload absence codes generated in my D365FO to the library:

Template 1.PNG

Once the data package has been uploaded to the SAL, it becomes available to all projects in the tenant. As shown below, the package is now available for import to a specific project (in this case the project called “TSP Demo (V8.1)”).

Template 2.PNG

Data packages in the shared asset library can be used to share pre-defined data across projects. In a template context it allows for easy distribution of the data relevant for deployment of the template.

Data Task Automation; Data packages from the SAL can be downloaded and imported automatically into D365FO using Data Task Automation (DTA), which is available in the Data management workspace. The automation tasks are configured through a manifest. The following figure shows an example of a DTA manifest file.

Test 31

The above manifest file can be loaded into Data management and results in the creation of a number of data automation tasks as shown below.

Template 3.PNG

 The combination of data packages and data task automation will allow you to build a flexible framework that automates the generation of all relevant data in a new deployment from the template.

Configuration and Data Manager; The Configuration and Data Manager (CDM) is a simpler, but less flexible, alternative to the DTA. The CDM is available as a function in a project in LCS. Data packages imported to the project asset library automatically become available in the CDM.

In the following example, I am applying the absence codes data package to the D365FONOV18 environment.

Template 5.PNG

Subsequently, I select the legal entity to apply the selected data packages to within the instance (as shown below).

Template 6.PNG

In this case, applying absence codes to the “FRSI” legal entity fails because reference data is not yet available. Since data is being applied through a data project, I can analyse the failed import through the Data management workspace as shown in the following screenshot.

Template 7.PNG

Data Validation Checklist

Once template data has been successfully imported into a legal entity, it needs to be enriched and validated. For this task, D365FO offers the Data validation checklist (DVC) workspace (shown below).

Template 4.PNG

In the DVC, the test manager can configure any number of data validation checklists. A validation checklist consists of a number of tasks that can be grouped by area (configurable). The tasks are then sequenced and assigned to individual users. The user can open the menu point relating to the data directly from the workspace.

The DVC can be used to provide a structured approach to validating data imported from the template and enrich this data with data specific to the legal entity.

Data Sharing

As part of the template, a company may decide to share a section of data across the entire enterprise. For this purpose, D365FO offers the cross-company data sharing function. This function can be found under:

System administration / Setup / Configure cross-company data sharing

In the following example, I have created a new data sharing group called “Template Data Sharing” that shares absence codes and classroom groups across the “FRRT” and “ITCO” legal entities.

Template 8.PNG

Optimization Advisor

A less known tool to validate data is the Optimization Advisor (OA). Some while ago I wrote a blog post on the subject, so I will not spend time on how it works here. However, in a template context, the OA rules can be used across all legal entities and be used to validate data.

Conclusion

In the above, I have highlighted some D365FO tools that support the configuration and distribution of template data. These tools are:

  • Asset Library for managing template data packages.
  • Data Task Automation for flexible and automated distribution of template data.
  • The Configuration and Data Manager for simple distribution of template data across instances and legal entities.
  • Data Validation Checklist that enables the project manager or test manager to take a structured approach to validating and enriching data.
  • Data Sharing allowing (template) data to be automatically shared across selected legal entities.
  • Optimization Advisor providing automated rule-based data validation and self-healing capabilities.

In the next part of this blog post I will take a closer look out how to work with D365FO processes in a template context.

Using OData to Count Records in a Table in Dynamics 365 for Finance and Operations

With no direct access to the Dynamics 365 for Finance and Operations (D365FO) database you are relying on OData to execute your queries. A common scenario is the need to count the number of records in a table. The following URL template offers a quick way to do this:

[d365fo-url]/data/[data-entity]/$count

If, for instance, I would like to count the number of customers in my D365FO instance, the URL would look like this:

d365fo384738737aos.cloudax.dynamics.com/data/CustomersV3/$count

A quick an easy way to count records in a table.

The $count function can obviously be combined with the $filter function if you only need to count a subset of records. By adding the following statement to the URL, the request returns the number of customers in group ’10’:

/?$filter=CustomerGroupId eq ’10’

Using PowerApps and Flow with Dynamics 365 for Finance and Operations, Part 1

With PowerApps and Flow we have a very strong toolbox that allows us to augment standard Dynamics 365 for Finance and Operations (D365FO) features and provide “last-mile” solutions. In this two-part series I explore how we can use PowerApps to augment standard functionality and Flow to provide sophisticated data processing capabilities.

The Scenario

In the following example I am trying to support the following scenario:

The company is keeping track of what equipment is on loan to employees through the Loaned equipment function in the Human resources module. An employee can open a PowerApp and scan the bar code on the equipment to start a new loan. This will close the existing loan record and create a new loan record. In addition, the HR-administrator is informed about the loan through an e-mail.

The following screenshot shows the screen in my PowerApp where the user can scan the bar code.

PowerApps 1.PNG

Data Flow

To support the above scenario I need data to flow from the PowerApp via Flow to D365FO. The data flow is described in the following steps:

  1. The PowerApp invokes a flow and sends parameter-data to the flow.
  2. The data is stored in a SharePoint list (optional step).
  3. This flow also updates any existing equipment loan record in D365FO.
  4. A second flow is triggered when a new item is created in the SharePoint list. The flow sends an e-mail to the HR-administrator informing of the loan with a link to the item in the SharePoint list and subsequently creates a new equipment loan record in D365FO.

In this part of the series I will look at points one and two. In the next part we will be looking at points three and four.

Invoking a Flow from the PowerApp

The first thing we need to do is to invoke a Flow from the PowerApp to get the process going. However, before I go into PowerApps to configure this, I need to set up the Flow to make it available to PowerApps. To make a Flow available in PowerApps, the first step in the Flow needs to be a PowerApps-trigger as shown in the below example.

PowerApps 2

Next step is to determine what data (parameters) I need pass from PowerApps to drive my Flow. As the following screenshot shows, I need Barcode, Purpose, UserId, FromDate and ToDate as data from the PowerApp.

PowerApps 3

This data is used to create the SharePoint item. When linking a parameter in PowerApp to a field in the Sharepoint list, I need to use the Dynamics content provider called “Ask in PowerApps”. This connects the PowerApp-parameter with the SharePoint list field when the Flow is called.

PowerApps 4.PNG

To invoke a flow from PowerApps you need to use the OnSelect method on a control (in my case the control is called “Icon5”. On the Action pane in the ribbon, use the Flows button to link the control to a flow.

As shown in the following screenshot, this brings up a dialogue that allows me to select between the flows available in my organisation.

PowerApps 5.PNG

In this case, I have selected the flow called “PowerApp->Createitem”. This immediately posts the following code into the OnSelect method on the Icon5 control:

‘PowerApp->Createitem’.Run(

PowerApps is now linked to the Flow and I must fill in the parameters, I have requested in the Flow. This results in the following OnSelect method code:

‘PowerApp->Createitem’.Run(Label3.Text;TextInput2.Text;Gallery1_1.Selected.’Partid’; DatePicker1.SelectedDate;DatePicker1_2.SelectedDate)

When clicking on the Icon5 control in the PowerApp, the Flow will now be invoked with the data defined in the parameter-string.

Updating a Record in D365FO

As mentioned above, all I need to do now is to update existing open equipment loan records in D365FO. This is due to a business rule on the LoanedEquipments data entity that does not allow new loans if a loan exists without a returned date.

To do this, I use the Flow connector called “Dynamics 365 for Finance and Operations”. This connector has an action called “Get records” and as the following screenshot shows, I use this action to select all open equipment loans.

PowerApps 6.PNG

Two things worth nothing in the Filter Query I am using to find the correct records:

  1.  An empty date in an D365FO record is defined as “1900-01-01T12:00:00Z”.
  2. I can use Dynamics Context in the query, but it most be enclosed in ”.

Please note: Depending on your locale, the statement separator can be either “,” or “;” and the system may expect single or double quotes.

For each record found by the query, I use an Apply to each construct to update the returned date (sorry for the Danish) in the LoanedEquipments data entity.

Conclusion

As the above example shows, it is relatively simple to create functionality that combines a user interface (PowerApps) with data processing (Flow) using the Power Platform in combination with D365FO. In my example I also create an item in a SharePoint list. This is mainly to demonstrate this capability and could have been omitted.

In the next part in this series, I will show how to create a record in D365FO based on the data stored in the SharePoint list. Stay tuned…

 

 

 

Test Automation Suite for Dynamics 365 for Finance and Operations, Part 2

In the first part of this series, I took a look at how you configure the Test Automation Suite (TAS) for Dynamics 365 for Finance and Operations (D365FO). Now it is time to take a look a how you use TAS for test automation.

Test Automation Suite Example

As mentioned in the earlier post, the starting point for a test case in TAS is a task recording. In my example, I am using a task recording of how to create a new customer as shown in the following screenshot:

Test 20.PNG

As the following screenshot shows, I have uploaded the task recording to my process library in the Business Process Modeler (BPM) in Lifecycle Services (LCS).

Test 21.PNG

The task recording is now a process, and it becomes an epic or user story (depending on level) in Azure DevOps when the two tools synchronise, but it is not yet a test case. To make the process into a test case, you must create at least one requirement for it in BPM as shown below.

Test 22.PNG

In Azure DevOps, each requirement becomes a user story, but only one test case is created – for the entire process. This is shown in the below query from Azure DevOps.

Test 23

In the previous post, we created a test plan in Azure DevOps. Now it is time to add test cases to this test plan. In my example I only have one test case, so it is relatively easy to add them to the test plan as shown below.

Test 24.PNG

With the test plan in place, it is time to switch to the TAS. Before you are able to use the TAS, you must configure the settings as described in the previous blog post. Once this has been done, you can click on Load and the test plan is loaded from Azure DevOps. If the Parameters File field is empty, you must click on New to populate the file. This file contains the test data for the test case.

Test 25.PNG

When clicking on Edit, an Excel spreadsheet opens up with test data captured in the task recording, as the above screenshot shows. This spreadsheet is also available for editing in the folder configured under settings.

Please note, before clicking on Run to execute the automated test, you must make sure you display zoom factor on your PC (not the VM) has been set to 100% otherwise, the Internet Explorer driver on the VM will fail.

Now, when clicking Run in the TAS, the system logs in to D365FO and starts to automatically simulate the test. If successful, the test will be marked with “Passed” in the Result field or “Failed” if the test failed.

If I go back to Azure DevOps under Test plans / Runs, the query shows a list of all the test runs I have carried out and their status as shown below.

Test 26.PNG

You can obviously drill-down from here and use the Azure DevOps tools to investigate further and take appropriate action.

Test Case Meta-Data

One last thing worth exploring is what data is carried across to a test case in Azure DevOps. As you can see from the following screenshot, the test case in Azure DevOps automatically inherits the task recording steps from BPM.

Test 27.PNG

If you are not using automated testing, this would still allow the user to manually perform a test based on task recordings.

Also, as this screenshot shows, the test data used in TAS is stored with the test case.

Test 28.PNG

Conclusion

As I have shown in this blog post, we are now able to automate testing based on data from BPM and Azure DevOps without developer assistance. I am sure we will see the TAS develop further in future, but for now we have a strong tool to support our agile projects and regression testing during continuous delivery.

In my third, and last, post on the subject I will be looking at how to investigate failed test cases and chaining of test cases in TAS.

 

 

 

Test Automation Suite for Dynamics 365 for Finance and Operations, Part 1

The Test Automation Suite

As you may already know, with V8.1 of Dynamics 365 for Finance and Operations we now have a Test Automation Suite (TAS) that allows an administrator to define and orchestrate automated testing.

However, the TAS is not a standalone tool. It works in conjunction with Azure DevOps and the Business Process Modeler (BPM) in Lifecycle Services (LCS). The three individual tools are integrated and play the following roles:

  • BPM: The BPM holds the process model that contains the task recordings that are used in the test plan. For more information on how to create task recordings, please see this blog post.
  • Azure DevOps: Azure DevOps is where you configure the test plan and follow-up on the test status.
  • TAS: In the TAS you are able to load test plan data from Azure DevOps, configure the automated test and run it.

In this blog post we will be looking at how to configure the three tools to allow synchronisation and testing. In part 2 of this small series, we will look more closely at how to perform the actual test.

Setting up the Business Process Modeler

The first step to get things going is to configure the BPM and its integration with Azure DevOps.

LCS Project Settings

To start with, you need to configure the integration between LCS and Azure DevOps. This article explain how to set this up. As the following screenshot shows, I have connected my LCS project with Azure DevOps.

Test 2

As you can also see from the screenshot, LCS automatically determines how work item types in LCS are linked to work item types in Azure DevOps.

With this working, the next step is to set up a process model.

Test 3.PNG

You can obviously use an existing process model in BPM to synchronise with Azure DevOps, but in my example, I have created a new model called “Test Case Library”.

Please note that you need to use the new BPM experience in LCS to get access to this view.

Azure DevOps Test Plan

Assuming you have already managed to set up a project in Azure DevOps, it is now time to create a test plan.

Test 6.PNG

As the above screenshot shows, all you need to do initially is to create a test plan. Unless you have specifically designed a test process with multiple iterations, you can use the defaults.

Installing and Configuring the TAS

Now we come to the most difficult part of the process. Now, you need to install the TAS on the machine where you intend to execute the automated test. I my case, I am using a sandbox environment, so I use RDP to access the virtual machine (VM).

The installation download and instructions are available here.

Follow the instructions closely and everything should be fine. However, one thing worth bearing in mind:

If you copy the certificate thumbprint from the certificate tool, please be aware that it contains invisible Unicode characters. I had to copy it into Notepad and then save it as a text file to clean it up.

Configuring the TAS

Once the TAS has installed, it is time to configure your automated test. The above instructions should take you through this, but here are a couple of hints:

  • Azure DevOps URL: If you do not remember the URL, you can copy it from your LCS project settings.
  • Access Token: This is the token you generated when linking Azure DevOps to LCS.
  • Hostame: Simply the URL for your D365FO instance.
  • SOAP Hostname: The above hostname, but insert “.soap” between “aos” and “cloudax” in the URL. Example: “https://d365cbd45b49961q2970aos.soap.cloudax.dynamics.com”

With this last bit of configuration, your TAS should be ready to go.

In part 2 of this series, I will use the TAS to load test cases from Azure DevOps and execute an automated test. I will also take a look at how you can add additional test data to the test case to make it more comprehensive. Lastly, we will look at how the test results can be analysed in Azure DevOps.

In part 3, I expect to take a look at how we can chain test cases together and pass variables between them.

I would like to say a BIG thank you to Palle Agermark, who has helped me through a couple of sticky point in the installation process. Thanks, Palle, you have been a great help – as always.