Dynamics 365 for Finance and Operations Inventory Cost Model, Part 4: Landed Cost

This is the fourth instalment in my series of articles explaining the inventory cost model in Dynamics 365 for Finance and Operations (D365FO). The series is expected to include the following parts:

In the previous posts, we looked at the core concepts underpinning the inventory cost model in D365FO and how some of the key areas should be configured. In this post we will look more into how you work with landed cost.

 

I am not sure if there is a formal definition of what landed cost is, but for the purpose of this article landed cost is defined as:

Total cost of an inbound shipment of goods including product cost, freight, insurance and duties.

The concept of charges

In D365FO, landed costs are recorded and managed through so-called Charges. A charge can be any type of landed cost and can be set up for products and suppliers. A charge is a relatively simple concept based on the following business logic:

  • A Charge can be applied to a purchase order header or lines manually or automatically.
  • A charge cannot be split across multiple purchase orders.
  • A purchase order header charge can be allocated across the purchase order lines (manually).
  • Purchase order line charges can be included in the product cost and taken into the inventory.
  • A charge can only be applied to the supplier on the purchase order – not a 3PL supplier.
  • Charges on a purchase order are perceived to be estimated – realised charges are connected to the invoice.

Configuring charges

Before we can use Charges in D365FO, they need to be configured. Since we are dealing with inbound charges, the configuration takes place in the Procurement and sourcing module under Setup / Charges.

Firstly, we must configure the necessary charge codes as shown in the following example.

Inventory 1.PNG

The key decision to make here is how to post the charge to the general ledger. In this case, FREIGHT, the cost (debit) is posted to a nominal ledger account (600120) and the debt (credit) is included is the amount owed to the supplier and therefore the accounts payable account set up for the supplier.

In the next example, FEE, the debit configuration is pointing to Item. This means that the charge will go to the inventory account for the item on the purchase order line and be included in the inventory value.

Inventory 2.PNG

Obviously, charge codes with debit posting set to Item are only relevant for purchase order line charges.

Once the charge codes have been defined, they can be used manually on a purchase order. However, most companies would like to have default charges applied automatically. This is set up as Automatic charges.

Firstly, I have set up a handling fee of $5.00 for vendor 1001 as shown in the following example. This is set up as a Header charge.

Inventory 3.PNG

This means. that all purchase orders for vendor 1001 will have a handling charge applied to the purchase order header automatically.

Next, I have set up two line item charges for freight and fee respectively. The freight charge is multiplied by the quantity on the purchase order line. The fee charge is a fixed amount for each purchase order line.

Inventory 4.PNG

The charges are automatically applied to all purchase order lines for vendor 1001, item 1000.

I could also have set up automatic charges for a group of vendors or a group of items. These groups are maintained in special charges groups.

Use of charges in the purchasing process

When I create a new purchase order for vendor 1001, the handling charge is automatically applied as shown in the following screenshot.

Inventory 5.PNG

If required, I can add, change or delete charges in the purchase order header. Charges in the purchase order header are maintained under Maintain charges.

The purchase order header charge is a general charge that will not be included in the inventory value. The charge can be allocated to the purchase order lines instead by using the Allocate charges menu point as shown below.

Inventory 6.PNG

Now, when I create a purchase order line for item 1000, the freight and fee charges are automatically applied as shown here.

Inventory 7.PNG

Since the fee charge is fixed, it does not change with the order quantity whereas the freight charge does.

Once the purchase order has been confirmed to the supplier, changes to charges cannot be applied until the invoicing stage.

Invoicing matching

At the invoicing stage, the user cannot remove the estimated charges on the purchase order. The purchase order charges are automatically connected to the invoice charges, but the user can remove the connection and apply new corrected charges. This way, the user can match the invoicing but at the same allow comparison between estimated charges and actual charges for that purchase order.

Charges postings

The following screenshot shows the inventory transaction details for the purchase order line after the supplier invoice has been applied.

Inventory 8.PNG

As you can see, the purchase order line amount of $1,798.00 has been increased to $1,798.50 because of the $0.50 fixed fee charge. This charge has been included in the inventory value because the charge code was set up to post the debit side to the item.

If we look at the financial voucher for the purchase order as a whole, we can see that the $2.00 freight charge has been taken to the “Freight/transportation in” account and the $5.00 handling charge has gone to “Other miscellaneous expenses” account. Lastly, it is worth noting that the accumulated charges have been added to the accounts payable account as well.

Summary

The above example pretty much sums up what can be achieved with charges in D365FO. If your requirements involve distributing freight charges across multiple purchase orders, the Transportation management module may be applicable, but it is beyond the scope of this article.

As mentioned, the system keeps both the estimated and the realised charges, but I have yet to find a report that shows a comparison or any statistics.

 

 

Advertisements

Embedding a PowerApp in a Dynamics 365 for Finance and Operations Workspace

One of the best things about Dynamics 365 for Finance and Operations (D365FO) is the ability to embed PowerApps – especially in a workspace.

To embed a PowerApp in a workspace simply go to the Options menu point in the ribbon and select Personalize this form. Clicking on the … button allows you to add a PowerApp. You can place the PowerApp in an existing placeholder in the workspace.

The following screenshot shows an example where I have embedded a PowerApp called Transaction entry in the General journal processing workspace.

PowerApps 1.PNG

In the above example I have added a PowerApp that allows a user to quickly enter a general ledger journal transaction using a simple form. The Main account, Offset account and  Currency fields are drop-downs bound to OData data sources (data entities) in D365FO with some filtering to show the correct options. When the Save icon is pressed, data captured in the app is stored in a SharePoint list using Microsoft Flow.

A second flow is automatically kicked off when a new item is created in the SharePoint list and a ledger journal transaction is inserted into D365FO using the Create record action in the Flow connector.

PowerApps 2.PNG

When inserting into a ledger journal transaction, it is worth noting that the Account and Offset account fields require a syntax similar to the account-picker in the normal form. The format is picked up from General ledger / Chart of accounts / Dimensions / Financial dimension configuration for integrating applications. The active ledger dimensions format is used for this type of integration. In this example the format is:

MainAccount-BusinessUnit-CostCenter-Department for instance

110101-010–2000

Make sure to include the “-“.

Also, you need to implement the logic for creating a new ledger journal header to provide the ledger journal batch number if the PowerApp is not called from the ledger journal header screen.

That is about all you need to be able to provide users with simple guided data entry of ledger journal transactions using PowerApps. This is just a great and simple way to augment workspaces with even more relevant functionality.

 

Tools for Creating Rollout Templates in Dynamics 365 for Finance and Operations, Part 1

The holy grail of multi-national or enterprise ERP implementations has, for a long time, been the template-based approach. A template would ideally contain large chunks of pre-defined assets that can easily be deployed and validated within a business unit as part of a rollout.

The following figure shows four key elements of a rollout template.

Template Assets Overview

In this blog, I will be exploring the tools available in Dynamics 365 for Finance and Operations (D365FO) for each of these four areas. I have elected to split the blog into four pieces, one for each area, for the sake of readability.

This part explores how D365FO supports the Data area of the template

Definition of Template Data

First, let us start with defining what template data is.

Distributed Template Data; To me, distributed template data is defined as:

Data that is relevant, but not necessarily common, across the legal entities in the instance. Distributed template data is used as a starting point in a legal entity and may or may not be redistributed at a later point when template data changes.

Shared Template Data; Shared template data on the other hand is defined as:

Data that is common and shared across the entire enterprise (or selected parts thereof).

Changes to shared template data will automatically affect all (selected) legal entities. More on this in the subsequent subsection Data Sharing.

The Shared Asset Library in Lifecycle Services

For the purpose of this blog, I assume a single instance deployment scenario. In this scenario, code is shared across all legal entities within the instance and is, as such, not part of my template definition.

A key repository for solution assets is the Shared Asset Library (SAL) in Lifecycle Services (LCS). Through the SAL it is possible for partners to share solution assets with customers and for customers to share solution assets across multiple projects.

In the following subsections, I will be highlighting the parts of the solution, which can be used for managing and distributing template data.

Data Package

In the Data package section of the SAL you are able to upload data packages you have generated in D365FO Data Management. The following screenshot shows an example where I upload absence codes generated in my D365FO to the library:

Template 1.PNG

Once the data package has been uploaded to the SAL, it becomes available to all projects in the tenant. As shown below, the package is now available for import to a specific project (in this case the project called “TSP Demo (V8.1)”).

Template 2.PNG

Data packages in the shared asset library can be used to share pre-defined data across projects. In a template context it allows for easy distribution of the data relevant for deployment of the template.

Data Task Automation; Data packages from the SAL can be downloaded and imported automatically into D365FO using Data Task Automation (DTA), which is available in the Data management workspace. The automation tasks are configured through a manifest. The following figure shows an example of a DTA manifest file.

Test 31

The above manifest file can be loaded into Data management and results in the creation of a number of data automation tasks as shown below.

Template 3.PNG

 The combination of data packages and data task automation will allow you to build a flexible framework that automates the generation of all relevant data in a new deployment from the template.

Configuration and Data Manager; The Configuration and Data Manager (CDM) is a simpler, but less flexible, alternative to the DTA. The CDM is available as a function in a project in LCS. Data packages imported to the project asset library automatically become available in the CDM.

In the following example, I am applying the absence codes data package to the D365FONOV18 environment.

Template 5.PNG

Subsequently, I select the legal entity to apply the selected data packages to within the instance (as shown below).

Template 6.PNG

In this case, applying absence codes to the “FRSI” legal entity fails because reference data is not yet available. Since data is being applied through a data project, I can analyse the failed import through the Data management workspace as shown in the following screenshot.

Template 7.PNG

Data Validation Checklist

Once template data has been successfully imported into a legal entity, it needs to be enriched and validated. For this task, D365FO offers the Data validation checklist (DVC) workspace (shown below).

Template 4.PNG

In the DVC, the test manager can configure any number of data validation checklists. A validation checklist consists of a number of tasks that can be grouped by area (configurable). The tasks are then sequenced and assigned to individual users. The user can open the menu point relating to the data directly from the workspace.

The DVC can be used to provide a structured approach to validating data imported from the template and enrich this data with data specific to the legal entity.

Data Sharing

As part of the template, a company may decide to share a section of data across the entire enterprise. For this purpose, D365FO offers the cross-company data sharing function. This function can be found under:

System administration / Setup / Configure cross-company data sharing

In the following example, I have created a new data sharing group called “Template Data Sharing” that shares absence codes and classroom groups across the “FRRT” and “ITCO” legal entities.

Template 8.PNG

Optimization Advisor

A less known tool to validate data is the Optimization Advisor (OA). Some while ago I wrote a blog post on the subject, so I will not spend time on how it works here. However, in a template context, the OA rules can be used across all legal entities and be used to validate data.

Conclusion

In the above, I have highlighted some D365FO tools that support the configuration and distribution of template data. These tools are:

  • Asset Library for managing template data packages.
  • Data Task Automation for flexible and automated distribution of template data.
  • The Configuration and Data Manager for simple distribution of template data across instances and legal entities.
  • Data Validation Checklist that enables the project manager or test manager to take a structured approach to validating and enriching data.
  • Data Sharing allowing (template) data to be automatically shared across selected legal entities.
  • Optimization Advisor providing automated rule-based data validation and self-healing capabilities.

In the next part of this blog post I will take a closer look out how to work with D365FO processes in a template context.

Using OData to Count Records in a Table in Dynamics 365 for Finance and Operations

With no direct access to the Dynamics 365 for Finance and Operations (D365FO) database you are relying on OData to execute your queries. A common scenario is the need to count the number of records in a table. The following URL template offers a quick way to do this:

[d365fo-url]/data/[data-entity]/$count

If, for instance, I would like to count the number of customers in my D365FO instance, the URL would look like this:

d365fo384738737aos.cloudax.dynamics.com/data/CustomersV3/$count

A quick an easy way to count records in a table.

The $count function can obviously be combined with the $filter function if you only need to count a subset of records. By adding the following statement to the URL, the request returns the number of customers in group ’10’:

/?$filter=CustomerGroupId eq ’10’

Using PowerApps and Flow with Dynamics 365 for Finance and Operations, Part 1

With PowerApps and Flow we have a very strong toolbox that allows us to augment standard Dynamics 365 for Finance and Operations (D365FO) features and provide “last-mile” solutions. In this two-part series I explore how we can use PowerApps to augment standard functionality and Flow to provide sophisticated data processing capabilities.

The Scenario

In the following example I am trying to support the following scenario:

The company is keeping track of what equipment is on loan to employees through the Loaned equipment function in the Human resources module. An employee can open a PowerApp and scan the bar code on the equipment to start a new loan. This will close the existing loan record and create a new loan record. In addition, the HR-administrator is informed about the loan through an e-mail.

The following screenshot shows the screen in my PowerApp where the user can scan the bar code.

PowerApps 1.PNG

Data Flow

To support the above scenario I need data to flow from the PowerApp via Flow to D365FO. The data flow is described in the following steps:

  1. The PowerApp invokes a flow and sends parameter-data to the flow.
  2. The data is stored in a SharePoint list (optional step).
  3. This flow also updates any existing equipment loan record in D365FO.
  4. A second flow is triggered when a new item is created in the SharePoint list. The flow sends an e-mail to the HR-administrator informing of the loan with a link to the item in the SharePoint list and subsequently creates a new equipment loan record in D365FO.

In this part of the series I will look at points one and two. In the next part we will be looking at points three and four.

Invoking a Flow from the PowerApp

The first thing we need to do is to invoke a Flow from the PowerApp to get the process going. However, before I go into PowerApps to configure this, I need to set up the Flow to make it available to PowerApps. To make a Flow available in PowerApps, the first step in the Flow needs to be a PowerApps-trigger as shown in the below example.

PowerApps 2

Next step is to determine what data (parameters) I need pass from PowerApps to drive my Flow. As the following screenshot shows, I need Barcode, Purpose, UserId, FromDate and ToDate as data from the PowerApp.

PowerApps 3

This data is used to create the SharePoint item. When linking a parameter in PowerApp to a field in the Sharepoint list, I need to use the Dynamics content provider called “Ask in PowerApps”. This connects the PowerApp-parameter with the SharePoint list field when the Flow is called.

PowerApps 4.PNG

To invoke a flow from PowerApps you need to use the OnSelect method on a control (in my case the control is called “Icon5”. On the Action pane in the ribbon, use the Flows button to link the control to a flow.

As shown in the following screenshot, this brings up a dialogue that allows me to select between the flows available in my organisation.

PowerApps 5.PNG

In this case, I have selected the flow called “PowerApp->Createitem”. This immediately posts the following code into the OnSelect method on the Icon5 control:

‘PowerApp->Createitem’.Run(

PowerApps is now linked to the Flow and I must fill in the parameters, I have requested in the Flow. This results in the following OnSelect method code:

‘PowerApp->Createitem’.Run(Label3.Text;TextInput2.Text;Gallery1_1.Selected.’Partid’; DatePicker1.SelectedDate;DatePicker1_2.SelectedDate)

When clicking on the Icon5 control in the PowerApp, the Flow will now be invoked with the data defined in the parameter-string.

Updating a Record in D365FO

As mentioned above, all I need to do now is to update existing open equipment loan records in D365FO. This is due to a business rule on the LoanedEquipments data entity that does not allow new loans if a loan exists without a returned date.

To do this, I use the Flow connector called “Dynamics 365 for Finance and Operations”. This connector has an action called “Get records” and as the following screenshot shows, I use this action to select all open equipment loans.

PowerApps 6.PNG

Two things worth nothing in the Filter Query I am using to find the correct records:

  1.  An empty date in an D365FO record is defined as “1900-01-01T12:00:00Z”.
  2. I can use Dynamics Context in the query, but it most be enclosed in ”.

Please note: Depending on your locale, the statement separator can be either “,” or “;” and the system may expect single or double quotes.

For each record found by the query, I use an Apply to each construct to update the returned date (sorry for the Danish) in the LoanedEquipments data entity.

Conclusion

As the above example shows, it is relatively simple to create functionality that combines a user interface (PowerApps) with data processing (Flow) using the Power Platform in combination with D365FO. In my example I also create an item in a SharePoint list. This is mainly to demonstrate this capability and could have been omitted.

In the next part in this series, I will show how to create a record in D365FO based on the data stored in the SharePoint list. Stay tuned…

 

 

 

Test Automation Suite for Dynamics 365 for Finance and Operations, Part 2

In the first part of this series, I took a look at how you configure the Test Automation Suite (TAS) for Dynamics 365 for Finance and Operations (D365FO). Now it is time to take a look a how you use TAS for test automation.

Test Automation Suite Example

As mentioned in the earlier post, the starting point for a test case in TAS is a task recording. In my example, I am using a task recording of how to create a new customer as shown in the following screenshot:

Test 20.PNG

As the following screenshot shows, I have uploaded the task recording to my process library in the Business Process Modeler (BPM) in Lifecycle Services (LCS).

Test 21.PNG

The task recording is now a process, and it becomes an epic or user story (depending on level) in Azure DevOps when the two tools synchronise, but it is not yet a test case. To make the process into a test case, you must create at least one requirement for it in BPM as shown below.

Test 22.PNG

In Azure DevOps, each requirement becomes a user story, but only one test case is created – for the entire process. This is shown in the below query from Azure DevOps.

Test 23

In the previous post, we created a test plan in Azure DevOps. Now it is time to add test cases to this test plan. In my example I only have one test case, so it is relatively easy to add them to the test plan as shown below.

Test 24.PNG

With the test plan in place, it is time to switch to the TAS. Before you are able to use the TAS, you must configure the settings as described in the previous blog post. Once this has been done, you can click on Load and the test plan is loaded from Azure DevOps. If the Parameters File field is empty, you must click on New to populate the file. This file contains the test data for the test case.

Test 25.PNG

When clicking on Edit, an Excel spreadsheet opens up with test data captured in the task recording, as the above screenshot shows. This spreadsheet is also available for editing in the folder configured under settings.

Please note, before clicking on Run to execute the automated test, you must make sure you display zoom factor on your PC (not the VM) has been set to 100% otherwise, the Internet Explorer driver on the VM will fail.

Now, when clicking Run in the TAS, the system logs in to D365FO and starts to automatically simulate the test. If successful, the test will be marked with “Passed” in the Result field or “Failed” if the test failed.

If I go back to Azure DevOps under Test plans / Runs, the query shows a list of all the test runs I have carried out and their status as shown below.

Test 26.PNG

You can obviously drill-down from here and use the Azure DevOps tools to investigate further and take appropriate action.

Test Case Meta-Data

One last thing worth exploring is what data is carried across to a test case in Azure DevOps. As you can see from the following screenshot, the test case in Azure DevOps automatically inherits the task recording steps from BPM.

Test 27.PNG

If you are not using automated testing, this would still allow the user to manually perform a test based on task recordings.

Also, as this screenshot shows, the test data used in TAS is stored with the test case.

Test 28.PNG

Conclusion

As I have shown in this blog post, we are now able to automate testing based on data from BPM and Azure DevOps without developer assistance. I am sure we will see the TAS develop further in future, but for now we have a strong tool to support our agile projects and regression testing during continuous delivery.

In my third, and last, post on the subject I will be looking at how to investigate failed test cases and chaining of test cases in TAS.

 

 

 

Test Automation Suite for Dynamics 365 for Finance and Operations, Part 1

The Test Automation Suite

As you may already know, with V8.1 of Dynamics 365 for Finance and Operations we now have a Test Automation Suite (TAS) that allows an administrator to define and orchestrate automated testing.

However, the TAS is not a standalone tool. It works in conjunction with Azure DevOps and the Business Process Modeler (BPM) in Lifecycle Services (LCS). The three individual tools are integrated and play the following roles:

  • BPM: The BPM holds the process model that contains the task recordings that are used in the test plan. For more information on how to create task recordings, please see this blog post.
  • Azure DevOps: Azure DevOps is where you configure the test plan and follow-up on the test status.
  • TAS: In the TAS you are able to load test plan data from Azure DevOps, configure the automated test and run it.

In this blog post we will be looking at how to configure the three tools to allow synchronisation and testing. In part 2 of this small series, we will look more closely at how to perform the actual test.

Setting up the Business Process Modeler

The first step to get things going is to configure the BPM and its integration with Azure DevOps.

LCS Project Settings

To start with, you need to configure the integration between LCS and Azure DevOps. This article explain how to set this up. As the following screenshot shows, I have connected my LCS project with Azure DevOps.

Test 2

As you can also see from the screenshot, LCS automatically determines how work item types in LCS are linked to work item types in Azure DevOps.

With this working, the next step is to set up a process model.

Test 3.PNG

You can obviously use an existing process model in BPM to synchronise with Azure DevOps, but in my example, I have created a new model called “Test Case Library”.

Please note that you need to use the new BPM experience in LCS to get access to this view.

Azure DevOps Test Plan

Assuming you have already managed to set up a project in Azure DevOps, it is now time to create a test plan.

Test 6.PNG

As the above screenshot shows, all you need to do initially is to create a test plan. Unless you have specifically designed a test process with multiple iterations, you can use the defaults.

Installing and Configuring the TAS

Now we come to the most difficult part of the process. Now, you need to install the TAS on the machine where you intend to execute the automated test. I my case, I am using a sandbox environment, so I use RDP to access the virtual machine (VM).

The installation download and instructions are available here.

Follow the instructions closely and everything should be fine. However, one thing worth bearing in mind:

If you copy the certificate thumbprint from the certificate tool, please be aware that it contains invisible Unicode characters. I had to copy it into Notepad and then save it as a text file to clean it up.

Configuring the TAS

Once the TAS has installed, it is time to configure your automated test. The above instructions should take you through this, but here are a couple of hints:

  • Azure DevOps URL: If you do not remember the URL, you can copy it from your LCS project settings.
  • Access Token: This is the token you generated when linking Azure DevOps to LCS.
  • Hostame: Simply the URL for your D365FO instance.
  • SOAP Hostname: The above hostname, but insert “.soap” between “aos” and “cloudax” in the URL. Example: “https://d365cbd45b49961q2970aos.soap.cloudax.dynamics.com”

With this last bit of configuration, your TAS should be ready to go.

In part 2 of this series, I will use the TAS to load test cases from Azure DevOps and execute an automated test. I will also take a look at how you can add additional test data to the test case to make it more comprehensive. Lastly, we will look at how the test results can be analysed in Azure DevOps.

In part 3, I expect to take a look at how we can chain test cases together and pass variables between them.

I would like to say a BIG thank you to Palle Agermark, who has helped me through a couple of sticky point in the installation process. Thanks, Palle, you have been a great help – as always.

 

 

Insight as a Planned Outcome – Not an Afterthought in ERP Projects

Many clients I speak to has “insight into data” as one of the key drivers for their ERP project – up there with “operational efficiency” and “business agility”. However, in my experience, many projects fail to keep focus on insight as an important objective and plan for tangible outcomes.

Why is Insight not in Focus?

As mentioned, data insight is important in an ERP project, and many businesses seek to achieve insight into:

  • Customer behaviour.
  • Business performance.
  • Process efficiency.
  • Costs.

as a key objective. In some cases this objective is met, more as a function of reasonable implementation by experienced professionals, than as a planned outcome. In my view there are a two reasons for this:

  1. Approach and methodology.
  2. Costs.

Let’s deal with these reasons individually.

Approach and Methodology

Most of the methodologies I have worked with across the systems integrator landscape in the last decade or two have been predominantly biased toward implementing efficient processes. The objective, and in many cases the business case, was to make processes more efficient.

Rarely, did we start our project with defining a data model that would define the rules and structures at a logical level to guide us during the implementation. Neither did we use a dimensional design framework such as Kimball’s to define how we would connect processes with the measures, KPIs and dimensions we would need to generate insight into data.

Instead, focus was on the data we needed to drive the process efficiently – and in some cases, as luck would have it, this was enough to drive reasonable data insights. However, this was not a planned outcome.

In some cases, striving for the required insight may actually mean compromising process efficiency to capture more data. Not what we usually seek to achieve…

Costs

Often, ERP projects are perceived to be expensive and it is assumed that the inbuilt data analytics tool will deliver on the promise of data insight. In the case of Dynamics 365 for Finance and Operations (D365FO), the system ships with a comprehensive suite of embedded analytics and sophisticated tools to build even more. So far so good.

However, these analytics are based on a certain process behaviour and data capture. In other words, to take full advantage of the inbuilt analytics, the implementation team need to fully understand the underlying data model and design dimensions (and other attributes) in a way that feeds into the analytics framework.

It will take time during an implementation to bring this knowledge to the team and time to ensure that the process model is augmented by a dimensional model. This will, in the end, increases costs.

In my experience, trying to correct the dimensional model, post-implementation is significantly more expensive than getting it right in the first place and will also ensure a better return on the investment made in the standard solution.

Plan for Insight

So what can we do to ensure that insight is a planned outcome of an ERP implementation instead of the “usual” elusive phase II?

Firstly, accepting that there is a direct relationship between how a process is configured, the data captured during the process and the insights you drive is important. Processes in D365FO were meant to work in a certain way by design and if you follow that pattern you are automatically able to leverage the inbuilt analytics provided.

So understanding and following the standard process pattern is important.

Secondly, a user story should always (there will be exceptions) contain a definition of the insights resulting from the user story or the overlying (epic) insights the user story supports. A user story cannot be done until the expected insight is achieved.

An insight can be a user story in its own right.

My recommendation is to ensure that expected insight is documented and treated as any other requirement in a project (no this is not just a report!) and that the project plan reflects the time required to also deliver on these requirements.

This way, insight becomes a planned outcome!

Photo by Unknown Author is licensed under CC BY-SA-NC

 

Dynamics 365 for Finance and Operations Sales Agreements, Part 1: Setting up an agreement

With Dynamics AX 2012, Microsoft introduced Sales and Purchase agreements. In previous versions we were only able to work with simple trade agreement (prices and discounts), but we are now able to work with specific contracts based on volume or a financial commitment.

In this three-part series, I will be taking a look at how to work with Sales agreements. The series is divided into three parts:

  • Part 1: Setting up agreements (this post).
  • Part 2: Working with agreements in the sales order process.
  • Part 3: Following-up on agreements.

Setting up a Sales agreement

To set up a new Sales agreement go to Sales and marketing / Sales agreements / Sales agreements. Click on New in the ribbon to create a new agreement.

Firstly, you must select a customer for the agreement and classify the agreement in the Sales agreement classification field. This field is used to group agreements together.

Agreement 1.PNG

Now, when you open the General fast tab, a number of new fields become available.

In the Customer reference field you can enter the customer’s agreement reference number, if applicable.

The system automatically fills in the Currency field with the default currency associated with the selected customer. However, this can be overridden for the specific agreement.

In the Default validity period you can enter the period in which the contract is active.

You can enter a title for the agreement in the Document title field and a reference to an external document (the contract) in the External document reference field.

Agreement 2.PNG

In the Default commitment field, you select what the agreement is for. You have the following options:

  • Product quantity = The agreement is for a specific quantity of a specific product.
  • Product value = The agreement is for a financial value of a specific product.
  • Product category value = The agreement is for a financial value of products within a specific category.
  • Value = The agreement is for a financial value across all products.

In my example, I create an agreement with a commitment for a specific quantity of a specific product (Product quantity commitment).

Lastly, if the agreement relates to a specific project, you can select the project in the Project ID field.

Once the agreement header has been created, agreement lines can be added by clicking on the

Agreement 3.PNG

button on the Sales agreement lines fast tab.

As the below example shows, I have now added a line to the agreement for 500 pieces of product number 1000, The Surface Pro 128 GB. In this case, the agreement is set up for Site 1. You can use the inventory dimensions to create specific agreements across, for instance, Site and Warehouse to allow for high granularity in how you manage the supply chain.

Agreement 4.PNG

In the Line details fast tab, it is possible to add further details to the line. An important option is Max is enforced. This determines whether a user can exceed the agreed volume when creating a sales order. If set, as in the below example, the agreed volume cannot be exceeded.

Agreement 5.PNG

As you can see from the above screenshot, the agreement is still on hold and cannot be used for transactions until it has been confirmed using the Confirmation option in the ribbon. Once the agreement has been confirmed it changes status to Effective and becomes available for transactions.

In the below example, I have used Value commitment for the agreement instead. As you can see, when a line is added to the agreement, I am only able to enter a Net amount, which represents the financial commitment across all products.

Agreement 6.PNG

This completes the set up of a Sales agreement in D365FO. In my next post, I will be looking at how to create sales order based on the Sales agreement.

Please note, in this post I have only been showing Sales agreements. The Purchase agreement is a mirror of the Sales agreement, so I will not be showing any Purchase agreement examples.

Microsoft’s documentation for Sales agreements is available here.

 

The Person Search report in Dynamics 365 for Finance and Operations

Microsoft has published its guide to GDPR compliance for Dynamics 365 for Finance and Operations (D365FO). You can read the guide here.

A key tool in identifying and reporting on a person’s data is the Person search report. This tool allows an administrator to identify one or more persons through a specialised search engine. The Person search report can be found under System administration / Inquiries / Person search report.

Based on a request, a new search project is created by clicking on the New button in the ribbon. This brings up the search definition dialogue.

PS 1.PNGFirstly, you must give the search project a name, in this case it is “Andrew Dixon”. Also, you must enter a date for the search request.

Secondly, the search criteria must be defined. In the above example, I am using the name search, but it is also possible to search on:

  • E-mail address.
  • User ID.
  • Personnel number.Customer account number.
  • Address.
  • Etc.

Once the search criteria have been defined, press Execute search to see the results. The results are presented across the person data categories available in D365FO.

In this example, Andrew Dixon is found in the following data entities:

  • Address.
  • System user.
  • Worker.

If I expand the Worker section, I am able to see the reference to Andrew Dixon’s personnel record as well as his party record as shown below.

PS 2If the person has requested an export of the information we hold on him or her, the administrator is able to export the data to Microsoft Excel by using the Process report function in the ribbon as the following screen shows.

PS 3.PNGAs you can see from the above screen, I have selected “Person search” as the report template. If the drop-down is empty it is because you have not imported the default templates into the Data management module.

You may have to tweak the template slightly as not all data entities are enabled for all countries. If a data entity is not enabled for your country, just remove it from the template.

When I click OK, a data management project is created based on the “Person search” template and since I said yes to run the data project immediately, it starts executing as shown below.

PS 4.PNGOnce the project has executed successfully, you can export the result to a data package by clicking on Download package. This create a .ZIP file containing a spreadsheet for each data entity. As you can see from the following screenshot, the list is quite long.

PS 5.PNGWhen I open the “Worker” spreadsheet I can see it contains a record of all the personnel master data we hold on Andrew Dixon as shown below.

PS 6.PNG

Please note: When a search request has been processed, the Date processed and Project name fields are populated on the search project. This gives the administrator an audit trail of when a request was processed and what data project was used to process the request. Nice.

The Person search report function is definitely a strong tool to help companies identify personal data across D365FO without having to visit each module individually.