The Last Suit You’ll ever Wear…

I have heard it said (and written) a few times recently that switching to a cloud-based ERP solution is the equivalent of MiB’s “Last Suit You’ll ever Wear”. As much as I sympathise with the sentiment, I do not necessarily agree with the underlying arguments that ERP replacement is predominantly driven by obsolete technology and that in this future Nirvana, replacing a granular part of ERP is like downloading an app for your new iPhone.

Judging by all the Dynamics AX 2009 and various old Axapta systems (not to mention Lawson M3, SAP R/something etc.) still running out there, I would argue that obsolete technology is low down the list of priorities for many companies. Albeit, I agree the future holds a more granular approach to ERP, the main reason companies have ERP systems is to support complex processes that span across this granularity. So, even though the technology will allow a more granular approach to implementation, the process complexity is not going to go away. It may actually increase as digital transformation extends traditional operational processes closer to the end-customer.

So no, I am sorry, a company will still re-implement their solution in 8 or 10 years time. And no, the decision will not predominantly be driven by obsolete technology and nor is it today. Here are some of the reasons I believe this to be the case:

  • In the digital era, a company’s business model will change more rapidly.
  • Acquisitions will lead to inefficient processes.
  • New management may see a re-implementation as a way to re-energise internal performance.
  • Cost pressures will lead us to think in new ways.
  • Organisations and processes change, the ERP system may not be able to support these changes without a re-think.

However, I think there is a lot of good reasons to move a company’s ERP to the cloud. Just to name a few:

  • Access to new and advanced technologies.
  • Increased security.
  • Predictable operational costs.
  • Scalability.
  • ERP becomes a technology commodity.
  • Technological complexity is abstracted away.

I agree that most companies are likely to stay on the same technology platform (including primary cloud service provider) once they have made the switch, but as much as I would love to think that in future we will only be making incremental changes to a company’s (everlasting) ERP solution, I think that business dynamics dictate a different reality.

Posted in Uncategorized | Leave a comment

Dynamics 365 for Finance and Operations Inventory Cost Model, Part 1: Core Concepts

This week I was encouraged by fellow networker, Wali Ullah Khan to share my insights on the inventory cost model in Dynamics 365 for Finance and Operations. On reflection, I agree that this topic is probably of interest to a quite a few people. Since this topic is relatively broad and complex, I have decided to divide my post into five parts, namely:

  • Part 1: Core concepts.
  • Part 2: Configuration.
  • Part 3: Cost management.
  • Part 4: Landed cost.
  • Part 5: Cost controlling.

The core inventory cost model has actually changed surprisingly little since the days of Damgaard Axapa 1.0. Actually, some of the core concepts of the inventory cost model go all the way back to Damgaard XAL. I have worked with this core model since 1995 (in Concorde XAL), but even so, I will not pretend to know everything, so if I get something wrong, please feel free to correct me. But this I know: we still have an inventory cost model that is tightly coupled to the application logic and data model.

In the following, I have tried to explain some of the core concepts in the D365FO inventory cost model.

Inventory Model

A released product in D365FO is associated with an inventory model through the Item model group. The inventory model determines how the inventory value for that product is calculated. D365FO supports a number of different inventory models, but some common scenarios are:

  • Traded products = FIFO.
  • Manufactured products = Standard cost.
  • Retail merchandise = Moving average.

Normally, the inventory model for a product is determined by the company’s accounting policy.

The inventory model for a released product is fixed for the entire legal entity, so it is not possible to change it by site or warehouse.

Inventory Transactions

The whole inventory costing system in D365FO revolves around the inventory transactions. Every time a product is bought, sold, manufactured, shipped etc. the system creates an inventory transaction (or updates an existing one). The same transaction may have a different status depending on where it is in the life-cycle. Some of these status values do not influence inventory costing, but some do. These are:

  • Received = The goods have been received and put away in the warehouse. Normally, this is referred to as goods-received-not-invoiced (GRNI).
  • Purchased = An invoice for the goods has been received and posted.
  • Deducted = The goods have been shipped to the customer, but not yet invoiced. Normally, this is referred to as goods-shipped-not-invoiced (GSNI).
  • Sold = The goods have been invoiced to the customer.

All other inventory transaction status value do not influence the cost.

Physical and Financial Value

Another important aspect of the inventory cost model in D365FO is the distinction between physical and financial value. Physical value is the value of goods received (status = Received) or goods shipped (status = Deducted). Correspondingly, financial value relates to status Purchases and Sold respectively. Historically, physical value was sometimes referred to as “floating value” in earlier versions (maybe it still is on some reports).

As the following screenshot shows, physical and financial value is recorded directly on the inventory transaction together with the physical and financial dates.

Physical-financial value.PNG

Average Value Issue

As the above example shows, the financial cost value for these goods is 188,790.00. However, depending on the product’s inventory model (see above), this may change when the inventory is closed (see below) on a periodic basis. If the product’s inventory model is set to Standard cost, the cost value is fixed and will not change when the inventory is closed. However, if a dynamic principle such as FIFO or Weighted average is chosen, the financial value may change. The change will be recorded in the Adjustment field on the transaction.


Ah yes, this lovely feature has been with us since Concorde XAL, so I guess it must be celebrating its silver jubilee. To calculate the true inventory cost value of products, you must close the inventory on a regular basis.

This also brings us to a very singular core concept of D365FO (and all its predecessors): issue at average value. When creating an outbound inventory transaction, for instance by creating a sales order, the inventory will not get a financial cost value until you invoice the sales order line. The invoice routine finds the cost value for the transaction by looking up the average cost value for the set of dimensions associated with the transaction. Let’s make an example:

Product 51B66 has a cost price €100 per piece. The storage dimensions for 51B66 are set to site and warehouse.

On warehouse WEST the company holds 50 pieces and the total cost value is €5,100. Therefore the average cost value for warehouse WEST is €102.

On warehouse EAST the company holds 25 pieces and the total cost value is €2,450. Therefore the average cost value for warehouse WEST id €98.

If the user invoices a transactions on warehouse WEST, the system will use €102 per item as cost of goods sold (COGS). If instead, the user invoices a transaction on warehouse EAST, the COGS is €98. In part 2 of this series, we will explore how to set up storage dimensions for inventory costing in more details.

As the following screenshot shows, you can always find the average cost price in the Cost price field on the On-hand inventory screen:


When closing the inventory, the system will recalculate the value for alle open inventory transactions. Open inventory means (financially) outbound inventory, which has not been matched with (financially) inbound inventory. If the product inventory model is FIFO, it will, as it says on the tin, calculate the cost value using a FIFO algorithm.

This calculation creates settlements. These settlements are in effect an audit trail for each inventory transaction that shows, in detail, how the inventory cost value is calculated. The accumulated adjustment in cost value is stored in the Adjustment field on the inventory transaction itself. We will be looking more into settlements and cost exploration in part 3.

Once an outbound inventory transaction has been fully settled, it is closed and the cost value is fixed (at last).


All inventory transactions with financial value will have a corresponding entry in the general ledger. In part 2 we will be exploring the relationship between the inventory module and the general ledger. However, a core concept worth highlighting at this stage is this: if you would like to adjust the inventory value, it is done through the inventory module – not by making manual postings to the general ledger accounts. At least, if you need to make manual adjustments, create a separate adjustment account and leave the automated postings on accounts where manual postings are not allowed.

As a principle, you can only influence the cost value on an outbound inventory transaction by changing the cost value of the matching inbound inventory transaction.

So how do you do that? Mainly by applying charges to the inbound transaction. Charges can be applied in a number of ways, but normally they are applied on the purchase order and subsequently on the supplier invoice. Charges are fully configurable and can be applied automatically or manually. Examples of common charges are freight and duty. However, an automated (percentage) charge for warehouse overheads can also be applied and included in the inventory cost value.

It is worth noting that the cost value on the inventory transaction is always in the base currency for the legal entity. Therefore, exchange change fluctuations can be quite hard to carry over into the inventory value.

Standard Costing

Since standard costing is probably the most straightforward inventory cost concept in D365FO I will not be dealing with it in this series. In future, I will dedicate a separate series on this topic instead. Stay tuned…

Closing remarks

In the above, I have tried to introduce some core inventory cost model concepts that apply to D365FO. In the many projects I have been involved with, the concept most people struggle with is this: the fact that COGS is not fixed when you invoice the customer. In part 3 we will discuss how to mitigate this issue as far as possible. Also, many people find reconciling inventory and the general ledger difficult. Again, if best practice is applied when setting up the solution, it should not be too bad. We will be taking a look at this in part 2 and 5 of this series.

A word of caution to end this first post in the series: if this goes wrong, it is usually very difficult to reconcile the ledgers and get back on the right track due to the sheer volume of transactions generated by the system. In past projects, I have always tried to make sure we would simulate an inventory closing procedure with reconciliation to iron out any issues before going live. It is definitely worth the effort.

Posted in Dynamics 365 for Finance and Operations | 2 Comments

Dispelling a few myths about Dynamics 365 for Finance and Operations

In my conversations with clients, existing users of Dynamics AX and partners, I have come across a few myths (or statements, if you like) that I think are worth discussing. In this post, I have picked on a few of them and done my best to explain, what I believe to be the reality.

Myth #1: It is not possible to customise Dynamics 365 for Finance and Operations because it is a cloud solution.

Reality: Albeit, Dynamics 365 for Finance and Operations is managed and operated in Microsoft Azure, it is, to all intents and purposes, a private deployment with its own configuration and customisations.

Myth #2: Microsoft is sealing the application, so the system can no longer be customised.

Reality: Far from it. With extensions, the eventing model, common data service and custom services, Dynamics is probably more open than at any time in the past. Yes, you need to follow certain patterns and no you will probably not be able to directly mess about with the code that calculates VAT, but is that a bad thing? Customisations of a certain nature may also become more expensive, but let’s be honest: some pretty shabby customisations have been developed in the past because it could be done on the cheap (overlaying).

Myth #3: Running in the cloud will be more expensive.

Reality: For some, maybe that is the case. But often we are comparing apples to pears. The security and compliance measures surrounding Dynamics 365 for Finance and Operations are second-to-none and probably far better than anything your current hosting partner can provide (with all due respect). Moving the licence from CAPEX to an OPEX subscription model will make it more visible, especially for customers, who have given up their BREP, but I doubt it will make it more expensive, but obviously each customer is different.

Myth #4: My data is not safe.

Reality: I will not pretend to be an expert on the regulatory issues relating to whether a company can let Microsoft store and process their data in a certain geography, but for most customers this question is irrelevant. I work mainly with European clients and when deploying Dynamics 365 for Finance and Operations under Microsoft’s standard online services terms, the primary and secondary data centres are both placed in Europe (Ireland and Holland to be precise). You can read more on the subject here. You can also read more on how Microsoft protects your data in the Azure Trust Centre.

Myth #5: It will be hard to get my data out.

Reality: If you decide to terminate your subscription, Microsoft standard terms guarantees that data is available for a 90-day period (after the subscription is terminated) where you can download data to your own data centre or a new cloud partner.

There are probably a lot more myths, uncertainties and questions out there, but these five I have come across a lot recently. Feel free to comment, if you disagree with some of my conclusions. Only through dialogue and fact-based information can we dispel myths and untruths.

Posted in Dynamics 365 for Finance and Operations | Leave a comment

Creating a Personal Workspace in Dynamics 365 for Finance and Operations

Workspaces is a pretty cool feature in Dynamics 365 for Finance and Operations (D365FO). As you probably know, D365FO comes with a number of pre-built workspaces, but it is so much nicer to create your own personal workspace, isn’t it? Yes it is! So here goes…

To get cracking we need to go to the dashboard, right-click and select Personalize. This brings up this form:

Create new workspace

Click on the + Add workspace option. This automatically creates a new workspace tile on the dashboard:

My workspace 1

You can now right-click on the tile and give the workspace tile a more appropriate name. In my example, I name the workspace “My customer workspace”.

For now, the workspace is totally blank, so I need to go and add some content.

Firstly, I go to the All customers screen and filter on wholesale customers (customer group = 10) as shown in the following screenshot:

Wholesales customers filter

By clicking on Options in the ribbon and selecting Add to workspace, I can now select the “My customers workspace” I created earlier. I now have the option to add the customer query to the workspace as a tile, a list or a hyperlink. In this case, it would make sense to add it as a list.

As the following screen shows, I now get an opportunity to customise the list before it is created in the workspace:

Add as list.PNG

I can choose to have a simple list (3 columns) or a tabular list (8 columns) and select the fields I would like to display.

Opening the “My customers workspace” from the dashboard, the list has now been added:

Workspace with list

Similarly, I would like to see customers with debt past due as a tile in my workspace, so I go to the Customers past due menu point in the Accounts receivable module. When I add this query to the workspace as a tile, I am given the option of showing a count of customers in the tile:

Customers past due

The resulting workspace now looks like this:

Workspace with list and tile.PNGIf you choose, you can now add PowerBI components to the workspace using the Open report catalog menu point in Options on the ribbon.

So far, I have not found a way to add normal (SSRS) reports to the workspace as a hyperlink.

This was a very short introduction to how you can create your own personal workspace in D365FO. And yes, it really is that simple.


Posted in Dynamics 365 for Finance and Operations, Uncategorized | Leave a comment

My blog is now syndicated on the Dynamics 365 Enterprise edition community site

Dear Reader,

I am very happy to announce that my blog is now also available on the Microsoft Dynamics 365 Enterprise edition community site. You can find it here:

A practitioner’s views on Dynamics 365 for Operations and Finance

You can also explore the Dynamics 365 Enterprise edition community site here:


Posted in Uncategorized | Leave a comment

Dynamics 365 for Operations and Finance application sealing is the best thing…

Blocks 1

I will start by being straight about where this blog post is heading:


And why is that?

  1. Firstly, from a purist developer’s point-of-view I have never liked overlaying. Overlaying code was never truly object-oriented and introduced all sorts of issues. I will admit that the way MorphX and X++ worked, overlaying was the easiest way to get things done and in some, cases where the class model was particularly bad, probably the only way to support certain scenarios.
  2. In the past, instead of saying no to a client, we have developed overlays, which we probably shouldn’t have. Customisations to tax calculations, complex rules introduced into the MRP engine and significant breaks to the natural flow of invoicing. Ring a bell?
  3. Regression testing overlayed code was nigh-on impossible sometimes and the quality suffered badly in some projects.
  4. Overlaying was never truly based on best-practice patterns but very much based on the individual developer’s style.
  5. With massive customisations based on overlaying, upgrading to a new major release was probably always an illusion.

So, I am not mourning the “death” of overlaying.

However, I am not blind to the challenges presented by moving to extensions. Just mentioned a few:

  1. There will be customisation scenarios we are unable to deliver.
  2. The pre-post eventing model may seem a bit restrictive.
  3. The APIs we need may not be ready or may not quite support the scenario we are working with.

However, all things considered, I think the ISV and VAR communities will benefit from extensions.

  • Projects will increasingly be based on standard functionality.
  • Customisation quality will improve.
  • Projects will become more predictable and therefore more profitable.
  • New features introduced by Microsoft can be leveraged by clients (potentially increasing revenue streams).
  • ISV solutions will be easier to leverage.

Only time will tell if my optimism is well-founded or extensions will become a menace to the ISV / VAR community, but I remain optimistic and believe those of us, who are passionate about ERP and Dynamics 365 for Operations and Finance, still have some great times to come… Bring it on!


Posted in Uncategorized | 2 Comments

What’s in a cutover plan?

In the past I have seen a lot of ERP project plans with a single-line entry called “go-live”, “cutover” or “final migration”. As I have argued in a previous post, this phase includes the cutover activity, which must be planned carefully in great detail and executed with precision.

Since I wrote that post, quite a few people have asked me:

“What should be in a cutover plan?”

In the following, I have tried to answer this question.

Plan details

First of all, the cutover plan must contain a detailed list of all activities and their dependencies.

For each item in the cutover list I would normally record the following data:

  • Status. In my experience the following status cover most scenarios:
    • Pending.
    • Started.
    • Success.
    • Parked.
    • Failed, cutover continues.
    • Failed, cutover terminated.
  • Type of activity. In my experience the following types cover most scenarios:
    • Automated script.
    • Manual data entry.
    • Application configuration.
    • Infrastructure configuration.
    • Physical work (e.g. change barcode labels on bins in the warehouse).
    • Random control.
    • Reconciliation.
    • Checkpoint.
  • Owner (who owns the activity and must report back to the cutover manager?).
  • Planned start time.
  • Estimated duration.
  • Actual start time.
  • Actual duration.
  • Decision-maker (the person, who can make a decision to proceed or not, if an activity fails).
  • Informed (list of people, who must be informed about this specific activity).
  • Undo / Redo (a description on how to undo or redo an activity).
  • Instruction  (if an activity requires the owner to test a specific outcome or reconcile some data, a link to the instruction for this activity should be included).
  • Dependencies (a list of predecessors to this activity – especially if activities are being executed in parallel).

Depending on your specific needs you may need other fields in the list, but these have served me well in the past.


In my experience it can be useful to place checkpoints in appropriate places throughout the schedule. If your list of activities is very extensive, some activities may fail, but the decision-maker decides to move forward anyway. Some activities may take longer to run or some unforseen event challenges the overall plan. The checkpoints can be used to gather key stakeholders to give them an overview and discuss how and if to proceed. If your activity list is short and the process well-proven probably you do not need checkpoints.

Fallback plan

Ideally, the cutover plan should be accompanied by a plan that describes how you will fall back in the event of failure. It is difficult and cumbersome to align the fallback plan to each activity in the cutover plan so normally I align the two plans on checkpoint level. That also allows me to discuss more thoroughly with key stakeholders before deciding to fall back.


Key to a successful cutover is the ability to keep everyone involved informed at all times. The simple way is to give everyone access to the activity list (spreadsheet) and they can open it periodically to have a look. Alternatively, the cutover manager may send out the updated list at each checkpoint. On larger projects I have been involved with, the cutover manager has managed the list in SharePoint with a BI tool on top to allow users to see progress in a graphical format.

Using the cutover plan for trial migrations

The last topic I would like to touch on is this: in the past I have found that developing the cutover plan as part of the on-going trial migrations can be extremely useful and by basing each trial migration on the provisional cutover plan you also start testing and proving your plan. Very often I have seen trial migrations focusing on the technical side of migrating data without ensuring that this migration is aligned with manual activities and reconciliation activities. By going through the cutover plan each time you perform a trial migration you prove the plan and start building more accurate timings.

All in all, this is not an out-of-the-box template for a cutover plan, but I hope this gives you enough inspiration to build your own plan based on the spceific needs of your projects.

Good luck cutting over…

Posted in Data Migration, Dynamics 365 for Operations and Finance, Project Management | Leave a comment