Is APQC the Right Way to go in a Dynamics AX Implementation?


Ever since Microsoft decided to embed the APQC process classification framework into the Business Process Modeller in Lifecycle Services (see previous post), the question to anyone responsible for a Dynamics AX implementation has been: is APQC the way we should go with our implementation?

When we started out on the Dynamics AX programme I am currently responsible for, we had to decide on a number of things:

  1. Did we, as a company, have a common global process language or classification framework we could leverage?
  2. How should we structure our workshops?
  3. Did our implementation partner have a process classification model we could use as a starting point?

Since the answer to number 1 was a definite NO! – and this is probably the case for many (most) companies, we could go two ways:

  1. Spend a lot of time (and consultancy fees) mapping our processes, classifying them and building a common language from that or
  2. use an existing process classification framework and map our processes back to that.

We chose option 2!

When you are at the formative stage of an implementation and need to decide how to structure your analysis and design workshops, in the main, you decide between a functional approach or a swim lane approach. With a functional approach you often end up replicating the processes you have today because you analyse them from the same functional view you are already used to. With a swim lane approach you tend you challenge the processes more because of the holistic approach, but a classic swim lane such as order-to-cash (including fulfilment) may prove too all-encompassing and you end up trying to handle all the company’s processes in one swim lane.

Instead of these more traditional approaches we chose to use the APQC framework to structure our workshops. This means that we divide the project into a number of work streams based on APQC L1. Subsequently we structured workshops based on L2 and discussed processes on L3. It did not work for everything, so some creativity and improvisation may be required, but it gave us a good starting point for our planning.

Lastly, our implementation partner did not have a process classification model, so choosing one ourselves was necessary and easy. Choosing an industry standard instead of a proprietary model from our partner, which was likely to tie us in with that partner, I think is the sensible way to go.

Will the choice ultimately make a significant and positive difference to our Dynamics AX implementation? The jury is still out on that one since Microsoft has not actually changed the product to align with the model as far as I can tell, but maybe that is not the point.

The point, for us at least, is that using APQC in the formative stage of an implementation may save you time and frustration and give a structure on which to build up you programme organisation and processes. Whether this is a sustainable advantage or not, I will report on in a future blog post.

Could we have gone with a different framework? Yes, we could for instance have gone with APICS SCOR(r), but this model is heavily biased toward supply chain processes and Microsoft chose APQC, so going with something else did not seem to make a lot of sense.

So far APQC seems the way to go…

Posted in Dynamics AX, Project Management | 2 Comments

2014 in review

The stats helper monkeys prepared a 2014 annual report for this blog.

Here’s an excerpt:

The concert hall at the Sydney Opera House holds 2,700 people. This blog was viewed about 11,000 times in 2014. If it were a concert at Sydney Opera House, it would take about 4 sold-out performances for that many people to see it.

Click here to see the complete report.

Posted in Uncategorized | Leave a comment

Life Left in ERP?


Currently, I am on a tour of the world “promoting” our global ERP programme and speaking with my colleagues across the world got me thinking:

Why do we actually need an ERP system?

Some while ago I wrote a blog post on best-of-breed applications versus the integrated package. In this blog post I concluded that maybe it is time to call it a day for the integrated package and instead leverage best-of-breed applications. I wrote this blog post to provoke some thoughts on why we all spend gazillions on implementing complex ERP systems when all we often need is agile applications we can easily replace when something new and better comes along.

Now, in the last few months I have had to reflect more deeply on, why we still need our ERP suite and, why it still makes sense to throw money at it. Here are some of my thoughts…


Stibo Systems and Informatica will tell you that all you need to mitigate the problem of managing master data across multiple systems is a Master Data Management (MDM) solution. Clearly, these solutions are great at managing information and master data, but at the end-of-the-day maintaining master data in one (key) application only seems to me to be a better choice. Firstly, users only need to work in and understand one application. Secondly, there is never any doubt what the system-of-record is and no need to reconcile data. Also, in a time-lagged MDM implementation, there is always a risk of operating on out-of-date data. Certainly MDM systems can drive MDM processes in a multi-application landscape, but not using the ERP system for most master data seems like more of a hassle to me.


ERP systems generally are built for a high transactional load. So are many best-of-breed application such as Warehouse Management Systems (WMS). However, often what we see is that the integration between the systems becomes the bottleneck and requires significant investment to scale out. Also, ERP systems guarantee two-phase commit across a transactions. This can be done through integration but is certainly more difficult to achieve. Lastly, to achieve adequate performance through integration, often the integration is implemented through an asynchronous technology causing potential time lag.


Clearly, the biggest issue with integrated systems is reconciliation. Keeping both master data and transactional balances in sync is a big task and requires quite some manpower and insight.


Key to achieving efficient business processes is the ability to look holistically across the entire process or swim lane eg. Order-to-Cash. In an integrated package this is not easy, but at least it is doable. Doing this across multiple applications from different vendors with different service providers is virtually impossible. In a new era where omni-channel is the new “black”, trying to provide the customer with a unified buying experience across separate e-commerce, CRM, ERP and WMS applications might prove more than a little challenging.


The fewer your master data repositories and transactional data sources, the easier it is to build and maintain your data warehouse. Simply put, an integrated ERP solution is the data warehouse architect’s best friend.


When the auditor’s pay you a visit to audit the accounts, clearly it is a lot easier to only have them audit one system. Also, in terms of application security it is significantly easier (relatively speaking) to maintain only one set of roles. Often, users are left active in an application long after they have left the company because user deactivation is not automated. This is not the case with an integrated solution.


Spreading you licence budget thinly across multiple applications and vendors does not give you a lot of leverage when it comes to negotiating licence fees.


If you are building your own application management services (AMS) team most times you would like to reduce the number of applications because you need to achieve critical mass in relation to skills. This is also the case for service providers. Having multiple service providers working together can often be a challenge so having multiple applications often increases this challenge since a single service provider is rarely able to handle AMS across all applications.


Let’s go back to the ’90s and remember the key selling point for the integrated ERP-solution: real -time updates across modules. No need for daily cross-module reconciliation.


Lastly, setting up and maintaining the necessary monitoring for one solution can be quite a task. Doing this for multiple applications and across all the integration points is bordering on a nightmare.

As you can see from the above, there are still something to be said for the good old integrated ERP solution.

On this happy note, I would like to thank you all for reading my thoughts on this blog and I would like to send my seasonal greetings to you and your families. In 2014 I have not quite managed to put down all my thoughts as I would have liked on this blog due to work commitments, but I hope 2015 will allow me some more time to share my thoughts on ERP in general and Microsoft Dynamics AX in particular. I would also encourage you to chip in with your view and comments.

Posted in Dynamics AX, Integration, Solution Architecture | Leave a comment

Why Upgrade? Time for Continuous Investment Cycles…

This being the summer holiday season and with some time on my hands, I have started reflecting on why we continue to look at ERP in upgrade cycles. Firstly, it is worth considering the word UPGRADE. How many times have I actually participated in a like-for-like upgrade of an ERP system? Probably a handful of times in 20 years; all the other projects have been re-implementations or replacements of obsolete technologies.

So why do we persist with talking about “using standard”, “avoiding modifications” etc. when, at the end of the day, we are highly unlikely to ever upgrade our solutions as such?

Obviously, a key reason is that it is difficult to persuade senior management to investment a high proportion of the company’s profits in a solution that is likely to be replaced after 6-8 years. Talking about upgrades when building the business case seems more appropriate.

Also, by using the upgrade argument to limit modifications, we are able to contain the project scope and therefore speed up the implementation. By avoiding modifications, the chance of success is also higher because we do not have to regression test added functionality – this being a highly time consuming high risk activity.

Anyway, back to the main question: Why upgrade?

Very often the key drivers behind an upgrade project are:

  • Obsolete technology.
  • Skills unavailability.
  • Functional shortfalls.

The above challenges seem to drive most replacement projects where the solution has been in use for a long time and have probably reached its expiry date a while back, but the organisation has been unable to find the necessary resources to carry out the project.

There is a number of other reasons, but in the main, businesses seem only to invest in ERP upgrades when forced to. This would indicate, that vendors and service providers are not able to sell upgrade projects on added business benefits – this also seems to tie in with my admittedly anecdotal experience.

So where does this leave an IT leader looking to keep this core part of the IT application landscape up-to-date?

Clearly, in many companies ERP is no longer considered to significantly contribute to generating a competitive advantage (unlike the golden ´90s era), so IT investment is channelled toward other technologies. If this is fair or not, I will leave to others to judge. Also, most mainstream ERP packages today provide sufficient functionality to adequately support most key business processes, so businesses are unlikely to require additional functionality.

ERP vendors are then seeking to add non-core features such as Business Intelligence suites, eCommerce sites, retail capabilities etc. to tempt customers to upgrade, but often businesses already have solutions in these areas and the benefits are not seen to outweigh the massive upgrade investment.

With this in mind, we should ask the question, with ERP in the cloud, will we be entering an era of buying into a mainstream vendor technology stack and staying with them for 20+ years? In a recent blog post, I shared some of my thoughts on the current state of cloud ERP. Albeit, I was slightly hesitant to buy into the cloud idea, wholesale, I can definitely see us moving into the cloud within the next few years. Whether this will, as some argue, make the issue of upgrading redundant because the software will be continuously updated, remains to be seen. I still think, as most enterprise cloud deployments will be customer-specific private-cloud affairs, we will see major upgrades with need for full regression testing.

Some while ago, in another blog post, I argued for using best-of-breed packaged software to augment the core ERP solution. In many ways, using additional software packages allows you to upgrade parts of the application without having to upgrade the core ERP solution. In future, hopefully, we will see Dynamics AX becoming increasingly loosely coupled so we will be able to upgrade individual modules and run different versions of modules.

So, at the end of the day, I would argue that instead of talking about ERP projects and upgrades, we should instead talk about investment cycles where the business has to continuosly invest in the application backbone running the business, namely the ERP solution. That would also mean that the business should maintain a sustainable investment in skills, capabilities and vendor relationships to ensure that the ERP solution is maintained to a high standard and kept up-to-date. This way, the business, I would argue, is able to save money in the long run and the outcome would be a more robust solution with more predictable costs associated with it. It is time to acknowledge that without ERP a business would struggle to process key business transactions and an ERP solution cannot be maintained on the cheap.

These were my summer musings on the theme of upgrades – please enjoy the summer…

Posted in Cloud Computing, Deployment, Dynamics AX | Leave a comment

Dynamics AX on iPad

On an interesting note: Having deployed Dynamics AX 2012 R3 through Azure, I managed to connect to the server and use the Dynamics AX client on my iPad. To do this I used the Remote Desktop client for iPad. Actually, it worked surprisingly well even though Dynamics AX certainly isn’t optimised for touch. Really surreal to actually run Windows 8 on a iPad albeit indirectly.

Posted in Uncategorized | Leave a comment

My First Adventure With Lifecycle Services for Microsoft Dynamics

Microsoft Lifecycle Services for Dynamics

This blog post is a short essay describing my first adventure with Lifecycle Services for Microsoft Dynamics (LCS)  (as I believe the solution is now called). Before I start going into details with my first experiences , I would like to give Microsoft some credit for:

  • Investing so heavily in what has been an obvious missing link when it comes to implementing Microsoft Dynamics – especially in the enterprise segment where competitors have had these tools for a while.
  • Recognising that the tool should be customer-centric rather than partner-centric.
  • Having the courage to focus on some of the hard stuff e.g. business process modelling.

Start Using LCS

Before you can start using LCS, you will need to have a current Dynamics maintenance plan and you will need to have an Microsoft Live account associated with you Dynamics account. The first time you log in, you are presented with an Online Services agreement. Obviously, you are supposed to read this agreement in detail, but here are a few important things I have picked up:

  • The agreement grants Microsoft a non-exclusive right to use and modify the data you upload.
  • The customer (you) retains all ownership rights to the data.

With the button pressed and the agreement out-of-the-way, we are ready to start using the solution.


The whole solution is based around the idea of working with one or more projects. Apart from filling out basic details such as project name and description you must the select the type of project you are about to create. You can select the following project types:

  • Implementation.
  • Upgrade.
  • Maintenance.

The difference between the three different project types is the functions available. For instance, the only project phase available on a Maintenance project is Operate, which makes sense.

Once the project has been created, you are presented with this main screen:

LCS Project Overview

Before starting the project definition, some basic settings must be configured using the Settings button:


A key decision under Settings is to decide if your documents should be stored in LCS or on an external site. In case you use an external site, you must provide a valid URL.

The first step after the basic settings have been configured is to set up you project team. To do this, you click on the Team button

LCS Tesam

and start adding team members. You can give each team member a distinct role allowing them different level of permissions to change project structure and contents.

If required, you can use the Role management button

LCS Role Management

to create new roles with unique permissions.

With the basic configuration of the tool out-of-the-way, it is time to start setting up you project.

Project Phases

Within the Phases section of the tool, the project has been divided into the five Sure Step phases.

Analysis Phase

When clicking on one of the phases, you are presented with a checklist outlining the key activities of the Sure Step phase, a set of tools relevant to the phase and a project status indicator. The checklist content can be edited to suit your specific project.

From within each phase (or from the main menu) you are able to edit the project plan:

Project Plan

As the above picture shows, the project plan is a simple start-end date for each phase.

From the main menu you are able to produce a simple project summary:

Project Summary

This summary gives you a basic overview of the project including:

  • Overall project status.
  • Work item statistics.
  • Key project wins.
  • Outstanding work (checklist items).
  • Risk register.

You are also able to create issues of type Project Bug, Work Item or Risk. You can perform some simple tracking on the issues, but as far as I can tell, you are not able to assign them to any project activities, so really it is just as simple list.

The Business Modeller

The Business Modeller is the heart of LCS. With the Business Modeller, you are able to model your business processes all the way down to task level. Microsoft provides 140 standard processes out-of-the-box (as per June 2014) based on the APQC framework.

Business Modeller - Models

When you open the Business Modeller, you are presented with your business process libraries. Firstly, you have the libraries provided by Microsoft on the right-hand side and on the left-hand side you can create your own libraries by uploading the structure from a spreadsheet.

As for the APQC out-of-the-box processes; as a starting point I guess they are helpful, but they do not fully reflect the complexities and quirks of most enterprises.

As the following picture shows, the Business Modeller contains a process diagram tool, which allows you to draw business processes and attach certain meta-data to the process.

Business Modeller Diagram

The tool is very basic compared to Microsoft Visio and is unlikely to support the requirements of most organisation. It is unfortunately not possible to link process diagrams at present.

From within the diagram you are able to open the associated Dynamics AX menu point. Also, you are able to link the process element to a gap in the gap list. However, so far I have not been able to find the place where the gap list is kept.

Another way to augment the process model is to record a certain process or activity using the task recorder:

Task Recorder

The Task Recorder allows you to add sub-processes to an existing process framework and record the process for subsequent upload to the Business Modeller. This is a very effective and intuitive way to build process diagram elements. Also, the Task Recorder will allow you to generate Microsoft Word documents detailing the recorded process as shown in the following picture:

Task Recorder Document

It is unlikely that the document can be used straightaway for training or test documentation, but is indeed an easy way to generate a good starting point.

Once you have finished working on the processes in the Task Recorder, you will need to generate a process package and upload it to the Business Modeller in LCS to take effect. This is a straightforward, albeit, slightly cumbersome and time-consuming process.

The task recorded process is added to the process model as a video.

To me, one of the key challenges with the Business Modeller if the fact that it is not version-controlled. This means that any change to a process diagram takes effect immediately. You can mitigate this problem, to a degree, by having to copies of the same process framework and copy from the “development” framework to the “live” framework when you want to deploy a revision.

Actually, it is version-controlled in as much as it records the changes as a new version everytime you change a process.

Upgrade Analysis

Upgrade Analysis

With the Upgrade Analysis tool, you are able to upload multiple .AOD files from an Dynamics AX 4.0 or 2009 environment and let LCS analyse the modifications made to the system. The solution is straightforward and to me is preferable to the same solution embedded in the Dynamics AX software, because it always contains the latest revisions to the meta-data-model.

Provisioning a Dynamics AX 2012 R3 Azure Environment

One of the key features of Dynamics AX 2012 R3 is the ability to execute on Azure. It is worth noting that at present this feature is not available for production environment deployment, but still you are able to deploy full demo environments.

Before you start provisioning a new environment, you need to sign up and create an account in Azure. It is beyond the scope of this blog post to go into the mechanics of creating an Azure account, but you will need your account ID to be able to provision a Dynamics AX 2012 R3 environment on Azure.

To start the provisioning process, you click on the

Cloud Hosted Environments

icon. This kicks off a very simple and self-explanatory process, which ends in a new environment, if all goes well. This picture shows an overview of my environments.

Environments Overview

You simply provision a new environment by clicking <+>.

For me, all went well straightaway, and the simplicity and speed of the process is impressive. As the following picture from the Azure administration portal shows, the environment is now deployed:

Azure Administration Portal

You can connect to the environment using Remote Desktop Services (RDS). In the limited tests I have been able to do so far, the environment works well and is stable, but fast it isn’t. Obviously, one of the key points going forward with using Dynamics AX on Azure will be the ability to scale the performance of the solution from the Azure administration portal.

We are strongly considering using Dynamics AX on Azure for development and test purposes. However, to do this we still need to figure out exactly how it will work with AD federation and integration to an on-premise Team-Foundation-Sever (TFS) – so more about this in a future blog post.

Issue Search

Issue Search

The Issue Search feature allows you to search for a solution to a known issue and download the hotfix.

Usage Profiler

Usage Profiler

In the Usage Profiler you are able to enter or upload data relating to the use and complexity of your installation. Based on this data, LCS can generate an estimated sizing as shown in the following picture:

Sizing Estimate

As far as I can tell, the estimates seem quite sensible, but I will need to gain more experience with this feature before making a final judgement. Also, I have not been able to associate the analysis with the Business Modeller. More on this in a future blog post.



RapidStart is not a direct feature in LCS as such, but a separate tool that allows you to configure Dynamics AX using a questionnaire-style tool set. You are able to link your LCS-project with a project in RapidStart. I will go into details with RapidStart in a future blog post, but my impression so far is that Microsoft’s intention with RapidStart is good, but it has neglected to develop the product for a while.

License Sizing Estimator

License Sizing Estimator

The license sizing estimator allows you to enter the number of employees by role by department as shown in the following picture:


Once all employees have been recorded, it is possible to generate the following report, which shows in necessary license CALs:

License Report

LCS in the Cloud

LCS is only being provided as a cloud services. Obviously, the key advantage of this approach is that it is easy to deploy and does not require any local installation apart from a connector between LCS and the Dynamics AX environment. However, with LCS being provided in the cloud there are also some drawbacks in my view.

Firstly, Microsoft deploys improvements to the solution on a regular basis. The Dynamics LCS blog is used to communicate the changes, but in my view, this agile approach will not work well with ongoing projects. Changes in solution behaviour during a critical phase of your project is probably not what you need, so therefore I would not be encouraged to use LCS for any mission-critical projects.

Secondly, you must be online to use LCS. Mostly, we are all online these days, but when you are not, it can be a right pain.

Currently, LCS only supports Internet Explorer (IE) browser. It will run on IE9, but if you would like to provision Dynamics AX through Azure or upload an .AOD file larger than 4MB (Upgrade Analysis tool), you will need IE10. It will appear that Microsoft still hasn’t recognized that not all of us are using IE all of the time.

Would We Go With LCS Right Now?

Certainly, we will be using the tools in LCS. The Upgrade Analysis, the License Sizing Estimator and Usage Profiler are fine. However, the solution has a pretty strong whiff of BETA about it, so we are unlikely to go all in – just yet… In future, definitely!

However, for me the biggest challenge is with the Business Modeller. Albeit, I can see the intention with the tool, it is still too simplistic to be used as a general business process modelling tool. When Microsoft has addressed the shortcomings and the missing features, I am sure this tool will become the de facto tool for all Dynamics projects.

The Projects feature of LCS is nowhere mature enough to provide adequate project management support – at least not for enterprise level projects. There are so many other tools out there today that will provide this type of functionality, so until Microsoft provides a full-feature tool set, I will be staying with my other products.

What Should the Future hold for LCS?

If Microsoft persists with the Projects feature, they should make sure it becomes a whole lot more operational. This would mean adding stuff like:

  • Integration with Microsoft Project.
  • Proper activity management including SCRUM-like artefacts such as backlogs.
  • Better reporting features.
  • Ability to export lists, reports and artefacts to PDF and Excel.
  • Change request management tool.

We will continue to monitor the progress of the Cloud Hosted Environments tool and leverage that as new features become available.

Our conclusion is this: LCS shows promise and, yes, it is the future of Dynamics AX implementation and operation, but right now it is not quite mature enough to base an entire implementation on.

Posted in Cloud Computing, Deployment, Dynamics AX, Project Management | 7 Comments

Merry Christmas

Christmas 1

Dear Reader,

For those of you, who celebrate Christmas, I would like to wish you all a merry Christmas and a very Happy New year. If you do not celebrate Christmas, please accept my seasonal greetings.

I hope you have found some of my postings in 2013 helpful or at least amusing. The blog reached 10,000 views in 2013 and I hope we can keep it up in 2014. Please leave comments on my posts for the benefit of the community reading this blog.

Thanks for your support in 2013!

Kind Regards,

Henrik Marx Larsen

Posted in Uncategorized | Leave a comment