My blog is now syndicated on the Dynamics 365 Enterprise edition community site

Dear Reader,

I am very happy to announce that my blog is now also available on the Microsoft Dynamics 365 Enterprise edition community site. You can find it here:

A practitioner’s views on Dynamics 365 for Operations and Finance

You can also explore the Dynamics 365 Enterprise edition community site here:


Posted in Uncategorized | Leave a comment

Dynamics 365 for Operations and Finance application sealing is the best thing…

Blocks 1

I will start by being straight about where this blog post is heading:


And why is that?

  1. Firstly, from a purist developer’s point-of-view I have never liked overlaying. Overlaying code was never truly object-oriented and introduced all sorts of issues. I will admit that the way MorphX and X++ worked, overlaying was the easiest way to get things done and in some, cases where the class model was particularly bad, probably the only way to support certain scenarios.
  2. In the past, instead of saying no to a client, we have developed overlays, which we probably shouldn’t have. Customisations to tax calculations, complex rules introduced into the MRP engine and significant breaks to the natural flow of invoicing. Ring a bell?
  3. Regression testing overlayed code was nigh-on impossible sometimes and the quality suffered badly in some projects.
  4. Overlaying was never truly based on best-practice patterns but very much based on the individual developer’s style.
  5. With massive customisations based on overlaying, upgrading to a new major release was probably always an illusion.

So, I am not mourning the “death” of overlaying.

However, I am not blind to the challenges presented by moving to extensions. Just mentioned a few:

  1. There will be customisation scenarios we are unable to deliver.
  2. The pre-post eventing model may seem a bit restrictive.
  3. The APIs we need may not be ready or may not quite support the scenario we are working with.

However, all things considered, I think the ISV and VAR communities will benefit from extensions.

  • Projects will increasingly be based on standard functionality.
  • Customisation quality will improve.
  • Projects will become more predictable and therefore more profitable.
  • New features introduced by Microsoft can be leveraged by clients (potentially increasing revenue streams).
  • ISV solutions will be easier to leverage.

Only time will tell if my optimism is well-founded or extensions will become a menace to the ISV / VAR community, but I remain optimistic and believe those of us, who are passionate about ERP and Dynamics 365 for Operations and Finance, still have some great times to come… Bring it on!


Posted in Uncategorized | 2 Comments

What’s in a cutover plan?

In the past I have seen a lot of ERP project plans with a single-line entry called “go-live”, “cutover” or “final migration”. As I have argued in a previous post, this phase includes the cutover activity, which must be planned carefully in great detail and executed with precision.

Since I wrote that post, quite a few people have asked me:

“What should be in a cutover plan?”

In the following, I have tried to answer this question.

Plan details

First of all, the cutover plan must contain a detailed list of all activities and their dependencies.

For each item in the cutover list I would normally record the following data:

  • Status. In my experience the following status cover most scenarios:
    • Pending.
    • Started.
    • Success.
    • Parked.
    • Failed, cutover continues.
    • Failed, cutover terminated.
  • Type of activity. In my experience the following types cover most scenarios:
    • Automated script.
    • Manual data entry.
    • Application configuration.
    • Infrastructure configuration.
    • Physical work (e.g. change barcode labels on bins in the warehouse).
    • Random control.
    • Reconciliation.
    • Checkpoint.
  • Owner (who owns the activity and must report back to the cutover manager?).
  • Planned start time.
  • Estimated duration.
  • Actual start time.
  • Actual duration.
  • Decision-maker (the person, who can make a decision to proceed or not, if an activity fails).
  • Informed (list of people, who must be informed about this specific activity).
  • Undo / Redo (a description on how to undo or redo an activity).
  • Instruction  (if an activity requires the owner to test a specific outcome or reconcile some data, a link to the instruction for this activity should be included).
  • Dependencies (a list of predecessors to this activity – especially if activities are being executed in parallel).

Depending on your specific needs you may need other fields in the list, but these have served me well in the past.


In my experience it can be useful to place checkpoints in appropriate places throughout the schedule. If your list of activities is very extensive, some activities may fail, but the decision-maker decides to move forward anyway. Some activities may take longer to run or some unforseen event challenges the overall plan. The checkpoints can be used to gather key stakeholders to give them an overview and discuss how and if to proceed. If your activity list is short and the process well-proven probably you do not need checkpoints.

Fallback plan

Ideally, the cutover plan should be accompanied by a plan that describes how you will fall back in the event of failure. It is difficult and cumbersome to align the fallback plan to each activity in the cutover plan so normally I align the two plans on checkpoint level. That also allows me to discuss more thoroughly with key stakeholders before deciding to fall back.


Key to a successful cutover is the ability to keep everyone involved informed at all times. The simple way is to give everyone access to the activity list (spreadsheet) and they can open it periodically to have a look. Alternatively, the cutover manager may send out the updated list at each checkpoint. On larger projects I have been involved with, the cutover manager has managed the list in SharePoint with a BI tool on top to allow users to see progress in a graphical format.

Using the cutover plan for trial migrations

The last topic I would like to touch on is this: in the past I have found that developing the cutover plan as part of the on-going trial migrations can be extremely useful and by basing each trial migration on the provisional cutover plan you also start testing and proving your plan. Very often I have seen trial migrations focusing on the technical side of migrating data without ensuring that this migration is aligned with manual activities and reconciliation activities. By going through the cutover plan each time you perform a trial migration you prove the plan and start building more accurate timings.

All in all, this is not an out-of-the-box template for a cutover plan, but I hope this gives you enough inspiration to build your own plan based on the spceific needs of your projects.

Good luck cutting over…

Posted in Data Migration, Dynamics 365 for Operations and Finance, Project Management | Leave a comment

Is Digital Transformation the End for ERP?


For many years, ERP has been the glue that has connected business processes within a company and supported business process extension to external users. In recent years, ERP’s hegemony as the “system-to-bind-them-all” has come under pressure and digital transformation seems to be accelerating this change.

Since there is no formal definition of digital transformation, for the purpose of this blog post, I would like to define digital transformation as:

  1. Pervasive use of technology across business processes.
  2. Use of technology to connect external users directly to the company’s business processes.
  3. Implementation of business and operating models based on the use of technology.
  4. Leveraging technologies that go beyond simple process optimisation such as predictive analytics, APIs, machine learning etc.

For many companies, digital transformation may mean implementation of some of the above, but not all. However, it is unlikely that a business should not do any of these is some form.

ERP as the Pervasive Technology

As long as I can remember, people have argued that efficiencies were found by implementing all business processes in a single, integrated software solution, namely ERP. Maybe this was true before cloud and the democratisation of integration technologies, but nowadays connecting Salesforce CRM to your ERP system is hardly rocket science nor does it take months – merely days or weeks. Also, people are increasingly weighing up the advantages of fully integrated processes with the need for speed and agility. Going into the era of digital transformation, I think we need to look at ERP as a landscape of connected business applications rather than a monolithic one-stop-shop. This is likely to challenge both IT departments and application management outsourcers because instead of focusing core skills on a single solution, they now need to understand and support multiple solutions – or leave it to “digital citizens” in the business to pick up this role. Certainly, digital transformation is likely to mean change in the way IT and outsources relate to business stakeholders.

Connecting External Stakeholders

Until now, ERP has predominantly been about supporting and optimising internal processes. Anything to do with external users has been handled on an arm’s-length principle through portals or similar technologies. This is not going to work in future. To become fully digital we need to connect all users through relevant technologies, apps, that are similar to what they use is other aspects of their digital life – or they will take their business elsewhere.  ERP people have traditionally not understood or appreciated the need to leverage apps and SoMe as part of the business processes so here we are facing a major hurdle if we are to stay relevant in the digital transformation. Actually, the technology is there. What we need now is consultants and service providers, who can deliver on the opportunities.

New Business Models

Transforming ERP is always difficult and filled with risk. Therefore, the ERP area is inherently conservative. We have traditionally been very good at optimising existing process, or on a good day, suggest a new process, but expecting the ERP community to invent and drive through new business models may be “a bridge too far”. However, as new business models are being trialled and matured, ERP needs to become more responsive to earn a place in the future digital reality – or run the risk of becoming obsolete.

Leveraging New Technologies

It is fair to say that ERP vendors are currently doing their level-best to promote new technologies such as machine learning as complimentary to their ERP package, but so far it seems more tentative than pervasive. However, I believe that this is where the biggest opportunity lies for ERP in the digital era. If, for example,  we can leverage machine learning to truly optimise internal business processes such as AP automation, maybe ERP as we know it, still have a future, but it will require a new set of thinking for ISVs. I am pretty sure that the ERP vendors will offer these technologies as-a-platform, but leave the actual implementation to ISVs and VARs. In many ways this is an understandable approach, but the risk is that independent software providers will quickly cease the initiative and offer this as an API in the cloud.

We Need to be Agile

With the technologies for digital transformation coming on-stream and maturing the emphasis now is on how we deliver. The traditional approach with year-long risk-filled transformation programmes may still be relevant for core ERP, but to stay relevant in future we need to find a way to trial and mature new processes, business models and approaches on a smaller scale and in a more agile fashion. The success, I think, relies on better change and risk management and not technology focused methods. I am sure I will come back to this theme in a future post.

In the above I have shared my current thoughts on where I see ERP’s future in the digital era and how ERP can be part of the digital transformation. Beyond core business processes, I still think ERP can play an important  role in future, but now is the time for service providers (ISVs, VARs and outsourcers) to step up and provide customers with forward-thinking solutions that align with the new business models delivered through fast and agile methods. Or risk becoming obsolete!


Posted in Uncategorized | Leave a comment

Integration With Dynamics 365


In recent weeks I have been in a number of discussions on how to support various integration scenarios through Dynamics 365 Operations (formerly known as Dynamics AX and hereafter referred to as D365O). Let’s be clear from the outset, I consider D365O as a pure cloud-delivered solution, so I am not considering any potential on-premise scenarios in this blog post.

With D365O being a cloud delivered solution and with (virtually) all customers having most of their other applications running on-premise at present, a key challenge is how we move data to and from Azure in this hybrid scenario. The Help WiKi for D365O is not flush with information on how to do this. However, a couple of things can now be ascertained in relation to how we integrate:

  • peer-to-peer integration (or system integration) is supposed to be performed using web services either through the SOAP or the REST protocols.
  • recurring integration scenarios are supported through processing of Data Packages configured in Data Projects.
  • all integration revolves around the data entity concept.


Let’s deal with the simpler form of integration first. Already in Dynamics AX 2012 we were using SOAP-based web services. This does not appear to have changed much in D365O. In addition to SOAP-based web services, D365O now also supports REST allowing us to exchange JSON messages and perform CRUD operations using ODATA (you can read more about the ODATA protocol here).

For more information on supported services, see this Help WiKi article.

Conceptually, integration to D365O revolves around the data entity. The normalised database schema is aggregated into a number of high-level logical units called data entities to hide the physical implementation and make it easier for developers, who do not need to understand the underlying schema. An example is the Customers data entity. This data entity maps the simple concept of a customer into multiple tables in D365O.

This type of integration makes is relatively simple to integrate synchronously with D365O across a wide range of apps and systems. I have not been able to find a description of how Microsoft intends to scale out the service, so that remains to be seen.

To enable easy integration across the Dynamics 365 stack and Office 365, Microsoft is developing the Common Data Model (CDM), which is similar to data entities, but spans across all the applications in the ecosystem. This article by Ukka Niiranen is a pretty introduction to CDM. As the article indicates, CDM is intrisically linked with Flow.


With synchroneous integration taken care of through web services, let’s turn to asynchroneous integration. Data import and export is configured through the Data Management Workspace:


In the Data Management Workspace you are able to configure multiple data sources. Out-of-the-box data sources includes D365O itself, Excel and CSV. You can reconfigure these or you can add new ones (also based on ODBC and XML). In this example I am creating a Data Project that exports all exchange rates to an Excel file:


In the data project, I can select one or more data entities to export and determine the sequencing. This means I can export all currencies before I export all exchange rates, which makes sense.

Once I have saved the data project, I am able to set up a recurring data job:

recurring-export-jobThe following screenshot shows the exchange rates Excel spreadsheet generated by the export:


Once I am done with the Data Project I download the Data Package file, which defines the structure of my Data Project.

Now I am able to import the data into an instance of D365O by creating an import Data Project and using the Package data source format. This can also be set up to happen on an a recurring basis. Pretty simple stuff, really.


This simple example shows how easy it is to configure and execute recurring import / export jobs using the Data Management framework and data entities. However, since D365O runs in the cloud (Azure), we need to get data in to or out of  Azure to be able to process them. Probably the simplest way is to use the Azure Service Bus. Transferring data to and from Azure Service Bus, however, this will require some sort of broker service such as BizTalk, but that discussion is for another day… I think, at this stage, it is fair to say that D365O offers good tools for integration, but operating in a hybrid world will require some extra work for the foreseeable future.

Posted in Uncategorized | Leave a comment

ERP – What is the Right Approach?

One of the key questions asked at the beginning of every ERP project I have taken part in is:


Since ERP (or MRP) implementations have been around at least since the early ’70s (and probably before) you would have thought that a best-practice approach has been identified by now and thoroughly documented for all to use, but in my experience this is not the case.

In the last 20+ years, I have seen widely varying approaches to ERP implementation ranging from “make-it-up-as-we-go-along” to minutely specified implementations with detailed plans and rigid change control. The question is: what is the right approach?

In my experience, the right approach has a lot to do with getting the balance right between:

  • Documentation.
  • Planning granularity.
  • Effective and efficient communication.
  • Risk and flexibility.

In the following, I have shared some of my thoughts on the different approaches I am familiar with.


However, before I start getting into the meat of my musings, I would like to say a few words about the “We Want Standard” statement that is being uttered at every kick-off on every ERP project ever undertaken. Obviously, using the standard solution as far as possible makes sense for the following reasons:

  • Allows future upgrades.
  • Reduces number of defects.
  • Reduces implementation time.

So, if everyone agrees that a company should take the standard solution and this approach obviously is the most efficient, why do we need a (sometimes) very lengthy analysis and design activity? Would it not make more sense to get straight into the build phase and start configuring the solution and migrating data?

Firstly, all companies are not the same, so some customisation is required to make any solution fit the business. Period! Secondly, often the ERP solution needs to be integrated with a number of other software packages and this requires bespoke development.

Lastly, analysis and design is not only about customisation and development. Analysis and design, conducted properly, should also bring these additional benefits:

  • Give key users a thorough understanding of the solution.
  • Allow the company to revisit current practices and processes and design future (and hopefully improved) processes based on the new solution.
  • Thoroughly plan for and design data migration and cut over.

In my experience, the value of good analysis and design should not be underestimated.


This is, as many of you know, the classic approach to software implementation and dates back to the ’50s. You can read more about it in this Wikipedia article. I would wager that this is how the majority of ERP implementations are being carried out today. Not necessarily in its purest form, but through some variation on the theme.

The cornerstone of the waterfall methodology is the Requirements Specification (The Spec). The Spec is a document that is supposed to document all functional and non-functional requirements for the solution. With The Spec, the customer and supplier are able to perform a User Acceptance Test (UAT) at the end of the implementation and clearly ascertain whether each requirement has been delivered – to spec.

In theory this is a pure and simple way of implementing software. It lends itself nicely to rigid contracts, even fixed price, because all requirements are there for all to see and evaluate. However, requirements are in reality often vague and ambiguous and difficult to test.

The main problems with ERP implementations and the waterfall method, in my experience, are:

  • In its purest form, the waterfall approach requires very thorough documentation of requirements.
  • The customer is often not able to clearly express all requirements leaving room for gaps and interpretation in the contracts.
  • Since 90% of the ERP solution is a standard package, The Spec will inevitably focus on the gaps and not document fit requirements well.
  • If The Spec remains unchanged throughout the implementation, the customer is unlikely to end up with the system they want.

Historically, The Spec has been used as the key document for tenders (ITTs, RFPs and RFQs) allowing potential suppliers to tender for the “same” solution. However, experience tells us that tendering for the same solution based on The Spec may still throw up wildly different proposals and offers from suppliers. This, in my experience, is because any written requirement allows for interpretation and assumptions to be made.

Those familiar with the Dynamics SureStep methodology used by partners to implement Microsoft Dynamics AX will probably recognise that this methodology is a (modified) waterfall approach. Many partners have adapted the model to allow for some flexibility to mitigate the weaknesses of the pure waterfall approach.

So, is the waterfall approach still relevant with all its shortcomings? In my view, yes! If applied sensibly with judicious application of change control and some agile elements (during build), it still makes sense and can be a strong tool to govern the relationship between customer and supplier.


The strength of the waterfall approach is its rigid approach to stage-gating. Basically, you cannot move forward in the implementation cycle without agreement on key deliverables. This is sound practice in an ERP implementation. Building on this strength, the waterfall approach can be improved by making it iterative. With an iterative approach, the customer and supplier can agree to go back and revisit a requirement, if it is deemed to be wrong or no longer relevant. This allows some flexibility and gives the customer a better solution at the end of the day.

Using an iterative approach, however, requires strong change management. Revisiting a requirement should be done under change control to ascertain impact on time and economy. Also, a change to a requirement may impact other requirements and design, so cross-functional coordination is necessary.

Poorly managed iterative implementations tend to become chaotic because anything can be questioned at any time. This should not be allowed.


Prototyping, often referred to as a Conference Room Pilot (CRP), is an approach where the analysis and design phases are replaced by hands-on sessions where users and consultants sit together and walk through the business processes using a mock-up of the solution. The solution mock-up often contains partially migrated (master) data to give the user a more comprehensive experience of the future solution.

In my experience, prototyping can be really efficient and beneficial for small-scale implementations where complexity is low and cross-functional coordination is minimal. However, as complexity grows the need to communicate through written documentation becomes more pronounced and forces prototyping to slip back toward an iterative waterfall approach but without the benefits of genuine stage-gating and change control.

In some implementations, I have seen prototyping used for parts of the analysis and design, but within a waterfall framework. This combination can be very powerful.

To make prototyping successful, though, requires a thorough understanding of the company’s business processes upfront by the consultants to allow them to configure the CRP in a meaningful way. Without this thorough process knowledge, prototyping in my view, carries significant risks.


I will not pretend to be an expert on agile software development. My experience with agile as an approach is based on a number of ERP implementations where agile methods and tools were used. Firstly, I would say that applying agile to the customisation and development processes in an implementation works nicely and should be done unless there some compelling reason not to do it.

However, applying agile to analysis, design and configuration is a slightly different matter. For agile to work, a comprehensive product backlog is required, which can be divided into sprints. For analysis, it is often difficult to produce a comprehensive product backlog upfront because, by definition, the analysis should be used to understand what the implementation must deliver ie. create the product backlog.

During design, as the (build phase) product backlog starts building, it becomes clear that with the complexity of ERP, the cross-functional dependencies are significant. As items in one backlog may depend heavily on items in other backlogs it becomes increasingly difficult to prioritise the sprint backlogs. If items start falling behind schedule, prioritisation in other work streams may ground to a halt.

One compelling reason for applying an agile approach is to be able to deliver tangible results more quickly. ERP with its interwoven processes and dependencies is often difficult to deliver in discrete chunks of functionality. Therefore, agile may just become another way of performing ordinary project activity planning.

Another compelling reason for applying an agile approach to an ERP implementation is the ability for the customer to de-prioritise unimportant features. I have seen this work in practice, but in my view there are some prerequisites that need to be in place:

  • The customer is willing to accept the financial risks. They pay for the features they prioritise.
  • Strong cross-functional communication and coordination is required.
  • Users in the work streams need clear guidance on how items in the backlogs can be prioritised.


Apart from selecting the right methodology and approach, organising the work streams can be another challenge and it is important to get it right. The challenge is to create a work stream organisation with the right level of granularity. Often, work streams are constructed using a traditional swim lane approach such as:

  • Order-to-Cash.
  • Procure-to-Pay.
  • Hire-to-Retire.
  • Etc.

However, this may result in some massive work streams such as Order-to-Cash basically encompassing everything the company does except from finance. If the work streams become too granular and focus on smaller process areas, the cross-functional coordination may become too cumbersome. Striking the right balance is essential and very company-specific, so before constructing the work streams thorough assessment and dialogue is required. In a previous blog post, I wrote about using APQC as a framework for organising work streams.


Regardless which approach is being taken, I have heard many project owners over time say that they do not care as long as the price is fixed. Personally, I strongly disagree with this viewpoint. What is the benefit of having a fixed price, if the system does not work as intended, at the end of the day? With a fixed price comes a fixed scope and potentially endless quibbling over scope changes.


All this leads me to the question:

If I was to start a new ERP implementation tomorrow, which approach would I take?

Firstly, I would assess the size and complexity of the implementation. Also, I would ascertain my company’s risk appetite and the maturity of my own organisation.

If the implementation was small with relatively little complexity and my company was happy to accept a certain financial risk, I would choose to conduct a scaled-down analysis and design phase (potentially using prototyping) to fix the scope and build a sensible product backlog. Then I would execute the build and deployment phases with an agile approach.

If, on the other hand, the implementation was large and complex, I would frame an iterative waterfall approach to ensure the proper stage-gating with strong governance. With many stakeholders in a project, I believe this is still the most prudent and effective approach. However, if possible, I would certainly want the development carried out using an agile approach.

Lastly, I would like to emphasise that, in my experience, it is important to have a close dialogue with potential suppliers on this. Suppliers have different levels of experience with these approaches and their contractual frameworks may not be well-aligned to them all. It would not be advisable to force a supplier to use an approach they are not familiar with leveraging a misaligned contractual framework.

So, in my view, it all comes down to striking the right balance and combining the right elements.

Posted in Uncategorized | Leave a comment

Cloud ERP… Is It the Future?

In recent years, a lot has been said and written about “The Cloud” and why we all need it. Obviously, we all use “The Cloud” in our daily lives through our smartphones (e.g. Garmin Connect, Apple iCloud Photostream etc.) or through Microsoft Office 365, OneDrive or Dropbox on our desktop computers. Some even work in genuine “Cloud” applications such as Salesforce, WorkDay or Microsoft Dynamics CRM Online.

However, in this blog post, I would like to discuss what “The Cloud” means to Enterprise-Resource-Planning (ERP) and why it is important for a company to have a strategy going forward.

“The Cloud” obviously comes in a number of flavours, namely:

  • Infrastructure-as-as-Service (IaaS)
  • Platform-as-a-Service (PaaS)
  • Software-as-a-Service (SaaS)

with SaaS being the highest abstraction layer in the stack. Also, “Cloud” services are offered by a number of different vendors including Amazon (AWS), Microsoft (Azure), Salesforce, Oracle and IBM (Softlayer). Each “Cloud” vendor delivers different layers in the stack or a combination thereof. However, it is fair to say that most vendors, except from Microsoft, IBM and Oracle, focus on either IaaS or SaaS.

This blog post will not go into details regarding vendors or market leadership, but this article in Business Insider gives som basic insights.

When designing a “Cloud” strategy, I believe it is important to ask some important questions:

  1. Are we only looking for a flexible, OPEX-based approach to provisioning infrastructure?
  2. Is costs the key driver for the transformation?
  3. Are we looking to convert legacy applications to “The Cloud” or will they stay on-premise?
  4. Are we willing to put mission critical workloads into “The Cloud”?
  5. Does the nature of our data require special attention?
  6. Will we be looking to leverage platform services such as databases and integration middleware in “The Cloud”?
  7. How will we want to deal with security in this new scenario?

In the following, I am sharing some of my experience and views on “Cloud” strategy.


Some wise person said that IaaS is just running your applications on someone else’s hardware and to a degree this is true. However, in my view the main benefits of basing your strategy on at least some IaaS component is:

  • Ability to scale quickly and unconstrained by the limitations of your data centre.
  • Simple and efficient hardware provisioning.
  • Spike management.

For ERP, starting out by using IaaS for development and test environments can be an effective strategy that allows you to quickly scale. Also, this approach gives you the opportunity to iron out issues before deploying production workloads to “The Cloud”.


Provisioning your platform products as services is a fairly new concept to most of us and takes some real consideration when designing your architecture. With Microsoft Azure you can for instance provision SQL Server as a service, but what if your transactional application resides on premise. Will the network be able to deal with the load?

A number of vendors including Microsoft (Azure) and Oracle (Cloud Platform) offer a comprehensive suite of platform services that allow developers to leverage sophisticated resources such as database services, integration services and analytics services. Also, in the case of Azure, you can leverage complex services including big data analytics, machine learning and Internet-of-Things (IoT) connectivity as PaaS.

PaaS vendors generally have a service, such as Azure Service Bus, that allows you to connect your on premise applications with your platform or application in “The Cloud”, but it will require some re-factoring of legacy applications to support this approach. So make sure to plan for this transitioning and code re-factoring if you consider leveraging PaaS.


Most people will be familiar with applications such as Salesforce, Microsoft Office 365 and ServiceNow. All of these are applications delivered through “The Cloud” and paid for through subscription in some form or other. In the ERP space the key “Cloud” player in recent years has been Netsuite, but now other vendors such as Microsoft is muscling in with Dynamics AX and Project Maidera, which is a scaled-down ERP solution tightly integrated with Office 365.

Obviously, SaaS provisioned applications rely on PaaS and IaaS to function so subscribing to a SaaS application means you implicitly leverage PaaS and IaaS services.

In my experience, the key architectural considerations when subscribing to a SaaS application are:

  1. How will the application integrate with my existing security set up and, who is overall accountable for security?
  2. Does the vendor’s application life-cycle management process align with my existing process and do I have any control over patching?
  3. If the application is mission-critical, how will I manage major incidents?
  4. Does the application contain personal or highly confidential data, which must be stored in a specific geographic location or may not be geo-replicated?
  5. Does my subscription or contract provide mechanisms for existing the relationship?
  6. Will deploying mission critical applications in “The Cloud” require special network connections?
  7. Am I able to get to my data for integration and analytics purposes?


So far, uptake of ERP in “The Cloud” has been fairly limited, but I believe this is about to change. Some ERP-related workloads such as CRM (Salesforce and Dynamics CRM), HRM (Workday), Service Management (ServiceNow) and Expense Management (Concur)  have for quite a while now been increasingly provisioned through “Cloud” subscription and I believe we will see more ERP vendors moving quickly to offering “Cloud” solutions in the near future.

For a small business, already using Salesforce or Dynamics CRM and Office 365, the next logical step is to subscribe to a cloud ERP solution such as Microsoft Project Madeira or Intuit Quick Books Online. These solutions are basically “vanilla” software packages offering fairly comprehensive functionality and integration to other popular small-business applications. For small businesses, subscribing to “Cloud” ERP is a compelling case with low initial costs and rapid implementation.

However, for medium-sized businesses and enterprises the switch to “Cloud” ERP may prove more complicated. As mentioned above, transitioning from on-premise ERP to “Cloud” ERP raises a number of architectural and security questions that must be dealt with through careful architectural planning.

Relationships with existing hosting and Application Management Services (AMS) partners may also need to be revisited to address scenarios that combine public and private “Cloud” applications with on-premise legacy applications and platform products. Staying in control of such an environment requires strong governance and change management skills.

In a transitional phase, we may in reality see “Cloud” ERP in the enterprise being deployed on-premise with some resources provisioned and consumed through cloud services. For Dynamics AX this is what is being promised with Azure Stack when released.

Regardless, implementing “Cloud” ERP in the enterprise will probably never be “vanilla” and should be approached with thoroughness and due consideration.


With Dynamics CRM we are seeing some features first becoming available in the “Cloud” edition and then subsequently released to on-premise. I am not sure this is a relevant reason to go for “Cloud” ERP. However, being able to off load accountability for performance, stability, security and patching to the vendor may be. Clearly, for small businesses unable to hire skilled IT staff this is a compelling reason, but for medium-sized businesses and enterprise with in-house skills our outsourcing agreements this may not be the case. At the end of the day, it probably comes down to an individual choice or business case for each company and costs.

As the workforce becomes more mobile, being able to access your ERP solution from anywhere, anytime becomes increasingly important. Scaling and deploying mobile on a global scale is very difficult and costly, so maybe mobile scenarios will be the compelling reason for some to move to “The Cloud”.


Certainly, there is the potential for disruption as ERP moves to “The Cloud”. Specialist vendors seem able to provide comprehensive features across some traditional ERP domains including:

  • HRM
  • S&OP
  • Expense Management
  • Service Management
  • Project Management
  • Accounts Payable
  • Procurement and Sourcing
  • Document management

However, these offerings do not yet strike at the core of ERP namely financials, supply chain management, order management and manufacturing, but who knows…

In my view, the core ERP solution for mid-sized businesses and enterprises will remain a force also in the future “Cloud” world, but we are likely to see more non-core workloads being deployed through alternative “Cloud” offerings.

As you can see from my musings above, I believe we are still in the very early days of “Cloud” ERP and it will be exciting to see what the future holds. I encourage you to use this blog post to share your thoughts and views on “Cloud” in general and more specifically, “Cloud” ERP.



Posted in Uncategorized | Leave a comment