I am very happy to announce that my blog is now also available on the Microsoft Dynamics 365 Enterprise edition community site. You can find it here:
You can also explore the Dynamics 365 Enterprise edition community site here:
I am very happy to announce that my blog is now also available on the Microsoft Dynamics 365 Enterprise edition community site. You can find it here:
You can also explore the Dynamics 365 Enterprise edition community site here:
I will start by being straight about where this blog post is heading:
I LOVE EXTENSIONS AND THE FACT THAT THE APPLICATION IS BEING SEALED AND I BELIEVE THIS IS THE BEST THING THAT HAS HAPPENED TO THE DYNAMICS AX / 365 (FOR OPERATIONS AND FINANCE) COMMUNITY FOR DECADES…
And why is that?
So, I am not mourning the “death” of overlaying.
However, I am not blind to the challenges presented by moving to extensions. Just mentioned a few:
However, all things considered, I think the ISV and VAR communities will benefit from extensions.
Only time will tell if my optimism is well-founded or extensions will become a menace to the ISV / VAR community, but I remain optimistic and believe those of us, who are passionate about ERP and Dynamics 365 for Operations and Finance, still have some great times to come… Bring it on!
In the past I have seen a lot of ERP project plans with a single-line entry called “go-live”, “cutover” or “final migration”. As I have argued in a previous post, this phase includes the cutover activity, which must be planned carefully in great detail and executed with precision.
Since I wrote that post, quite a few people have asked me:
“What should be in a cutover plan?”
In the following, I have tried to answer this question.
First of all, the cutover plan must contain a detailed list of all activities and their dependencies.
For each item in the cutover list I would normally record the following data:
Depending on your specific needs you may need other fields in the list, but these have served me well in the past.
In my experience it can be useful to place checkpoints in appropriate places throughout the schedule. If your list of activities is very extensive, some activities may fail, but the decision-maker decides to move forward anyway. Some activities may take longer to run or some unforseen event challenges the overall plan. The checkpoints can be used to gather key stakeholders to give them an overview and discuss how and if to proceed. If your activity list is short and the process well-proven probably you do not need checkpoints.
Ideally, the cutover plan should be accompanied by a plan that describes how you will fall back in the event of failure. It is difficult and cumbersome to align the fallback plan to each activity in the cutover plan so normally I align the two plans on checkpoint level. That also allows me to discuss more thoroughly with key stakeholders before deciding to fall back.
Key to a successful cutover is the ability to keep everyone involved informed at all times. The simple way is to give everyone access to the activity list (spreadsheet) and they can open it periodically to have a look. Alternatively, the cutover manager may send out the updated list at each checkpoint. On larger projects I have been involved with, the cutover manager has managed the list in SharePoint with a BI tool on top to allow users to see progress in a graphical format.
The last topic I would like to touch on is this: in the past I have found that developing the cutover plan as part of the on-going trial migrations can be extremely useful and by basing each trial migration on the provisional cutover plan you also start testing and proving your plan. Very often I have seen trial migrations focusing on the technical side of migrating data without ensuring that this migration is aligned with manual activities and reconciliation activities. By going through the cutover plan each time you perform a trial migration you prove the plan and start building more accurate timings.
All in all, this is not an out-of-the-box template for a cutover plan, but I hope this gives you enough inspiration to build your own plan based on the spceific needs of your projects.
Good luck cutting over…
For many years, ERP has been the glue that has connected business processes within a company and supported business process extension to external users. In recent years, ERP’s hegemony as the “system-to-bind-them-all” has come under pressure and digital transformation seems to be accelerating this change.
Since there is no formal definition of digital transformation, for the purpose of this blog post, I would like to define digital transformation as:
For many companies, digital transformation may mean implementation of some of the above, but not all. However, it is unlikely that a business should not do any of these is some form.
ERP as the Pervasive Technology
As long as I can remember, people have argued that efficiencies were found by implementing all business processes in a single, integrated software solution, namely ERP. Maybe this was true before cloud and the democratisation of integration technologies, but nowadays connecting Salesforce CRM to your ERP system is hardly rocket science nor does it take months – merely days or weeks. Also, people are increasingly weighing up the advantages of fully integrated processes with the need for speed and agility. Going into the era of digital transformation, I think we need to look at ERP as a landscape of connected business applications rather than a monolithic one-stop-shop. This is likely to challenge both IT departments and application management outsourcers because instead of focusing core skills on a single solution, they now need to understand and support multiple solutions – or leave it to “digital citizens” in the business to pick up this role. Certainly, digital transformation is likely to mean change in the way IT and outsources relate to business stakeholders.
Connecting External Stakeholders
Until now, ERP has predominantly been about supporting and optimising internal processes. Anything to do with external users has been handled on an arm’s-length principle through portals or similar technologies. This is not going to work in future. To become fully digital we need to connect all users through relevant technologies, apps, that are similar to what they use is other aspects of their digital life – or they will take their business elsewhere. ERP people have traditionally not understood or appreciated the need to leverage apps and SoMe as part of the business processes so here we are facing a major hurdle if we are to stay relevant in the digital transformation. Actually, the technology is there. What we need now is consultants and service providers, who can deliver on the opportunities.
New Business Models
Transforming ERP is always difficult and filled with risk. Therefore, the ERP area is inherently conservative. We have traditionally been very good at optimising existing process, or on a good day, suggest a new process, but expecting the ERP community to invent and drive through new business models may be “a bridge too far”. However, as new business models are being trialled and matured, ERP needs to become more responsive to earn a place in the future digital reality – or run the risk of becoming obsolete.
Leveraging New Technologies
It is fair to say that ERP vendors are currently doing their level-best to promote new technologies such as machine learning as complimentary to their ERP package, but so far it seems more tentative than pervasive. However, I believe that this is where the biggest opportunity lies for ERP in the digital era. If, for example, we can leverage machine learning to truly optimise internal business processes such as AP automation, maybe ERP as we know it, still have a future, but it will require a new set of thinking for ISVs. I am pretty sure that the ERP vendors will offer these technologies as-a-platform, but leave the actual implementation to ISVs and VARs. In many ways this is an understandable approach, but the risk is that independent software providers will quickly cease the initiative and offer this as an API in the cloud.
We Need to be Agile
With the technologies for digital transformation coming on-stream and maturing the emphasis now is on how we deliver. The traditional approach with year-long risk-filled transformation programmes may still be relevant for core ERP, but to stay relevant in future we need to find a way to trial and mature new processes, business models and approaches on a smaller scale and in a more agile fashion. The success, I think, relies on better change and risk management and not technology focused methods. I am sure I will come back to this theme in a future post.
In the above I have shared my current thoughts on where I see ERP’s future in the digital era and how ERP can be part of the digital transformation. Beyond core business processes, I still think ERP can play an important role in future, but now is the time for service providers (ISVs, VARs and outsourcers) to step up and provide customers with forward-thinking solutions that align with the new business models delivered through fast and agile methods. Or risk becoming obsolete!
In recent weeks I have been in a number of discussions on how to support various integration scenarios through Dynamics 365 Operations (formerly known as Dynamics AX and hereafter referred to as D365O). Let’s be clear from the outset, I consider D365O as a pure cloud-delivered solution, so I am not considering any potential on-premise scenarios in this blog post.
With D365O being a cloud delivered solution and with (virtually) all customers having most of their other applications running on-premise at present, a key challenge is how we move data to and from Azure in this hybrid scenario. The Help WiKi for D365O is not flush with information on how to do this. However, a couple of things can now be ascertained in relation to how we integrate:
Let’s deal with the simpler form of integration first. Already in Dynamics AX 2012 we were using SOAP-based web services. This does not appear to have changed much in D365O. In addition to SOAP-based web services, D365O now also supports REST allowing us to exchange JSON messages and perform CRUD operations using ODATA (you can read more about the ODATA protocol here).
For more information on supported services, see this Help WiKi article.
Conceptually, integration to D365O revolves around the data entity. The normalised database schema is aggregated into a number of high-level logical units called data entities to hide the physical implementation and make it easier for developers, who do not need to understand the underlying schema. An example is the Customers data entity. This data entity maps the simple concept of a customer into multiple tables in D365O.
This type of integration makes is relatively simple to integrate synchronously with D365O across a wide range of apps and systems. I have not been able to find a description of how Microsoft intends to scale out the service, so that remains to be seen.
To enable easy integration across the Dynamics 365 stack and Office 365, Microsoft is developing the Common Data Model (CDM), which is similar to data entities, but spans across all the applications in the ecosystem. This article by Ukka Niiranen is a pretty introduction to CDM. As the article indicates, CDM is intrisically linked with Flow.
With synchroneous integration taken care of through web services, let’s turn to asynchroneous integration. Data import and export is configured through the Data Management Workspace:
In the Data Management Workspace you are able to configure multiple data sources. Out-of-the-box data sources includes D365O itself, Excel and CSV. You can reconfigure these or you can add new ones (also based on ODBC and XML). In this example I am creating a Data Project that exports all exchange rates to an Excel file:
In the data project, I can select one or more data entities to export and determine the sequencing. This means I can export all currencies before I export all exchange rates, which makes sense.
Once I have saved the data project, I am able to set up a recurring data job:
The following screenshot shows the exchange rates Excel spreadsheet generated by the export:
Once I am done with the Data Project I download the Data Package file, which defines the structure of my Data Project.
Now I am able to import the data into an instance of D365O by creating an import Data Project and using the Package data source format. This can also be set up to happen on an a recurring basis. Pretty simple stuff, really.
THE TRICK IS GETTING DATA TO THE CLOUD…
This simple example shows how easy it is to configure and execute recurring import / export jobs using the Data Management framework and data entities. However, since D365O runs in the cloud (Azure), we need to get data in to or out of Azure to be able to process them. Probably the simplest way is to use the Azure Service Bus. Transferring data to and from Azure Service Bus, however, this will require some sort of broker service such as BizTalk, but that discussion is for another day… I think, at this stage, it is fair to say that D365O offers good tools for integration, but operating in a hybrid world will require some extra work for the foreseeable future.
One of the key questions asked at the beginning of every ERP project I have taken part in is:
HOW SHALL WE GO ABOUT IT?
Since ERP (or MRP) implementations have been around at least since the early ’70s (and probably before) you would have thought that a best-practice approach has been identified by now and thoroughly documented for all to use, but in my experience this is not the case.
In the last 20+ years, I have seen widely varying approaches to ERP implementation ranging from “make-it-up-as-we-go-along” to minutely specified implementations with detailed plans and rigid change control. The question is: what is the right approach?
In my experience, the right approach has a lot to do with getting the balance right between:
In the following, I have shared some of my thoughts on the different approaches I am familiar with.
“WE WANT STANDARD!”
However, before I start getting into the meat of my musings, I would like to say a few words about the “We Want Standard” statement that is being uttered at every kick-off on every ERP project ever undertaken. Obviously, using the standard solution as far as possible makes sense for the following reasons:
So, if everyone agrees that a company should take the standard solution and this approach obviously is the most efficient, why do we need a (sometimes) very lengthy analysis and design activity? Would it not make more sense to get straight into the build phase and start configuring the solution and migrating data?
Firstly, all companies are not the same, so some customisation is required to make any solution fit the business. Period! Secondly, often the ERP solution needs to be integrated with a number of other software packages and this requires bespoke development.
Lastly, analysis and design is not only about customisation and development. Analysis and design, conducted properly, should also bring these additional benefits:
In my experience, the value of good analysis and design should not be underestimated.
This is, as many of you know, the classic approach to software implementation and dates back to the ’50s. You can read more about it in this Wikipedia article. I would wager that this is how the majority of ERP implementations are being carried out today. Not necessarily in its purest form, but through some variation on the theme.
The cornerstone of the waterfall methodology is the Requirements Specification (The Spec). The Spec is a document that is supposed to document all functional and non-functional requirements for the solution. With The Spec, the customer and supplier are able to perform a User Acceptance Test (UAT) at the end of the implementation and clearly ascertain whether each requirement has been delivered – to spec.
In theory this is a pure and simple way of implementing software. It lends itself nicely to rigid contracts, even fixed price, because all requirements are there for all to see and evaluate. However, requirements are in reality often vague and ambiguous and difficult to test.
The main problems with ERP implementations and the waterfall method, in my experience, are:
Historically, The Spec has been used as the key document for tenders (ITTs, RFPs and RFQs) allowing potential suppliers to tender for the “same” solution. However, experience tells us that tendering for the same solution based on The Spec may still throw up wildly different proposals and offers from suppliers. This, in my experience, is because any written requirement allows for interpretation and assumptions to be made.
Those familiar with the Dynamics SureStep methodology used by partners to implement Microsoft Dynamics AX will probably recognise that this methodology is a (modified) waterfall approach. Many partners have adapted the model to allow for some flexibility to mitigate the weaknesses of the pure waterfall approach.
So, is the waterfall approach still relevant with all its shortcomings? In my view, yes! If applied sensibly with judicious application of change control and some agile elements (during build), it still makes sense and can be a strong tool to govern the relationship between customer and supplier.
The strength of the waterfall approach is its rigid approach to stage-gating. Basically, you cannot move forward in the implementation cycle without agreement on key deliverables. This is sound practice in an ERP implementation. Building on this strength, the waterfall approach can be improved by making it iterative. With an iterative approach, the customer and supplier can agree to go back and revisit a requirement, if it is deemed to be wrong or no longer relevant. This allows some flexibility and gives the customer a better solution at the end of the day.
Using an iterative approach, however, requires strong change management. Revisiting a requirement should be done under change control to ascertain impact on time and economy. Also, a change to a requirement may impact other requirements and design, so cross-functional coordination is necessary.
Poorly managed iterative implementations tend to become chaotic because anything can be questioned at any time. This should not be allowed.
Prototyping, often referred to as a Conference Room Pilot (CRP), is an approach where the analysis and design phases are replaced by hands-on sessions where users and consultants sit together and walk through the business processes using a mock-up of the solution. The solution mock-up often contains partially migrated (master) data to give the user a more comprehensive experience of the future solution.
In my experience, prototyping can be really efficient and beneficial for small-scale implementations where complexity is low and cross-functional coordination is minimal. However, as complexity grows the need to communicate through written documentation becomes more pronounced and forces prototyping to slip back toward an iterative waterfall approach but without the benefits of genuine stage-gating and change control.
In some implementations, I have seen prototyping used for parts of the analysis and design, but within a waterfall framework. This combination can be very powerful.
To make prototyping successful, though, requires a thorough understanding of the company’s business processes upfront by the consultants to allow them to configure the CRP in a meaningful way. Without this thorough process knowledge, prototyping in my view, carries significant risks.
I will not pretend to be an expert on agile software development. My experience with agile as an approach is based on a number of ERP implementations where agile methods and tools were used. Firstly, I would say that applying agile to the customisation and development processes in an implementation works nicely and should be done unless there some compelling reason not to do it.
However, applying agile to analysis, design and configuration is a slightly different matter. For agile to work, a comprehensive product backlog is required, which can be divided into sprints. For analysis, it is often difficult to produce a comprehensive product backlog upfront because, by definition, the analysis should be used to understand what the implementation must deliver ie. create the product backlog.
During design, as the (build phase) product backlog starts building, it becomes clear that with the complexity of ERP, the cross-functional dependencies are significant. As items in one backlog may depend heavily on items in other backlogs it becomes increasingly difficult to prioritise the sprint backlogs. If items start falling behind schedule, prioritisation in other work streams may ground to a halt.
One compelling reason for applying an agile approach is to be able to deliver tangible results more quickly. ERP with its interwoven processes and dependencies is often difficult to deliver in discrete chunks of functionality. Therefore, agile may just become another way of performing ordinary project activity planning.
Another compelling reason for applying an agile approach to an ERP implementation is the ability for the customer to de-prioritise unimportant features. I have seen this work in practice, but in my view there are some prerequisites that need to be in place:
Apart from selecting the right methodology and approach, organising the work streams can be another challenge and it is important to get it right. The challenge is to create a work stream organisation with the right level of granularity. Often, work streams are constructed using a traditional swim lane approach such as:
However, this may result in some massive work streams such as Order-to-Cash basically encompassing everything the company does except from finance. If the work streams become too granular and focus on smaller process areas, the cross-functional coordination may become too cumbersome. Striking the right balance is essential and very company-specific, so before constructing the work streams thorough assessment and dialogue is required. In a previous blog post, I wrote about using APQC as a framework for organising work streams.
“NO WORRIES, I HAVE A FIXED PRICE CONTRACT!”
Regardless which approach is being taken, I have heard many project owners over time say that they do not care as long as the price is fixed. Personally, I strongly disagree with this viewpoint. What is the benefit of having a fixed price, if the system does not work as intended, at the end of the day? With a fixed price comes a fixed scope and potentially endless quibbling over scope changes.
All this leads me to the question:
If I was to start a new ERP implementation tomorrow, which approach would I take?
Firstly, I would assess the size and complexity of the implementation. Also, I would ascertain my company’s risk appetite and the maturity of my own organisation.
If the implementation was small with relatively little complexity and my company was happy to accept a certain financial risk, I would choose to conduct a scaled-down analysis and design phase (potentially using prototyping) to fix the scope and build a sensible product backlog. Then I would execute the build and deployment phases with an agile approach.
If, on the other hand, the implementation was large and complex, I would frame an iterative waterfall approach to ensure the proper stage-gating with strong governance. With many stakeholders in a project, I believe this is still the most prudent and effective approach. However, if possible, I would certainly want the development carried out using an agile approach.
Lastly, I would like to emphasise that, in my experience, it is important to have a close dialogue with potential suppliers on this. Suppliers have different levels of experience with these approaches and their contractual frameworks may not be well-aligned to them all. It would not be advisable to force a supplier to use an approach they are not familiar with leveraging a misaligned contractual framework.
So, in my view, it all comes down to striking the right balance and combining the right elements.
In recent years, a lot has been said and written about “The Cloud” and why we all need it. Obviously, we all use “The Cloud” in our daily lives through our smartphones (e.g. Garmin Connect, Apple iCloud Photostream etc.) or through Microsoft Office 365, OneDrive or Dropbox on our desktop computers. Some even work in genuine “Cloud” applications such as Salesforce, WorkDay or Microsoft Dynamics CRM Online.
However, in this blog post, I would like to discuss what “The Cloud” means to Enterprise-Resource-Planning (ERP) and why it is important for a company to have a strategy going forward.
“The Cloud” obviously comes in a number of flavours, namely:
with SaaS being the highest abstraction layer in the stack. Also, “Cloud” services are offered by a number of different vendors including Amazon (AWS), Microsoft (Azure), Salesforce, Oracle and IBM (Softlayer). Each “Cloud” vendor delivers different layers in the stack or a combination thereof. However, it is fair to say that most vendors, except from Microsoft, IBM and Oracle, focus on either IaaS or SaaS.
This blog post will not go into details regarding vendors or market leadership, but this article in Business Insider gives som basic insights.
When designing a “Cloud” strategy, I believe it is important to ask some important questions:
In the following, I am sharing some of my experience and views on “Cloud” strategy.
Some wise person said that IaaS is just running your applications on someone else’s hardware and to a degree this is true. However, in my view the main benefits of basing your strategy on at least some IaaS component is:
For ERP, starting out by using IaaS for development and test environments can be an effective strategy that allows you to quickly scale. Also, this approach gives you the opportunity to iron out issues before deploying production workloads to “The Cloud”.
Provisioning your platform products as services is a fairly new concept to most of us and takes some real consideration when designing your architecture. With Microsoft Azure you can for instance provision SQL Server as a service, but what if your transactional application resides on premise. Will the network be able to deal with the load?
A number of vendors including Microsoft (Azure) and Oracle (Cloud Platform) offer a comprehensive suite of platform services that allow developers to leverage sophisticated resources such as database services, integration services and analytics services. Also, in the case of Azure, you can leverage complex services including big data analytics, machine learning and Internet-of-Things (IoT) connectivity as PaaS.
PaaS vendors generally have a service, such as Azure Service Bus, that allows you to connect your on premise applications with your platform or application in “The Cloud”, but it will require some re-factoring of legacy applications to support this approach. So make sure to plan for this transitioning and code re-factoring if you consider leveraging PaaS.
Most people will be familiar with applications such as Salesforce, Microsoft Office 365 and ServiceNow. All of these are applications delivered through “The Cloud” and paid for through subscription in some form or other. In the ERP space the key “Cloud” player in recent years has been Netsuite, but now other vendors such as Microsoft is muscling in with Dynamics AX and Project Maidera, which is a scaled-down ERP solution tightly integrated with Office 365.
Obviously, SaaS provisioned applications rely on PaaS and IaaS to function so subscribing to a SaaS application means you implicitly leverage PaaS and IaaS services.
In my experience, the key architectural considerations when subscribing to a SaaS application are:
“THE CLOUD” AND ERP
So far, uptake of ERP in “The Cloud” has been fairly limited, but I believe this is about to change. Some ERP-related workloads such as CRM (Salesforce and Dynamics CRM), HRM (Workday), Service Management (ServiceNow) and Expense Management (Concur) have for quite a while now been increasingly provisioned through “Cloud” subscription and I believe we will see more ERP vendors moving quickly to offering “Cloud” solutions in the near future.
For a small business, already using Salesforce or Dynamics CRM and Office 365, the next logical step is to subscribe to a cloud ERP solution such as Microsoft Project Madeira or Intuit Quick Books Online. These solutions are basically “vanilla” software packages offering fairly comprehensive functionality and integration to other popular small-business applications. For small businesses, subscribing to “Cloud” ERP is a compelling case with low initial costs and rapid implementation.
However, for medium-sized businesses and enterprises the switch to “Cloud” ERP may prove more complicated. As mentioned above, transitioning from on-premise ERP to “Cloud” ERP raises a number of architectural and security questions that must be dealt with through careful architectural planning.
Relationships with existing hosting and Application Management Services (AMS) partners may also need to be revisited to address scenarios that combine public and private “Cloud” applications with on-premise legacy applications and platform products. Staying in control of such an environment requires strong governance and change management skills.
In a transitional phase, we may in reality see “Cloud” ERP in the enterprise being deployed on-premise with some resources provisioned and consumed through cloud services. For Dynamics AX this is what is being promised with Azure Stack when released.
Regardless, implementing “Cloud” ERP in the enterprise will probably never be “vanilla” and should be approached with thoroughness and due consideration.
WILL THE BENEFITS OUTWEIGH THE CHALLENGES?
With Dynamics CRM we are seeing some features first becoming available in the “Cloud” edition and then subsequently released to on-premise. I am not sure this is a relevant reason to go for “Cloud” ERP. However, being able to off load accountability for performance, stability, security and patching to the vendor may be. Clearly, for small businesses unable to hire skilled IT staff this is a compelling reason, but for medium-sized businesses and enterprise with in-house skills our outsourcing agreements this may not be the case. At the end of the day, it probably comes down to an individual choice or business case for each company and costs.
As the workforce becomes more mobile, being able to access your ERP solution from anywhere, anytime becomes increasingly important. Scaling and deploying mobile on a global scale is very difficult and costly, so maybe mobile scenarios will be the compelling reason for some to move to “The Cloud”.
DOES “THE CLOUD” SPELL THE END FOR MONOLITHIC ERP?
Certainly, there is the potential for disruption as ERP moves to “The Cloud”. Specialist vendors seem able to provide comprehensive features across some traditional ERP domains including:
However, these offerings do not yet strike at the core of ERP namely financials, supply chain management, order management and manufacturing, but who knows…
In my view, the core ERP solution for mid-sized businesses and enterprises will remain a force also in the future “Cloud” world, but we are likely to see more non-core workloads being deployed through alternative “Cloud” offerings.
As you can see from my musings above, I believe we are still in the very early days of “Cloud” ERP and it will be exciting to see what the future holds. I encourage you to use this blog post to share your thoughts and views on “Cloud” in general and more specifically, “Cloud” ERP.