Is Digital Transformation the End for ERP?

digital-transformation

For many years, ERP has been the glue that has connected business processes within a company and supported business process extension to external users. In recent years, ERP’s hegemony as the “system-to-bind-them-all” has come under pressure and digital transformation seems to be accelerating this change.

Since there is no formal definition of digital transformation, for the purpose of this blog post, I would like to define digital transformation as:

  1. Pervasive use of technology across business processes.
  2. Use of technology to connect external users directly to the company’s business processes.
  3. Implementation of business and operating models based on the use of technology.
  4. Leveraging technologies that go beyond simple process optimisation such as predictive analytics, APIs, machine learning etc.

For many companies, digital transformation may mean implementation of some of the above, but not all. However, it is unlikely that a business should not do any of these is some form.

ERP as the Pervasive Technology

As long as I can remember, people have argued that efficiencies were found by implementing all business processes in a single, integrated software solution, namely ERP. Maybe this was true before cloud and the democratisation of integration technologies, but nowadays connecting Salesforce CRM to your ERP system is hardly rocket science nor does it take months – merely days or weeks. Also, people are increasingly weighing up the advantages of fully integrated processes with the need for speed and agility. Going into the era of digital transformation, I think we need to look at ERP as a landscape of connected business applications rather than a monolithic one-stop-shop. This is likely to challenge both IT departments and application management outsourcers because instead of focusing core skills on a single solution, they now need to understand and support multiple solutions – or leave it to “digital citizens” in the business to pick up this role. Certainly, digital transformation is likely to mean change in the way IT and outsources relate to business stakeholders.

Connecting External Stakeholders

Until now, ERP has predominantly been about supporting and optimising internal processes. Anything to do with external users has been handled on an arm’s-length principle through portals or similar technologies. This is not going to work in future. To become fully digital we need to connect all users through relevant technologies, apps, that are similar to what they use is other aspects of their digital life – or they will take their business elsewhere.  ERP people have traditionally not understood or appreciated the need to leverage apps and SoMe as part of the business processes so here we are facing a major hurdle if we are to stay relevant in the digital transformation. Actually, the technology is there. What we need now is consultants and service providers, who can deliver on the opportunities.

New Business Models

Transforming ERP is always difficult and filled with risk. Therefore, the ERP area is inherently conservative. We have traditionally been very good at optimising existing process, or on a good day, suggest a new process, but expecting the ERP community to invent and drive through new business models may be “a bridge too far”. However, as new business models are being trialled and matured, ERP needs to become more responsive to earn a place in the future digital reality – or run the risk of becoming obsolete.

Leveraging New Technologies

It is fair to say that ERP vendors are currently doing their level-best to promote new technologies such as machine learning as complimentary to their ERP package, but so far it seems more tentative than pervasive. However, I believe that this is where the biggest opportunity lies for ERP in the digital era. If, for example,  we can leverage machine learning to truly optimise internal business processes such as AP automation, maybe ERP as we know it, still have a future, but it will require a new set of thinking for ISVs. I am pretty sure that the ERP vendors will offer these technologies as-a-platform, but leave the actual implementation to ISVs and VARs. In many ways this is an understandable approach, but the risk is that independent software providers will quickly cease the initiative and offer this as an API in the cloud.

We Need to be Agile

With the technologies for digital transformation coming on-stream and maturing the emphasis now is on how we deliver. The traditional approach with year-long risk-filled transformation programmes may still be relevant for core ERP, but to stay relevant in future we need to find a way to trial and mature new processes, business models and approaches on a smaller scale and in a more agile fashion. The success, I think, relies on better change and risk management and not technology focused methods. I am sure I will come back to this theme in a future post.

In the above I have shared my current thoughts on where I see ERP’s future in the digital era and how ERP can be part of the digital transformation. Beyond core business processes, I still think ERP can play an important  role in future, but now is the time for service providers (ISVs, VARs and outsourcers) to step up and provide customers with forward-thinking solutions that align with the new business models delivered through fast and agile methods. Or risk becoming obsolete!

 

Advertisements
Posted in Uncategorized | Leave a comment

Integration With Dynamics 365

binary

In recent weeks I have been in a number of discussions on how to support various integration scenarios through Dynamics 365 Operations (formerly known as Dynamics AX and hereafter referred to as D365O). Let’s be clear from the outset, I consider D365O as a pure cloud-delivered solution, so I am not considering any potential on-premise scenarios in this blog post.

With D365O being a cloud delivered solution and with (virtually) all customers having most of their other applications running on-premise at present, a key challenge is how we move data to and from Azure in this hybrid scenario. The Help WiKi for D365O is not flush with information on how to do this. However, a couple of things can now be ascertained in relation to how we integrate:

  • peer-to-peer integration (or system integration) is supposed to be performed using web services either through the SOAP or the REST protocols.
  • recurring integration scenarios are supported through processing of Data Packages configured in Data Projects.
  • all integration revolves around the data entity concept.

PEER-TO-PEER INTEGRATIONS

Let’s deal with the simpler form of integration first. Already in Dynamics AX 2012 we were using SOAP-based web services. This does not appear to have changed much in D365O. In addition to SOAP-based web services, D365O now also supports REST allowing us to exchange JSON messages and perform CRUD operations using ODATA (you can read more about the ODATA protocol here).

For more information on supported services, see this Help WiKi article.

Conceptually, integration to D365O revolves around the data entity. The normalised database schema is aggregated into a number of high-level logical units called data entities to hide the physical implementation and make it easier for developers, who do not need to understand the underlying schema. An example is the Customers data entity. This data entity maps the simple concept of a customer into multiple tables in D365O.

This type of integration makes is relatively simple to integrate synchronously with D365O across a wide range of apps and systems. I have not been able to find a description of how Microsoft intends to scale out the service, so that remains to be seen.

To enable easy integration across the Dynamics 365 stack and Office 365, Microsoft is developing the Common Data Model (CDM), which is similar to data entities, but spans across all the applications in the ecosystem. This article by Ukka Niiranen is a pretty introduction to CDM. As the article indicates, CDM is intrisically linked with Flow.

RECURRING INTEGRATIONS

With synchroneous integration taken care of through web services, let’s turn to asynchroneous integration. Data import and export is configured through the Data Management Workspace:

data-management-workspace

In the Data Management Workspace you are able to configure multiple data sources. Out-of-the-box data sources includes D365O itself, Excel and CSV. You can reconfigure these or you can add new ones (also based on ODBC and XML). In this example I am creating a Data Project that exports all exchange rates to an Excel file:

export-project

In the data project, I can select one or more data entities to export and determine the sequencing. This means I can export all currencies before I export all exchange rates, which makes sense.

Once I have saved the data project, I am able to set up a recurring data job:

recurring-export-jobThe following screenshot shows the exchange rates Excel spreadsheet generated by the export:

exchange-rates-spreadsheet

Once I am done with the Data Project I download the Data Package file, which defines the structure of my Data Project.

Now I am able to import the data into an instance of D365O by creating an import Data Project and using the Package data source format. This can also be set up to happen on an a recurring basis. Pretty simple stuff, really.

THE TRICK IS GETTING DATA TO THE CLOUD…

This simple example shows how easy it is to configure and execute recurring import / export jobs using the Data Management framework and data entities. However, since D365O runs in the cloud (Azure), we need to get data in to or out of  Azure to be able to process them. Probably the simplest way is to use the Azure Service Bus. Transferring data to and from Azure Service Bus, however, this will require some sort of broker service such as BizTalk, but that discussion is for another day… I think, at this stage, it is fair to say that D365O offers good tools for integration, but operating in a hybrid world will require some extra work for the foreseeable future.

Posted in Uncategorized | Leave a comment

ERP – What is the Right Approach?

One of the key questions asked at the beginning of every ERP project I have taken part in is:

HOW SHALL WE GO ABOUT IT?

Since ERP (or MRP) implementations have been around at least since the early ’70s (and probably before) you would have thought that a best-practice approach has been identified by now and thoroughly documented for all to use, but in my experience this is not the case.

In the last 20+ years, I have seen widely varying approaches to ERP implementation ranging from “make-it-up-as-we-go-along” to minutely specified implementations with detailed plans and rigid change control. The question is: what is the right approach?

In my experience, the right approach has a lot to do with getting the balance right between:

  • Documentation.
  • Planning granularity.
  • Effective and efficient communication.
  • Risk and flexibility.

In the following, I have shared some of my thoughts on the different approaches I am familiar with.

“WE WANT STANDARD!”

However, before I start getting into the meat of my musings, I would like to say a few words about the “We Want Standard” statement that is being uttered at every kick-off on every ERP project ever undertaken. Obviously, using the standard solution as far as possible makes sense for the following reasons:

  • Allows future upgrades.
  • Reduces number of defects.
  • Reduces implementation time.

So, if everyone agrees that a company should take the standard solution and this approach obviously is the most efficient, why do we need a (sometimes) very lengthy analysis and design activity? Would it not make more sense to get straight into the build phase and start configuring the solution and migrating data?

Firstly, all companies are not the same, so some customisation is required to make any solution fit the business. Period! Secondly, often the ERP solution needs to be integrated with a number of other software packages and this requires bespoke development.

Lastly, analysis and design is not only about customisation and development. Analysis and design, conducted properly, should also bring these additional benefits:

  • Give key users a thorough understanding of the solution.
  • Allow the company to revisit current practices and processes and design future (and hopefully improved) processes based on the new solution.
  • Thoroughly plan for and design data migration and cut over.

In my experience, the value of good analysis and design should not be underestimated.

WATERFALL

This is, as many of you know, the classic approach to software implementation and dates back to the ’50s. You can read more about it in this Wikipedia article. I would wager that this is how the majority of ERP implementations are being carried out today. Not necessarily in its purest form, but through some variation on the theme.

The cornerstone of the waterfall methodology is the Requirements Specification (The Spec). The Spec is a document that is supposed to document all functional and non-functional requirements for the solution. With The Spec, the customer and supplier are able to perform a User Acceptance Test (UAT) at the end of the implementation and clearly ascertain whether each requirement has been delivered – to spec.

In theory this is a pure and simple way of implementing software. It lends itself nicely to rigid contracts, even fixed price, because all requirements are there for all to see and evaluate. However, requirements are in reality often vague and ambiguous and difficult to test.

The main problems with ERP implementations and the waterfall method, in my experience, are:

  • In its purest form, the waterfall approach requires very thorough documentation of requirements.
  • The customer is often not able to clearly express all requirements leaving room for gaps and interpretation in the contracts.
  • Since 90% of the ERP solution is a standard package, The Spec will inevitably focus on the gaps and not document fit requirements well.
  • If The Spec remains unchanged throughout the implementation, the customer is unlikely to end up with the system they want.

Historically, The Spec has been used as the key document for tenders (ITTs, RFPs and RFQs) allowing potential suppliers to tender for the “same” solution. However, experience tells us that tendering for the same solution based on The Spec may still throw up wildly different proposals and offers from suppliers. This, in my experience, is because any written requirement allows for interpretation and assumptions to be made.

Those familiar with the Dynamics SureStep methodology used by partners to implement Microsoft Dynamics AX will probably recognise that this methodology is a (modified) waterfall approach. Many partners have adapted the model to allow for some flexibility to mitigate the weaknesses of the pure waterfall approach.

So, is the waterfall approach still relevant with all its shortcomings? In my view, yes! If applied sensibly with judicious application of change control and some agile elements (during build), it still makes sense and can be a strong tool to govern the relationship between customer and supplier.

ITERATIVE WATERFALL

The strength of the waterfall approach is its rigid approach to stage-gating. Basically, you cannot move forward in the implementation cycle without agreement on key deliverables. This is sound practice in an ERP implementation. Building on this strength, the waterfall approach can be improved by making it iterative. With an iterative approach, the customer and supplier can agree to go back and revisit a requirement, if it is deemed to be wrong or no longer relevant. This allows some flexibility and gives the customer a better solution at the end of the day.

Using an iterative approach, however, requires strong change management. Revisiting a requirement should be done under change control to ascertain impact on time and economy. Also, a change to a requirement may impact other requirements and design, so cross-functional coordination is necessary.

Poorly managed iterative implementations tend to become chaotic because anything can be questioned at any time. This should not be allowed.

PROTOTYPING

Prototyping, often referred to as a Conference Room Pilot (CRP), is an approach where the analysis and design phases are replaced by hands-on sessions where users and consultants sit together and walk through the business processes using a mock-up of the solution. The solution mock-up often contains partially migrated (master) data to give the user a more comprehensive experience of the future solution.

In my experience, prototyping can be really efficient and beneficial for small-scale implementations where complexity is low and cross-functional coordination is minimal. However, as complexity grows the need to communicate through written documentation becomes more pronounced and forces prototyping to slip back toward an iterative waterfall approach but without the benefits of genuine stage-gating and change control.

In some implementations, I have seen prototyping used for parts of the analysis and design, but within a waterfall framework. This combination can be very powerful.

To make prototyping successful, though, requires a thorough understanding of the company’s business processes upfront by the consultants to allow them to configure the CRP in a meaningful way. Without this thorough process knowledge, prototyping in my view, carries significant risks.

AGILE

I will not pretend to be an expert on agile software development. My experience with agile as an approach is based on a number of ERP implementations where agile methods and tools were used. Firstly, I would say that applying agile to the customisation and development processes in an implementation works nicely and should be done unless there some compelling reason not to do it.

However, applying agile to analysis, design and configuration is a slightly different matter. For agile to work, a comprehensive product backlog is required, which can be divided into sprints. For analysis, it is often difficult to produce a comprehensive product backlog upfront because, by definition, the analysis should be used to understand what the implementation must deliver ie. create the product backlog.

During design, as the (build phase) product backlog starts building, it becomes clear that with the complexity of ERP, the cross-functional dependencies are significant. As items in one backlog may depend heavily on items in other backlogs it becomes increasingly difficult to prioritise the sprint backlogs. If items start falling behind schedule, prioritisation in other work streams may ground to a halt.

One compelling reason for applying an agile approach is to be able to deliver tangible results more quickly. ERP with its interwoven processes and dependencies is often difficult to deliver in discrete chunks of functionality. Therefore, agile may just become another way of performing ordinary project activity planning.

Another compelling reason for applying an agile approach to an ERP implementation is the ability for the customer to de-prioritise unimportant features. I have seen this work in practice, but in my view there are some prerequisites that need to be in place:

  • The customer is willing to accept the financial risks. They pay for the features they prioritise.
  • Strong cross-functional communication and coordination is required.
  • Users in the work streams need clear guidance on how items in the backlogs can be prioritised.

ORGANISATION

Apart from selecting the right methodology and approach, organising the work streams can be another challenge and it is important to get it right. The challenge is to create a work stream organisation with the right level of granularity. Often, work streams are constructed using a traditional swim lane approach such as:

  • Order-to-Cash.
  • Procure-to-Pay.
  • Hire-to-Retire.
  • Etc.

However, this may result in some massive work streams such as Order-to-Cash basically encompassing everything the company does except from finance. If the work streams become too granular and focus on smaller process areas, the cross-functional coordination may become too cumbersome. Striking the right balance is essential and very company-specific, so before constructing the work streams thorough assessment and dialogue is required. In a previous blog post, I wrote about using APQC as a framework for organising work streams.

“NO WORRIES, I HAVE A FIXED PRICE CONTRACT!”

Regardless which approach is being taken, I have heard many project owners over time say that they do not care as long as the price is fixed. Personally, I strongly disagree with this viewpoint. What is the benefit of having a fixed price, if the system does not work as intended, at the end of the day? With a fixed price comes a fixed scope and potentially endless quibbling over scope changes.

RECOMMENDATION

All this leads me to the question:

If I was to start a new ERP implementation tomorrow, which approach would I take?

Firstly, I would assess the size and complexity of the implementation. Also, I would ascertain my company’s risk appetite and the maturity of my own organisation.

If the implementation was small with relatively little complexity and my company was happy to accept a certain financial risk, I would choose to conduct a scaled-down analysis and design phase (potentially using prototyping) to fix the scope and build a sensible product backlog. Then I would execute the build and deployment phases with an agile approach.

If, on the other hand, the implementation was large and complex, I would frame an iterative waterfall approach to ensure the proper stage-gating with strong governance. With many stakeholders in a project, I believe this is still the most prudent and effective approach. However, if possible, I would certainly want the development carried out using an agile approach.

Lastly, I would like to emphasise that, in my experience, it is important to have a close dialogue with potential suppliers on this. Suppliers have different levels of experience with these approaches and their contractual frameworks may not be well-aligned to them all. It would not be advisable to force a supplier to use an approach they are not familiar with leveraging a misaligned contractual framework.

So, in my view, it all comes down to striking the right balance and combining the right elements.

Posted in Uncategorized | Leave a comment

Cloud ERP… Is It the Future?

In recent years, a lot has been said and written about “The Cloud” and why we all need it. Obviously, we all use “The Cloud” in our daily lives through our smartphones (e.g. Garmin Connect, Apple iCloud Photostream etc.) or through Microsoft Office 365, OneDrive or Dropbox on our desktop computers. Some even work in genuine “Cloud” applications such as Salesforce, WorkDay or Microsoft Dynamics CRM Online.

However, in this blog post, I would like to discuss what “The Cloud” means to Enterprise-Resource-Planning (ERP) and why it is important for a company to have a strategy going forward.

“The Cloud” obviously comes in a number of flavours, namely:

  • Infrastructure-as-as-Service (IaaS)
  • Platform-as-a-Service (PaaS)
  • Software-as-a-Service (SaaS)

with SaaS being the highest abstraction layer in the stack. Also, “Cloud” services are offered by a number of different vendors including Amazon (AWS), Microsoft (Azure), Salesforce, Oracle and IBM (Softlayer). Each “Cloud” vendor delivers different layers in the stack or a combination thereof. However, it is fair to say that most vendors, except from Microsoft, IBM and Oracle, focus on either IaaS or SaaS.

This blog post will not go into details regarding vendors or market leadership, but this article in Business Insider gives som basic insights.

When designing a “Cloud” strategy, I believe it is important to ask some important questions:

  1. Are we only looking for a flexible, OPEX-based approach to provisioning infrastructure?
  2. Is costs the key driver for the transformation?
  3. Are we looking to convert legacy applications to “The Cloud” or will they stay on-premise?
  4. Are we willing to put mission critical workloads into “The Cloud”?
  5. Does the nature of our data require special attention?
  6. Will we be looking to leverage platform services such as databases and integration middleware in “The Cloud”?
  7. How will we want to deal with security in this new scenario?

In the following, I am sharing some of my experience and views on “Cloud” strategy.

INFRASTRUCTURE (IaaS)

Some wise person said that IaaS is just running your applications on someone else’s hardware and to a degree this is true. However, in my view the main benefits of basing your strategy on at least some IaaS component is:

  • Ability to scale quickly and unconstrained by the limitations of your data centre.
  • Simple and efficient hardware provisioning.
  • Spike management.

For ERP, starting out by using IaaS for development and test environments can be an effective strategy that allows you to quickly scale. Also, this approach gives you the opportunity to iron out issues before deploying production workloads to “The Cloud”.

PLATFORM (PaaS)

Provisioning your platform products as services is a fairly new concept to most of us and takes some real consideration when designing your architecture. With Microsoft Azure you can for instance provision SQL Server as a service, but what if your transactional application resides on premise. Will the network be able to deal with the load?

A number of vendors including Microsoft (Azure) and Oracle (Cloud Platform) offer a comprehensive suite of platform services that allow developers to leverage sophisticated resources such as database services, integration services and analytics services. Also, in the case of Azure, you can leverage complex services including big data analytics, machine learning and Internet-of-Things (IoT) connectivity as PaaS.

PaaS vendors generally have a service, such as Azure Service Bus, that allows you to connect your on premise applications with your platform or application in “The Cloud”, but it will require some re-factoring of legacy applications to support this approach. So make sure to plan for this transitioning and code re-factoring if you consider leveraging PaaS.

APPLICATIONS (SaaS)

Most people will be familiar with applications such as Salesforce, Microsoft Office 365 and ServiceNow. All of these are applications delivered through “The Cloud” and paid for through subscription in some form or other. In the ERP space the key “Cloud” player in recent years has been Netsuite, but now other vendors such as Microsoft is muscling in with Dynamics AX and Project Maidera, which is a scaled-down ERP solution tightly integrated with Office 365.

Obviously, SaaS provisioned applications rely on PaaS and IaaS to function so subscribing to a SaaS application means you implicitly leverage PaaS and IaaS services.

In my experience, the key architectural considerations when subscribing to a SaaS application are:

  1. How will the application integrate with my existing security set up and, who is overall accountable for security?
  2. Does the vendor’s application life-cycle management process align with my existing process and do I have any control over patching?
  3. If the application is mission-critical, how will I manage major incidents?
  4. Does the application contain personal or highly confidential data, which must be stored in a specific geographic location or may not be geo-replicated?
  5. Does my subscription or contract provide mechanisms for existing the relationship?
  6. Will deploying mission critical applications in “The Cloud” require special network connections?
  7. Am I able to get to my data for integration and analytics purposes?

“THE CLOUD” AND ERP

So far, uptake of ERP in “The Cloud” has been fairly limited, but I believe this is about to change. Some ERP-related workloads such as CRM (Salesforce and Dynamics CRM), HRM (Workday), Service Management (ServiceNow) and Expense Management (Concur)  have for quite a while now been increasingly provisioned through “Cloud” subscription and I believe we will see more ERP vendors moving quickly to offering “Cloud” solutions in the near future.

For a small business, already using Salesforce or Dynamics CRM and Office 365, the next logical step is to subscribe to a cloud ERP solution such as Microsoft Project Madeira or Intuit Quick Books Online. These solutions are basically “vanilla” software packages offering fairly comprehensive functionality and integration to other popular small-business applications. For small businesses, subscribing to “Cloud” ERP is a compelling case with low initial costs and rapid implementation.

However, for medium-sized businesses and enterprises the switch to “Cloud” ERP may prove more complicated. As mentioned above, transitioning from on-premise ERP to “Cloud” ERP raises a number of architectural and security questions that must be dealt with through careful architectural planning.

Relationships with existing hosting and Application Management Services (AMS) partners may also need to be revisited to address scenarios that combine public and private “Cloud” applications with on-premise legacy applications and platform products. Staying in control of such an environment requires strong governance and change management skills.

In a transitional phase, we may in reality see “Cloud” ERP in the enterprise being deployed on-premise with some resources provisioned and consumed through cloud services. For Dynamics AX this is what is being promised with Azure Stack when released.

Regardless, implementing “Cloud” ERP in the enterprise will probably never be “vanilla” and should be approached with thoroughness and due consideration.

WILL THE BENEFITS OUTWEIGH THE CHALLENGES?

With Dynamics CRM we are seeing some features first becoming available in the “Cloud” edition and then subsequently released to on-premise. I am not sure this is a relevant reason to go for “Cloud” ERP. However, being able to off load accountability for performance, stability, security and patching to the vendor may be. Clearly, for small businesses unable to hire skilled IT staff this is a compelling reason, but for medium-sized businesses and enterprise with in-house skills our outsourcing agreements this may not be the case. At the end of the day, it probably comes down to an individual choice or business case for each company and costs.

As the workforce becomes more mobile, being able to access your ERP solution from anywhere, anytime becomes increasingly important. Scaling and deploying mobile on a global scale is very difficult and costly, so maybe mobile scenarios will be the compelling reason for some to move to “The Cloud”.

DOES “THE CLOUD” SPELL THE END FOR MONOLITHIC ERP?

Certainly, there is the potential for disruption as ERP moves to “The Cloud”. Specialist vendors seem able to provide comprehensive features across some traditional ERP domains including:

  • HRM
  • S&OP
  • Expense Management
  • Service Management
  • Project Management
  • Accounts Payable
  • Procurement and Sourcing
  • Document management

However, these offerings do not yet strike at the core of ERP namely financials, supply chain management, order management and manufacturing, but who knows…

In my view, the core ERP solution for mid-sized businesses and enterprises will remain a force also in the future “Cloud” world, but we are likely to see more non-core workloads being deployed through alternative “Cloud” offerings.

As you can see from my musings above, I believe we are still in the very early days of “Cloud” ERP and it will be exciting to see what the future holds. I encourage you to use this blog post to share your thoughts and views on “Cloud” in general and more specifically, “Cloud” ERP.

 

 

Posted in Uncategorized | Leave a comment

ADKAR as a Framework for Change?

HAVE YOU EVER BEEN PART OF A PROJECT THAT FAILED?

If you have worked with systems implementation for a while, there is a good chance that you have some across a few. Some while ago, I wrote a blog post on the subject.

I bet, that the application or system you implemented was pretty much okay, but the organisation and people failed to change and adapt to the new ways of working.

I am sure the above rings a bell with a lot of you, who have been working with systems implementation for some years.

Also, it is fair to say, most of us recognise that a key reason for failure is the lack of genuinely good change management. Very often, the system is implemented as a technical exercise by a systems integrator leaving change management to the customer, who may be very good a their business but has no explicit skills within change management.

In some cases this actually works and everybody is happy, but too often change management equates to a bit of training and then we are done with it.

Maybe a more structured approach to change management could be applicable and this is where ADKAR comes in. Obviously, there is a number of change management frameworks out there and every major consultancy probably has its own, but to me ADKAR is simple and accessible.

In the following, I will go through the five key building blocks of ADKAR and why they are necessary to facilitate successful change. At the core of ADKAR is the idea that change starts with each individual person and I quite like that approach. In a project you need to convince all individuals that change is necessary, explain how to change and show how the change will affect them.

Awareness of the Need for Change: Firstly, getting people to understand why change is necessary and the consequences if change is not made is pivotal. Most people are naturally averse to change, if they do not understand why. People, who understand the underlying need for change, are motivated to make that change and will actively support the process.

Desire to Support the Change: Individuals may have different reasons for supporting change, but they need motivation. Obviously, fear is a great motivator so if the reason for change is job loss or a similar catastrophic event, some people may feel the necessary motivation. For others, pecuniary incentives or a potential promotion may be necessary. Again, motivation is individual and should be treated as such to be successful with change.

Knowledge of How to Change: This is often the core component of any change management programme, I have come across in the past. Training, new work instructions, process diagrams etc. Tangible stuff that gives everyone on the project a feeling that “we are certainly doing change management”. And yes, this needs to be prioritised and not just become a side-issue as we get pushed for time or resources during the project.

Ability to Demonstrate New Skills and Behaviours: Giving people the knowledge to change does not necessarily make them able to change. As part of the project, we need to ensure that change is actually happening, to the appropriate standard. Embedding change management into the overall plan is key and introducing genuine stage-gating that includes change management checkpoints is good practice.

Reinforcement to Make the Change Stick: In my experience, management must, throughout the project, reinforce the change management and constantly ensure that the ADKA components in the framework are being implemented and pushed.

On paper, change management seems extremely simple. However, in our eagerness to deliver the system configurations, integrations, data migration etc. in a project we tend to overlook the need to motivate people to change and to give them the tools to make the change happen. Last, but not least, we are often failing to build the necessary controls into the plan that ensures synchronisation between the technical implementation and the progress on change.

In my experience, before commencing a major systems implementation project, any organisation should ask themselves: DO WE NEED TO CHANGE?, ARE WE REALLY READY FOR CHANGE? and WHAT WILL HAPPEN IF WE DO NOT CHANGE? You can certainly use ADKAR as an inspiration for answering these questions.

Further reading: If you are interested in learning more about ADKAR, I can recommend this book on the subject.

Source: The source of this blog is the ADKAR framework developed and marketed by Prosci. ADKAR is the copyright of Prosci.

Posted in Change Management, Project Management | 1 Comment

Dynamics AX Workspaces

With the new version of Microsoft Dynamics AX (previously known as AX7), Microsoft has introduced the concept of a Workspace, which replaces the Role Centre known from previous versions of Dynamics AX.

In my experience, the Role Centre never caught on because it was perceived as difficult to customise and a “nice-to-have” from a customer perspective. With Workspaces, I believe this is about to change for two reasons:

  • A Workspace is a “native” Dynamics AX component.
  • The Workspace is simpler and more intuitive for the user.

When a user opens the new Dynamics AX browser-based client, they are immediately presented with their Dashboard showing available workspaces as shown in the following screenshot:

Workspace_InvoiceEntry 2.PNG

If, for instance you select the Sales order processing and inquiry Workspace, you get the following view as standard:

Workspace_InvoiceEntry 3

Clicking on one of the tiles automatically brings up the related screen or view. If I click on the Unconfirmed tile, I get an overview of all unconfimed sales orders as shown here:

Workspace_InvoiceEntry 4.PNG

As you can see, the view is presented in the sales order entry screen allowing me to quickly action one or more unconfirmed orders by using the buttons in the ribbon.

By right-clicking on a tile, I am able to personalise that tile by changing its name, hiding it etc. In this example I choose to pin it to the Dashboard.

The Sales order processing and inquiry tile on the Dashboard now displays a link to unconfimed orders and in addition to this, the number of unconfirmed orders as shown below:

Workspace_InvoiceEntry 6

In this way, the user is able to rename, hide, remove or re-order the sequence of elements in their Dahsboard and Workspaces.

As the following example shows, Workspaces can also contain PowerBI reports:

Workspace_InvoiceEntry 7.PNG

The above is an example from the Cost analysis Workspace.

In my view, a lot more Dynamics AX users will take up these new Dashboard and Workspaces features because they are much simpler to use and more intuitive than what went before with Role Centres.

If you go to the Microsoft Dynamics Help WiKi and search for “Workspaces” you get a lot more insights into how Dashboards and Workspaces can be designed, built and customised.

 

Posted in Uncategorized | Leave a comment

Is APQC the Right Way to go in a Dynamics AX Implementation?

Meeting

Ever since Microsoft decided to embed the APQC process classification framework into the Business Process Modeller in Lifecycle Services (see previous post), the question to anyone responsible for a Dynamics AX implementation has been: is APQC the way we should go with our implementation?

When we started out on the Dynamics AX programme I am currently responsible for, we had to decide on a number of things:

  1. Did we, as a company, have a common global process language or classification framework we could leverage?
  2. How should we structure our workshops?
  3. Did our implementation partner have a process classification model we could use as a starting point?

Since the answer to number 1 was a definite NO! – and this is probably the case for many (most) companies, we could go two ways:

  1. Spend a lot of time (and consultancy fees) mapping our processes, classifying them and building a common language from that or
  2. use an existing process classification framework and map our processes back to that.

We chose option 2!

When you are at the formative stage of an implementation and need to decide how to structure your analysis and design workshops, in the main, you decide between a functional approach or a swim lane approach. With a functional approach you often end up replicating the processes you have today because you analyse them from the same functional view you are already used to. With a swim lane approach you tend you challenge the processes more because of the holistic approach, but a classic swim lane such as order-to-cash (including fulfilment) may prove too all-encompassing and you end up trying to handle all the company’s processes in one swim lane.

Instead of these more traditional approaches we chose to use the APQC framework to structure our workshops. This means that we divide the project into a number of work streams based on APQC L1. Subsequently we structured workshops based on L2 and discussed processes on L3. It did not work for everything, so some creativity and improvisation may be required, but it gave us a good starting point for our planning.

Lastly, our implementation partner did not have a process classification model, so choosing one ourselves was necessary and easy. Choosing an industry standard instead of a proprietary model from our partner, which was likely to tie us in with that partner, I think is the sensible way to go.

Will the choice ultimately make a significant and positive difference to our Dynamics AX implementation? The jury is still out on that one since Microsoft has not actually changed the product to align with the model as far as I can tell, but maybe that is not the point.

The point, for us at least, is that using APQC in the formative stage of an implementation may save you time and frustration and give a structure on which to build up you programme organisation and processes. Whether this is a sustainable advantage or not, I will report on in a future blog post.

Could we have gone with a different framework? Yes, we could for instance have gone with APICS SCOR(r), but this model is heavily biased toward supply chain processes and Microsoft chose APQC, so going with something else did not seem to make a lot of sense.

So far APQC seems the way to go…

Posted in Dynamics AX, Project Management | 2 Comments