<img src="https://certify.alexametrics.com/atrk.gif?account=3HHNq1DlQy20Y8" style="display:none" height="1" width="1" alt="">

SAP Analytics
Thought Leadership

3 Reasons to Be Skeptical of Best of Breed Cloud EPM Vendors Claims to Support xP&A

Posted by
David Den Boer
David Den Boer
on Tue, Aug 16, 2022 @ 14:08 PM
Find me on:

EPM as a solution category has arguably been around for more than two decades. Before the turn of the century, a long line of solutions starting with Hyperion was developed to alleviate specific pain points caused by Excel-based financial planning and analysis (FP&A) processes. 

In the first significant redefinition of EPM technology requisites, Gartner has just coined xP&A for Extended Planning & Analysis. While Gartner did not necessarily define an advanced performance management process to go along with these new platform technology requirements…xP&A does make possible some highly valuable advancements that are, in our humble opinion, worth serious consideration. 

Two key themes underpin xP&A:

  • Alignment of Operational and Financial planning and analysis
  • Incorporation of AI features such as Predictive Analytics

Of course, astute users of EPM might observe that using EPM tools to cover operational planning and analysis is not new, nor is predictive analytics. What is new is aligning these capabilities as a unified platform and assigning it a label. 

Let's face it, Gartner is influential. Redefining this category has had a ripple effect almost immediately on the market. Vendors formerly identifying as EPM technology developers snapped in alignment to rebrand their product portfolio as supporting xP&A. Customers are sure to follow Gartner's classification and articulate requirements for new analytics solutions using the fresh xP&A framework. 

How do longstanding vendors support these newly defined requirements? Prospective customers seeking the benefits of xP&A are right to be skeptical. Let's discuss three key ways technology vendors offering cloud EPM solutions may be overplaying their capabilities when they claim their solutions support xP&A. 

As a reminder, Gartner has identified a list of required technical capabilities to be deemed as supporting xP&A. However, Gartner has not specified a new process that leverages this enhanced feature set. Hence, the default position is that supporting your existing operational and financial planning and analysis activity is the process support requirement. 

Bottom line: Should you rip out your existing EPM tool to reimplement your current process on a rebranded xP&A tool? Definitely NOT! As discussed here often, applying new technology to your existing process is unlikely to yield significant value. 

Here are the risk points of these Best of Breed Cloud EPM solutions, now rebranded as supporting xP&A: 

Data Integration Limits 

Per our reference framework for advanced EPM (and xP&A, for that matter), there are two types of integration more mature solutions attempt to manage and support - Horizontal & Vertical. Horizontal is an alignment across strategic, operational, and financial processes. Vertical aligns from summarized reporting useful in dashboards and aggregated financial statements through intermediate levels of detail typically found at a planning level, on down to the most granular data found in transactional systems. Whichever level of detail you're contemplating, more expansive coverage horizontally or lower levels of data vertically means more significant data volumes and more data objects to manage. More data volumes require more storage, invoke significant latency shuffling data from one place to the next (moving transactions to the cloud, moving operational models to financial models, etc.), and require more processing and memory horsepower to manipulate. The more data movement, the more likely conversions are required. The more conversions equal maintenance points, risk of translation errors, and processing steps, ultimately all of these gyrations yield increases in overall process latency. 

Best of breed cloud EPM/xP&A vendors rely on a "land and expand" sales and solution strategy. They say, "Buy our product to solve one functional domain, and as long as you have that in-house, might as well solve the next functional challenge too!" It makes sense on a certain level, and clients are willing to assume that if one area is satisfied, the next one will be too. However, if you fast forward through these iterations to using the same platform for most or all of your functional domains required for comprehensive planning (which can easily number in the 10-25 models across operations and finance as contemplated by xP&A), that could form a tipping point where you encounter hard limits. 

Multi-tenant cloud architecture has a limitation on workspace capacity. Understanding how prospective vendors address "large" capacity requirements is essential if clients target xP&A benefits because clients will add models, data, and complexity over time. We have found some cloud EPM vendors with smallish workspace limitations, to begin with. We can foresee this limitation that starts as a non-issue with a single or a handful of models rapidly becoming a brick wall barrier in the future when applied to the full spectrum of functional domains. In response to this hard limitation, clients have encountered; some vendors have developed a way to stitch together multiple workspaces with a “super-workspace” drawing on data from sub-workspaces to cobble together support for a larger capacity. 

Clients can easily imagine the perils of having transaction data from various sources across their enterprise feeding models arrayed across multiple workspaces, coordinating data flows, and then reporting across these distributed objects, becoming a complex, fragile, and high latency endeavor before too long.

Some vendors lack real-time integration with transactional data sources if your requirements seek to advance support for vertical data integration. Such limited solutions require loading these transactions to their cloud platform to access the detailed records in support of summarized groupings. Not only is this time-consuming (forget aspirations of real-time reporting), but this also rapidly consumes finite storage space within the workspace. Not only is the space limited, but it can become exponentially more expensive as often data storage is a metric where cloud solution pricing can be metered. 

If your organization is interested in xP&A requirements (and at this point, we encourage all clients to set their sights on xP&A), play out what your data storage requirements are likely to be now and over the next few years. Will the convenience of using the same platform for multiple business domains ultimately become an insurmountable limitation preventing you from achieving the vision? 

As an alternative, some cloud solutions are built on hyperscaling cloud platforms; other solutions are built with on-premise deployments – either approach, with the correct configuration, can support any level of horizontal and vertical integration. Knowing how any prospective solution is architected is critical.  

Inter-model alignment

The cornerstone of xP&A solutions is aligning operational and financial planning processes for comprehensive analytics. Opportunities for alignment across the broad operations and finance functions include subordinate objects handling transactional data, master data, workflow, data validation, report navigation, and, importantly, business rules. The benefit of this alignment is to build a synchronized bridge between operational activity (plan and actual) and financial outcomes to enhance the coordination between these two historically isolated domains. 

The inner working of this high-level objective connotes the alignment of many formerly independent models as well. Under operations, demand plans, sales & operations plans (S&OP), supply chain, inventory, production labor, and other specific topics warrant having domain-specific models. Under finance, revenue, margins, labor & operating expenses, capital expenses, tax, treasury, and other particular domains apply. Before long, the task of coordinating these disparate models can grow to become quite complex. 

How a prospective solution handles aligning these functions across several (perhaps many) models is critical in determining how well equipped a future solution is to enable advanced requirements like those contemplated in xP&A. Is there an efficient dimension library where master data can be defined once and reused across multiple models for efficiency? Can transaction data be loaded once and read in situ to minimize redundant and highly latent data movements? Can business rules be defined in a central mechanism and executed by any model requiring updated data across models? Can users seamlessly navigate from one model to the next with intuitive moves without requiring users to master a complex landscape of objects/steps?

Some products were designed to meet legacy use cases (pre-xP&A) and align to a landscape of one or at most a handful of models. Imagining how xP&A could be embraced with attempting to align, administer, and use 20 or so independent models may expose some very undesirable consequences and limitations. 

Artificial Intelligence & Predictive Analytics

In addition to the benefits (and risks) of creating an integrated analytics platform (horizontally and vertically), a fundamental tenet of xP&A is leveraging the transformative potential of Artificial Intelligence (AI) and Machine Learning (ML) with your finance and operational planning and analysis processes. 

Clients agree that the future of finance includes a heavy involvement of AI/ML to enable advanced analytics; however, few clients or vendors have identified specific business processes to implement or integrate AI/ML in any meaningful resolution. Absent this process definition, and the default position again becomes, "do your current process, but add this to it,"…which is an ambiguous notion resulting in no net gain, as we have said many times. 

Setting aside the process-derived risks, we can be skeptical of technology risks depending on how EPM/xP&A vendors have integrated AI/ML into their solutions. In reality, very few have gone further than layering surface-level AI/ML capabilities. For most products, enabling an AI/ML engine to process any data is done on a model-by-model and sub-datasets within each model basis. Meaning, we can employ AI/ML – usually to run predictive algorithms – on an individual subject matter within a single model in isolation.

The paradox becomes that the more the client organization embraces a broad xP&A landscape, the more specific and purpose-built their models become. Instead of having one financial model to support all the planning and reporting needs at a summary level of detail, a more xP&A style of design would call for more focused models capable of capturing detailed assumptions relevant to specific areas. These clear assumptions, though, often have very intuitive relationships to outcomes. 

Typically, the detailed assumptions supporting revenue and salesperson compensation are in different models. For example, suppose revenue planning was the subject matter. In that case, there is not much need for predictive modeling to identify that selling a greater volume of products with a higher average selling price (ASP) will yield more revenue. A more valuable insight might be the relationship between salespeople's compensation and the ability to sell an increased volume of high ASP products. This type of interaction is called "cross-functional analysis" for obvious reasons. 

Many best-of-breed cloud EPM solutions outsource predictive capabilities. Such products enable this functionality by extracting data from their product and loading it to Google, Amazon, or Azure's data engine. The value of such analytics should be intuitive, but how seamlessly these tools integrate across models will dictate what types of analytics are possible.

One “killer question” to ask if your organization is considering a new EPM solution to incorporate advanced functionality like predictive analytics, only to find that this potential new product requires the extraction of data from your EPM product and then upload to a third party platform: “Do I even need this new product to enable predictive?” The answer is no. You could add predictive to your existing EPM solution (regardless of age) by aligning the third party to the incumbent platform. This could save millions alone!

What happens if you have 20 models, as in the case of a more mature and complete xP&A deployment? How exactly would you load 20 models to a third-party platform to access this functionality? What kind of translation is required? What latency would this invoke? We can take real-time modeling off the table for starters, but what other constraints might be encountered? 

There are many considerations when purchasing a new platform in the hopes of accessing advanced features.


I have gone to great lengths, not to mention any products by name. Still, generally available EPM products can be divided into two camps to meet highly valuable and coveted xP&A grade requirements: Best of Breed (BoB) and Megavendor Platform solutions. I will leave it to the readers to decide which camp their incumbent product and prospective products belong to. 

Most products share the hope for client xP&A deployments based on the sweeping assumption that the value of an xP&A solution is predicated on using the same technology platform across an increasing number of models. Vendors of this mindset celebrate xP&A because they want to convince customers that adopting their chosen solution in every possible functional model across the xP&A landscape is the most efficient and best possible configuration. This perspective fails to contemplate any advanced process, where BoB EPM clients may particularly encounter the risks I describe above.

This risk is maximized when choosing a product before choosing a process and succumbing to the vendor’s suggested expansion using the same product across all business domains. Ironically, this focus on covering existing functional silos (but not enabling a larger process across silos) risks hobbling process effectiveness to be not much better than the status quo.

This legacy understanding of EPM processes may be difficult to detect through the veneer of cloud-based solutions touting xP&A and equipped with some shallow predictive capabilities. But make no mistake – tools, and projects that do not drive a more robust process explicitly are ultimately regressive. Clients falling for this pitch will spend more money and be more limited than they ever hoped. 

I would describe the second product grouping as more of a mindset as I do not believe a single product embodies all required functionality to transform processes. Fulfilling the vision of this xP&A mindset requires a product purpose-built to avoid the pitfalls I describe above. It is intentionally scalable to support horizontal and vertical integrations of sufficient detail required to fuel powerful AI algorithms. No EPM/xP&A products support this requirement alone today. No products are purpose-built to support 20 models or so. The scalability, interconnectivity, and navigation do not have this use case in mind. 

Finally, predictive analytics is a bolt-on afterthought to most of the available products. SAP alone is somewhat novel in having an integrated predictive engine from the beginning in SAC, but this needs maturing to fulfill an advanced process. At least the potential is there, and this is more than every other product can claim. 

Why am I excited? First, whoever has read this far has confirmed what I believe: xP&A is valuable and worth targeting as a vision for every company's performance management configuration. I also know I have worked for the better part of two decades to define a successor performance management process to the venerable Excel-based data silo and "serial analytics" process. I have it defined – "Dynamic Networked Analytics" is the name for this new process. We will be discussing this in great detail in the coming months.

The final reason for my anxious anticipation of mainstreaming xP&A is that the talented team at Column5 and Darwin EPM has been working on an off-the-shelf product that enables this advanced business process. Implementing this process and methodology will automatically advance organizational performance management capabilities to a place where Excel will be left far behind. 

This innovation is here, and visionary clients can adopt it today. In a few months, this will become even easier to adopt. 

Are you interested in learning more? 


Schedule a FREE 30-min Meeting

Topics: EPM ROI, EPM, xp&a


Recent Posts

Posts by Topic

see all

3  S T E P S

To Enhance Your EPM Performance:

 1.Get Your Ultimate Guide to Improving SAP EPM Performance

ebook 2

 Learn what could be contributing to your poor performance and how to diagnose common problems. Get tips that will empower you and your team to improve the  performance  of your system in order to get the most out of your SAP EPM Investment.

Get the e-Book


2.  Test Your BPC Performance 

Benchmark 2

Get a BPC performance report card and custom report to identify performance issues you may not even know you have.

Test my BPC


  3. Get Your EPM Assessment  


Our assessment delivers a complete, best practice roadmap for you to follow.  We’ll work together to assess what you need to pivot your financial forecasting, planning, reporting and analysis to handle the compressed requirements from the volatile COVID-19 environment. 

New Assessment