Why Planning and Analytics are like PB&J

So this post is for the planning and analytics geeks out there. I enjoy watching software applications categories undergo fundamental change, where real innovation starts to appear. And the nexus of planning and analytics is where this is happening.

Often the drivers behind these kind of changes can be technology based – such as the rise of mobile, or perhaps social or economy based – such as the rise of the self-employed economy. But when these external drivers gather momentum they often disrupt software categories. Some people go with the flow, others try and fight it.

With that in mind I read with interest an article recently that made the case for Analytics (or for the old school among us: Business Intelligence) and Budgeting/Planning applications being two separate worlds and really not needing to be unified together in a single application, that putting them together is just hype, not useful.

It was on the heels of a set of announcements from SAP, with SAP Cloud for Planning bringing together both analytics and planning – large scale analytics, data visualization, modeling and planning under the same unified hood, underpinned by SAP HANA. I personally think there is real innovation to be had at this nexus of analytics and planning, but more on that a little later.

The crux of the case in the article, was that Analytics is for the tech guys who get big data, data prep, data warehousing, SQL, unstructured and structured data etc., while Planning is for Finance, who worry about drivers, financial allocations, forecasts etc.

Different Disciplines, Different People?

I get it, my background hails from the data-warehousing, Business Intelligence, and Online Analytical Processing (OLAP). And to be honest, financial planning was a different world.  When I built dashboards and analytics for organizations, (typically for the crew in IT) there was often a separate planning implementation going on in the room next door for Finance. Each side looked with somewhat distain over at the other (I preferred to write SQL than think about cost allocations).

When Business Intelligence first emerged in the mid-90s it was built by tech, for finance-IT – we’re talking star-schemas, semantic layers, and all that good stuff – distant from the world of finance. While when the first packaged planning apps for finance appeared, they were built as apps. New technology at the time like OLAP databases were optimized for modeling and what-if? analysis for finance – but had fewer dimensions and detail for analysis and weak ad hoc analysis, while big iron driven data-warehouses were optimized for large scale analysis, but couldn’t handle changes in assumptions inherent in the modeling and planning process.

So when the rounds of vendor consolidation with Business Objects, Cognos, and Hyperion happened in the mid 00’s it was two (often more) Business Intelligence (BI) and Corporate Performance Management (CPM) stacks. Two different categories, different skill sets, different code-bases. Vendors glued these two stacks together with a veneer of branding and single sign-on to make them like a suite, but they were really different code-bases and experiences beneath the thin integration.

Change is Underway.

But just the same way that NetSuite and Workday are reimagining their respective categories in ERP and HCM for the new economy, the same is beginning to gather pace in CPM. In ERP for example, eCommerce capabilities increasingly need to work with the ERP seamlessly, from web storefront to order -- because a digital storefront is often strategic. And HCM apps need to be mobile-first in an increasingly self-service world. So, CPM is undergoing a similar transformation, just differently.

CPM is changing because planning itself has to be more responsive, more in tune than ever before with the operating environment. And that requires analytics.

A recent Hackett Group survey showed that about a third of companies intend to implement rolling forecasting over the next few years. Combined, Hackett saw over half of companies building some kind of rolling forecasting process. Hackett attributed it to increased competitive pressures on companies, and faster moving markets. Companies want to see now just further out, they want to see their forecast adjusted based on a continually changing environment.

So doing a yearly plan/budget isn’t good enough anymore either. And because organizations are increasingly moving to rolling forecasts, it means ingesting ERP, HCM, and CRM data increasingly frequently. And more frequent planning and the push for more accurate forecasting means responding to external data too. Not all of this data needs to be in the plan itself, but the planning professional must be able to update planning drivers, change assumptions, and make course corrections in the face of the larger data landscape that they are expected to respond to - and they need to see that environment clearly.

The data landscape they're making decisions on is larger than before, and they’re being asked to re-plan and respond to that landscape faster. Planning no longer takes place in a vacuum, and it takes place more frequently, and closer to the business.

The dashboard vendors don’t have it easy either. Because standalone dashboards aren’t really good enough anymore either – they don’t have a call to action in them – just seeing a chart isn’t good enough – the expectation is you’ll do something about it. You either take action in your system of record – that’s why providers like NetSuite, Workday, and Salesforce provide embedded analytics. Or you plan and adjust based on those insights, using engines that combine analytics and planning, like SAP Cloud for Planning, Anaplan, and Adaptive Insights. But a standalone run-of-the-mill web based dashboard environment (and standalone planning environment) is deteriorating in value.

But really reimagining planning and analytics as a single unified solution means starting with a clean sheet of paper. Providers like SAP are taking the lead. Remember those data stores I mentioned earlier, one optimized for planning and the other optimized for large scale analysis? Well in-memory columnar databases like SAP HANA offer the opportunity to do both in the same database and data model, which makes it easier to model and plan in the context of large scale analytics. With data visualization operating on the same data store that's being used for analysis and planning, it's a potentially potent combination, blurring the lines between analysis and modeling.

So to do this right, it really helps to have a unified system – one database engine and model – the same engine serving both the analytics and the planning, one set of common definitions, one unified user experience, one business dictionary across both. It’s no longer just gluing these systems together anymore - like what happened over a decade ago, they have to be rethought in the context of where planning and analytics are headed, and designed together.

For once, this isn’t just vendor hype. As the nature of planning changes, a new opportunity opens up to rethink the systems that enable it.

Now time for that PB&J.




The Fake Cloud Comes to Budgeting and Planning Applications

Legacy on-premises providers are feeling the heat, as more and more businesses worldwide continue to migrate to the cloud for added agility, greater collaboration, and faster data analysis.

This current cloud momentum has left many legacy players playing catch-up. They’re frantically migrating their products to the “cloud” – but it’s really just the “hosting” of old. The truth is that they’re gluing together old products to a delivery model that was never designed to work together. Worse still, they’re marketing it as if it is a real cloud solution. The “cloud-washing” phenomenon has now come to the budgeting, planning, consolidation, and business intelligence space, where legacy providers are warming up two-decades old software, painting puffy cloud pictures in brochures and presentations, and hoping their prospective customers can’t spot the difference. The truth is, you can’t just move on-premise software to a datacenter, and call it “cloud”. Ultimately, the customer is the loser in this scenario.

Even the media is fed up with fake cloud providers that try to pass as SaaS vendors.

Why? Because there are real, meaningful differences between solutions born and bred in the cloud, and those that were forced into the cloud to try to keep up with today’s business needs. Customers who are unable to navigate through the sea of SaaS-queraders and who are fooled by the fakers are destined to be stuck with expensive, antiquated solutions to run their businesses.

So the question is this:

Can You Spot Fake Cloud Budgeting and Planning Applications?

Here are four warning signs to look out for:

Fake cloud budgeting and planning solutions are much more difficult to use.

For a budgeting and planning solution to be successful, finance needs to be able to make change independently. That means creating new plans, allocations, or dashboards without IT or a busload of consultants. Fake cloud solutions still carry their complex heritage. Running them in the vendor’s data center still means a complex and IT intensive user experience for you in these areas:

  • Building financial plans
  • Updating security settings
  • Creating reports
  • Writing allocations and formulas
  • Make changes to business structures
  • Tuning the application for performance

One easy way to sniff out a fake cloud? Look for multiple administration consoles, non-browser based tools to administer the app, and large amounts of IT facing/technical administration functionality. Even better – ask to take a free trial, and watch the SaaS-Queraders scratch their heads, wondering how it’s even possible given their solution’s complexity.

In contrast, a true cloud solution is designed from the ground up for business users to manage and change the application themselves because it had to be designed that way. If you can take a Free Trial – and be using the application within just a few minutes with your own data, then you can be pretty confident it’s a real cloud solution.

A conversation with a reference customer starts with “What version are you running?”

With fake cloud solutions, all on-premise/hosted customers are on different versions. It’s much harder to share knowledge and best practices when your peer is running a different version of the software – one that might be 5 years old. In fact, when fake cloud providers “release” new software, their on-premise customers wait years to upgrade and each hosted instance requires upgrading separately - often an onerous and risky process. With cloud-native solutions, 100% of customers are always on the latest release. Everyone is speaking the same language, creating a strong community for sharing tips, tricks, and adopting the latest functionality.

Fake cloud solutions are often an “operations horror” behind the scenes.

With fake cloud solutions, you don’t want to see what’s going on behind the curtain – it’s often ugly. Transition of all that IT “ops” complexity is kept away from you. Good for you – but bad for the vendor – because those old premise solutions were never designed to run “as-a-service”, or be easy on IT. Often, fake cloud solutions each need many instances for each customer, that each requires its own “care and feeding,” and personal upgrading. The fake cloud vendor quickly ends up with hundreds, or even thousands of instances. Each instance also requires personal patching, fixing, and maintenance. It’s incredibly easy for customizations and optimizations to break during an upgrade because the applications weren’t designed with easy upgrades in mind.

In contrast, true cloud solutions are multi-tenant with a single code-base that’s designed to automatically migrate customizations with each new release.

A slower pace of innovation.

The best cloud companies innovate faster than fake cloud providers. Why? Because they can focus on one codebase and one platform. Imagine a world where your development team has to maintain 4 or 5 different versions of the software to support all the different customers running those versions. And imagine if all those versions were on different platforms and operating systems – Windows, Linux, Solaris, Oracle, DB/2, SQL Server. It’s a matrix of complexity, which saps innovation and resources.

Contrast that with a real-cloud vendor. All of the customers are on the same version and platform. It means a 100% of the R&D team focuses on improving the application that YOU are running, not someone else’s code-base.

Use these four tips are a starting point to avoid getting burned by the fake cloud - and do your own research, as they're merely the tip of the iceberg as far as warning signs to be wary of.