Reinventing the Analytics Process

By Nilly Essaides
September 8, 2020

No one could have predicted the massive disruption brought on by the coronavirus. But some companies were better prepared than others to withstand the turbulence. Finance organizations that acquired more-advanced analytics and developed flexible planning processes had a head start. Those that have not now face an urgent need to upgrade their capabilities. Pre-pandemic, 79% of respondents to our 2020 Key Issues Study reported that improving analytics, modeling and reporting was their No. 1 transformation initiative. As we continue to face extreme uncertainty, the pressure to deliver real-time, mission-critical insight has only intensified.

Finance transformation initiatives planned for 2020

Intensified Pressure to Improve Analytics

To say the crisis had a big impact on planning activities is an understatement. Fifty-two percent of respondents to our COVID-19 Finance Response Poll (2020) said the pandemic has had a major effect on their forecasting horizon, and 9% were forced to completely overhaul their approach. Plus, 55% saw a significant impact on their use of modeling or the lack thereof; 13% experienced a complete overhaul. Rising infection rates are triggering another wave of volatility. Finance will have to dial-up forecasting frequency and use of advanced analytics techniques in order to provide leadership with the insight they need to make critical business decisions.

Evidence of the rising demand for planning and analytics solutions is the greater demand for cloud-based EPM applications. It’s something we see across our client base. FP&A teams are looking to develop and run predictive analytics that make it a lot easier to produce range – vs. single-point – forecasts anchored to statistical modeling. For example, we recently worked with a company to develop a predictive P&L forecasting model. After running the new model alongside their traditional process, the company found that the statistical model produced far more accurate forecasts, even though – or because – it relied on a smaller number of KPIs.

Architecting the End-to-End Data Analytics Process

The pressure to look forward and take full advantage of organizational data is not only forcing finance functions to up their analytics IQ, it’s also instigating a whole new way of thinking about the data analytics. In the past, information was collected haphazardly, kept in multiple places and used inconsistently by different parts of the organization. That setup torpedoed FP&A’s ability to come up with meaningful insights. With data velocity and volume rising at unprecedented speed, finance needs a way to collect, prepare and analyze information and produce actionable insights. Even though the crisis has placed transformation projects under the microscope, finance is powering forward with data-related initiatives (68%) but are even accelerating them (9%).

Once they establish a strong data foundation, finance needs to adopt an end-to-end view of the data analytics process that begins with data curation and ends with actionable insight.

The end-to-end data analytics process

Phase 1: The big-data engine room

The data-flow redesign should begin with building a mechanism to capture internal and external information and filter it based on relevance, for example what’s needed to support an upcoming planning event. Given the volume of data, determining what is pertinent is critical to avoiding information overload. Decisions should involve correlation analysis between a “candidate data set” and the business outcomes sought from the data being acquired.

Once a data set is defined, it should be “virtualized” by using modern data management platforms. This will make it more readily accessible to different users. Plus, it’s important to design a robust governance framework to ensure data is consistent and validated before it becomes available to a broader audience. Part of that process involves creating and coordinating data-access policies with the IT organization.

By getting the data in order, finance teams can spot and solve for capabilities that they may be missing. For example, they can embed data engineering or science experts in the function, work with IT business architects, and/or collaborate with outside data and analytics providers. The overarching goal is systematizing the production of data sets, so finance can support the full scope of business decisions and performance evaluation.

Phase 2: Data democratization

Gone are the days of reserving data for the use of a select few. But funneling data from different sources through a single control point is not really scalable. There’s just too much data around, and in many cases it’s still collected manually. According to our research and conversations with clients, leading organizations are increasingly adopting a hybrid model that combines a BI/analytics center of excellence (COE) with cloud-based self-service tools. The COE can focus on sophisticated calculations and building algorithms, while business users are empowered to run their own queries, which speeds up the discovery process and gets them quick answers to pressing business problems. To pull this off, finance needs to prioritize the adoption of automation solutions that blend data from different sources and build new ways to query “blessed” data more methodically.

Phase 3: Value extraction

Next, finance needs to set the standards for the automated solution that will leverage the data and feed it into advanced analytics models. This could happen at the COE and/or the business-user level. For example, we see clients using standardized data to complement or replace traditional forecasting approaches with statistical models that predict financial outcomes based on different underlying assumptions.

Finance teams are also establishing new ways to distribute information so it’s easier to digest. They use more data visualization tools, natural language generation (NLG) solutions and chatbots. In our 2020 Key Issues Study, respondents said they expect a 26% year-over-year increase in the adoption of data visualization tools in 2020. Thirty-eight percent reported they already have large-scale deployments of such tools – which are both the most broadly adopted digital solution after legacy applications as well as the technology with the highest expected growth rate.

To complete the end-to-end data analytics process, FP&A needs to operationalize insights by turning them into actions and recommendations. They can do it by working closely with their business-partner community. Unfortunately, this part of the process is often not well established, but it can be a critical driver in improving business conversations. The onus is on finance to shape the agenda and flow of these conversations to ensure the systematic inclusion of data analytics in management decision-making and strategy execution.

The Next Normal

The prospect of a prolonged recession and economic uncertainty will enhance management demand for actionable insight. The ability to make and execute decisions quickly defines agility in this environment of relentless change. That is more important than ever before. Our research shows that highly agile organizations significantly outperform typical companies in a range of metrics, such as EBITDA margin, net margin and total shareholder returns. Finance will need to update its planning process so it can quickly adjust to changes in market conditions. And it must speed up the insight-to-action cycle by leveraging new data-management platforms that can feed predictive models in order to improve the accuracy and utility of their forecasts.