Softwarepakketten.nl
Surepay IBAN-Naam check
EXACT Software
VISMA E-accounting
E-boekhouden

Bijdrage van Bloggers (blogs)

Big data for Finance, is it necessary? - Part II

Plaatsingsdatum 05-03-2016
Berichtdatum Maart 2016

Blog door Marco Van der Kooij

Big Data for Finance, is it necessary? - Part II

After reflecting on it a bit more I've come down to two main drivers why big data for Finance is a necessity:

The first is that regulators, especially in the financial services industry, are asking for data on a more granular level. Initiatives like IFRS 4 phase 2 and Anacredit are requiring data on a contractual level to be reported and I guess that more initiatives like this will come. Say what you want about regulations but one positive is that they create a level of urgency in organizations to take action and IFRS 4 and other regulations will force finance organizations to take action on big data to meet these regulatory requirements.

The second reason I discovered during a session on Integrated Reporting is, that it is expected that predictive analytics is needed to define which KPIs are the most material to report on. In other words predictive analytics are needed to Identify those KPIs (from the hundreds or thousands of possibilities) which are most important to building an Integrated Planning and  Reporting process and platform that supports the business model. It's not only about being accountable by reporting, but also forward looking performance management with predictive analytics is needed to optimize the business model for all stakeholders. Again in the financial services industry I see a change from accountability to forward looking performance management to be better in control and to better support the business with analytics and accurate rolling forecasts for both financial AND non-financial information.

For predictive analytics a lot of reliable data, calculations and visualisation capabilities are required to do scenario analysis in order to define materiality and a reliable forecast and the finance organization has to have confidence in and control over a much more granular level of data to support this. 

At Tagetik we support our customers with our Business Integrator functionality to capture large volumes of data at a granular level in a staging database, perform calculations at a very detailed level and aggregations for reporting the results. This solution has been in use for several years by our banking customers in Italy and North America. In Italy it is used to report to the Italian and European Central Banks and in North America it is also used by several retail banks to calculate reliable cash flow and portfolio projections at the financial instrument level. Both use cases entail millions of rows of data and banking specific calculations at the lowest level of detail. In the past data volume limitations, performance degradation, and calculation sophistication made this close to impossible for the finance organization to accomplish. Now this type of big data use case is not only possible, it is becoming essential for regulatory reasons in financial services and in other industries for profitability analysis and kpi reporting at the customer, sku, or other very granular levels. 

My expectation is that the easy creation of a staging area for predictive analytics, modeling, calculations, aggregations for both reporting and rolling forecasts at a much more granular level will be common practice in the very near future, just as it is happening right now in financial services. 


Vorige blog door Marco Van der Kooij, medio feb 2016

Big Data for Finance, is it necessary? - Part I

I never believed in big data for Finance departments; as that type of volume of data came from building data warehouses for web analytics in the past. Financial Accounting and Management information for reporting & analysis is mostly at monthly or sometimes at weekly balances level. For budgeting & planning it is at product or product group and/or brand and/or FTE level (in some cases employee level) as the lowest level of detail. And in my point of reference this is not really big data compared to the storage and analysis of clicks per page views and site visits.

Recently, I started to have new insights on this coming from customer requests to report, analyse and plan at a more granular level of detail. This is mostly coming from Retail (SKU-level), Manufacturing (Bill-of-Material level) and Financial Services (at contract level from individual customers). Customers are looking for a fully integrated solution for a single point of truth to support their planning & control cycle and management processes from their consolidation, reporting, budgeting, planning & forecasting software with business intelligence type of analyses with predictive analytics capabilities from a single source.

I think the time has passed to respond that a Corporate Performance Management (CPM) solution is not capable to do this, as technology has evolved in order to support huge amounts of data with fast response times due to in-memory capabilities without necessarily depending on an IT owned data warehouse to support it.

The difficulty for me in this is, that the granularity of data is different for the different processes in financial and management reporting to support analysts and decision makers. SKU-level information is not needed for statutory consolidation or P&L reporting. On the other hand for operational planning SKU’s (or a subset of SKU’s applying the 80-20 rule) are required for a rolling forecast from sales to production in a manufacturing company or even at the bill-of-material level for a sourcing strategy with impact on inventory levels and cash flow. The companies I’m talking to are desperately looking for a fully integrated software solution to support both needs without creating redundant data and reporting tools. Moreover, they want to improve the business value of variance analysis by taking it to a more detailed level where they can take action. For example, utilisation at the plant or staff level to improve operational excellence. While they are not necessarily going there yet, the next step would be more predictive analytics capabilities will be required in the planning process to optimize demand and supply. The one thing I’m hearing loud and clear is that they want one place, one system, one set of data, where this is done. Bolting on a new product that they have to integrate to do this is not what they want.

As a software vendor focused squarely on the needs of the Office of the CFO, we know that nothing should disturb or add risk to the statutory consolidation and regulatory reporting from the financial reporting department. So how do you reconcile these things?

Actually, I don’t have the perfect answer right now.  I’d love to hear from those of you in finance your thoughts on the merging of traditional finance with Big Data. How would you like them to come together and how do you see it benefiting your role and your company? So please comment with your thoughts and in the meantime I’ll be working on a follow-up post on the topic over the next few weeks.

Categorie(n) Branche > Financials, Soort > Financial Planning and Analysis (FP&A)
Bronvermelding Tagetik
Internet URL Tagetik

Automatisch op de hoogte blijven?
Schrijf u in voor onze gratis periodieke nieuwsbrief.

Terug

Gerelateerde berichten

Reporting 3.0 - The future of corporate reporting [21-12-2015]


 

Kleisteen

Informer Software


KING Software


Timewriter


Onerzoeksbureau GBNED