The ability to capture both structured and unstructured data and share it is crucial for organisations to analyse it effectively.

However, as organisations plan their evolution from descriptive to predictive analytics, one of the key issues they face is a lack of infrastructure.

A recent SAP report, Challenges for Analytics as Data-Driven Enterprises Chase Scale, explained that this is due to the majority of infrastructure being built before the explosion in unstructured data.

Similarly, machine learning and artificial intelligence were not even out of the testing stage when this infrastructure was installed.

How this affects the move to new ways of working in analytics is rather significant and multifaceted.

At a base infrastructure level, many small- to medium-size organisations— and all corporates — have a dependency on critical platforms to support their core business model.

And a number of these are more than ten years old.

In addition, many of these platforms — such as their CRM systems or the general ledger — are transactional and designed to minimise data capture or data inputs from a customer to achieve an outcome.

But, in the predictive world, and when machine learning is involved, the general rule is the more data the better.

Unfortunately, these legacy platforms are simply not designed to capture, store and distribute mass volumes of data. This is where problems start.

For example, the concept of a Data Lake. Its function is to centralise the data from transactional platforms into a central entity. However, the sources feeding into the Data Lake were not designed to capture data at the scale needed today. As a result, analytics teams are working  with incomplete data sets.

Additionally, large product and consulting companies, that have specialised in building large transactional platforms, are now trying to reinvent themselves with new solutions focused on modern-day data requirements.

The report noted that the work required to ensure that new advanced analytics approaches like predictive can be supported. But it needs significant investment and transformation — including, potentially, a major rethink of the business model.

However, these businesses are often unwilling to commit at the scale required without clear evidence from customers that these new approaches have traction in the market.

It’s the curse of incumbency.

Enterprises that partner with such organisations find themselves being pushed to support use cases to justify consultants’ transformations.

Courtney McCabe, a participant in the round table that provided foundation for the research, and a senior analytics executive with a strong finance sector pedigree, said “A very common conversation I have is, ‘Try this new product, if you like it I will sign you up for a five year enterprise deal, we will give you a great discount.’ It just doesn’t work like that anymore.”

Another issue many analytics teams are grappling with is being forced to work with specific vendors and specific platforms which lack experience solving complex analytics issues.

The rationale is often that these vendors and partners have an existing relationship, “so you have to use them”.

Participants said vendor selection is critical and can do damage when executed poorly.

See how the best in the world are embracing intelligent technology and innovation. Register for e’ffect at Carriageworks in Sydney on August 8.

About the author

Athina Mallis is the editor of the Which-50 Digital Intelligence Unit of which SAP is a corporate member. Members provide insights and expertise for the benefit of the Which-50 community. Membership fees apply. 


Previous post

Coles to accelerate digital transformation with the help of AI and cloud

Next post

Want viral reach? Here are 5 changes to make now

Join the digital transformation discussion and sign up for the Which-50 Irregular Insights newsletter.