Three key factors increasingly motivate organisations to build and then leverage a modern data architecture.

First, the business strategy requires democratised access to structured and unstructured data in such a way that it can easily be provided to decision-makers in real-time.

Next, there is a need to operationalise data in a way that accounts for data quality and consistency, but which is also very secure.

And finally, organisations want to streamline business process automation.

That subject is addressed in a new research paper from AWS called Modern Data Architecture, which looks at the issue specifically from the perspective of organisations using SAP.

For SAP customers, there are huge benefits to be had from combining SAP and non-SAP data, and the research paper describes how this is much easier to achieve when all of the data resides inside a data lake on AWS.

The kinds of large, distributed enterprises typically found amongst the SAP community are able to extract more value from their SAP core by extending data out to the edge. The data can be mobilised to field staff or remote devices, or used to feed AI and machine learning, according to the authors of the report.

Foundational data lakes

Whereas traditional data warehouses were designed as staging environments for gathering structured data that had been transformed by a data normalisation process, a foundational data lake is really designed for any kind of data or content. That data might come from a traditional SAP ERP solution, other SAP apps, or increasingly many new sources of data — including unstructured data from social media or semi-structured data from IoT devices or streaming video and audio.

Data lakes allow organisations to aggregate all these types of data in a cost-effective manner.

They are straightforward to set up in a Cloud environment like AWS, through the use of data lake formation templates in an automated process utilising AWS S3.

The technical barrier to entry for customers is very low, and an S3 bucket is basically the start of a data lake. You can add authorisation and security parameters to the lake formation templates, with high levels of granularity.

Once implemented, the data lake is a key piece of infrastructure that allows organisations to treat data as a pervasive resource — democratise the data, automate processes, build digital factories, and deploy edge computing solutions.

Importantly, these approaches also equip organisations with the bedrock they need to build a sophisticated and contemporary approach to analytics

While basic business intelligence services are table stakes for organisations, they are not nearly enough in a modern and highly competitive marketplace. Organisations need to move with alacrity along with the analytics and insights continuum — from diagnostic insights, through to predictive insights and, ultimately, prescriptive insights.

But data silos, legacy systems, security policies, inertia, and even corporate politics all contribute to an environment that makes it hard to get hold of the data when they need it most.

For businesses today, getting the data infrastructure in order is not an optional consideration. In fact many of the problems they face today will get worse as sources of data — such as IoT and SCADA devices and sensors — start to blend, or unstructured documents and even streaming services created for better customer experiences begin to propagate across the network.

Data needs to be aggregated and accessible so that it can be made available to SAP business processes and ultimately put in front of data analysts, whose work drives business decisions. And of course, companies need to do this cost-effectively.

Take the example of online fashion platform Zalando, which cut the cost of obtaining business insights — including near-real-time business performance — by 30 per cent, after it migrated its SAP systems to AWS.

Zalando employed Amazon Redshift, along with tools such as AWS AutoScaling, which automatically adjusts capacity to maintain steady performance, and AWS CloudFormation, which allows companies to use programming languages or simple text files to model and provision resources for applications. As a result, Zalando achieved better performance at a lower cost.

“We can provision sandbox environments and test applications for quality assurance in hours,” said Yuriy Volosenko, Director for Enterprise Applications and Architectures at Zalando. “With an on-premises environment, it could take weeks. In projects like SAP S/4HANA, we needed to provide tens of sandbox environments with more than two terabytes of random access memory each. It wasn’t an issue because of the scalability of AWS.”

The Zalando example demonstrates the benefits of building a contemporary data architecture with SAP on AWS.

To learn more, read Modern Data Architecture.

This article is published by Which-50’s Digital Intelligence Unit (DIU) on behalf of AWS. DIU Members such as AWS pay to share their expertise and insights with Which-50’s audience of senior executives. 

DIU

LinkedIn
Previous post

AustCyber merges with Stone & Chalk

Next post

Seven West Signs up with Google as Gov mulls more concessions to tech giants