How to deploy data exchanges and integration for data monetization
FYI, this story is more than a year old
Now that companies are getting a better handle on extracting value from mountains of customer data for their own business operations and marketing purposes, they are taking the next steps in learning how to capitalize on that very same data as a resalable asset.
According to Accenture, “…payment providers have a treasure trove of customer data at their fingertips,” and to monetize that data will require that they take “advantage of distinctive data sets and apply advanced analytics techniques” from multiple sources.
These new techniques are enabling companies in all industries to monetize years of valuable data assets (e.g., customer, market, historical, genealogical, etc.) via fee-based access.
These and other digital business models are predominately based on data exchanges, which are now a pre-requisite for doing digital business at scale.
Since moving data around the world can be expensive and impractical, then its ability to “exchange access” to data with customers and business partners becomes critical to a company’s ability to realize the full value of its data.
This new perspective on making data more useful and profitable can lead to a number of market forces that will change how companies handle and protect their data assets, including:
- Data will soon replace traditional products in most firms as the most revenue-generating asset in a growing global digital economy. By 2020, Accenture estimates the global digital economy will rise to $24.615 billion, representing 25% of the gross domestic product (GDP).
- Universal methods of accessing, passing and transferring data between and among disparate systems, companies and networks will be a critical capability for digital businesses. As the need for data exchange increases, it will have to be balanced with risk mitigation.
- The dependency on multiple data sources will drive legal and service expectations along with the associated monetary impact, in the event of unavailability or data loss. A recent IBM Security and Ponemon Institute global study shows the average total cost of a data breach, which includes what an organization spends on the discovery of and the immediate response to a single breach, is $3.62 million.
- As data leaves an enterprise’s domain of control, companies will need to gain greater visibility into and management of where it will go, how will it be protected and what it will be used for.
Combining data from disparate sources into meaningful information and delivering it so it can be monetized and exchanged with trusted business partners can set into motion a variety of new challenges for most companies.
Accessing multiple types of data from numerous sources and locations, in either an event-based or time-scheduled manner, places a significant burden on existing enterprise data infrastructures and services, particularly those that are centralized in one location.
For businesses to effectively and efficiently reap optimal value from their data assets, they will need to:
- Deal with hundreds of permutations of data transfers, some requiring governance items that may not already be applied.
- Create a “trust-nothing” environment, where each action that changes the data should have some level of governance review, which can be impractical for individual service implementers — anyone can call their service from an API, dynamically, with no humans involved.
- Address random data transformations (in some form or level) throughout the environment. Services such as extract, transform and load (ETL) consume resources and lack macro-coordination.
- Reduce the difficulty “data guardians” (who are still responsible for the data) experience when governing dynamic data transformation.
By leveraging an Interconnection Oriented Architecture (IOA) strategy, you can deploy a data exchange and integration platform at your company’s digital edge, where digital commerce, population centers and business ecosystems meet.
Data exchanges are groups of companies (“ecosystems”) that are securely interconnected in an edge node (“interconnection hub”) for the purpose of accessing/sharing data that can be monetized.
New data sources are valuable to these data-oriented partners, and these relationships create “data gravity” as data attracts more data, as well as the digital partners and services that stand to gain from accessing that data.
Also, as in analytical processing, more data sources directly translate into more experience (e.g., IoT data, scientific data, medical trial data) and greater value from the insights and information it produces.
Even if translation between data sources is not required and data is passed straight through, the other governance functions and controls made available within the business ecosystem can provide significant value and needed oversight in a dynamic, automated data environment.
A data integration platform essentially takes data sources from a number of supported source interfaces (i.e., file, database, object store, etc.), transforms it into a universal format and then uses data services within a digital edge node to provide various consumer, API-enabled interfaces to access the data.
This approach has already created widespread value in organizations that frequently need to integrate data between disparate applications in one or more clouds.
The design pattern below illustrates how you can deploy data exchanges and integration at your digital edge.Data Exchanges and Integration at the Digital Edge
Take the following steps to develop a data exchange and integration infrastructure at the edge:
- Configure a local data repository for data staging and local private storage for caching/performance.
- Deploy data access (adapters/connectors), transformation and delivery services (adapters/ connectors).
- Ensure the following steps are “wired” to go through boundary control and inspection zone(s).
- Apply event processing and policy enforcement.
- Configure data profiling, data quality and operational processing.
- Provide internal and external (productized) APIs for data integration (as-a-service).
- Apply metadata and master data, management and re-encryption for destination key management
- Integrate with a data pipeline service and update provenance information.
By implementing this data exchange and integration infrastructure design pattern for monetizing data assets, you will realize the following benefits:
- A uniform way to exchange data between applications, cloud services and business ecosystem partners (with or without compression).
- High-performance, low-latency connectivity for delivering dynamically integrated, real-time data and managed batch data transfers/migrations.
- Full event processing of all data exchange points that go through a data exchange (internally and externally), forming a view of data events and an audit trail, with dependency analysis.
- Automated methods to apply data encryption, masking, personally identifiable information, alerting and leakage prevention, as well as policy enforcement at exchange points.
- Real-time data corruption and tampering detection that leverages machine learning.
- Data exchanges that allow companies to produce entirely new business models built on integrated data.
One of many use cases for data exchanges and integration infrastructures is the fast-evolving digital payments ecosystem.
It consists of credit card networks, issuing and acquiring banks, processors, payment gateways, mobile and fixed line point of sales devices, mobile wallets, fraud and identity, and integrated solutions providers.
Key characteristics of the digital payments ecosystem are:
- A large, growing number of counter-parties exchanging data as a consumer or business payment is processed, vetted and settled.
- A standardized data format has yet to emerge in this industry; in its absence, the data exchange platform can play the crucial role of transforming the source data into a self-describing, well-defined format.
- New rules and stricter regulations are imposed, for example the General Data Protection Regulation (GDPR) or the Fourth Money Laundering Directive (4MLD). Payments ecosystem participants can address these new policies by implementing a data exchange that provides the mandated data security mechanisms.
- In EMEA, the Second Payment Service Directive (2PSD) de facto ends the banks’ monopoly on payment services and customers’ account information. 2PSD creates open APIs, allowing non-banks to develop new business models to monetize data assets previously only accessible to the banks. By deploying data exchange platforms, the non-banks can now effortlessly participate in the digital payments ecosystem via these open APIs.
- Digital payments regulations require monitoring of data events and the creation of an audit trail, and market participants will send the data from the exchange to adjacent cloud-based analytics platforms to increase the monetization potential derived from the data assets.
This and other industry-specific use cases can leverage an IOA data exchange and integration infrastructure design pattern to gain direct and secure access to critical data assets for monetization, or any other business expansion purposes.
In the next blog article on managing data at the edge, we’ll discuss how to leverage data orchestration and data provenance to facilitate and track data flows and consumption from disparate sources across the data fabric.
Article by Herbert Preuss, Equinix Blog Network