Latest update

6/recent/ticker-posts

Data Fabric

Data fabric is a term coined by the research firm Gartner to describe a distributed IT architecture in which data is governed the same way whether it is located on premises, in the cloud, or at the edge of a network.

Essentially, data fabrics are woven with data integration and management policies that address specific types of data. The goal of creating a unified data fabric is to ensure that an organization's data will always be readily available to authorized entities no matter where it resides.

Gartner predicts that the market for software products and services that facilitate the creation and management of data fabrics will grow to be $3.7 billion annually by 2026. To accommodate the need for interoperability, cloud services that help enterprise customers create data fabrics are usually platform agnostic and process agnostic.

Popular vendors in the data fabric market space include NetApp and SAP.

One of the key benefits of data fabric is eliminating silos, which have been a problem in IT for as long as big data has existed.

Even decades ago, people were talking about the need to move valuable data to where it could be best used. Developers and others lamented situations where important data got stuck in some isolated environment, and couldn't be accessed for new and important uses.

A data fabric helps to eliminate a lot of those problems and obstacles, by creating a system where data flows more freely. Think about it as a traffic infrastructure with a better traffic design, where data packets powered by their built-in trajectories are better able to move to a destination that's been planned for them.

When it comes to momentum and data process initiative, the metaphor might fall apart a little bit, but in some ways, it holds true. Here's how it works: When there is more of a system for delivery, those individual processes become more capable. The delivery flows better.

This was true with Eisenhower's creation of the interstate model. All of a sudden, there were freeways connecting cities and faraway destinations to one another. So traffic improved.

The data fabric, in some ways, does a lot of the same things, or similar things, for data trajectories. You can imagine in any complex big data system that there is a sophisticated environment to navigate. If each of those processes is on their own to do that, it creates bottlenecks and other problems.

So the data fabric works as a framework and a model for data transit. It's a smarter network design and build than simply cobbling together hardware and allowing some trajectory. It makes all of those different trajectories easier.

Another way to think about data fabric is to look at some popular use cases. For example, there's the use of data fabric for anti-fraud systems. Here, having the versatile environment helps the technology and its human handlers to do more in terms of mining for the valuable information that's going to help drive insights.

Other use cases apply to the business world where data fabric helps to build specific analysis.

In short, data fabric has to do with delivery. It's seen as a progressive descendent of earlier primitive systems where there was hardware and there was a network, but there was no overall managing network system for getting data where it needed to go. In general, those designing and setting up data fabric will be involved in the key work of data governance for some entity, and looking at how data can be used most efficiently and most effectively.

Post a Comment

0 Comments