frogs-logo

A product of the New Order DAO

HomeFrogs ProNewsletterSubmissions
Back
35ngava68pV4hBO38FXz5C

The Data Economy: Three Protocols

There’s a massive amount of data in the world, and it’s increasing at an incredible rate. IDC, the global market intelligence firm, estimates that this sum will increase from 33 to 175 zettabytes in 2025. That is 175 followed by 21 zeroes!

Despite the sheer volume of data, it is not utilized very effectively across multiple companies, industries, and economies. The chief problem is data silos. A data silo is when data collected by one unit or department is not available to the rest of that organization. The same is true at higher levels – the data gathered by one company is typically not accessible to other companies.

The data economy will change all of this.

The data economy is the global digital ecosystem where both producers and consumers collect, organize and share data to get insights. Importantly, they also monetize it. Data in this economy tends to be eclectic, meaning it has a wide variety of sources. Search engines, social media platforms, online data vendors, companies using devices connected to Internet-of-Things - you name it.

Participating in the data economy offers many benefits. By exchanging their data with other actors, companies may develop a new business line. For example, medical device producers have a wealth of information on their users’ health, such as heart rate or insulin level. In addition to the revenue they generate from selling medical devices, they can also collaborate with health care organizations by providing them with the patients’ tracking data in an ethical and secure way. All participants will benefit from this data exchange, and a medical device manufacturer will have created a new revenue stream.

Streamr

According to the forecasts, nearly 30% of global data will be generated in real time, 95% of which will be obtained due to IoT devices. If this proves to be true, then we can say that Streamr is building the future now. Streamr is a decentralized platform where users can exchange and monetize their real-time data streams, including those generated by IoT devices. At its core lies the Streamr Network, which transfers streams of real-time data from producers to consumers.

All the data in the Streamr network come in the form of streams. A stream is any sequence of data points, which can be data of any kind and can be generated from any source. Sources include but are not limited to sensors in a smart house, commercial data vendors, or database systems. To illustrate what a stream may look like, let’s look at the example below.

Temperature and RPMs of an Active Motor

Temperature and RPMs of an Active Motor

These figures are taken from high performance engines, whose temperature rises with RPMs. This can obviously be useful for mechanical engineers, who can now employ field data when redesigning their engines.

Under Streamr, streams are gathered by users and packaged into Data Unions. With consent, real-time data from different users can be put into one Data Union. Data Unions are what is available for sale on the Streamr marketplace. This is how users can monetize their real-time data. DATA tokens are distributed to all producers of data streams when a buyer purchases (or “subscribes” in Streamr jargon) a Data Union. Not all Data Unions are created equal. They can differ by their membership models, use cases, revenue structure or by other features.

I believe that the Streamr platform in general, and Data Unions in particular, will play a huge role in the data economy. It is a permissionless, decentralized, peer-to-peer product that also allows retail data producers to monetize their real-time data.

Ocean Protocol

One of the leading players in the field of DataFi is Ocean Protocol. DataFi can be defined as the sector of decentralized finance where data and data services are treated as a new, emerging asset class. Ocean Protocol, which is at the front of the space, is a decentralized data sharing protocol which allows the producers of data to directly sell their products to consumers.

2

2

The protocol allows data providers to securely monetize their data without giving full ownership to buyers. Data consumers - for example policy makers, or AI, or machine learning engineers - will benefit from the protocol by getting access to the private data sets which otherwise would be hard or impossible to obtain.

One of the most important concepts in Ocean Protocol is datatokens, which give access to a particular data set or a data service. All datasets or data services on the protocols have their own datatokens. In order to get access to the data set, you send 1.0 data tokens to the data producer. You can even transfer your access to someone else by sending your 1.0 datatokens to him. It should be noted that you don’t buy the data itself, you only buy access to that data.

Compute-to-Data (CtD) is a clever technological solution which allows businesses or individuals to share their data while maintaining privacy. Let’s say you have a data set that you would like to “rent” out, but you don’t due to security concerns. A data scientist wants to use your data. CtD is a tool that resolves the problem. Any Data Consumer can run his models on your data while the data will never leave its premises (which can be your hardware, a Google Sheets file or anything else). Think of CtD as a protective layer between a Data Owner and a Data Consumer.

3

3

The way it works is when an algorithm is run on the data, only results, not the data set itself, will be sent to the Data Consumer. This will allow the data owner to monetize his data while still preserving privacy. You can sell your data directly to the Consumer or on the marketplace. Data Consumers will get more data to train their models upon. Another advantage for data consumers is that they won’t need computation infrastructure, since all computations will run on the data owners’ hardware.

Not only data, but also algorithms are regarded as an asset on Ocean Protocol. A researcher can monetize his algorithm. Like other data assets, providers can sell the algorithms themselves, or simply access to them. An algorithm developer may decide to sell only access to the algorithm, which means that algorithm is public. If only the computation service is sold, not the algorithm itself, then it’s private.

Chainlink

Finally, we have an old classic - Chainlink.

Many DeFi applications need external data. For example, an on-chain betting market would need real-time odds from multiple bookmakers; or a decentralized trading app where you can trade a security linked to ETH futures price should be able to fetch ETH futures prices from outside exchanges, such as Chicago Mercantile Exchange (CME). So in most cases, there’s a need to connect smart contracts with outside world information.

This is what a blockchain oracle does. It is a third-party service feeding real-world data into smart contracts powering DeFi. Decentralized oracles go even one step further by combining oracles into one system. They query multiple data sources and return the data to the blockchain. The aim is to reduce the risk of any single point of failure.

Chainlink is a leading decentralized oracle network. Its architecture consists of three parts – Basic Request Model, Decentralized Data Model, and Off-Chain Reporting. Basic Request Model is what its name suggests. If a smart contract needs to know the price SOL is trading at on Binance, Basic Request Model will do it. This part of the Chainlink architecture is responsible for querying data from a single data source.

Decentralized Data Model (DDM) introduces the idea of on-chain aggregation. Data is aggregated from multiple independent oracle nodes, which increases the reliability and trustworthiness of the answer. Chainlink’s Data Feeds function is based on the Decentralized Data Model. Data Feeds are sources of off-chain data, such as weather events, business financials, outcomes of sports events or asset prices. Data is aggregated on-chain so that consumers can always retrieve the answer.

Finally, Off-Chain Reporting (OCR) is what makes Chainlink truly special in the context of decentralization. The execution happens mainly off-chain. Oracle operators (nodes) communicate with each other over a peer-to-peer network, and each node regularly reports its data and approves it with a signature. All reports are aggregated in one transaction, which is the final answer for that round that is then transmitted. The main advantage for oracle nodes of aggregating reports into a single transaction is that they pay much less in gas. Submitting one transaction instead of many of them decreases congestion on the Chainlink blockchain, too.

Conclusion

It isn’t an overstatement to say that data is now an asset class in its own right. Data and data services are becoming increasingly more important to the overall economy. The data economy, the global digital ecosystem in which data is gathered, analyzed, and shared with other participants to get value from information, is going to impact all sectors and industries, from health care to e-commerce.

However, there are several problems with how the data flows in the global economy. One of the worst issues with the current situation is that data is generated and kept in data silos. The inability of data to reach other participants reduces its value. Second problem is the lack of ownership. Though our data is getting recorded comprehensively and at an accelerating rate by big players, such as social media platforms or medical devices manufacturers, we are usually not the owners of the data. They are used or sold without our consent.

Data DeFi is going to change the way that data is collected and shared. Even now, when DataFi is nascent, several projects have emerged with the aim to decentralize the data services. These services, among which we can mention Ocean Protocol and Streamr, allow data producers to monetize products and services on their marketplaces. Besides creating revenue streams for users, these platforms will return the ownership of data to data producers. The benefit of DataFi for data consumers is that the data that otherwise would be hard or impossible to obtain will be accessible to them in a secure and ethical manner.

Published on Mar 24 2023

Written By:

Risk_Taker88

Risk_Taker88

@risk_taker88
newo-logo

Copyright © 2024 NEW ORDER. All Rights Reserved

Privacy PolicyToken Terms and Conditions