Logo

The Data Daily

Data Tokenization: Morphing The Most Valuable Good Of Our Time Into A Democratized Asset

Data Tokenization: Morphing The Most Valuable Good Of Our Time Into A Democratized Asset

One can argue that everything – yes, everything – consists of data. Sensor data in the industry. Social media profiles. Health care operations. Any type of value exchange. The contexture of a surface – just to name a few. Hence it quickly becomes evident why data is the most valuable and crucial good of our time. Often we do not even realize that we (or IoT devices) produce data constantly, and even less do we realize that others quite often make money off of this data or leverage it to influence elections (e.g. Cambridge Analytica). Therefore we need changes within this industry. This article aims to educate as to what this disruptive journey will look like and is intended to raise awareness for the need of democratization in this industry.

The data industry has emerged as one of the most diverse and influential industries on our planet. Communication, marketing, politics, finance, technology, health care, law etc. are more or less constructs of data which are produced, analyzed, leveraged and recycled. An ever growing sector, it is poised to become one of the most valuable assets, thus converging finance and technology seamlessly when tokenized. According to a Wall Street Journal report of April 7, 2021 global IT spending is expected to rise by 8.4%, therefore further emphasizing a digital trend with continuously increasing data usage. Nevertheless, our contemporary economy data is often lacking privacy, tangibility and accessibility since centralized conglomerates are in control, so it has not yet reached its full potential. A 2020 report on “The Big Data Industry to 2025”by Research and Markets perfectly demonstrates the level of diversification data provides as an asset:

 Blockchain generally is a data-based technology, so building a data tokenomy on top of the infrastructure does seem like a low-hanging fruit – at least when connecting the dots. However, this is not the only functionality data can inherit here: allowing real-time data utility for various segments of application is another tool specifically attractive in linking data with IoT infrastructure, for instance. Also, from the regulatory perspective of “data residency laws”, tokenizing data and leveraging the transparency features of blockchain come in handy for providing data sovereignty and compliance. As an asset class on blockchain, data can be integrated into DeFi applications. Additionally, by staking an asset such as OCEAN or securitized data, liquidity to data pools can be provided. This leads to a significant differentiation in how data is tokenized: 

“If you can obtain all the relevant data, analyze it quickly, surface actionable insights, and drive them back into operational systems, then you can affect events as they’re still unfolding. The ability to catch people or things ‘in the act’, and affect the outcome, can be extraordinarily important.”– Paul Maritz, Chairman of Pivotal Software 

Paul Maritz perfectly describes the potential of data when efficiently leveraged. Tokenization of any asset tends to inherit a large amount of benefits such as making the respective asset tangible, and the same goes for data. The core benefits can be split up into the following: 1) Security 2) Privacy 3) Democratization 4) Monetization, 5) Decentralization and 6) Transparency. Moreover, once data is an established asset class available to retail investors (e.g. security tokens), more regulatory scrutiny is expected to arise due to compliance procedures of the respective jurisdiction (e.g. prospectus filing) on how the data is utilized.  

When analyzing the time between 2011 and 2020 in the so-called “datasphere” (or ocean of data), the overall volume has risen from approximately 1.8 to 59 zettabytes. For 2025, the goal is 175 zettabytes according to an International Data Corporation (IDC) report. Since data has become ubiquitous, progress is required in areas such as management, privacy and storage. While the Big Data Market is steadily growing (see Figure 1 below), the amount of data industry employees and data-related service providers has skyrocketed, according to a report by the Big Data Value Association.

Despite the promising indicators and data’s considerably increasing market share, it is at times difficult to value data transparently because of the manifold domains where data is generated. Besides, this process can be standardized much easier. As mentioned in a PwC report on “Putting value on data” there are thus far no final indicators on how to value data as an asset. The key drivers of value for data (see Figure 2 below) are defined by the authors as the following: Exclusivity, Liabilities and Risk, Accuracy, Interoperability/Accessibility, Completeness, Usage, Restrictions, Timeliness, and Consistency. Additionally, the three common principles for valuing any asset can be applied: the income, market and cost approach. 

Generally, there are various methods as to how data can be monetized:  

Data has continuously increased as a force to reckon with in the global economy. On par with the growing influence, the problems associated with it have become more evident. On the one hand, the control and storage of data is ever more centralized: over 50% of data is stored in the public cloud according to a Gartner report. The clear market dominance in the cloud space comes from the likes of Amazon, Microsoft, IBM, and Apple. Moreover, there is an omnipresent lack of privacy and access to valuable data. It rarely happens that a normal person surfing the internet has any control over the data they produce and the privacy thereof, or has any say in who can access it (unless they use privacy tools or avoid cookies). These pain points are compounded by the fact that it is very hard to monetize your own data. 

Additionally, reports by PieSync and Solvexia list the following bottlenecks when analyzing the contemporary data-economy: 

While maybe not too obvious at first glance, data-related assets can actually be integrated into the DeFi space quite seamlessly (when legally compliant) through leveraging the fungibility of for example ERC-20 tokens. This paves the way for an effective data on/off ramp in the form of a data utility (e.g. granting access) or security token (e.g. participation in profit). Since tokenizing data allows you to assign value for data as an asset, it can then for instance be used as collateral for lending. Conversely, lenders receive more relevant data tokens in the form of interest payments. 

In alignment with this merge between data and the crypto markets, typical wallets like Metamask can be used to “store” the data. There are many benefits when merging data with DeFi:

Another auspicious use case, apart from directly integrating tokenized data into DeFi, is utilizing efficient and adjusted data analysis of essential data for the optimization of decision-making in DeFi (e.g. yields, insurance, DEX-trading).. Consequently, aspects like price data feeds, risk models, high-order instruments (e.g. stablecoins) and margin trading are becoming significantly more accurate. As a result, integrating this data through oracles into the DeFi ecosystem will become far more useful. 

In the current data economy, consumers and businesses don’t feel comfortable sharing data, partially due to prominent instances of data abuse (e.g. Cambridge Analytica). The consumer perception is that they have limited say over how their data gets used as they are typically presented with a binary choice when it comes to sharing their data: in return for getting access to valuable online services, often presented as a “free service”, they either sign away most of the rights to their data in the Terms & Conditions, or they decide to forego digital services that might be essential.

It is also challenging for consumers to assess the value of their data. While big tech has developed tools and methods to assess and extract value from data, the average consumer cannot accurately assess the price of e.g. their browsing habits or shopping preferences. Data incumbents routinely collect massive amounts of data that stays within the walls of the company, thus becoming “silo’d”; the value that is extracted from user data is usu. not shared with the data subjects. By design, this means that massive amounts of data are locked up and only utilized by a small number of big players; only 3% companies say that they have access to sufficient quality data, according to Forbes. Even if companies do have access to enough quality data, many lack the ability to monetize/utilize their data due to concerns with privacy, regulation, and losing their competitive advantage.

History has led to data silos: there’s a lot of data, but most of it is siloed in the hands of data monopolies, and its latent potential is underutilized. As a consequence, the lack of access to quality data hampers innovation. Without access to quality data, industry and academia cannot develop new innovations and solve pressing problems facing business and society. Although we generate more and more data every day (the amount of data being generated has increased 10-fold in the past 10 years), 97% of that data is underutilized. In the current construct there is insufficient incentive to share data. Unless these data silos are broken down, and the various sources of data are integrated to provide a holistic, enterprise-wide view, companies will be limited to functional-level projects rather than digital transformation.

One potential solution to the problem of data silos is to turn data into an asset class that can easily be traded and owned on blockchain. This would open up the Data Economy, enable data sovereignty, break down data silos, enable access to more quality data, and allow individuals to monetize their data. More and higher-quality data would enable businesses to innovate and create new value and new markets. ays crypto could power the Data Economy in a new report detailing a bright future for blockchains. Blockchains and cryptocurrencies are predicted to be a key infrastructure for the data economy, according to Goldman Sachs.

Ocean Protocol is working on this solution by incentivizing data sharing and by building an ecosystem where data, including consumer data, is an asset class that is priced according to market mechanisms. This would enable anyone to publish, share, and monetize data on granular, customizable terms. Ocean Protocol’s solution consists of 4 key elements to enable and incentivize data sharing:

The above elements enable the incentivization of data sharing:

Decentralized Autonomous Organizations (DAOs) are a form of governance that characterize tokenized ecosystems: holders of a particular token make decisions collectively about the direction and future of the system. 

OceanDAO grants funds towards projects creating positive ROI for the Ocean ecosystem. OCEAN holders can decide by vote which project proposals receive funding, i.e. which are most likely to lead to growth. At the time of writing, OceanDAO has distributed funding worth 435,500 OCEAN in 49 investments, all by vote. Long-term goals include improved voting & funding mechanisms, incentivizing engagement, and streamlining processes. Anyone is able to submit proposals, which typically further the following goals

1 OCEAN token equates to 1 vote. Ocean is planning to eventually transition all of its (currently) centrally organized plans and features to the self-governing DAO. Proposals and strategy are discussed weekly in Town Halls. 

Tokenizing data brings some legal challenges. General Data Protection Regulation (GDPR) compliance is essential, and patent law, trademark law, as well as domestic civil law have to be considered. Most of these, however,  shouldn’t present bottlenecks'. 

Rather more complicated is financial regulation. Looking at tokenization of data from a European regulatory perspective, we have to pay attention to the Markets in Crypto-Assets Regulation (MiCAR) which will come into force in late 2022. Under MiCAR, all crypto assets are regulated, unless they are already regulated by a different regime (e.g. Markets in Financial Instruments Directive, MiFID). Crypto assets under MiCAR are digital representations of value or rights which may be transferred and stored electronically, using distributed ledger technology or similar technology. This is a very wide definition, covering also tokenized data. Once the token is a crypto asset under MiCAR, the issuer shall publish a Whitepaper and notify it with the respective National Competent Authority. A crypto assets service provider (literally anyone providing crypto asset services to others) has to be regulated. With respect to the Whitepaper requirements, MiCAR generally exempts NFTs. MiCAR says that crypto assets which are unique and non-fungible with other crypto assets shall not require a Whitepaper. So, MiCAR also leaves the door open for tokenized NFTs but the crypto service providers would require a license under the soon to be effective MiCAR. It is assumed that MiCAR will come into effect in Q2 2022 with an 18-month transition period.

Until then, domestic regulatory law within the EEA remains very diverse. Some EEA member states regulate crypto assets service providers (e.g. Germany and France) whilst others only require a self-obligatory registration (e.g. the Netherlands, Luxembourg, Liechtenstein). When it comes to documentation requirements, a securities prospectus for the issuance of securities (including tokenized securities sui generis) is rather harmonized within the EEA. The issuer may tokenize data, ask for regulatory approval of the respective securities prospectus and passport it for fundraising purposes to other EEA member states.

In January 2020, new blockchain laws came into force in Liechtenstein with the TVTG. With this step, Liechtenstein has taken into account the development of the age of digital transformation based on blockchain. The Liechtenstein Token Act allows rights and assets to be tokenized in a legally compliant manner by applying the Token Container Model. As discussed before, you can generally divide data tokenization into two sectors: utility tokens and security tokens. Let’s take a look at how one can legally tokenize data as a security by leveraging the Liechtenstein Token Act, and how to passport it subsequently (with prospectus).

Firstly, an SPV (Special Purpose Vehicle) mostly in the form of an AG (german: Aktiengesellschaft) has to be established in Liechtenstein. This can be done with crypto as initial capital contribution for instance and without a bank account. Once the legal process of establishment is conducted, the data which is to be monetized will be “packaged” into the SPV as the sole asset which allows the subsequent tokenization thereof in line with the TVTG. Private placements of this securitized data can now already be made. However, since it is considered a normal security from a foreign perspective, offering it to retail investors  requires that a prospectus be filed and afterwards passported to the EU seamlessly from Liechtenstein thanks to its membership in the European Economic Area (EEA). 

The passporting process goes as follows: Due to Liechtenstein's membership of the EEA, it is possible to tokenize rights and passport them as securities sui generis via an approved prospectus with the Liechtenstein supervisory authority (Financial Market Authority, FMA) to other EEA states and thus also to Germany. This is based on the regulatory requirements of the Prospectus Regulation (Regulation (EU) 2017/1129 of the European Parliament and of the Council of 14 June 2017 on the prospectus to be published when securities are offered to the public or admitted to trading on a regulated market and repealing Directive 2003/71/EC), which are harmonized under European law. What is special about this is that here the Liechtenstein advantages under civil and company law of the TVTG can be combined with the regulatory single market approach. 

For EU passporting, it is necessary that the tokenized rights are first approved with a prospectus at the FMA. In addition to the classic securities prospectus, a so-called EU growth prospectus can also be considered as a prospectus type, which allows for certain facilitations in prospectus approval for small and medium-sized companies – and thus especially for startups. The approval procedure is standardized throughout Europe thanks to the Prospectus Regulation. The content, scope and approval procedure are uniformly specified. The approval period is also specified, which provides a certain degree of planning certainty. If there are no reasons why the approval should not be granted, the passporting can be applied for at the same time as the approval. This is done in the same procedure and takes only two working days. As soon as the approval and the passporting have been confirmed by the FMA, the distribution of the issuer in the German market and in other European markets is possible.

To conclude, it has to be noted that while the tools and knowledge are there to foster the democratization of data, we still have a long way to go before this becomes mainstream. For that to happen, a lot of education has to be made and intuitive tools offered, allowing users to seamlessly migrate to the tools of data democratization. Applying blockchain technology to this sector does not only lead to more efficient and decentralized access, but also gives users the opportunity to apply DeFi tools to monetize the data you produce and share. In a nutshell, we receive back the control of our data and can decide how/if we want to monetize it. This can turn an industry that everyone contributes to (but very few actually gain from) into a truly decentralized and auspicious passive income stream. This valuable but also intransparent sector could become tangible, transparent and democratized. 

Daniel Tóth (Ocean Protocol), Alireza Siadat (Partner at Annerton) and Nicolas Weber (Head of Business Development at Amazing Blocks) also contributed to this article.

Images Powered by Shutterstock