As a follow-up to the latest events in Boston and New York during the Blockchain week, the current state of the encryption and research data is discussed, and the critical role it plays in the maturation of the space.
Last week, RIA was in a panel on the status of data in cryptography. One question that someone raises is “Why the data and the research companies needed when all the data in the space is public and free?”
The answer is that while almost everything is public, in its crude and unfiltered form, a large amount of data can also be noisy and deceptive. The process of aggregation, filtered through the standardization of data and encryption is not a trivial task – it is a full-time job, and it is not efficient or scalable for individual actors as institutions or funds to do for themselves.
However, high quality data is a must for the space to mature and attract institutions and asset managers, as well as regulatory gain support.
The challenges that weigh on the current state of cryptographic data can be analyzed by taking into account three key criteria – data comprehensibility, integrity and accessibility. What are the challenges to which they refer to these criteria, and what are the research and data companies doing or still have to do to solve these problems?
Comprehensibility of data
Comprehensible data is easy to understand and analyze to extract precise ideas. In general, if we look at the data of companies and traditional assets, which meets this criterion. Traditional companies fit very well into sectors and companies within a sector have metrics that are easy to track and compare. We know that if Samsung sells more phones at a higher return than Apple does, then it is likely to do better as a company at that point in time.
The same can not be said for encryption assets and networks. And the main reason for this is that the metrics that are apparently not accompanied by a series of warnings that are not intuitive and are not easy to explain. A couple of examples that come to mind are the volume of transactions and the number of transactions in the Bitcoin network.
The volume of transactions is the value of all the transactions that take place in the chain. What is not intuitive is that when we say that the volume of transactions we often refer to the gross amount, which includes the real volume of the economy, but also the non-economic volume as outputs of change.
Change outflows are the excess part of the volume of transactions that are sent back to the originating portfolio – it is incorrect to include the value of these outflows as the economic volume. It would be like going to the grocery store and paying for a $ 5 bagel with a $ 10 bill, and receiving $ 5 back as the change and saying that the economic value of my transaction was $ 10. A couple of vendors Data that have deployed adjusted economic volume measures include CoinMetrics and TokenAnalyst.
Another metric that is often used to control the performance of Bitcoin is the transaction count, or the number of transactions that take place in the chain. The limitation of number of transactions is that you do not separate out all transactions and payments within batch transactions.
Dosing transaction is a practice used by companies such as exchanges, mining pools or other large users of the network to reduce transaction fees and save valuable block space. A problem is the level of dosage in Bitcoin may vary over time, which makes it difficult to compare the number of Bitcoin transactions to themselves historically.
In addition, the level of dosage in other chains may also vary, which makes it difficult to make apples comparisons with the apples of the transaction has different networks. A more appropriate metric that incorporates all payments made within batch transactions is the payment count.
When reaching reliable, economic metrics makes it more difficult since the complexity of the network in increments of analysis. That is why we need research and data providers – to educate first the stakeholders about the dangers of using raw data and provide them with more understandable alternatives.
Integrity of the data
High integrity data is reliable, reliable and consistent. While there is a large amount of data rich in cryptography, certain data sets are susceptible to intentional or unintentional game play, manipulation or inconsistencies.
A few frequently cited examples of data that are prone to games or inconsistencies include the number of users and transactions in blockchains low rate, volume of exchange, certain measures of development activity in Github, and the entries of – in the absence of a Best word – Top of the market.
In the case of market capitalization – calculated as cash price times the total number of chips in circulation in the chain – a large number of the challenges in obtaining an accurate measure of the market cap or network value are derive from inconsistencies in the supply.
For example, when considering traditional companies, the price is variable, but the offer is fixed for the most part. In cryptography, both price and supply are variable, so it is difficult to know if an increase in market capitalization is the result of an increase in price or supply. On the other hand, if it is the result of an increase in supply, the reasoning behind the change in the supply may be so opaque.
Additional challenges include the comparison of the supply curve of different cryptoassets and defects in the calculation of the circulation supply. The program of supply of cryptoassets differ significantly from one asset to another and are often not clearly indicated.
All else the same consideration, a crypto that has a higher inflation rate is seen to be metric market capitalization increasing at a faster rate than an asset that has a lower inflation rate. In terms of current supply, the measure in the supply chain can not account for lost coins, currencies that are blocked by contract, and unclaimed currencies of bifurcated assets.
So we are indebted to the use of low integrity metrics always? Not fortunately.
Cryptography analysts and companies have introduced proxies that account for some of these challenges. An example is the metric of the cover made created by Nic Carter and Antoine Le Calvez. The given cap makes an attempt to account for lost coins and unclaimed coins of forked assets.
Another proxy is the liquid market capitalization, recently introduced by Messari, which tries to account for currencies that have restrictions, among other adjustments.
But these proxies also have limitations. That is why it is important to include complementary measures of strength and performance of a comprehensive analysis of these networks. Some examples are hashrate or fees paid to the miners in the PoW chains or coins locked in the redefinition of the chains of outlets.
Progress has been made to help interested parties separate signal from noise. Data integrity, however, remains a frequent problem facing space, and as more institutional capital enters the space, data and research companies have to continue acting as managers.
Accessibility to data
Accessibility refers to how easy it is to collect and manage data. In traditional markets, investors and analysts have a single point of access to most of the financial data they need at Bloomberg, Factset, or Thomson Reuters. These companies also have advisors that customers can contact if they need help finding or evaluating anything.
But the encryption data is fragmented through hundreds of websites and quite the opposite of easy to find and manage. Many of these websites offer a splinter of market or network data with varying levels of context and sophistication.
There are some key reasons for this level of chaos and fragmentation. Crypto has different categories of data – market data, data on the chain of data, off the chain – each of which has its own challenges and each of which requires a different set of skills and tools to add, clean and standardize. Another issue is that unlike traditional markets, the encryption space does not have a uniform set of data information standards.
Crypto companies are not required by the monitoring agencies to disclose performance. Finally, when it comes to market data, a challenge that is unique to the encryption space is that these assets are negotiated through multiple locations worldwide that have different levels of transparency, integrity and accessibility.
However, there are a number of companies that carry the burden to create industry data standards and provide self-regulation among encryption projects. And the integration, consolidation and investment of time and money to attack the challenge of accessibility should continue in the coming years.
While there is a long way to go, it is important to remember that this industry is still only ten years old, and it is impressive to see to what extent the encryption data and research has arrived in such a short time.
The number of data and research providers has shown strong growth in the last year as attention has increased to separate the signal from noise. Why? Because we understand that it is fundamental, it is not optional, for the encryption industry to start offering comprehensible and accessible data and research that current and future members of the crypto community can trust if we want to see the space evolve.