Decoding the Layered Tokenization Enigma
Unlocking the World of Blockchain Needs Business Leaders Well Versed in Interpreting Available Data and Realizing Its Potential. Case Study Included.
If you were to ask me, the most valuable, concise, and insightful market research on digital assets can be found in 21.co's reports. If someone could somehow translate the expertise of the individuals involved into a machine learning model and add a touch of innovation to Chat GPT, I'd be ready to invest. Consider this business idea my gift to you!
One of their recent reports provides a comprehensive overview of the current tokenization market. However, what strikes me is how it reflects a particular worldview of tokenization I often encounter in industry circles - a worldview that diligently compiles concrete and observable market data only to then disregard it entirely. Instead, it seems to embrace tokenization as a utopian vision that is believed to be just around the corner. It might be a bit unfair, but it occasionally reminds me of the speech given by Erich Honecker at the 40th anniversary of the GDR in Berlin, with distinguished guests from around the world in attendance:
The GDR will also enter the year 2000 with the conviction that socialism is the future [...]
Our position [...] is not based on outdated doctrine [...] Our position is a policy founded on the highest principles [...]
We do not rest on our past achievements [...] We discard what is outdated and restrictive; we continue to progress on our path.
It could serve as an intriguing prologue for a tokenization manifesto, don't you think? Even in the face of protests outside the ‘Palace of the Republic’ that eventually led to a regime collapse later, Honecker's belief in being on the right side of history remained unshaken. And given his biography I cannot even blame him, but that is besides the point. Admittedly, it's a far-fetched analogy, but one that ignites my vivid imagination. Allow me to illustrate my point with a few examples.
The Destabilising Feedback Loop
On page 7, the report presents an intriguing idea about how Layer 1 blockchains, through their native currencies, capture a significant portion of the value created. The initial premise, which I wholeheartedly share, is that:
“Decentralized blockchains are financially sustained by the underlying native token coordinating stakeholders to secure the network and validate transactions."
The authors go on to suggest that a growing digital asset market increases demand and, therefore, its price, implying that investors can gain meaningful exposure to the growth of blockchain-enabled solutions. This is as important as the communication protocols supporting the internet and, in its importance, similar to oil for our present economy. Since I am not in the business of giving investment advice, I'll let you make of it what you like. But I do want to make a few general points.
There is a trend towards Layer 2 solutions to overcome performance issues of the Layer 1 protocols. The growth in Layer 2 (L2) solutions, while expanding the overall blockchain ecosystem, can lead to a relative decrease in the percentage share of Layer 1 (L1) blockchains. As L2 solutions become more efficient and handle more transactions, they can absorb a significant portion of activities that were previously managed by L1s. This dynamic doesn't necessarily diminish the absolute value or importance of L1 blockchains, but it does suggest a shift in how value and activities are distributed across the blockchain layers. The evolving landscape reflects a more complex and interconnected blockchain ecosystem, where both L1 and L2 solutions play crucial, albeit evolving, roles.
So, the first question I would ask my financial adviser is how the various components of a digital infrastructure capture economic rents. In other words, what is the market structure of the future, factoring in the dynamic and evolving nature of Layer 1 and Layer 2 interactions?
The reason I am raising these questions is not to predict the price of crypto in the future. My point is that since native currencies have a coordination function (as acknowledged by the report), they need to have a price to perform this function. But that does not mean they become financial assets. The analogy of transaction fees to commodities like oil is intriguing, although I prefer to use electricity (to navigate around any choppy waters arising from the need to decarbonise, which is why oil is subject to very unique factors). Comparing crypto to utility services like energy should remind us that these markets are not trading financial assets either. Electricity markets sometimes experience negative pricing due to oversupply, and trading it is a way to coordinate and balance the supply and demand across different grids to ensure the proper functioning of the grid. What this tells me is that there's a delicate balance to be struck in monetising these digital commodities. Overemphasis on monetisation risks skewing the primary function of these networks, which is efficient resource allocation.
The Last Will Be First …
Page 8 provides an overview of the activity and relative performance of different protocols used for tokenization. What is striking is this: according to the table, Ethereum's transaction costs exceed competitors by a ratio of 9:1 to 197,000:1, depending on which network you pick. Additionally, Ethereum significantly underperforms in the speed of execution, yet it maintains a 58% market share in tokenization. The authors comment on this extraordinary achievement rather nonchalantly: "Ethereum optimises for security and decentralisation, while the rest compete on speed and scalability.” Hm!
I don't know about you, but getting an almost-as-good-as-Starbucks cappuccino 197,000 times cheaper would make me seriously reconsider my loyalty to them. Ethereum's dominance despite high costs and slow speeds seems like a paradox. Is it just a first-mover advantage? Maybe not entirely. The real twist might lie in how non-native assets like tokens, think USDC, cleverly bypass the sluggishness of Layer 1 through faster, cheaper side channels. This trend hints at an evolving blockchain equilibrium: minimising Layer 1 settlements in favour of efficient Layer 2 solutions, all while keeping the Layer 1 network stable and validators happy. These blockchain dynamics have strong parallels in the banking industry concerning settlement internalisation away from CSDs or SWIFT network alternatives for secure but less costly communication. And these dynamics pose risks and opportunities, indeed, a delicate balancing act! And for my liking, there is surprisingly little debate on the huge cost deltas and what kind of behaviour this will encourage.
Mislabelled Labels
Editorial comment for page 10 - I could not stop myself: the European Investment Bank (EIB) is a supranational entity, not a government body, so no government bond. Moving on. Next.
The 10 Percent Enigma
Page 29 is what compelled me to write this blog:
“We estimate that the market value for tokenized assets will be between $3.5 trillion [..] and $10 trillion [..] by 2030. For the base case, we assume that tokenization will capture about ~10% of the net assets of regulated open-end funds [..]”
These ominous ten percent are everywhere.
BCG writes “tokenization market to be 10 % of GDP by 2030.” So does Citi: “According to a BCG and ADDX study, tokenization of global illiquid assets is estimated to be [..] nearly 10% of global GDP by 2030 [..] based on conversations with internal and expert domain experts — we forecast $4 trillion to $5 trillion of tokenized digital securities [by 2030].” And so does everyone else.
I really don’t like this 10%! And we have to thank the “Global Agenda Council on the Future of Software & Society'' for creating this chimera in our minds not too dissimilar to Coca Cola inventing Santa Claus wearing a red outfit or the greeting card industry inventing Valentine’s day. And yes, that is really their name, and yet nobody asks if this was meant to be a parody publication by the WEF. I don’t need to ask the question because I am convinced I have the answer already, of course.
Please judge for yourself. It is very insightful what the council wrote in 2015:
“Shift 16: Bitcoin and the Blockchain
The tipping point: 10% of global gross domestic product (GDP) stored on blockchain technology.
Expected date: 2027.
By 2025, 58% of respondents expected this tipping point to have occurred.An explosion in tradable assets, as all kinds of value exchange can be hosted on the blockchain
Better property records in emerging markets, and the ability to make everything a tradable asset
Contacts and legal services increasingly tied to code linked to the blockchain, to be used as unbreakable escrow or programmatically designed smart contracts
Increased transparency, as the blockchain is essentially a global ledger storing all transactions.”
Ok, I do repeat, how can this not be meant as a parody? And yet its driving all these forecasts. Putting that aside, let’s look at the model presented.
Current market size of tokenized assets: $118.57 billion.
Digital Dollar / USD Stable Coin represents 97% of all tokenized assets.
The addressable market lists various asset types, equities, debt, etc.
And we want to land between 3.5 to 10 trillion by 2030.
The required Compound Annual Growth Rates (CAGRs) for these scenarios would be as follows (see table):
Bear-case 3.5 trillion CAGR: Approximately 167.62%.
Base-case 6.8 trillion CAGR: Approximately 194.38%.
Bull-case 10 trillion CAGR: Approximately 210.92%.
And be quick, the clock is ticking starting 1 January 2024. My numbers are higher than what 21.co has because I would exclude stable coin here for two reasons. The report’s list of assets considered as an addressable market does not mention stable coin, and I assume the authors are not advocating that stable coin would still account for 97% of the tokenization balances.
Now here I think past and present collide. Writing a speech for the Chairman of the State Council of the German Democratic Republic in 1989 about everlasting Socialism or predicting the size of tokenized markets is not so different. Whatever assumptions were used then or now to predict their respective future scenarios, I cannot, for a moment, believe either author thought it's a highly realistic scenario.