Tokenization: Gimmick or Game-Changer? Unpacking the Real Challenges
Exploring the complexities and realities of tokenizing financial instruments, this article delves into the true potential and limitations of blockchain in modern finance.
Curse you, Mr. Harrold, and your suggestions on which topics this series, Rolling the Dice on Tokenization, should cover.
Perhaps I should explain this. Mr. Harrold is a founding member of TSSAG, now with Alfasec Advisors and a custody guru and industry legend, so how could I not listen to what he has to say? But here’s my problem: my blog articles are very unplanned. Their topics emerge as I encounter pieces of information that spark my thinking. And now this: I’ve got homework and need to think of a proper structure and an analytical framework.
A planned article? God forbid.
No more helpful ideas, please. This is getting too stressful. I have a larger research project on this topic and want to ensure consistency between what I write here and what is yet to come. My other concern is making sure that what I write is of sufficient quality to enrich the debate, which means taking a critical view, avoiding unnecessary mistakes, and not simply wasting your time by repeating what we all know anyway.
So what are the challenges of tokenizing financial instruments, and why and how?
Good question. Hmm.
The obvious choice to give an overview would be a domain-based structure, perhaps. There are technological, legal, and operational considerations, for example.
Or how about an asset class-based structure: equities, debt, cash?
Or perhaps a more process-oriented structure: issuance, distribution, lifecycle events?
None of these frameworks reflect how I think about the problem. This is not to say there are no legal challenges, for example. Instead, knowing this fact doesn’t really tell you much about how to think about this opportunity from a business perspective.
And the reason is best explained looking at a typical article taking such an approach, like a recent one by Schroders. According to its website, Schroders was founded in London as J.F. Schröder & Co in 1800. But let’s be honest, we wouldn’t be reading their articles today if Johann Friedrich Schröder hadn’t gotten some help—in the form of his younger brother, Johann Heinrich Schröder, who moved to London at the tender age of 16 to help manage this new enterprise. Don’t worry, he only became a partner in 1804, so the Schroders financial empire isn’t exactly the product of a 16-year-old’s brainchild.
Interestingly, I checked, and it turns out that even today, you only need to be 16 years old to be a director of a company in the UK, including a public limited company (PLC). However, managing shares and other legal obligations might still require a bit of adult supervision or a legal guardian. It’s an amusing thought, isn’t it? Mr or Mrs CEO, when will your parent sign off the board minutes?
Now, back to Schroders’ article on tokenization:
“Current regulations provide a stable foundation, but further development is needed to fully realise this technology's potential.”
And assuming that will be taken care of somehow, we are then presented with various claims that are incoherent.
“[..] tokenisation will reduce value leakage across the financial value chain, eliminating the need for constant data conversion between systems, dramatically lowering ownership costs and boosting scalability.”
Value leakage is a subjective and context-dependent concept, which lacks objective criteria. It is an inherently imprecise term because it is a judgment call about what costs or losses are avoidable or inefficient based on the goals and perspectives of the decision-makers involved. What might be seen as leakage in one scenario could be a necessary expenditure in another.
This perfectly summarises the overall challenge: a diffuse idea about objectives and even less clarity on how such objectives could be achieved. In this case, Schroders postulates the elimination of the need for converting data between systems.
For tokenization to truly eliminate the need for constant data conversion, there would need to be a significant reduction in the number of systems used across the financial value chain or a move towards a standardized protocol that all institutions agree upon. This is highly unlikely in the near future, as financial institutions have deeply entrenched systems and regulatory requirements that demand specific formats and processes. Translation from one format to another is not a cost driver. What makes this complicated is the interpretation of information that is described differently, i.e., at a different abstraction level. The reduced need for data conversion would thus be the outcome of significant business process standardization, and the chances of that are zilch.
However, the original statement lacks a clear connection between fractionalization, scalability, and personalization, particularly regarding the critical aspect of cost efficiency. Scalability typically refers to the ability to manage an increasing volume of transactions efficiently, without a corresponding rise in costs or complexity. For blockchain-based fractionalization to contribute to more scalable and personalized financial products, there must be a clear articulation of how blockchain technology will reduce the costs associated with these processes, making them economically viable while addressing existing operational challenges.
Without addressing the central issue of how blockchain technology can achieve the necessary cost efficiency, the argument that tokenization inherently makes financial products more scalable and personalized remains incomplete. Simply enabling fractionalization on a blockchain does not automatically result in these benefits without a significant reduction in transaction costs or operational complexity.
The starting point should be understanding how the technology works and how it is supposed to deliver benefits, with a focus, as suggested by Schroders:
“[...] certain assets, including funds, are particularly well-suited due to potential improvements in settlement time, distribution efficiency, and intermediary reduction.”
Funds processing is a diverse activity with distinct subtypes, including mutual funds, exchange-traded funds (ETFs), money market funds (MMFs), and private funds. Each of these has its own operational requirements and market specificities, particularly regarding whether these assets are eligible for settlement through a Central Securities Depository (CSD). There are significant variations across markets in how these instruments are utilized. For instance, the U.S. ETF market is heavily dominated by institutional investors, whereas in Europe, retail participation is more prominent.
In mutual funds and MMFs, investors purchase new fund units at a price determined daily by the fund issuer, with the process facilitated by a transfer agent. In contrast, ETFs are traded throughout the day on exchanges by brokers who have acquired inventory from the fund provider.
Schroders’ statement suggests that blockchain and tokenization will bring new capabilities to enable fractionalization, but this raises the question of what these capabilities truly entail. Current Central Securities Depositories (CSD) settlement systems and custody systems used by banks already facilitate fractionalization to some extent. For example, TARGET2-Securities typically settles shares at a minimum of one share, a static value set by the issuer, but this could be modified to allow for fractional shares below one. Therefore, can it be said that TARGET2-Securities already enables fractionalization? This 'benefit' is rarely emphasized because fractionalization usually occurs in specific, often complex scenarios, such as corporate actions triggered by an issuer or intentional decisions by a custodian. The fractional entitlements recorded by a bank in such cases are not settled on the open market.
So, here’s an example of the future present today, with courtesy of Revolut (but there are many others offering this):
Revolut Trading provides the investment service of reception and transmission of orders in relation to whole shares (“shares”) and fractions of shares (“fractional shares”) to retail clients [..]
The fractional shares [..] cannot be traded on regulated markets, such as public exchanges. That is because a fractional share is made available to you only once the third party broker has purchased a share and makes available fractions of that share to you. As such, you can only sell fractional shares you acquired via the Revolut app back to the third party broker and, as a result, fractional shares may be subject to greater liquidity risk than shares.
Consequently, tokenization does not introduce new fractionalization capabilities—these capabilities already exist within the current infrastructure. Moreover, it is unlikely to be appealing to the majority of investors because such a model introduces inherent operational complexities, and no satisfactory solution has yet been provided to address the underlying issues.
For the time being, regulators have taken notice of this development and issued guidance, such as the one by the UK FCA, requiring that firms “should carefully consider whether their fractional share offerings are delivering good outcomes for consumers” given the often negative impact on transferability, income, and ownership rights, “including how investments may be recovered in the event of a firm failure."
Some aspects could be addressed by changing the product definition. For example, distributing ETFs, which pay out income to investors, introduce more complexity in the context of fractionalization compared to accumulating ETFs, which reinvest income into the fund. Accumulating ETFs are particularly favored in Europe, especially in Germany, due to their tax efficiency, whereas distributing ETFs are more common in the United States, where the tax treatment of qualified dividends encourages their use. Under these circumstances, it is unlikely that U.S. investors would switch fund types solely to benefit from tokenization.
The other key open issue that makes fund tokenization at present more of a gimmick than a robust solution is the mismatch between any efficiency gains that could be achieved through tokenization of fund units and the requirements of the current market. Assume for a second that an ETF fund share could settle on a T+instant basis as a result of tokenization. The broker, however, would not be able to purchase the required securities on an instant basis because equities would still settle on a T+1 or T+2 basis, or whatever the market convention dictates. Consequently, the token, traveling at lightning speed, cannot be made available to investors without the broker taking on new and significant market price risk for which the broker would not be compensated or losing the tax benefit stemming from delivering shares to the fund when purchasing new fund shares.
Question: Is this a likely outcome? According to my very complex calculations and projections, I’d say the chances are infinitesimally small.
I had previously written about the fallacy of present MMF tokenization models so I won’t repeat this here.
This one could also be interesting discussing similar issues.
So what are the issues?
A lack of understanding of how blockchain drives efficiency as a deterministic system and what is required to operate as one. How does this differ from current market structure arrangements, which are not deterministic, and why does this distinction matter?
What token standard is needed to enable the deterministic properties of blockchain?
Which functional gaps need to be resolved to support these deterministic properties?
Would the required changes be suitable to support financial instruments as they are currently defined, or would they necessitate a targeted redefinition of certain product aspects to make this approach useful?
And so, where should we start?
I think that’s a good list for now to inform the topics I should cover in the next articles.
loved this post! some well-informed debunking of hype