How to FIX Token Standards When ISO, FpML, and XBRL Have a Say
Exploring the challenges of token standardization in blockchain and its interplay with established financial protocols like FIX, ISO, FpML, and XBRL.
In my last article as part of the collaboration with The Securities Services Advisory Group (TSSAG), I explored the psychological and technical shifts required for tokenization to succeed in blockchain. This time, I want to focus on a crucial but often overlooked piece of the puzzle: token standards and their role in ensuring blockchain's deterministic properties.
This deterministic quality isn’t guaranteed. It depends on how data is recorded and entered into the system, as well as overcoming current market challenges and mindsets.
Recap of Key Points to Consider in My View:
Challenges for Financial Tokenization: While deterministic systems work well for cryptocurrencies, tokenization of financial instruments faces hurdles due to regulatory and operational requirements that necessitate flexibility for managing performance and interventions.
Psychological Barriers: Blockchain adoption relies on a mindset shift, requiring decision-makers to accept constraints on their ability to interfere—this could unlock structural cost savings.
Necessary Preconditions: Tokenization requires data that is verifiable and trustworthy digitally. Without this, it cannot deliver significant advantages over existing systems.
The term "token standards" gained prominence with the advent of Ethereum, and in particular ERC20, and refers to application-layer standards that define how applications running on Ethereum interact with one another. In essence, these standards ensure that smart contracts remain composable, meaning they can integrate and function together seamlessly. To fully grasp this, we need to delve into the technical aspects underpinning these standards.
Token Standards vs. Traditional Financial Schemas
The financial industry has long relied on data schemas like FIX, ISO, and FpML, which provide common definitions for data types. For example, when exchanging data about a financial asset, the value ‘100’ might be tagged with an attribute to indicate the currency is ‘USD.’ This standardization allows different systems to exchange information efficiently.
A blockchain like Ethereum also employs a data schema to define data structures, but it takes this a step further: it specifies the behavior (functions) associated with the data. This critical feature distinguishes blockchain standards from traditional financial schemas, where behavior specifications are typically absent.
For instance, ISO 20022 includes a Business Process Model that derives the data elements used in its message definitions. However, an ISO 20022 file does not define how that data should be processed or consumed within specific business operations. Blockchain standards, by contrast, embed rules for both the structure and behavior of data, enabling automated, deterministic processes.
Why Current Token Standards Fall Short
A token standard, such as ERC20 on Ethereum or BEP-20 on the Binance Smart Chain, defines a set of rules and functions that govern how users and applications can interact with a smart contract. These standards specify both the data structure (schema) and the associated functional behavior, enabling consistent and predictable interactions. While this approach limits the overall scope of functionality, it ensures a finite and standardized set of transactions and responses.
For example, the ERC20 standard includes essential functions such as:
Transfer: Enables the transfer of tokens from one account to another.
Balance: Retrieves the current token balance of a specific account.
These standardized functions ensure that any ERC20-compliant smart contract will behave in a consistent manner, simplifying integration and promoting interoperability across applications.
However, the current token standards are insufficient to drive deterministic behavior because the information about what asset the token represents is of a ‘declaring’ nature but cannot be confirmed through cryptographic means. A token standard for financial instruments would need to be designed from the perspective of how a deterministic settlement arrangement could be ensured. This, in turn, requires that the correctness of the data elements describing the assets can be cryptographically proven. This may sound trivial when deciding whether an equity is denominated in USD or EUR, but it becomes a complex problem for scenarios like voluntary corporate actions.
The Blockchain Hype vs. Reality
An article reporting from the FIX Americas Trading Conference in October 2024 shared this feedback regarding digital assets:
“Panelists said blockchain should reduce costs and trading errors in securities markets, similar to how FIX connectivity did 30 years ago.”
Really? Comparing blockchain's potential to FIX connectivity is misleading because FIX addressed a specific set of problems in a narrowly defined domain, whereas blockchain’s scope is far broader and far less mature in terms of standardization.
Such claims lack grounding unless the industry addresses foundational challenges like standardization, interoperability, and integration with entrenched protocols (e.g., FIX, ISO, FpML, XBRL). The suggestion also assumes standardization and interoperability that do not currently exist. It’s like claiming a smartphone will make calls better without addressing cellular network compatibility.
FIX was designed as a front-office standard to facilitate trading and reduce communication errors between trading parties. It is not comprehensive across all asset classes (e.g., funds) nor is it intended for back-office processes. The benefits FIX brought were achieved by aligning the industry on a common protocol for specific functions (pre-trade and trade). Blockchain doesn’t currently provide a similar unified framework for cross-functional, cross-asset processes.
Blockchain operates as a data ledger rather than a common messaging standard like FIX or ISO. Without a universal data scheme or abstraction across these standards, blockchain alone cannot harmonize data flows across front-office, middle-office, and back-office operations.
Tokenization Complexity and Interoperability
Tokenization introduces an additional layer of complexity. If different firms tokenize the same asset (e.g., a bond) with varying standards or processes, the interoperability problem grows, not shrinks. The industry has yet to reconcile blockchain’s potential with the practical realities of siloed standards. FIX and ISO remain critical because they are established, interoperable standards for specific domains.
Blockchain may enable new efficiencies, but expecting it to unify fragmented data schemes and the underlying distinct business processes without addressing these hurdles is, to put it mildly, unrealistic.
Conclusion: A Universal Data Abstraction Layer?
What is also unlikely is to standardize ahead of the market because this would require investment in a problem that doesn’t hurt yet and a clear understanding of future requirements. A speculative design for a speculative market typically means people tackle more immediate, concrete problems instead.
The answer to all of this seems obvious: a universal data abstraction layer. But what that should look like, and how, and when—well, that’s the million-dollar question.
Another way to approach this issue is to consider how much functionality and domain specificity should be encoded at the protocol level versus the application layer. Today’s securities markets, by their very nature—dealing with intermediated securities—rely on common principles. Embedding these principles into the protocol could reduce reliance on higher-level applications to handle critical functions, ensuring consistency and interoperability at the most foundational layer.
Deterministic Operations: Encoding business logic, such as for a DvP transaction, directly into the protocol could enhance the deterministic properties of the blockchain, ensuring uniform behavior across all network participants.
Avoiding Redundancy: If each application layer has to reinvent the wheel by encoding the same securities-related rules, inefficiencies and interoperability issues will arise. A protocol-level implementation would streamline operations by standardizing key processes.
Cross-Border and Multi-Party Use Cases: Securities are traded globally, and a neutral blockchain may not suffice for harmonization. Embedding certain region-specific or market-specific rules directly into the protocol could simplify international operations.
General-Purpose Blockchains vs. Modular Solutions: While general-purpose blockchains are often chosen for their flexibility and governance advantages, they may not fully address the unique demands of the securities market. Modular blockchains, which allow domain-specific rules to be applied at a customizable protocol layer while maintaining neutrality at the base layer, could offer a compelling alternative.
Ultimately, it comes down to performance. Securities servicing requires high throughput, which may not be achievable with a generic blockchain. This may warrant custom blockchains or modular architectures where specific business logic and data structures are embedded at the protocol level.
In my opinion, this remains an open question that deserves further exploration and discussion.
The question remains as to who is going to drive the interoperability 'protocols' - the clearing houses, exchanges or custodians? In some ways, all this detracts from the decentralization and we are left with yet another messaging layer...