Author: Edward Glyn
The benefits of tokenisation are increasingly evident, but the path to realising those benefits is less clear-cut, says Edward Glyn, Managing Director, Head of Global Markets at Calastone.
The CEO of BlackRock, the world’s largest asset manager, has called it the ‘next generation for markets’. According to one estimate it will account for 10% of global GDP as soon as 2030. Few could now doubt that the tokenisation of assets is a trend that has entered the financial mainstream. The idea that a security can be created, traded and held as a digital entity via distributed ledger technology (DLT) is no longer something for the future, but fundamental to how asset managers are working to transform their business models today.
The benefits of tokenisation are increasingly evident. It can bring accessibility and liquidity to both traditional and alternative assets – facilitating fractional ownership of everything from real estate to private equity funds, and fine art. By digitising the asset it increases both transparency and efficiency of how it is traded and held, potentially heralding operational savings for the asset manager and fee reductions for the client. And it points the way towards a future in which those managers can offer the kind of highly personalised solutions hitherto reserved for institutions and the wealthiest clients to a much wider customer base – with bespoke portfolios built from the ground up based on tokenised holdings across a range of asset classes.
The path to realising those benefits is less clear-cut. While tokenisation is a widely accepted priority for the industry – 97% of institutions agreed that it will ‘revolutionise asset management’ according to a recent BNY Mellon survey – consensus breaks down when it comes to the details of implementation. Definitions of what tokenisation means in practice, exactly how it should be manufactured, as well as agreement on client outcomes, the most important use cases, vary considerably.
The need to understand those differences and evaluate the strengths and weaknesses of the various approaches, has become critical as the industry moves from talking about tokenising to implementation, with experimental products now arriving onto the market. The decisions made today about tokenised assets – the shape of products, underlying systems and regulation – will do much to determine the shape of the industry for years to come. It has never been more important for asset managers to understand what they precisely they mean by tokenisation and what exactly they are trying to achieve with it.
The two faces of tokenised funds
The tokenised funds that have launched over the last year, from ETFs to private equity funds, have served to illustrate the different visions that surround this technology.
The first and simplest approach to tokenisation has focused on the unit level. While the fund continues to operate as before, an additional register of tokens is recorded on a public blockchain, a new layer of record keeping alongside the traditional register of units maintained by the transfer agent.
This allows fund units to be traded as tokens – with alternative managers including KKR and Hamilton Lane having already launched digital funds on these lines.
Equally, this is tokenisation at the most superficial level, which maintains the entirety of fund management’s complex value chain. It doesn’t change how a fund operates, maximise value to the investor, or fully leverage the potential of DLT as a platform.
More ambitious is the vision that the assets in the fund, not just the unit, are tokenised. Under this model, an entire collective investment vehicle is built and administered on DLT, running digitally end-to-end. The tokens do not represent fund units but the proportions of the assets themselves – whether equities, fixed income or alternatives.
This approach addresses not just the question of how to trade funds digitally, but how to digitise the entire process of building, distributing and managing them. By bringing the entire investment value chain on to DLT, it directly tackles much of the inefficiency that dogs the legacy fund model – ensuring the data is commonly accessible to all counterparties rather than being laboriously moved through every link in the chain. Not only will this boost operational efficiency, it should also be an aid to innovation, fostering collaboration in product development and enabling a deeper level of real-time analytics.
This platform-based, asset-level approach to tokenisation is one Calastone has been piloting with several global asset managers, and in consultation with regulators. It is one we believe will harness DLT as a powerful platform for providing, managing, securing and distributing tokenised assets, helping to unlock meaningful improvements in operational efficiency and product innovation.
Transforming the industry
Ultimately that is the long-term prize of tokenisation: not simply to change the way funds are structured and distributed, but to transform the basis on which asset management operates – from complex systems that engender cost and complexity to shared digital platforms that allow data to flow seamlessly, modern new products to be rapidly launched and the benefits of both efficiency and innovation to be felt by all.
More than a technology to be deployed, tokenisation should be seen as a catalyst for much-needed change in an industry that is approaching its centenary and carrying considerable legacy baggage. At a time when fund managers are facing margin compression, pressure to differentiate in a crowded market and trying to attract a new generation of investors, tokenisation represents the means to radically transform the products they bring to market – making them cheaper, more personal, more flexible and more competitive. The more ambitiously tokenisation is pursued now, the greater those benefits stand to be in the crucial years to come.
Edward Glyn is Managing Director, Head of Global Markets at Calastone