DAML ensures integrity and confidentiality within and across markets
During 2018 we published a series of blogs describing DAML, our modeling language for developing smart contracts (if you missed the series, you can start here). That series focused on the contract developer’s point of view, detailing how DAML abstracts away the complexity of guaranteeing ledger integrity and preserving confidentiality of sensitive information — thus making it possible for developers to safely and concisely capture multi-party business workflows with minimal effort and risk. There’s another side to DAML that we haven’t explored externally yet: how the language itself can pave the way for the interconnection of markets in such a way that is simply not possible today. In this blog post I’ll touch on how DAML can help facilitate market interoperability. I’ll be focusing on capital markets so that I can provide a consistent set of examples but the concepts are equally applicable to any industry where market participants engage in multiple independently operated markets.
Siloed, multi-operator capital markets today
Interoperability across markets is of course nothing new. There are approximately 80 stock exchanges worldwide, and an even larger number of commodities, bond, derivatives, and OTC markets; it is very common for multinational firms to trade in more than one market. The IT world is awash in legacy gateways, message brokers, enterprise service bus architectures, and more all designed to help data flow across disparate markets.
Interoperability across markets today
However, even with all of this technology, it remains impossible for a participant with trades in multiple markets to validate their overall position at any moment in time with absolute certainty. Why?
The problem stems from the fact that today each market is operated by a distinct legal entity that follows a unique set of rules, practices, etc. — many of which are mandated by law or regulation. Operators have sole responsibility for managing who can participate in their market, have final say in determining when a transaction is final, and maintain the ‘golden record’ of all transactions conducted within their market. On top of that, each market participant also maintains its own records for its trades, which are generally compiled from messaging among stakeholders and end of day reports.
Within a market today it takes time for these messages and reports to be generated, propagated, and reconciled across all trading participants. Although a participant can have a fair idea of their positions in a market at any given moment in time, these positions are not known with certainty until the reconciliation process completes — typically at the end of a trading day. If market participants do not have a certain, real-time view of their positions within a market today, how can there be any expectation of having such a view across markets?
Many market operators are looking to DLT to solve this data consistency problem within their individual markets by replacing independently-maintained silos of information with a single source of truth that is shared across the entire market. Mutualizing infrastructure in this manner gives market participants the means to efficiently automate and coordinate transactions and workflows across and within organizations. If architected correctly, DLT can also make it possible for participants to be able to — for the first time — maintain an accurate, real-time, cross-market view of their positions.
But data consistency is not the only cross-market deficiency that needs to be addressed. In addition to having confidence in their cross-market view of data, market participants would also like to be able to execute trades atomically across markets. This is of course not even conceivable today because transaction commitment decisions are made on an individual market basis and there is no effective mechanism for multiple, independent market operators to collaborate on committing a transaction that crosses their markets.
Here again, it is possible for DLT to deliver this brand-new capability — to atomically transact across markets — but success is highly dependent on how one goes about architecting the solution.
The interconnected capital markets of tomorrow
Interoperability across markets tomorrow?
This sort of interoperability across markets requires the DLT infrastructure to be able to maintain the concept of a consistent, cross-market state and atomically deliver cross-market validation of transactions. How can this be achieved?
It’s natural to think that the only way to get there is to adopt an architecture where one distributed ledger is shared and maintained across a set of market operators. In such an architecture, ledger integrity is typically maintained by a Byzantine fault tolerant consensus algorithm to allow the mutually distrusting operators to come to agreement on how data is written to that shared ledger.
Multi-party, Byzantine fault tolerant deployments are likely not permissible in capital markets (and likely not in many other markets as well), where the individual operators are bound by law to guarantee the behavior of the systems maintaining their markets. Market operators simply cannot relinquish or share responsibility of their legislatively-mandated rules and practices. In other words, market participants need a mechanism to ensure they all commit transactions consistently, but the ultimate arbitrator deciding if a transaction is committed or rejected for any specific market at the business level is not a vote between participants; it is the operator of that market, or the participant itself for its own internal books and records.
Consensus algorithms are not necessary for commitment protocols
There is however a deeper problem with consensus algorithms. They are fully replicated and only work if every node has access to all of the data in the market. This means that even nodes that have no stake in a given transaction would be exposed to data about that transaction — violating the strict data confidentiality laws under which most markets operate (this blog entry provides an in-depth perspective on how difficult it can be to keep data private in distributed ledgers) and violating the responsibility of market operators to deliver consistent operations of their market.
In any market, what’s really important is that the stakeholders to each transaction commit that transaction locally; there is no need for network-wide consensus to determine when to commit every transaction! We only need the stakeholders to a given transaction to agree over commitment of that transaction, and this sort of per-transaction consensus is a much smaller problem to solve than traditional consensus algorithms are designed to handle.
I pointed out earlier about how every market is operated under a set of rules that are particular to that market. The market’s rules are typically codified into a legally binding agreement (often called the Master Service Agreement and/or the Operator’s Rulebook, as was discussed in this blog entry) that all market participants must voluntarily agree to in order to be able to participate in that market.
Deciding whether you should commit or not is defined deterministically by these market rules.
And so — when any set of market participants enter into an agreement covered under the MSA, a transaction will be committed to the ledger if and only if the rules to the agreement are observed by all parties. It is a cleaner, simpler mechanism that limits both the decision scope and data exposure strictly to the stakeholders in the agreement.
This is not to say that consensus algorithms have no place in a market-based DLT. For example, since the decision to commit is a separate process from the actual writing of committed transactions to the ledger, it’s important to be able to resolve the proper ordering of transactions to avoid race conditions. One could use a consensus algorithm here; but considering that there are institutions in any given market that could be trusted to properly timestamp these requests, why impose the complexity of a consensus algorithm on a process that does not inherently require ‘trustlessness’?
DAML is about sharing rules across markets — not data
And this is where DAML comes into play. In previous blog entries, we have described DAML as
- a language of contracts, where agreements and the parties to them are native constructs in the language;
- a language of privacy, where only the data that a party is authorized to access is revealed to them;
- a language of ledgers, where contract data is stored using a structured ledger data model and
- a functional language, functions being the building blocks of mathematics.
With these properties built-in to the language, it is a straightforward task to encapsulate the rules for a market (the MSA) into DAML libraries. The publisher of a DAML library encodes who is allowed to act — and how they are allowed to act — in those libraries. As such, a market operator has full control over which members of a network can act within a given market.
A practical example: the ISDA CDM
It’s very common for organizations within capital markets to codify best practices in order to streamline market operations. Consider, for example, ISDA’s (the International Swaps and Derivatives Association) emerging CDM (Common Domain Model). The stated goal of CDM is to “provide an industry standard blueprint for how derivatives are traded and managed across the lifecycle, and how each step in the process should be represented.”
For a derivatives market, the CDM spec defines a set of data types and constraints they must satisfy, along with rights and obligations workflows. These are, if you’ve read the prior blog series, concepts that are central to DAML, and so it should be expected that a DAML driven system would be a natural choice for derivatives markets adhering to the CDM.
Barclays recently sponsored a two-day event called DerivHack to get feedback on and ‘kick the tires’ of the emerging standard. The structure of DerivHack called for each team to simulate the derivatives market by applying CDM against representative use cases in post-trade processing using sample trade data in CDM format. Of the 15-team London event field, the DA team — working in DAML, of course — was awarded the prize of Best Overall Solution. What’s most interesting from the perspective of this blog entry is that it took very little effort for the team to generate the appropriate DAML from the CDM and complete all six use cases (the team in fact completed and presented two additional use cases that streamlined the overall solution).
The DerivHack experience validated that DAML and the DA Platform are a natural fit for rapidly and completely expressing the rules under which markets operate. Other platforms require developers to spend a significant amount of their time worrying about distributed systems, the mechanics of a transaction, and operation of the ledger; DAML allows developers to focus strictly on business relevant workflows. As this blog entry attests, it makes a difference. This was evidenced by the fact that none of the 14 DerivHack teams using Corda or Hyperledger Fabric, DLT platforms without a fit for purpose language, were able to complete all of the tasks in time — but the only team using DAML were.
You can read more about the DerivHack experience here.
DAML: interoperability without sacrificing confidentiality
In capital markets, confidentiality of sensitive data is king. Within a market, DAML was architected to put complete control over who sees what data in the hands of the publisher of the DAML libraries. The data itself is kept private to the parties directly involved in a particular transaction, period, and this is enforced by DAML workflows. This is especially true for Digital Asset ourselves; we are a technology vendor shipping software to entities and so do not see any confidential client data.
A DAML driven system is built on the concept of standard, shared data formats, shared business logic on how to manage data changes, and shared interpretation of instructions. It is this approach towards building DLT systems that sets the stage for expanding data consistency and transactionality across markets. When you enable interoperability across markets by sharing workflows rather than data, you keep the scope of data privacy within the control of the participants; operators do not gain access to data unless the specific rules of their market as represented in DAML — and agreed-to by all participants — entitle them to that data. Sharing workflows does not result in any loss of sovereignty for the participants or operators.
Interoperability can be extended across multiple markets simply by sharing the appropriate DAML libraries!
Join the community and download the DAML SDK at www.daml.com
This story was originally published 13 November 2018 on Medium
About the author
W. Eric Saraniecki, Head of Product, Digital Asset
Eric was part of the founding team of Digital Asset. Previously, Eric joined DRW Trading Group in 2006 and has more than six years of experience managing a commodities trading desk that he created. Eric is one of the Cumberland Mining founders, one of the largest crypto-asset liquidity providers in the world for cryptocurrencies.