Thanks to Alex Felix and Oleg Golubov for challenging me in the process of writing this post and providing comments both useful and difficult.
Picture this: you are having a romantic evening with your lover, but there is one extra person present, and he is in charge of you being able to interact with your date. What’s even worse, you can’t have a date at all unless such a person is present, and you even have to pay him for the “service”. Am I describing some crazy dystopian world that has nothing to do with how we live today? On the contrary! This is the reality of business and financial relationships today, because most of them are currently mediated by paid intermediaries that do exactly this: they enable, and hence can prevent, complicate, or simply diminish the quality of your business relationships.
When Bitcoin was first launched, it was designed to remove such intermediaries from one’s financial relationships. Soon after it became obvious that the ideas and technologies first pioneered by Bitcoin made possible disintermediation globally and in a variety of areas, not just payments. This takes me to the topic of this post: mutualized business relationship models, those that, akin to our lovers, are created in a way where the participants may remove the mad chaperone, and finally be alone.
Simply put, a mutualized business relationship is one where the ultimate participants are appropriately in charge of their interaction. This may have to do with payments, asset transfer, betting, accounting, borrowing, insurance, social networking, and even development of new pharmaceutical drugs. One can imagine a sharing economy system not run by a single huge company such as AirBnB or Uber, but constructed and operated by a large network of participants, customers and providers of services, for their ultimate benefit.
Two factors are driving the inevitable emergence of mutualized business. First, organizations that mediate the relationships of others have incentives that are misaligned with their customers. This prevents the increase in service quality and decrease in service price that would otherwise be the natural progression of the business trajectory of today’s fast-paced and rapidly evolving world.
Second, there are functions and services that intermediaries simply cannot provide using traditional business models. For example, digital assets emerge in the decentralized/disintermediated world of crypto precisely because the counterparty risk of a single issuer is not acceptable in many cases. Moreover, the legal and regulatory risks that fall on an intermediary are often more than a single organization can agree to bear.
Current efforts at decentralization follow a similar philosophy, but take it to the extreme. Decentralized systems reject human authority completely, which leads to architectures that are largely impractical. Moreover, radically decentralized architectures often achieve the opposite effect. Rejecting the human authority in places such as recourse and identity leads to systems that cannot be adopted in a compliant way by risk-averse and regulation-minded enterprise customers. To solve this problem, projects are forced to adopt architectures in which centralized hooks must be installed (for example, to reflect the state of a centrally run identity database on the blockchain), effectively reverting the entire system to a centralized operating model.
So to the extent that mutualization may be contrasted with decentralization as it stands today, the term refers to a balanced point on the spectrum between centralized and decentralized systems, where some human authority is intentionally built into the framework (again, recourse, KYC, reputation are notable examples) and is not decentralized away completely, but rather is limited in a way which improves the system’s social guarantees, while precluding the possibility of unilateral damage by a single party.
This post covers two specific trajectories for creating balanced mutualized systems. The first trajectory is one in which a centralized system moves towards greater decentralization to adapt to regulatory pressures, to align incentives betweens various types of stakeholders and customers, to reduce counterparty risks, or to gain access to digital ecosystems not possible in a fully centralized world. The second trajectory is one in which a radically decentralized system will acquire more centralized features for reasons of compliance, practicality, and user experience.
First, let’s look at some notable examples of situations that are ripe for mutualization along the first trajectory (centralized →mutualized).
Example 1: DTCC
Depository Trust & Clearing Corporation was created to simplify the process of inter-company settlement of securities. Note that DTCC is already somewhat mutualized by virtue of being a “user-owned and directed” organization. Nevertheless, it is clear that DTCC, which has settled $1.7 quadrillion of value in 2011, has too much power and introduces too many frictions into the process despite being governed by its (largest) customers. The main reason for this is the centralization of operational responsibility. Any technological advance to a complicated system that hosts such a tremendous amount of economic value will put an unbearable amount of responsibility on its operators. Any individual responsible for such a system will be more inclined to wait and be overly cautious with respect to any improvements, especially invasive ones. In other words, a guarantee of an insidious 0.1% inefficiency will be preferable to an improvement that risks a significant one-time loss. This is a situation where a well-governed blockchain infrastructure will prove to be more, not less, agile than incumbent centralized systems.
Example 2. Sharing Economy
Technology-enabled sharing economy networks like Uber and AirBnB provided a model for a new type of ultra-high-efficiency businesses. As compared to how taxis and hotels operated before, these organizations were able to provide a better service for a lower price by
(1) sharply increasing the size of the service-provider side of the market;
(2) enabling greater utilization of underutilized resources; and,
(3) inducing service providers to compete with each other on an open market.
These models are the beginning of mutualization and clearly demonstrate its benefits. We should expect this trend to continue. The early indication of the future direction towards even greater mutualization comes in the form of the sharing economy companies’ attempts to convince the regulators that their customers should be able to own shares of the companies.
The goal is to eliminate the incentive misalignment between shareholders and customers on service pricing. Sharing economy companies, once large enough, create a strong network effect that makes them into effective monopolies or near-monopolies. At that point, the pricing will suffer greatly because the companies are governed and operated to the benefit of their shareholders.
In order to fully mutualize a large sharing economy network, one must carefully consider the function it serves. To date, all attempts to create unmediated two-sided markets resulted in failures everywhere, except for the Dark Internet and shadow economy, where there are no other choices. This is because within such systems there must be individuals providing specialized mediation services (conflict resolution, recourse, guidance), which can not be eliminated or replaced by purely technological solutions.
Would a tour guide in Barcelona own tokens of some decentralized apartment sharing solution?
In such cases mutualization would mean a creation of competitive and appropriately incentivized marketplaces for such auxiliary services. For example, in each city where a Mutualized AirBnB operates, there must be a competitive marketplace for services for insurance, repairs, and help to provide a good experience for travelers and hosts in distress. This leads one to conclude that mutualized sharing economy solutions are not two-sided, but N-sided markets, where auxiliary services form a small but critical part of the process in a fair and productive way.
A depiction of an N-sided market for a sharing economy network (Image courtesy of Michael Zargham)
In the case of traditional sharing economy mediators, such as Uber and AirBnB, the company itself fulfills the need for such auxiliary services. Obviously, its incentives are such that it will provide the minimum necessary service for the network to operate (and to avoid liability — which is why they often offer very good insurance coverage), but will not improve on such services, because they do not, by themselves, generate profit, especially when a company dominates its market. In contrast, one can and should create incentives within a fully mutualized network that will enable small local businesses to provide auxiliary services in a self-sustaining way.
We now turn to the other trajectory (decentralized →mutualized).
Example 3. Mutualized Finance
In crypto over the past two years, we saw a rapid emergence of DeFi — decentralized finance. The term includes a broad range of financial services automated on blockchain, so it warrants some specifics. Two projects that first entered the DeFi market and largely defined it (long before the word “DeFi” was coined) were MakerDAO and Augur prediction market. Both drew on the blockchain’s ability to (1) cheaply issue fungible and programmable digital assets integral to their use case, and to (2) establish a secondary market ecosystem which enables price discovery and trading of these assets with minimal involvement from the asset creators.
DeFi, as it currently stands, is an extremely compelling proof-of-concept. The market currently includes a broad range of nascent offerings, such as insurance, derivatives, lending products, and exchanges. Two aspects of blockchain enable this market: operational decentralization, which leads to a decrease in counterparty and regulatory risk, and programmable assets, which enable creation of financial contracts that don’t require major technological efforts on part of every entity that interacts with them.
The reason why DeFi is no more than a proof-of-concept is because it is currently built on top of technology frameworks that are radically decentralized and so suffer from risks that are endemic to this environment. These risks include the risks of cyberattacks, lost private keys, compromised wallets, as well as regulatory risks related to KYC/AML requirements that are being increasingly applied by regulators to decentralized systems. In order to become broadly adopted, decentralized finance must account for these requirements and, while preserving the essence of decentralization that enables these assets in the first place, build platforms that deliver highly practical enterprise-grade solutions. Such improvements would fall along the decentralization→mutualization trajectory.
Decentralization vs. mutualization
In order to be useful, systems must include a variety of human-provided services. In mutualized systems it is expected that some services will be provided by people endowed with sufficient level of control to fulfill their role. In fully decentralized systems, no human has any more power over the system than any other. In other words, while decentralized systems see decentralization as their very purpose, mutualized systems will prioritize the quality of the service and compromise in favor of centralization in cases where this improves the quality.
Nevertheless, mutualized systems will mitigate centralization risks by marketizing the respective services, as in the example of local tourist offices that provide support to travelers in distress. When the human-based service is provided by a market where multiple entities compete with each other on cost and quality, the outcome is better cheaper service for the customers.
Software Development as a Mutualized Service
Importantly, one must ensure a continuous and uninterrupted funding and management of the teams that are responsible for the core technology in such networks. Our position as investors in the space of decentralized technology affords us great visibility into the protracted struggles that development teams face with respect to both funding and decision-making. This is a large topic that warrants a whole other blog post.
In summary, there are significant ongoing concerns as to the soundness of the business models such teams put in place. In some cases, teams run out of funds received from token sales and cease operating; in other cases teams issue equity in addition to tokens which causes significant incentive misalignments between different types of investors. While a well known issue in the advancement of any open source technology, all of this puts the future of any single decentralization experiment in grave peril.
A mutualized approach should rectify this issue, because it permits greater complexity of multi-sided relationships than in fully decentralized systems. It would account for developers as just another player in the N-sided service market, with development being one of the N sides, properly incentivized and jointly governed by all participants. We see some early attempts at creating markets for development work in the form of such projects as GitCoin and I am certain that the improvement of this type of services will continue and will eventually garner huge demand.
The Anatomy of Mutualization
Let’s now turn to a discussion of what sort of components, organizations, technologies, and efforts will enable the new wave of mutualization, as distinct from decentralization. The core insight is that mutualization requires a hybrid centralized-decentralized functional structure. While all functions that can be provided by software may and should gravitate towards decentralization, the functions that are essentially human in nature can and should remain in the charge of actual people. Where there exists an intrinsic need for centralized human-provided services (software development being a notable and uncontentious example) mutualized organizations should build markets for such services, enabled by transparency and well-thought-through reputation systems. When incentivized by a market, rather than a central mediator, such services arise in ways that are both resilient and fair to all participants.
The dichotomy of human-mediated and technology mediated relationships.
From our experience with decentralized finance and other types of decentralized services, we have learned that they critically require an existing ecosystem of services, components, infrastructure, and organizations. The components in such an ecosystem once again naturally fall onto either a centralized→mutualized trajectory, or onto a converse decentralized→mutualized trajectory. The former are:
(1) Issuers and exchanges, that enable trading of dedicated tokens of value. This allows token creators to establish the game-theoretical incentives for participants to provide required services. These tokens must establish a free economy that enables both trading and price discovery.
(2) Auxiliary components, such as wallets, developer tools, webkits, and other software, which enable the developers to focus on the business model of the network, rather than a host of labor-intensive peripheral tasks.
(3) Entities willing to almost immediately run nodes (generalized miners) and other elements of the network’s infrastructure. These entities are incentivized by the network’s ability to issue tokens of value and are willing to take long-term risks and engage in viability analysis for all networks they consider participating in.
On the decentralized→mutualized trajectory lie all the components that bridge the gap between radically decentralized systems and traditional enterprise, allowing decentralized networks to deliver enterprise-grade services. These are:
(1) Legal frameworks, which enable real-world recourse for digitally-mediated relationships, especially if real-world assets are to be represented and traded digitally (see Mattereum).
(2) Identity systems, which ensure reliable identity verification of network participants, implement appropriate privacy guarantees while enabling counterparties, regulators, and law-enforcement to find bad actors when required.
(3) Mutualization-aware recourse and arbitration companies, which are empowered via a clear governance procedure to rectify errors and malicious behavior in such networks.
(4) A broad range of support companies, which guide new or inexperienced users when needed, or simply assist people that encounter problems.
(5) Markets for developers, designers, experienced team managers, and other roles, which are critical for software-enabled networks to adjust to the changing conditions, fix problems, grow, and evolve.
(6) And, finally, working approaches to governance and decision-making, that enable a large group of people to jointly make decisions and coordinate around them.
Components that make possible cheap and fast creation of mutualized business models
Most importantly, the emergence of mutualized systems is predicated on technology platforms that answer their broad yet unique needs. Here we must recall that an effort to adopt blockchain technology to improve efficiency of DTCC was scrapped citing lack of benefits as measured against the complexity of the effort. To advance mutualization past the proof-of-concept stage, the tools we use to build such technology must advance significantly.
Most of the existing blockchain and peer-to-peer technology was built under a rather strict decentralization premise, and consequently created operational models too clunky for broad adoption. The most glaring consequence of the radical decentralization philosophy so far has been lack of recourse by design, leading to stringent security requirements and untenable key management processes on part of all users. Even if you can avoid being hacked, losing your private key is fatal in most existing decentralized networks — a situation that detracts most but the most ardent adopters.
Mutualized systems do not require radical decentralization. A network operated by ten well-known parties may be sufficient for a vast majority of the needs of a mutualized organization. It establishes a social contract much like that of a fully decentralized blockchain (resilience, some censorship-resistance, some dispersion of responsibility), and where such social promises are relaxed, the resulting improvement in user experience is so significant as to warrant the compromise.
In fact, even centralized systems can now be engineered in a way that provides social guarantees similar to those of blockchain. A centrally-run server that executes on trusted hardware, such as an Intel SGX enclave, can reliably ensure non-interference into the system’s functioning by its operator, and preserve privacy of participants.
I’d like to propose several ways in which mutualized systems will arise and take hold. These potential trends inform our view as to what we might expect in the future and what products we should support as investors.
Driver 1. Asset Tokenization and Programmable Finance
Observing the progress of DeFi to date, we can’t help but wonder what would drive adoption of programmable finance by large enterprise. After all, programmable digital assets possess unique properties and offer significant increase in operational efficiency, as compared to traditional financial contracts. The improvements come from the basic principle that blockchain-based digital assets are write-once-use-everywhere software products. Creating a new financial contract using such automation tools is many orders of magnitude cheaper, because it frees all trading participants from developing same or similar software in order to use the contract. Additionally, the game-theoretical structure of DeFi products and markets enables models that simply can’t exist in traditional finance, prime example being MakerDAO which demonstrates that decentralized systems are able to provide loans at significantly lower costs than traditional borrowing.
So if the incentives are all there, what’s the main barrier to adoption? Simply speaking, until decentralized system provide usability and recourse features appropriate for broad adoption by risk-averse enterprise customers, DeFi will remain firmly within the purview of but a few daring experimenters playing with it today. The required features include recourse, enterprise-level identity and KYC solutions, and organization-oriented features such as role-based delegation of responsibility.
Driver 2. Supply chain finance and accounting
Significant inefficiency in global trade stems from haphazard informational permeability of organizational boundaries. In some cases organizational boundaries are used to hide bad behavior, in other cases they prevent, complicate, or make expensive financial audits that by definition require accurate data from all counterparties in a given set of transactions. In some cases the difficulty collecting required information arises due to the sheer variety and number of the in-house systems involved in the process. Sometimes information is intentionally withheld, because organizations are unwilling to expose private data to each other.
In this area mutualized systems enable the best-of-both-worlds approach where data is shared on standard informational rails while preserving privacy when necessary. Additionally, the ability of mutualized systems to create useful economic incentives may be used to reward organizations for sharing, in a way that makes entire markets significantly more efficient.
Driver 3: Capital Markets
The early ICO experiments failed — and for a good reason — but the lesson remains front and center: given access to a large enough pool of investors, capital becomes cheaper. Attempts to give a large global pool of investors access to equity of startup companies have been made before, but the expense of entering a global market for a centralized issuer or trading platform is large. Conversely, decentralized systems made a splash in just this area because they are by definition global. Both demand for capital, and demand for access to investment are strong drivers that will cause this area to develop further as evidenced by the emergence of nascent security token platforms.
Driver 4: Customers are Tired
Yes, customers are tired. Customers of Facebook are tired of their data being sold to advertisers; patients are tired of being bankrupted by the drug manufacturers; home owners are tired of being screwed by their banks; who, in turn, are tired of the inefficiency of DTCC; courier companies are tired of Amazon; farmers are tired of paying exorbitant insurance premiums, most of which go to the insurance companies, not towards..
Rabbithole Talks is a monthly CoinFund meetup, focused on doing deep dives into innovative token economics and technical design decisions for interesting blockchain-based projects. Past guests have included Doug Petkanics of Livepeer, Hunter Hillman of Connext and Luke Duncan of Aragon. We’re excited to continue the 2019 series with a roster of incredible speakers.
Open Source powers billions of dollars economic value for the world. Why, then, is Open Source often built on the backs of volunteers? Why does Open Source lack a business model? Blockchain technology has the potential to solve the age-old problem of Open Source Sustainability. In this talk, Gitcoin’s founder, Kevin Owocki, will explore the emergent blockchain Open Source Ecosystem, and share insights from the millions of dollars of OSS transactional value that the Gitcoin network has facilitated. Read more on the Gitcoin blog here.
Aligning stakeholders’ interests in an organization is hard. The current fundraising models (ICO or private fundraising) impose significant limitations on the mechanisms available to align stakeholders’ interests. A Continuous Organization (CO) is a new model designed to make organizations more fluid and more robust by overcoming those limitations. Using the Continuous Organization model, organizations can set themselves in continuous fundraising mode while benefiting from solid and flexible mechanisms to align stakeholders’ interests in their financial success. Read more here.
CoinFund is announcing Grassfed Network, an initiative that uses generalized mining and proprietary software strategies to directly participate in decentralized networks.
As its first external collaboration, Grassfed Network and Placeholder are announcing a staking pool — referred to as a Voting Service Provider (VSP) within Decred’s community — to contribute to the security and governance of the network.
Launch of Grassfed Network and Placeholder partnership
We are pleased to announce the launch of Grassfed Network, a CoinFund initiative that directly supports decentralized networks through the development and deployment of software built to gain access to network incentives, aid in network governance, and provide value-additive services and resources.
Grassfed Network engages in generalized mining (also referred to as “keeping”, “supply-side services”, and “mining 2.0”), an activity in which decentralized networks compensate third parties for providing useful services to the network. CoinFund and Placeholder have both spoken previously on the use of generalized mining as a differentiated strategy for cryptofunds to directly engage networks and generate returns. These services include, but are not limited to, transaction processing, staking, software or Merkle Mining, content curation, market making, active governance, and many others.
Via Grassfed Network, CoinFund is presently participating in live networks such as Livepeer, Steemit, Compound, and others. CoinFund is also in the preparatory phases to service soon-to-launch networks NuCypher and GEO; and is in the process of diligencing a wide set of opportunities to generate service provision rewards.
Additionally, CoinFund is announcing a partnership with Placeholder, a cryptoassets-focused venture capital firm managed by Chris Burniske, Joel Monegro, and Brad Burnham. Earlier this month, Grassfed Network launched a Voting Service Provider (VSP) on the Decred network, which Placeholder will delegate its own voting tickets to, and we welcome other investors to delegate to this pool should it match with their own requirements and governance ethos. Readers can review Placeholder’s investment thesis on Decred here.
The Decred Network and VSPs
Decred was founded in 2015 with the mission to improve upon Bitcoin by building best-in-class governance, security, and funding. Today, the Decred network features a system of checks and balances between users, miners, and developers. The network uses a hybrid Proof-of-Work (PoW) and Proof-of-Stake (PoS) voting mechanism to secure the network and achieve consensus.
The Decred cryptocurrency (DCR) is required to participate in validation of blocks on the Decred blockchain, approve protocol upgrades, and vote on allocation of the network’s treasury. To participate in network consensus, users lock DCR in exchange for digital voting tickets.
Some of Decred stake-holders may not have the resources to run a full node or keep their Decred wallet unlocked 24/7. These parties have the option to authorize a VSP to vote on behalf of their tickets, and thus to participate in governance of the protocol without bearing the aforementioned technological responsibilities. Delegators still receive rewards from the protocol for contributing to the network, and the VSP receives a small fee from the Decred network.
Generalized mining as a differentiating strategy for cryptofunds
CoinFund, Placeholder, and other crypto-focused funds have previously written and spoken at length about the various benefits that a generalized mining strategy may bring a crypto-investor.
A strategy of active network participation may represent:
A prospective opportunity to make principal investments productive and accrue network rewards from decentralized networks. At times, network growth is most effectively captured through direct network access and strategic entry points.
An active role in increasing the chances of success for a portfolio network through bootstrapping decentralization, liquidity, or governance.
A higher level of commitment to post-investment services for forward-looking teams seeking highly aligned investors.
An accretive or differentiated risk-return profile.
Messari has compiled an extensive repository of materials on generalized mining here.
CoinFund looks forward to growing its portfolio of generalized mining strategies through Grassfed Network and is excited to work alongside Placeholder in their efforts to support Decred.
Disclaimer: The content provided on this site is for informational and discussion purposes only and should not be relied upon in connection with a particular investment decision or be construed as an offer, recommendation or solicitation regarding any investment. The author is not endorsing any company, project, or token discussed in this article. All information is presented here “as is,” without warranty of any kind, whether express or implied, and any forward-looking statements may turn out to be wrong. CoinFund and its affiliates may have long or short positions in the tokens or projects discussed in this article.
James synthesizes the Ethereum 2.0 roadmap from the point of view of the developer. Ethereum 2.0’s initiatives, which include Proof of Stake using a Beacon Chain, sharding, state rents, and an eWASM VM, will subvert many assumptions, implications, and tooling of writing code on top of Ethereum. James takes us through the likely changes and speculates on how inter-shard communication might work.
In this short article from the Federal Reserve Bank of St. Louis, economists posit that the price of Bitcoin will be somewhere between $0 and a high number — and they’re right. Perhaps best aimed at bank CEO-type cryptocurrency detractors in language they can understand, the summary is that “economic theory predicts that the price dynamic of an unbacked asset is likely to be highly volatile and inherently unforecastable.”
Will Moloch DAO, a new DAO design spearheaded by Ameen Soleimani, be a better model for coordinating developer grants for Ethereum? Simon takes us through the features — and concerns — of Moloch’s design, from the idea of incentivizing defection (and thus creating risks of “collapse”) to feeding the DAO with any manner of ERC-20, a completely novel feature of any DAO yet described.
Some companies from the 2017 ICO boom chose to have been quietly building while other focused on price action. Lithuania’s Mysterium Network, a decentralized VPN product, falls squarely in the former category. In this end of the year update, Robert highlights some launch successes and modest adoption metrics for the early prototype.
If you’ve never heard of Urbit, you should catch up by reading the 17-year-old project’s pristinely-written primer, which launched along with the network’s Ethereum address management facility, Azimuth, on Ethereum this week. The self-described “personal server built from scratch” might be one of the most streamlined and intentional product launches in the entire decentralization space, and the concept of the Urbit network is a fascinating rabbit hole.
Something interesting is happening over on Augur right now. There is an Augur market that, due to its confusing language, is currently in a shoot-out dispute round which may end up pushing the system to its limit. Given enough capital and patience, a few participants with a lot of REP may be able to push the market to a resolution that is not accepted by the majority of observers. Nick Tomainocovers the question in detail. Here’s a quick summary.
Once a market closes, the amount of ETH locked in the market remains constant, though the rights to those shares may be traded over time.
The market is then sent to the dispute resolution rounds, where REP holders can stake towards outcome that they deem truthful.
Let A(n) = the total stake over all of this market’s outcomes at the beginning of dispute round n.
Let ω = any market outcome other than the market’s tentative outcome at the beginning of this dispute round n.
Let S(ω, n) = the total amount of stake on outcome ω at the beginning of dispute round n.
Let B(ω, n) = the size of the dispute bond needed to successfully dispute the current tentative outcome in favor of the new outcome ω during round n.
Note that the independent variables denote to the beginning of round “n”
Each dispute round has a window of ~7 days.
The round is resolved when consecutive dispute rounds produce the same result (in other words, the opposing outcome does not meet the minimum bond requirement).
The protocol pays out the REP staked to the losing outcome to the stakers of the winning outcome, pro rata to their REP stake.
The protocol recognizes the outcome as final and pays out the underlying ETH accordingly to the shareholders.
Let’s go through a basic example where we calculate the REP and ETH payout for a particular Augur market dispute. Say there is ~$100 in open interest, with Outcome “X” holding 80% of the market. In this dispute, 10 REP have been staked by Outcome A. The disputes go through several rounds until Round 5, where the minimum bond required to flip the tentative outcome is not filled.
Assume an initial stake of 10 REP
According to the formula, the calculates such that the winning outcome should get ~50% ROI (in REP), as well as the upside of the open interest on the underlying market (in ETH).
REP Payout: 80 REP is paid out to the stakers for Outcome “X”, pro rata.
ETH Payout: market settles for outcome “X” and underlying ETH shares are paid out accordingly (~20% of the $100, pro rata).
How long can these disputes keep flipping? There is a special last case resort known as the Fork Phase, where all markets freeze and the entire universe of REP holders must vote on the market in question to determine the “correct” outcome.
The Fork State initiates once the dispute bond size exceeds 275K REP (2.5% of all REP in existence).
For the market in question, each outcome creates alternative “child” universes, where REP holders can migrate their tokens towards the outcomes they deem true.
The child universe that receives the most REP is considered the “Winning Universe” and its corresponding outcome it deemed the true outcome— all existing markets can only port to this universe.
Because the REP tokens may only be used for dispute resolution within the universes they reside within, the REP migrated to “losing” universes are effectively non-fungible with the “winning” universe
The backers of the winning outcome earn the (i) market’s underlying ETH (ii) market’s opposing REP staked and (iii) the price effect of “the losing” universe’s REP tokens being effectively taken out of circulation
To The Limit
The House market is a fun one to observe because of how the dispute economics have been revised to deal with situations of asymmetric upside in the underlying market.
If the market were to resolve in favor of the Republicans, the 3% who own the Republican shares would earn the remaining 97% of ETH in open interest ( ~$679K); conversely, there is a relatively limited 4% of ETH upside for resolving Democrats (~$28K.)
One could expect that Republican shareholders are therefore more inclined to aggressively stake REP for the Republican outcome. This introduces the intended game-theoretical dynamic — market observers would prepare for such behavior and realize that despite limited upside for the market to resolve “Democrat,” there may be interesting REP returns by outlasting the Republican REP tide.
Below is the projected path for the market resolution assuming disputes continue going back and forth. The market reporter put down an initial stake for the “Republican” outcome (Round 0). Since then, the outcomes have been flipping consistently, with a cumulative 728 REP staked so far (~$7,280 at current REP prices). The subsequent rounds are merely illustrative, and a Fork State would initiate at Round 14 given that the dispute bond filled is greater than 275K REP.
Outcome X = Republican, Outcome Y = Democrat, ignoring the “Tied” and “Invalid” Market Outcomes…
Because the monetary consensus implies the Democratic outcome, a crypto-trader could keep filling the bond for the Democratic outcome with expectation of ~50% ROI in REP terms. Extending the idea of “Keepers”, lenders may take out their deposits on products like Compound to maximize their returns (50% vs <2%) by taking advantage of this opportunity. In mature crypto markets with many networks and participants, this might exhibit some form of local equilibria.
Though forking should generally be avoided, it is entirely possible that this market could be taken to that end-game scenario. The variables at play for forking would likely hinge on the expected behavior of exchanges — specifically which child universe they would migrate their REP towards.
Some might make a bet that most exchanges may not participate in the migration vote, out of risk perception or general apathy. Others might grossly underestimate the role of exchanges in determining on-chain decisions, which would bolster the idea that crypto capital operators exert active network influence especially in capital-based security models of the near future (Proof of Stake.)
Of course, as this process unfolds, a savvy trader could play the underlying market (volume is high during the resolution periods) as the disputes go deeper and closer to the potential fork state.
Though most of this analysis is speculative, it is definitely one to watch in the first quarter of 2019. The dispute resolution is currently in Round 4, which you can track in real time. I predict that the dispute resolution mechanism will prove itself to be robust enough, and the market will probably does not reach the ultimate fork state and resolves itself before the REP stake becomes exorbitantly expensive (in the six-figures.)
Disclaimer: The content provided on this site is for informational and discussion purposes only and should not be relied upon in connection with a particular investment decision or be construed as an offer, recommendation or solicitation regarding any investment. The author is not endorsing any company, project, or token discussed in this article. All information is presented here “as is,” without warranty of any kind, whether express or implied, and any forward-looking statements may turn out to be wrong. CoinFund Management LLC and its affiliates may have long or short positions in the tokens or projects discussed in this article.
In July 2018, I published a market map of the various state channel projects in development. In this post, I will provide an overview of where we are with this technology today and a summary of the projects below. In a nutshell, we’ve come a long way this year. While there are still many technical challenges that need to be addressed, teams are diligently working on solutions that are moving the industry towards wide-spread adoption.
“Wait, what are state channels again?”
To provide an overview, state channels are, at the core, an architecture choice that says “let’s not go to the blockchain if we don’t have to”. They introduce a design that looks at the blockchain as a “judge”, “supreme court”, or “settlement layer”. The main concept is that state channels move transactions “off-chain”. By doing so, state channels minimize the number of transactions that actually have to go “on-chain”. They do this with a comparable level of security to actually being on-chain, since participants could go on-chain whenever they want. More specifically, state channels use the threat of resolving disputes on-chain to create secure and trustless transactions off-chain.
Design choices represent a set of trade-offs. For example, the iPhone and Samsung Galaxy represent two different design choices, the former maximizing usability and UX, and the latter maximizing features. As such, state channels represent a trade-off between speed/finality and liquidity; opening and maintaining a state channel requires capital lock-up for the duration of that channel, but you gain the benefit of instant finality, meaning that as soon as both parties sign a state update, it can be considered final. They also trade-off those benefits for “long-tail risk”; in the low-probability event where everyone wants to close their channels at the same time (e.g. someone discovered a bug which caused a reaction akin to a “run on the bank”), the volume of pending transactions on the underlying blockchain might prevent users from safety recovering their funds. Lastly, state channels also represent “philosophical” trade-offs in consensus. While blockchains have byzantine fault-tolerant consensus models that don’t require input from the user, state channels require unanimous consensus and interactivity (i.e. action from the user).
Because of these trade-offs, state channels inherit properties that make them best suited for certain types of use-cases. State channels are most useful when the participants are going to be exchanging many state updates between each other; while there is an initial cost to creating the channel, state updates inside the channel are free and the cost per state update is amortized across the number of transactions. State channels are only able to be used for applications with a defined set of participants; funds are held in multisignature wallets where only the participants in the state channel are the signers to that multisig. Finally, state channels are best for use-cases that benefit from increased privacy; because everything is happening inside a channel between participants, only the opening and closing transactions must be broadcast publicly and recorded on-chain. In practice, this just means that two machines on the internet are sending messages to each other across TCP/IP.
Put together, many state channel developers see Layer 1 as the Security Layer and Layer 2 as the Scalability Layer. Furthermore, Layer 2 provides lower latency and cost per transaction that would otherwise not be possible beyond a certain level of throughput. with Layer 1 solutions. In summary, I believe that state channels are a readily-accessible solution for developers looking to build internet-scale decentralized applications.
Payment channel networks: many routes, similar destinations
Because there are very few people with whom you regularly transact, most payments have to go to a new person with whom you don’t have a channel. Payment channel networks, such as Bitcoin’s Lightning Network (discussed further below), make this possible by “routing” transactions between nodes who don’t have direct channels.
The main technique for implementing these networks is through the use of “Hashed Timelock Contracts” (HTLCs), however generalized state channels have introduced alternative constructions of “routing” state.
Routing via HTLCs
If Alice has a channel open with Bob, Bob has a channel open with Charlie, and Charlie has a channel open with Dave, it is possible for Alice to route her payment to Dave via Charlie and Bob. To do this without introducing additional trust assumptions, Alice needs to ensure that Bob or Charlie cannot run away with her money. This attack vector is addressed using HTLCs, which follow a multi-step process to complete the transaction.
In this model, Dave generates a pre-image, R, which is a random number, and shares the hash of the pre-image, H, with Alice. Alice then generates an HTLC with Bob that says “I will pay you 1 BTC if you show me the pre-image, R. If you don’t show me R within three days, I will take back my 1 BTC”. Bob then generates a similar HTLC with Charlie, with the difference that it will refund the 1 BTC after two days. Similarly, Charlie generates an HTLC with Dave, also with a shorter refund period.
HTLCs are generated from left to right (A->B->C->D)
Dave will now show the pre-image, R, to Charlie, who will now pay Dave the 1 BTC. Because Charlie now has R, he will show R to Bob, who will then pay Charlie the 1 BTC. Lastly, Bob will show R to Alice to redeem his 1 BTC. In this way, Alice was able to indirectly and securely send a payment to Dave.
R is revealed from right to left (D->C->B->A)Routing via “Virtual Channels”
Routing payments could be done in an alternative form via a technique known as “virtual channels”, introduced by Perun (more detail later), which use an intermediary that serves as a “virtual payment hub”. Anyone with a payment channel connected to the hub could establish virtual channels between each other. More specifically, if Alice and Bob both have a payment channel open with Ingrid, then Alice and Bob could establish a virtual channel between themselves. Unlike routing payments via HTLCs, Ingrid does not need to be involved in every payment between Alice and Bob. This property reduces latency and costs, and increases privacy.
To open a virtual channel, Alice and Bob essentially need to “lock-up” a set number of coins from their payment channel with Ingrid. The amount of locked coins will become the value of the virtual channel between Alice and Bob. Notice that Ingrid remains financially neutral by mirroring balances, effectively becoming Bob to Alice, and Alice to Bob.
Z(A) and Z(B) become the balances in Alice’s and Bob’s virtual channel (dotted line), respectively.
This technique could be reapplied to increase the length of the channel. Because a virtual channel is an instance of an off-chain contract, increasing the length does not require another transaction on-chain.
Bob becomes the “Ingrid” between Alice and Charlie.
Virtual channels also represent a different business model from channel routing. Lightning Network HTLCs have a “pay-per-payment” fee model since you need to incentivize each intermediary to route the payment. Virtual channels, on the other hand, have a “rent-a-path” fee model. In this model, an intermediary acts as a virtual payment hub that has direct channels with multiple parties. If Ingrid is the intermediary, then Alice and Bob pay Ingrid to keep the channel open for a certain period of time. It’s worth noting that such a model might have better economics for high-volume micropayments.
Routing via “Meta Channels”
This technique, developed by Counterfactual (more on the project below), achieves a similar goal to Perun’s virtual channel network (i.e. participants without a direct channel could interact), but with differences in construction. In the Counterfactual design, Alice & Ingrid, and Ingrid & Bob each have a generalized state channel open, each of which has a payment channel instantiated. Alice and Bob then create a counterfactually instantiated payment channel object owned by themselves, which could be called “O”. Lastly, two proxy payment counterfactual objects are created, one each in the Alice-Ingrid and Ingrid-Bob channels, that have state deposits (a, b) assigned to them, and that observe O.
Suppose the Alice-Bob payment counterfactual object has state (a, b) representing balances of Alice and Bob, respectively. The proxy payment object on the left assigns a to Alice and b to Ingrid, and the proxy payment object on the right assigns a to Ingrid and b to Bob. That way, Ingrid always has a + b assigned to her, remaining financially neutral.
Counterfactual uses the same “assign Alice’s balance to the left and Bob’s balance to the right” to generalize this to intermediary chains of arbitrary length.
While Perun and Counterfactual have introduced novel ways of routing payments across multiple intermediaries, HTLC-based routing is currently the only implementation on blockchain mainnets. We need to wait and see the effectiveness of these new designs as implementations come to market.
Stale state & unavailability griefing
The primary attack vector that state channels face is “stale state griefing”, which occurs when a malicious actor posts an outdated state to the blockchain.
The “challenge period”: a partial solution
Let’s take the case of a state where (Alice: 10 ETH, Bob: 0 ETH). Bob might be incentivized to publish the prior state which is (A: 5 ETH, B:5 ETH). To address this scenario, the closing process for state channels involves a “challenge period”, which gives Alice the chance to post a later state to the chain. In practice, each state update contains a “nonce” that is essentially a counter which is incremented upwards. Alice submitting a state with a higher nonce, that is also signed by Bob, will supersede Bob’s posted state. It will also restart the challenge period timer, giving Bob a chance to submit a newer state.
Introducing third parties to address the “always online” assumption
An issue with the challenge period is the assumption that Alice will be online to submit her state. If Alice is offline or is DDoS’d, she will be unable to post the finalized state and Bob will succeed in cheating Alice out of her money.
The main category of solutions to this issue is to introduce a third party that acts as a safeguard against these scenarios. This means that Alice could hire a third party to submit a later state in the case where she’s offline. Each of these designs has various benefits and drawbacks.
Lightning Network “Monitors” / “Watchtowers”
The “Monitor” was initially designed for the Bitcoin Lightning Network. The concept is to hire full nodes to watch the blockchain for fraudulent transactions. The participant in a channel outsources the monitoring to multiple third parties by sending consolidated transaction data to them. If the Monitor catches the other party trying to cheat, they could publish a “fraud proof” and earn a reward. One issue with this is that there is limited incentive for the Monitor to participate because fraudulent transactions are quite rare and channels are meant to be open for long periods of time. The other issue is that each Monitor has to maintain a full history of the chain, which is not scalable.
Pisa is a project which proposes a model with lower storage requirements and additional incentives. In this model, there is a third party called the “Custodian”, which is required to put down a large security deposit to get the job. If the channel participant proves that the Custodian did not do their job correctly, the watchtower loses its deposit. One issue with this design is that it leads to additional centralization, since there is a large collateral requirement for Custodians who want to service multiple channels. Another issue is that this doubles the liquidity cost of maintaining a state channel.
Celer “State Guardian Network”
Celer Network proposes a “State Guardian Network (SGN)”, which has a side chain construct similar to Plasma. Similar to Pisa’s deposit model, State Guardians must stake Celer tokens to participate in the network. When the user goes offline, they submit their state and payment to the SGN. This solves the liquidity problem because it’s staking Celer vs others, and solves the pricing issue by creating a market. One argument against this model is that it introduces additional trust assumptions, which is that users now need to trust the side chain and its actors.
These solutions are relatively early in their development and will require a robust third-party economy to produce the intended effects at scale, which presents a large opportunity for layer 2 service provisioning.
Layer 1 capability limits Layer 2 complexity
State channels are only as good as the underlying blockchain, meaning that an application that is unreasonable to build in Layer 1 will also likely be unreasonable to build in Layer 2. If we take a complex two-person game like battleship and build it on layer 2 state channels, it opens up an attack vector where the cost of disputing state and having the code execute on-chain (e.g. $100) could be more expensive than the initial bet for playing the game (e.g. $10). Patrick McCorry summarizes this dilemma with the following question:
If the player is about to win a $10 bet, but the counterparty has stopped responding in the channel, then is it worthwhile for the player to turn off the channel, complete the dispute process, re-activate the application and win the bet via the blockchain if this process costs $100?
In Patrick’s experiment of building a two-person battleship game on Ethereum state channels, the cost of executing the smart contract on-chain was ~12 million gas, which is more than could fit inside a block (currently ~7 million)!
For Ethereum specifically, off-chain computation is still limited by the EVM. Web Assembly (WASM) will allow for the execution of off-chain contracts directly within the end-users’ browser, which will improve execution speed, enable interoperability between other WASM-based blockchains, and reduce reliance on Infura, which is increasingly becoming a centralization risk for the ecosystem.
Applications: Gaming, gambling, and micropayments
The majority of payment channel networks will be used for, well, payments. P2P is the most prevalent use-case, but several projects, such as Althea Mesh and Machinomy, are tackling M2M payments as well. From a B2C perspective, virtual payment channel hubs are being used as a solution for content micropayments, such as PopChest for videos and SpankChain for adult entertainment.
Many state channel infrastructure projects, such as Connext, Kava, and Sprites, are primarily being used to improve the performance and UX of payment channel networks. That said, state channels have also emerged as a popular scalability solution for gaming and gambling use-cases, with projects like FunFair, Finality Labs, and Horizon Games implementing the technology.
As projects increase their focus on user adoption, I think we will continue to see the proliferation of two-party games built on state channels.
Firmly on the road to user adoption
Projects are continuing to #BUIDL. Many have launched on mainnet in 2018, and the others are expecting to do the same within the next year. Here is a more detailed summary of the projects building or incorporating state channel technology:
Direct Payment ChannelsSpankChain
SpankChain is an adult entertainment ecosystem powered by blockchain technology. The project is one of the earliest adopters and developers of state channel technology. SpankChain raised $6.5 million in their ICO in November 2017 and was the first production implementation of Machinomy’s unidirectional payment channels on the Ethereum mainnet in April 2018. It also released, in partnership with Finality Labs, a generalized state channels proof-of-concept in March 2018. It also launched, in partnership with Connext and Kyokan, the first non-custodial payment hub on mainnet in September 2018.
Stack is a cryptocurrency that uses state channels for real-time point-of-sale transactions. To make a purchase, their ERC-20 token (STK), provides access to a payment channel between a crypto wallet (i.e. users) and third-party liquidity provider (i.e. Stack’s own wallet), which will then convert and pay the merchant in fiat via existing payment rails. It’s also working on “multi-token channels”, which allows an existing payment channel to create “sub-channels” that support additional ERC-20 tokens. The project raised $17mn in their ICO in December 2017 and is currently on the Ethereum testnet.
Commonwealth Crypto is building a way of doing cross-chain atomic swaps at centralized cryptocurrency exchanges. They also allow traders to maintain custody of their coins while trading via “escrow-backed trading”, which is implemented through the use of unidirectional payment channels. The project started in early 2017 and closed a $1.5mn seed round later that year.
Popchest is an Ethereum-based video distribution platform (think “YouTube on the blockchain”) using micropayments and token incentives instead of relying on advertising and paid subscriptions. Micropayments are currently implemented via Machinomy’s open sourced unidirectional payment channels.
Payment Channel NetworksLightning Labs
Lightning Labs is the company behind Bitcoin’s Lightning Network, a bidirectional, HTLC-based payment channel network. The project was founded in 2016 has been on the Bitcoin mainnet since May 2018, when it raised a $2.5 million seed round. As of December 2018, the network has over 4,000 nodes and 12,000 payment channels. The protocol continues to explore innovative upgrades, such as multi-party payment channels based on the idea of “channel factories”, which could achieve off-chain payment routing similar to Perun’s virtual state channels or Counterfactual’s metachannels.
Liquidity Network is a payment channel network that uses payment hubs to enhance liquidity. Member of a payment hub could pay any other member of a payment hub with the allocated funds, and those funds are not locked between only those two users, but instead are accessible to all users on the same hub. The project raised $23 million in their ICO and has been active on the Ethereum mainnet since June 2018.
Raiden Network is Ethereum’s version of Bitcoin’s Lightning Network. It is a bidirectional, HTLC-based payment channel network. The main difference is that Raiden uses a token to pay for services, such as path finding or channel monitoring, within the network. The project raised $33 million in November 2017 and launched its mainnet in December 2018.
Trinity Network is NEO’s version of Raiden, though the project has started developing on Ethereum and Zilliqa as well. It is a HTLC-based payment channel network with a native token that’s used to pay for fees and services. Trinity raised $20 million in its ICO in January 2018, and is currently in testnet.
Althea Mesh is a project that aims to replace centralized ISPs with a competitive market of individuals and businesses participating in one decentralized network. Each node on the network establishes payment channels with each of its neighbors, allowing users to send and receive Ethereum micropayments for forwarding packets. The project started in mid-2017 and currently has two small live mesh networks in Colombia and Oregon.
Teechain is a payment channel network that’s built using Trusted Execution Environments (TEEs), specifically Intel SGX hardware enclaves. At the cost of introducing an additional trust assumption (i.e. trusted hardware), this design allows for higher throughput, asynchronous blockchain access, faster channel creation, and lower collateral costs relative to non-hardware-based approaches. The project published their whitepaper in July 2017 and are currently live on the Bitcoin mainnet.
Kava is leveraging the work of Cosmos and Interledger to build a fast-finality blockchain for interoperable payment channel networks. Their initial implementation uses unidirectional payment channels, which can be opened by a sender and closed immediately by the receiver, or by the sender subject to a dispute period. The project started in mid-2017 and is currently in public testnet.
Sprites is a research paper that proposes a payment network in which payment channels are derived from a more general state channel construction. The paper claims several improvements over Lightning Network and Raiden, specifically around reduced collateral costs and improved throughput. Enuma Technologies received a $200K Ethereum Foundation grant to implement their state channel construction.
Direct State ChannelsFunFair
FunFair, a casino on the blockchain, is one of the earliest projects implementing state channel technology. It uses “Fate Channels”, which are state channels with the added ability to verify a progressive reveal scheme by both parties, advancing a deterministic (“fated”) but unpredictable sequence of random numbers. The project raised $26 million in their ICO in June 2017 and has been on mainnet since May 2018.
Aeternity is a smart contract platform (i.e. a new blockchain) with native support for two-party generalized state channels. It also introduces the concept of a “snapshot”, which addresses the data unavailability problem by allowing a recent off-chain state to be recorded on-chain. After its inclusion, the channel cannot be closed using an older state than the one provided in the snapshot. The project raised $24 million in June 2017 and launched their mainnet in December 2018.
Magmo is a team of researchers working on state channels for Ethereum. They have developed the “Force-Move Games” framework, which is a (not fully general) state channel framework designed to support turn-based games whose moves don’t depend on time or data that is external to the channel (e.g. chess, rock-paper-scissors). Magmo is supported by the Ethereum Foundation and L4, and collaborates closely with the Counterfactual team.
State Channels NetworksConnext
Connext is a layer two scaling platform that uses an implementation of Perun’s virtual channels to offer payment hub infrastructure for Ethereum projects. Within a Connext Hub, users can trustlessly pay any other user of the Hub without needing to pay gas or wait for block confirmation. The project’s first hub went live on mainnet in September 2018, and the team recently received,..
The growing demand for decentralized network data: a survey and some predictions
Network participants need easily accessible protocol data — and today that data is difficult to aggregate and understand
When I first entered the crypto space, I was shocked by the lack of query and analytics tools that were available to understand basic metrics around decentralized protocols and applications. There were very few, if any, tools that could show me metrics like number of active users, the average gas cost to call a particular function, or the most actively used modules for a given decentralized application (“dapp”).
In a previous role as a product manager for BlackRock’s proprietary asset management platform, I relied heavily upon my ability to query and analyze key product metrics and market indicators on a daily basis. When I transitioned to a role in the crypto space and no longer had these tools, I felt like I was flying blind.
For years, there have been platforms available in the cryptocurrency space to help investors analyze and dissect market data like price, trade volume, and circulating supply. But, when I was starting out, I was more concerned with another set of metrics. Despite the fact that blockchains necessarily open source their underlying data, there was still a shortage of tools displaying that information in a real-time and user-friendly way.
As a growing number of networks requiring third-party participation have begun to hit mainnet, the demand for these tools has grown beyond development teams to include the larger community of participants in networks. This group includes the set of users that support a network, providing the computing resources, storage, validation, curation, and delegation necessary for the network to succeed.
A few examples of participants and the data they may need include the following:
A Keeper on the Maker network may want to keep track of the number of CDPs under a given collateralization ratio, so it may shore up enough DAI to take advantage of potential bite opportunities in the future
A Decred DCR token holder deciding where to delegate their voting power may want to review the uptime, fees, and total number of users currently delegating to a set of Voting Service Providers (VSPs)
A Transcoder for the Livepeer network may want to track the sensitivity of delegation to changes in fee structure, while also keeping an eye on the network participation rate and the growth in the number of active transcoders
Decentralized network data dashboards and analytical tools are growing in number and kind, with varying degrees of customizability
Below, I have mapped out a few kinds of tools available today, beginning at the most basic and moving to the most customizable.
Block explorers: these tools are the most basic, and the first to pop up when a new protocol is deployed. A few examples include Etherscan and Ethstats for Ethereum, Tzscan for Tezos, and bloks.io for EOS.
Network-specific dashboards: over time, development teams, Staking-as-a-Service companies, and third-party network supporters began to build out network-specific dashboards. These offer an array of key metrics, often with some added filtering and query functionality. A few good examples are the Livepeer Transcoder Dashboard, mkr.tools, DCRStats, the Hubble Cosmos Dashboard (built by Figment Networks), and Supermax, which covers key statistics on a wide variety of networks.
Supermax provides users with macro-level statistics on a variety of decentralized networks, including Livepeer. Credit: https://supermax.cool/livepeer
Developer dashboards: as dapps approached mainnet, a set of tools built with the intention of monitoring, growing, and retaining users (among many other things) have been developed. These offer views of core network statistics with an added degree of customization specifically for development teams. Two examples of teams working on these tools are TRM Labs and Dune Analytics.
Index & Query: this category includes a set of protocols and interfaces that allow users to directly query and parse blockchain data just as one would a more traditional database. A few of these tools are The Graph, Fluence, BigChainDB, and Google BigQuery. Notably, some of these tools (including The Graph and Fluence, currently in development) plan to offer decentralized index and query solutions.
Evolved, but still room to grow: a showcase of mkr.tools
I should note that some protocol dashboards have been around for some time now, and have evolved significantly to fit the needs of their users. Mkr.tools is one great example here.
About a month ago, I began to take a closer look at Maker Collateralized Debt Positions (CDPs). DAI had maintained its stability amidst the the recent market downturn, and I wanted to understand the lifecycle of a CDP in more detail. Any team or person interested in becoming a Keeper on the Maker platform would likely go through this same analysis prior to doing any work on the network.
Initially, I had to aggregate data from a few different sources in order to get a complete picture. These included:
Mkr.tools for a macro-level view of a given CDP
Etherscan for the more detailed set of transactions making up each action taken on a given CDP
GDAX for historical ETH prices (this source was replaced, over time, by direct calls to the Maker price oracle)
Various Maker smart contracts
The process of analyzing one simple CDP would take me at minimum a few minutes, and up to an hour for older, more complicated CDPs.
I was subsequently thrilled when mkr.tools was recently updated with improved tooling. Notable enhancements include the ability to track historical ETH prices, collateralization ratios, liquidation prices for a given CDP over time. After these updates, I might still open Etherscan to understand a certain action in more detail, but no longer find it as necessary. The time it takes to review a CDP contract can be done on one page over the course of a few minutes.
To help improve the onboarding experience of all potential network participants, I hope to see mkr.tools and similar dashboards continue to develop. In the future, Mike McDonald and other contributors plan to build out mkr.tools to include statistics on DAI usage in individual DeFi apps, among other things.
I should note that it is possible to build a set of Python scripts that interact with Maker’s pymaker API and pull out a more customized set of data. But for the average user new to the Maker project, this requires a lot of time, effort, coding expertise, and a high level of familiarity with Maker’s code. Publicly available dashboards democratize this information, improving accessibility for a wider swath of users.
Some thoughts on value creation
If a decentralized protocol relies upon active participation from third parties, it is in that protocol’s best interest for its data to be open sourced and easily accessible. If stakers, resource providers, delegators, and content curators are not able to effectively and efficiently quantify the risk and return profile of participating in a specific network, they will likely pass on that network in favor of another.
Abstracting away from the more detailed layers of the data query and analytics stack, let’s take a look at two core components:
The index and query module that allows protocol data to be quickly and easily parsed by a user
The user interface displaying that data, sometimes with proprietary analytics layered on top
Both of these components will be able to create and capture value, though in very different ways.
The Graph is an example of an index and query module. In addition to serving up real-time network data to dapps themselves, tools like The Graph can help inform network participants in two ways: (1) by supplying metrics to network-specific dashboards, and (2) allowing participants to query the protocol directly, in order to access metrics not otherwise available or to educate proprietary strategies. A huge amount of value is generated in mapping protocol data to a format that is easily indexed and queried.
Now let’s look at the user interface layer. Protocol development teams and supporters are likely to continue to build various open source dashboards and tools for the benefit of the network, at least in the short to medium term. However, due to the transparency of network data and the propagation of tools to understand that data, they are not likely to capture much value from those efforts — the ability to replicate certain views will become easier over time.
In order to build a high-margin business providing protocol-specific Dashboards for development teams, Staking-as-a-Service companies, and crypto-funds (a “Bloomberg for decentralized networks”), a team would need some proprietary, off-chain data or predictive analytics to differentiate themselves. A good example of a team that has done that for identity, compliance, and fraud today is Chainalysis. That being said, I am excited to see which teams are up to the challenge.
In conclusion, I will highlight my point that with more data, active network participants can make more educated decisions about how to efficiently grow networks they support. Decentralized networks will, in turn, compete on protocol design to incentivize users to join their network over others. I am excited to becoming an active user of the tools being developed to analyze protocol data in the future — and I am excited to see where this wave of transparency takes the space as a whole.
Disclosure: At the time of publication CoinFund and its affiliates hold interests in MKR, LPT, and The Graph.
Thanks to the CoinFund Team, Ria Bhutoria, and Viktor Bunin for thoughts and feedback on this post.
Disclaimer: The content provided on this site is for informational and discussion purposes only and should not be relied upon in connection with a particular investment decision or be construed as an offer, recommendation or solicitation regarding any investment. The author is not endorsing any company, project, or token discussed in this article. All information is presented here “as is,” without warranty of any kind, whether express or implied, and any forward-looking statements may turn out to be wrong. CoinFund Management LLC and its affiliates may have long or short positions in the tokens or projects discussed in this article.
Decentralized governance today: A case study on AGP-1
In this post I review Aragon’s 0.6 release, as well as the results from a vote on a recent network Governance Proposal. I also discuss some factors that I believe will impact the health of decentralized organization governance in the future.
Aragon: An overview
The Aragon Project was founded in November 2016 with the goal of empowering freedom by creating tools that allow decentralized organizations to thrive. The project aims to remove the frictions that exist in setting up and running more traditional organizations. These could, for instance, take the form of the paperwork and bureaucracy that get in the way of a team spending their days building a budding startup’s core product.
The project’s most recent 0.6 release allows users to create Aragon organizations on mainnet. An Aragon organization runs entirely on Ethereum and offers basic functionality for running a decentralized organization, including the ability to create and manage an organization-specific token, vote on proposals (including, for example, whether to add a person to the organization), and coordinate finances. The organization can take one of two forms: (1) a general purpose democracy or (2) a more business-oriented multi-signature entity.
In the future, new Aragon applications offering additional functionality will become available. These including Planning, Payroll, Espresso (a collaborative data vault), and Liquid Democracy (for delegation). As of December 10th, 177 DAOs have been formed on mainnet. This includes 47 multi-sigs and 129 democracies.
A review of AGP-1 and some factors impacting healthy decentralized organization governance
Shortly following the 0.6 release, the Aragon team held a vote for Aragon Governance Proposal 1 (AGP-1). AGP-1 passed, putting in place a formal proposal and voting process for changes to the Network.
However, what is most interesting about this vote are the turnout results. For one, the proposal passed with a staggering 99.97% majority. Only 2.6% of total ANT supply participated in the vote, and 57% of participating tokens were held by one address. The Aragon team itself has admitted that there is a considerable amount of room to improve turnout, and a discussion within the Aragon Forum has developed to begin to discuss some approaches to achieve this.
The crypto community should recognize that there are a few major endogenous and exogenous factors that are likely behind these results.
A few of these endogenous factors include:
Lack of skin in the game. If a proposal has low stakes, as may have been the case for AGP-1, voters may not necessarily feel compelled to spend the time and effort to cast their vote. Users may not have been aware of this specific vote, or may have agreed with the proposal but assumed it would pass without their input. Ensuring high turnout will require clear communication about proposals and parameters like voting start and end dates. Building in additional incentives, like token rewards for active voters, or an Aragon Community Token (ACT) as proposed in the previously mentioned Aragon Forum discussion, could also improve participation.
Unsuitable voting parameters. The vote for AGP-1 lasted 48 hours, and required a quorum of 67%. Over this two-day period, ANT token-holders may have been too busy with work obligations or social events taking them away from their laptops. For important votes, 48 hours may not be enough. The 67% bar, on the other hand, was easily met. This number could be increased for AGP votes in the foreseeable future, especially if these proposals continues to see low voter turnout (and thus a higher likelihood of whales swaying the outcome). One way to improve the turnout without necessarily adjusting parameters is delegation. Liquid Democracy — an Aragon app which will allow for this — is already in the works.
The exogenous factors I see mostly revolve around the Ethereum network. These include:
Ethereum network congestion. Development teams may want to experiment with scalability solutions to handle functions which should remain cheap and fast, like voting.
Poor integration between wallet and browser. Over the past few years, my peers and I have experienced with some considerable latency and complete transaction failure when using MetaMask with some browsers. Usability here has generally improved and will continue to do so, but any outstanding infrastructural frictions could impede voter turnout.
Lack of Ethereum network knowledge and setup. Although less likely to be an issue today, a larger, more diverse group of ANT-token holders may not be set up to use Metamask in the future. General Ethereum education will be a foundational requirement for a healthy decentralized organization governance process.
Ultimately, the implicit requirement that voter turnout statistics be made public is one of the benefits of DAOs. Low voter participation and whale voters cannot be hidden in a transparent governance process. And, the Aragon’s team’s willingness to admit to and discuss governance issues with their broader community will help to promote positive change in the future. The next Aragon vote will take place in January 2019. I look forward to seeing whether the team makes any changes to its voting process to improve turnout, and to reviewing the results.
If DAOs are to fulfill the vision of organizing decentralized groups via fair governance processes, protocol development teams must continue to monitor usability and to build systems that encourage high voter turnout. This ability to iterate on a governance system at all — patching bugs and introducing new functionality over time — is revolutionary. Aragon has certainly led the way in showing what is possible. I will also note that there are a number of other projects in the process of building out decentralized organization and governance tools , including DAOstack and Common Interest.
I hope that Aragon and other protocol development teams continue to publish and analyze the results of Governance Proposals, and to promote discussion on ways to improve their governance processes. I am excited to watch — and participate in — the evolution of these organizations in the future.
As decentralized staking networks continue to develop and proliferate, it is interesting to consider their impact on other areas of the crypto economy. One area where the vocation of staking is bound to have an impact is cryptoasset borrowing.
Today, the foremost examples of staking networks in production include Tezos, Livepeer, and even SpankChain. These networks rely on staking to provide security and governance. They also provide an antidote to high velocity, which, if such a phenomenon ever manifests in actual cryptoeconomics, is curbed by taking assets out of circulation and solves the main foil of localized digital currencies and payment tokens.
Staking networks today are being bootstrapped by early investors and funds who typically hold sizable network ownership based on investment in early project stages. But most funds find it difficult to technologically engage networks, especially large numbers of them, and look to service providers for security, custody, and domain-specific utilization of assets across systems. However, staking service providers themselves don’t hold early ownership of networks in general. They find themselves in a position where they must rely on delegation of assets (a limiting factor, since delegation is not available on-protocol in all networks) or fund and investor relationships for access.
However, as lending platforms and protocols come online, staking companies have new options for commanding larger stakes. Borrowing assets for the purpose of staking will serve two key purposes in the staking space: (1) access to the network, as most networks will require using tokens to perform work or run functional nodes; and (2) ability to create asset-neutral staking positions.
Staking networks are great — they create a new source of returns backed by real economies of decentralized network participants. But while most opportunities fall into a 5–20% return range in a token-denominated sense, the fiat-denominated returns are nevertheless much more volatile. The fiat-denominated return of a staking opportunity still depends more on sentiment and network growth than the typical rate of token return.
Borrowing tokens on decentralized protocols may in the future provide a simple way to lower fiat-denominated downside while staking operations are taking place. Suppose you wanted to earn a return on a staking network which offers a token-denominated 5% rate of return without being exposed to its fiat-denominated volatility. If you can borrow the network asset at a 1% token-denominated rate, you can stake it in the network, earn 5%, and then keep 4% for yourself. Borrowing the asset has a similar effect to shorting, neutralizing the long (fiat market) exposure to the asset.
Further opportunities are available to curb the fiat-denominated volatility of the position as well. For instance, the token-denominated return can be periodically skimmed and liquidated to lower volatility exposure.
Since being able to borrow tokens is useful in the context of staking, we might envision a market where staking opportunities are creating a lot more borrowing demand than we see now. So what is the potential relationship between borrowing rates and returns of networks?
Borrowing rates and staking returns converge
Today, borrowing rates on Compound Finance are rather low as compared to prospective staking opportunities. For instance, Livepeer’s current daily inflation rate (paid to transcoders and delegators as a bonding incentive) annualizes to over 25.8%. But the borrow APR of BAT, the most expensive asset borrowable on Compound, is 8.25%.
As demand increases for staking returns, this kind of differential is bound to bring borrowing rates higher. If a significant portion of staking begins to rely on borrowed tokens (due to stakers seeking access and asset-neutral portfolios built by funds), then lenders will push up rates accordingly.
Much of on-chain borrowing happens with collateralized smart contracts today, and underwritten or fractional-reserve systems that don’t require collateral have been proposed. Decentralized lending may carry default risk, in addition to technological and hacking risks inherent to smart contracts. Similarly, staking activities carry the risk slashing conditions (situations where staking nodes can lose tokens due to downtime or behavior that doesn’t comply with the protocol).
While the spread between borrowing rates and staking returns converges, it should still be non-zero to compensate these types of risks.
High borrowing rates
What if borrowing rates were higher than staking returns? This seems to create an incentive to withdraw stake from staking networks in order to earn a higher return through lending, creating a potential disincentive for staking activities. Whether this can or might happen is a case by case analysis of individual networks — for example, in Livepeer, the inflation rate would adjust as a function of staking participation rate (p-rate) to encourage more delegation and bonding.
Oversized demand for borrowing may also be an indicator of growing short interest in an asset, so using comparisons of borrowing rates and staking returns may be an indicator of market sentiment with respect to an asset or network.
In summary, crypto borrowing may be an important tool for stakers, crypto-focused investors, and analysts assessing the health of staking network markets. Borrowing creates new avenues of access to networks for new entrants into the generalized mining space, as well as a mechanism to build asset-neutral portfolios of staking opportunities. The spread between borrowing rates and staking returns should generally converge together, leaving some spread to compensate for technological and protocol-specific risks like slashing. Both the staking space and the decentralized lending space are important elements of the crypto economy worth following.
A wave of new groups have taken to market to build mining companies and network-facing services that tend to provision staking and validator nodes. Today, they are focused on defensibility strategies having to do with specialized hardware, operational security, and the deep economic understanding of proof-of-stake (PoS) systems. But as decentralized networks become ubiquitous as large-scale coordination mechanisms and, ultimately, publicly-owned institutions, it will become apparent that the role of technological actors in these networks goes far beyond PoS-style block validation. The set of domain-specific problems that can be solved by global networks, from organizational governance to computational resources to social media, is bewilderingly large. We have termed the activity of engaging the set of techno-financial opportunities in decentralized networks generalized mining, a nod to the roots of third-party cryptoeconomics in systems such as Bitcoin and Ethereum. We have termed the general marketplace of networks, service providers, and cryptoeconomic business opportunities as the third-party economy. Investors, companies, and projects play an important role in these upcoming networks. This meetup is intended to bring together the foremost participants in the generalized mining space to explore this set of exciting opportunities. Beer, wine and snacks will be provided.
Rabbithole Talks is a monthly CoinFund meetup, focused on doing deep dives into innovative token economics and technical design decisions for interesting blockchain-based projects. Join us for our kick-off event on 10/25 with Doug Petkanics (Livepeer)!
Doug Petkanics is Founder and CEO at Livepeer, where he focuses on protocol research and software development. Livepeer is building an open source video infrastructure platform, which leverages decentralization to drive down costs and meet the needs of developers building video applications at scale. Prior to Livepeer, Doug was founder and VP Engineering at Hyperpublic (acquired by Groupon) and Wildcard.
Crypto Talks is an interactive, livestreamed Q&A, hosted by Jake Brukhman (Co-Founder, Managing Director, & CEO of CoinFund), engaging founders of blockchain technology projects. The goal of Crypto Talks is to dive deeper into proposed projects and allow founders to present their visions and plans to a fairly sophisticated audience of experienced cryptoinvestors, and beyond. Past guests include: Chris Burniske (Co- author of “Cryptoassets”, Partner at Placeholder VC), Bryce Bladon (Co-Founder of Crypto Kitties), Ari Paul (CIO, Co-Founder of BlockTower Capital).