June 12th 2020
VP of Engineering at Blockdaemon
One of the great strengths of the blockchain sector is the vast multitude, and ever-increasing number of protocols in existence. These protocols offer incredible diversity in the functions and use cases that they deliver for developers and enterprises alike, with particular strengths and weaknesses inherent to particular blockchains.
For example, the often touted weakness of Bitcoin in scalability has been addressed by a number of other blockchains such as Hyperledger, Corda and the Lightning Network, each in turn with their own downsides in terms of centralization or conducting off-chain transactions.
Yet this incredible diversity in protocols also brings a set of challenges. Anybody – developers, exchanges, wallets, institutions and enterprises – that deals with more than just a couple of blockchain protocols knows how tedious it can be to keep switching contexts just to relearn a new set of commands and APIs.
This was the challenge that drove us to build a tool for developers to maximize the efficiency of projects operating across different protocols. Our API tool, Ubiquity API, allows software projects that communicate with a number of different protocols to use less repetitive, standardized code that minimizes the need for switching logic to handle API differences.
Creating such a tool in the rapidly evolving blockchain sector is no easy feat. The constant innovation in the sector meant that our solution had to stay in tune with cutting-edge developments in protocol and consensus mechanisms. Another significant consideration is the fact that not all protocols are created equal. For this reason, our solution had to be flexible enough to deal with a lot of variation, and yet provide a uniform interface for users.
Ensuring that our solution met these objectives required testing and development across a number of core areas:
Uptime was crucial, and yet as we went through the different environments, it wasn’t something that was initially prioritized in the development phase. We started by using Docker Swarm, then quickly realized that more thought around orchestration was required. We migrated to Kubernetes to take advantage of what it has to offer. We also revisited how we managed configurations and secrets, and arrived at a solution we’re proud of – scalable, performant and secure.
While trialling a few different architecture options, we decided that adding a layer of abstraction above block and object storage would provide us the ability to be flexible around where we stored large amounts of data. Our initial selection of Ceph has been a success and so far looks like a great candidate.
Cost efficiency at scale
One of the benefits of Event-Driven Architecture (EDA) is better scalability and fault tolerance. RabbitMQ was already part of our platform stack and we decided to expand its usage in other areas. Based on the robust Erlang language, RabbitMQ is a highly resilient, battle-tested message broker. Other contenders, such as NATS and Kafka, were also considered, yet RabbitMQ has proven to be just right for our needs.
We carefully considered the hidden benefits of a monolith over microservices and decided on abstracting the services to consider the logical separation of code and where certain functionality made sense as a separate service in terms of scaling. By taking the approach of using microservices, we could break down the architecture into independent components that allowed us to update, revise, and communicate faster without disrupting the current platform experience.
The blockchain space evolves at pace. The diverse range of protocols on the market offer benefits specific to particular use cases, and mean that significant consolidation of blockchains is unlikely in the medium-term. Projects in the space will need to be able to efficiently manage operation across these protocols in order to compete effectively.