To achieve the business advantages promised by blockchain/distributed ledger technology, organizations need a QA strategy that identifies potential performance and scalability challenges before they transition from pilot to production. Here’s how to move from concept to reality.
Blockchain and its underlying distributed ledger technology seems a perfect fit for organizations seeking to enhance transactional security, speed up processing and reduce infrastructure costs via a federated system that verifies the transacting parties without a central authority. With its open, consensus-driven and shared system, blockchain eliminates traditional trust tropes such as paper trails and proposes a faster, more secure and instantaneous throughput of transactions. It fosters trust via a shared version of truth — a distributed ledger, accessible to all participants, that cannot be altered without a consensus.
However, to deliver on its promise, the distributed ledger must be assured from a business process and technology standpoint. This requires the verification of underlying preset conditions, encoded as business logic, for a syndicated network of nodes to authenticate a transaction. Data immutability, integrity and interoperability in a blockchain network are crucial, as well as its performance and security.
As a result, a QA strategy is hypercritical for organizations seeking to move blockchain from proofs of concept and pilots into everyday production.
Assuring business flows for blockchain
Business processes, such as role-based permissions and adherence to predefined business logic/ conditions, are crucial to build trust in a blockchain network. To assure business processes, enterprises need domain-aligned QA strategy. For instance, a change in a blockchain-hosted legal document is affected only if the majority of nodes/participants agree to the change.
To assure business process flows in a blockchain network, QA needs to rise to the role of an orchestrator, approaching quality with these three pivots:
Instant update & access. Data integrity is crucial to blockchain’s model of trust, for which it is important to validate data synchronization. QA teams should reboot all the nodes (i.e., by taking them offline and making them live again to see if the ledger is instantly updated all across after a transaction takes place).
Smart contracts. Data immutability is established when authorized data owners can edit a block. QA should verify the metadata from previous blocks to ensure the new blocks are linked according to existing business rules. Additionally, audit logs should verify the new block order.
Maker-checker balance. Authorized makers on a blockchain network trigger a transaction; checkers then validate and approve it. QA should verify end-to-end business processes, roles, permissions and the order of alignment to ensure that seamless data flows for smooth functioning.
Technology assurance for Blockchain
Since a blockchain network requires several nodes to verify a transaction, it architecturally introduces redundancy, scalability and latency issues. In a legacy application pivoting around a centralized database, transactions are processed only once, whereas in a blockchain network, each node processes the same transaction independently. This is why, when it comes to real-world implementations, blockchain platforms have a higher risk of performance and scalability challenges, which can undermine its proposed benefits.
To ensure that the blockchain platform performs to acceptable standards, QA should factor in:
Transaction verification. Every transaction initiated within a blockchain requires proper sign-off before it is committed to the distributed ledger. As a result, these environments need a well-established validation technique, both from a functional, performance and security standpoint.
Consensus validation. Since establishing a consensus is fundamental to blockchain operation, there is an inherent need for optimal hardware resources to ensure that this process does not introduce latency and bottlenecks. Proper performance validation is required to ensure that the platform can establish the necessary consensus in a reasonable timeframe.
Block size scaling. To determine the correct block size, it is imperative to take into account the expected number of transactions that will be handled, as well as the block creation time or the block generation time. All three are key for optimal performance of the blockchain. Once established, both functional validation for the number of transactions and performance validation for the desired scalability need to be performed.
Instantaneous data transmission. Multiple nodes in a blockchain network communicate and exchange data multi-laterally. These data exchanges increase exponentially with every new transaction. However, an increasing volume of data leads to higher processing time, thus creating latency. QA should ensure that latency is minimized and maintained at agreed-upon service level agreements by maintaining:
As single version of truth: The digital ledger is a single version of truth that must be accessible to all nodes to establish data immutability. Therefore, the blockchain network must be tested to ensure accessibility across the network.
Security: Universal accessibility comes with the threat of data pilferage. Data should be secured against leaks and external attempts to infiltrate the blockchain network. There are three types of security testing at a high level:
Blockchain static and dynamic application testing, including assuring wallets, databases, graphical user interfaces (GUI) and application logic.
Blockchain integrity testing to validate that only authorized entities are able to access and holistically validate end-to-end transactions from a business process assurance perspective.
Blockchain network penetration testing, where threat scenarios are used to detect security vulnerabilities at different layers of the blockchain that could have gone unnoticed otherwise.
Automated testing of blockchain networks is a must for organizations that want to accelerate production implementations. Several automated functional testing tools (primarily open source solutions) are readily available such as Truffle, Ganache, Populus, Manticore, etc. These tools can accommodate specific testing libraries for various blockchain platforms, thereby significantly reducing the regression testing timelines.
Case in point
A global payments company sought to compare its existing in-house blockchain platform with an external permissioned blockchain system to identify areas of improvement to enhance its scalability, performance and throughput rates.
We evaluated the performance of its in-house blockchain framework against an external permissioned blockchain by comparing response time vs. throughput, CPU utilization based on load, peak transaction throughput (TPS), and block confirmation time variance with load.
To assess scalability, we adopted load simulation techniques and workload modeling to match real-world scenarios. For instance, setting up virtual users, simulating load by calling a RESTful web service to initiate payment requests.
After the evaluation, the permissioned blockchain system was found to work better than the in-house framework. The client decided to hone its in-house blockchain and commissioned us to devise a progressive improvement roadmap.
We suggested removing the single-point entry client proxy to eliminate throttling, and a horizontally scalable network topology was added to provision customer consensus nodes residing in different servers. Doing this enabled our client to:
Achieve higher TPS at 1,000-user load by attaining a transaction volume of 100K-plus in just 10 minutes.
Improve both business and blockchain transaction response times along with block confirmation statistics.
Optimize CPU utilization at customer nodes and higher block injection rates.
QA as an orchestrator of trust
Blockchain exemplifies the teething challenges that multi-stake ecosystems bring when compared with traditional monolithic top-down technologies. Blockchain’s distributed infrastructure and business model raises trust and data integrity concerns for some organizations – since failure points are greater compared with a single authorizing agency. This adds business risk.
As a result, QA needs to orchestrate quality at each touchpoint or stakeholder across the ecosystem. As such, QA must pivot from being the guardian of quality to the custodian of quality, helping build in customer trust and boosting adoption.