Skip to content

Why is Distributed Ledger Technology a Good Solution for Deep Process Automation?

  • | By Irena Lee
Why is Distributed Ledger Technology a Good Solution for Deep Process Automation?

Welcome to the final of our 3-part series on the Business Process Automation landscape!

In our last 2 blog posts, we discussed:

What the landscape of Business Process Automation looks like: https://www.luthersystems.com/blog/2020/10/13/the-landscape-of-process-automation

What led to the rise of Deep Process Automation: https://www.luthersystems.com/blog/2020/11/3/the-rise-of-deep-process-automation

And now, we will be discussing why distributed ledger technology is a good solution for deep process automation.

Complex enterprise processes are distributed by nature.

Enterprise processes have multiple tasks. Different tasks are executed by different entities (teams, systems, functions, organizations), which collectively form the whole process. These entities are operated separately by different operational and technical teams, bound by their own governance rules.

Over time, these entities make changes and adjustments to their operations and their technology. These changes do not get fully or properly communicated or properly implemented by other entities in the process. Consequently, different entities in the process operate under varying assumptions about what the process is and what other entities are doing. This lack of coordination leads to errors in executing the process. Enterprises remedy this by implementing rigorous rectification and reconciliation processes. This then results in higher cost and longer processing times. This is the problem that Deep Process Automation aims to solve.

In the past, there have been attempts at solving this problem, including (i) centralization and (ii) Microservices Architecture.

CENTRALIZATION OF THE PROCESS EXECUTION

Centralisation is the introduction of a central system or entity whose responsibility is to coordinate and orchestrate the execution of the process across the different entities involved in the process.

Assume a process has 5 tasks performed by 5 entities. A centralized entity will be a 6th entity attempting to coordinate and orchestrate the execution of the process across these 5 entities. While theoretically this seems like a logical choice, in practice, there are now 6 entities that can potentially fall out of sync. This is since each of the 6 entities is now operated separately by different teams, each of which make changes, which do not get fully and properly communicated to the rest of the entities, resulting in even further possibilities for lack of coordination which in turn results in even further possibilities for errors and the need for further reconciliation efforts.

MICROSERVICE ARCHITECTURE

Microservice architecture is one of the leading approaches for enterprise architectures today. It is a collection of specific services with predetermined inputs and outputs that allows for the execution of a specific task. While this is a massive leap in the expanded capabilities of enterprises in developing enterprise applications, there are limitations to it.

However, there are limitations of microservice architecture in practice:

  • Scaling by adding or modifying services, functions, and entities requires cross-team change mgmt coordination and additional discovery systems. Without extreme discipline these changes produce more fragmentation across services & data silos .
  • Costly and bespoke to meet Data Residency requirements - Each operating entity requires separate infrastructure & databases within regions, and custom reconciliation process to synchronise these databases.
  • No built-in checks for process & data integrity, auditability or durability - Requires additional add-on systems, often with bespoke & non-standard integrations. Cloud providers are starting to offer entire product lines for tackling one of these challenging areas introduced by MSA, for example AWS X-Ray for Observability, and AWS Quantum Ledger for data integrity.

While these approaches partly address the problem, the broader issue of lack of coordination across entities which leads to the need for reconciliation and rectification of errors in executing the process still persists.

HERE’S WHERE LUTHER COMES IN!

Luther System’s journey began with the realisation that distributed ledger technology can be applied to solving this long standing problem in enterprise processes. Naturally, it is a great fit for solving this problem for the following reasons:

  • Distributed Architecture
  • Rapid Scalability
  • Standardization
  • Luther’s Operating System

DISTRIBUTED ARCHITECTURE: IT’S A DISTRIBUTED ARCHITECTURE FOR A DISTRIBUTED PROBLEM

On face value, it makes sense to introduce a central system to coordinate and orchestrate the execution of the process across the different entities involved in the process. Towards that, traditional architectures force all data into a single team or data silo, which introduces further problems. APIs allow participants to retrieve this data in a standard format, but the data processing within that silo is opaque to the other participants. Consequently, the enterprise becomes a spider web of disjoint data silos and opaque processes.

Distributed ledger technology enables federated operations. It achieves this by: (i) enabling various participants to execute an event, agree on what happened, store the results while allowing them to continue to operate separately without requiring any of them or a separate entity to run the process; and (ii) enabling various entities (and consequently, parts) of the process will not have any interference in the way that they operate but are still able to run the process in unison. In many existing cases, entities would have to make changes to how they operate in order to better fit into the process as a whole.

So what holds all these entities together? Smart contracts! They codify the interactions and opaque processes between groups and are the operating rails for cross-functional processes, making them available to all of the participants, allowing them to have transparency. They are the technical mechanism that achieves process harmonization which is the ‘design and implementation of business process standards across different regions/business units to achieve targeted business benefits arising out of the standardisation, whilst ensuring a harmonious acceptance of the new processes by the different stakeholders’ (1).

RAPID SCALABILITY: DISTRIBUTED LEDGER TECHNOLOGY ENABLES RAPID SCALABILITY

With distributed ledger technology, enterprises are also able to scale (i.e. set up new entities such as teams, functions, systems and organisations) a lot faster. How is this achieved?

  • Processes, data & functions are added & updated with low friction. Immediately they’re available across the enterprise with little to no coordination & management overhead.
  • Participants can rely more on these changes, as the cross-organisation processes are codified and synchronized in real time.
  • Federated, architecture-by-design, with independent entities operating out-of-the-box.

STANDARDISATION: DISTRIBUTED LEDGER TECHNOLOGY STANDARDISES PROCESSES

The tech industry today has not converged on standards for building systems that cut across multiple participants. This is a pervasive problem across multiple industries including finance, healthcare, telecommunications, etc.) and therefore there is no official standard that exists for solving issues arising with data sharing, reconciliation, discovery, entities being out of sync and peer-to-peer communication.

Distributed ledger technology is able to solve these problems and standardise enterprise processes in a similar manner that Mulesoft and Apigee provided standards for APIs (via function interfaces). They both provide API specification standards which define the connected application inputs and outputs throughout numerous enterprises. In doing so, teams are able to easily connect their separate systems

Distributed ledger technology provides standards by adopting a new architecture that changes the way systems interact and process data. It also standardizes the execution of these processes as Smart Contracts, as opposed to just input and output data formats. It is your one stop shop for orchestrating, executing and verifying enterprise processes. It achieves this by providing:

  • A Standardized platform on top of which enterprise applications can be developed
  • Template scripts that are able to perform a set of common enterprise functions
  • A ‘discovery’ repository containing process logic for enterprise developers to search for and reuse processes which have been built before

Blockchain is a global and cross-industry effort to standardize these systems, lower their cost to deploy and improve their maturity. There is a large community that is growing and contributing to these projects.

OPERATING SYSTEM: PROVIDES FOUNDATION FOR PROCESS & DATA ORCHESTRATION, EXECUTION & MONITORING

Smart contracts provide the process & data orchestration, execution, and monitoring at every single step of the process. As a result we can think of smart contracts as the foundation for the Enterprise Operating System.

Smart contract is the script that runs the whole process for the enterprise and no part of the process can change its operations without updating the smart contract that is shared across the whole process. This is why the smart contract is such a powerful tool.

Distributed ledger technology is a coherent architecture that cleanly encompasses several key data & processing capabilities. Normally each of these are provided as standalone and disjoint products (data orchestration, execution and monitoring).

Blockchain is a holistic system that seamlessly integrates these services (process and data orchestration, execution and monitoring) providing the foundation for these capabilities to be on one single platform. Similar to how a traditional operating system provides protection between processes, synchronises the processes and manages resources on a user’s computer, distributed ledger technology can provide protection between processes, synchronises these processes and manages resources across the entire enterprise.

We believe that Deep Process Automation is uniquely suited for tackling complex enterprise processes and distributed ledger technology serves as the only solution that is able to automate end-to-end processes that feature a large number of tasks and logical rules whilst without being extremely costly.

If you want to learn more, contact us at contact@luthersystems.com !

We use cookies to give you the best experience of using this website. By continuing to use this site, you accept our use of cookies. Please read our Cookie Policy for more information.