-->

KMWorld 2024 Is Nov. 18-21 in Washington, DC. Register now for Super Early Bird Savings!

How to Push BPM to the Extreme

It is an inconvenient reality that many BPM suites are too slow for today’s increasingly data-intensive environments. Their poor performance is exacerbated by the use of a workflow-only architecture; a "Web services-centric" definition of SOA; overly general and inappropriate models for business rules and poor software technology for translating models into executable processes.

A BPM suite (BPMS) allows users to collaboratively model and automate business processes. These will generally include tools for the design and automation of processes, workflows and business rules together with integration, a forms generator and facilities to simulate, optimize and continuously improve the developed system.

The aim of such tools is to enable business transformation and to increase the adaptability or "agility" of companies so that they can respond quickly to changes in the market. The majority of products, however, have tended to focus exclusively on long-lasting human workflows, leaving the data-intensive processes to the wayside.

Middleware tools can be used to process transactions quickly, but while they offer the necessary speed and integration, they do not enable business users to be directly involved in the design or ongoing maintenance of their business applications. It is this feature of BPMS tools that produces many of their benefits.

As the business world becomes more data-intensive due to regulatory demands, mergers and acquisitions and the move to more centralized processes, the need to process large volume transactions is increasing rapidly and BPMS tools that cannot cope with these volumes will have restricted applicability. We will now look at those tools within a BPMS which can have a major effect on performance and suggest a new approach that addresses both worlds.

Business Rules
The popular "atomic and indivisible" approach to rule formulation is one in which all business rules are reduced to a set of simple conditional statements. This approach results in thousands of rules being needed and any change of data causing hundreds of them to be activated. This has two consequences:

1. Performance is poor and special algorithms (such as "RETE") are needed to allow the rules to be organized optimally, and

2. A lack of transparency about exactly what rules will be activated by any particular event.

The alternative approach (used by Microgen Aptitude, among others) includes business rules as part of a business process and allows them to be placed in context. The rules do not need to be reduced to the equivalent of a "low-level language" but can be as complex as needed—the user decides whether to have 10 simple rules or one complex one. This architecture allows much higher levels of performance and transparency.

Modelling, Deployment and Performance
A BPMS should allow business users to model processes and rules, enabling them to take ownership of the areas for which they are responsible. These models should be capable of direct deployment, so that agility is not reduced by interpolating a conventional IT development stage between design and deployment.

Most BPM products use a graphic modeling tool and then translate this into conventional code, but it is difficult to make such code efficient. Microgen Aptitude takes a novel approach to this problem, by utilizing the Manchester University Data Driven Programming Language (MUDDPL) as its graphic modeling language. This language was designed to translate a graphic model directly into a form of portable code, which is then read by a highly optimized engine. It generates very high-performance applications through the segmentation of a model into parallel streams that can be processed by separate threads or servers. The result is performance in excess of 35,000 simple transactions per second.

SOA and Transaction Process Management
Raw performance alone is not enough—the BPMS is expected to be able to write to multiple targets at the end of a process, under full transactional control—it is not acceptable that a core business system receives a data update but its associated data warehouse does not.

It is generally accepted that an SOA which relies entirely on the orchestration of Web services such as BPEL-based solutions is inappropriate in data-intensive environments. Even when "compensating transactions" are used, they do not meet the strict criteria for transactional control needed by most organizations today.

Web services also have performance issues and are generally much slower than calls to libraries and other technologies such as DCOM. But then there is no need to restrict an SOA to Web services and a BPM application should be able to utilize service calls through a number of different technologies.

A high performance BPMS with SOA facilities should provide transaction process management, orchestrating all types of service calls and being able to impose transactional control where appropriate.

Conclusions
To achieve high levels of performance within a BPMS, we would argue:

1. BPM suites should be capable of modeling and processing both workflow processes and data intensive processes. One without the other just simply isn’t enough.

2. To achieve the performance and transparency necessary, the business rules component should allow complex rules to be placed in context as part of the overall business process.

3. The BPM model should be directly deployable. To achieve this while maintaining performance, code generation should be avoided and models which encourage parallel processing are preferable.

4. If the system is deployed within an SOA, then this should not be limited to the orchestration of Web services only, as these will not provide the necessary performance or transactional control. 

KMWorld Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues