Old posts from bpm.com — CMW Lab Blog https://www.cmwlab.com/blog/bpm-com/ Plan. Manage. Collaborate. Thu, 11 Apr 2024 13:48:01 +0000 en-US hourly 1 https://wordpress.org/?v=6.6.1 End-to-End: the Industrial Equipment Case https://www.cmwlab.com/blog/end-to-end-the-industrial-equipment-case/ https://www.cmwlab.com/blog/end-to-end-the-industrial-equipment-case/#respond Tue, 14 Mar 2023 12:52:55 +0000 https://www.cmwlab.com/blog/?p=6605 The difference may seem subtle but in practice the time interval from successful acceptance testing to production work of the equipment may vary considerably. Why is that so? Because for a customer it’s a one-time (or rare) project so it’s hard to anticipate all possible issues. For the supplier, on the contrary, it’s a regular […]

The post End-to-End: the Industrial Equipment Case appeared first on CMW Lab Blog.

]]>


And this is the moment of truth: in the true customer-oriented world may means must!

Following the principle “what’s good for our customers is good for us”, the company revised the process so that it is now considered complete when the equipment produces goods, not when it just passed the testing. The duration of the process from start to this moment is used to measure overall process performance.

This principle should be followed with caution however. We must strive to deliver value to customers indeed but we should ask ourselves – are they willing to pay for it? And we shouldn’t forget about costs either. Achieving maximum customer’ satisfaction may lead to increased costs, unsatisfied investors, smaller budgets for new products and development etc.

Yet in this particular case it’s probably about supporting customer with advice and recommendations based on the company’s experience: good value for a small cost.

What can be learned from this case: “as you name your process and define its boundary our customer will be satisfied”.

Getting back to end-to-end processes, at Comindware we believe that an end-to-end is a process from the very beginning to the very end named as “… to …” for example “lead to cash”, “idea to product”, etc.

Sounds provoking? Please feel free to comment below, your opinion counts for us.

The post End-to-End: the Industrial Equipment Case appeared first on CMW Lab Blog.

]]>
https://www.cmwlab.com/blog/end-to-end-the-industrial-equipment-case/feed/ 0
BPM Maturity Model: Go Deep vs. Go Wide Strategy https://www.cmwlab.com/blog/bpm-maturity-model-go-deep-vs-go-wide-strategy/ https://www.cmwlab.com/blog/bpm-maturity-model-go-deep-vs-go-wide-strategy/#respond Tue, 14 Mar 2023 12:51:36 +0000 https://www.cmwlab.com/blog/?p=6602 Virtually every company becomes caught by the business process idea, sooner or later. The BPM promises – sales up! costs down! unprecedented agility! – make people eager to implement the “BPM thing” as soon as possible, if not yesterday The process maturity model is probably the most underappreciated concept of the Business Process Management discipline. […]

The post BPM Maturity Model: Go Deep vs. Go Wide Strategy appeared first on CMW Lab Blog.

]]>


The process maturity model is probably the most underappreciated concept of the Business Process Management discipline.

Virtually every company becomes caught by the business process idea, sooner or later. The BPM promises – sales up! costs down! unprecedented agility! – make people eager to implement the “BPM thing” as soon as possible, if not yesterday.

But here is the trap: people tend to view BPM as an ocean of opportunities where almost any course may be charted. While in reality it’s rather a railroad line named “BPM maturity scale”. Knowing the map of this line is absolutely critical because the process maturity is a part of company’s culture so it can’t be picked up arbitrary or changed easily – only step by step. One should honestly evaluate at what station the organization currently stays and then buy a ticket to the next one. Taking what seems to be a short way may derail the company’s BPM train.

The maturity model comes in many flavors: Gartner has one, Forrester has another, Carnegie-Mellon SEI has CMMI. The BPM Common Body of Knowledge by ABPMP refers all these plus Michael Hammer’s PEMM plus introduces its own model. While being in consensus on what the topmost level is, different models propose slightly different paths to the “process nirvana”.

Let’s have a closer look. The table below aligns the models by the top level – “Optimized” or “Proactively Managed”:
SEI CMMI Gartner Forrester ABPMP
0: Acknowledge Operational Inefficiencies 0: Non-Existent
1: Initial 1: Process-Aware 1: Ad Hoc 0: Ad-Hoc
2: Managed 2: Intra-Process Automation and Control (Coordinated Processes) 2: Repeatable 1: Defined
3: Defined 3: Inter-Process Automation and Control (Cross-Boundary Process Management) 3: Defined 2: Controlled
4: Quantitatively Managed 4: Enterprise Valuation Control (Goal-Driven Processes) 4: Measured 3: Architected
5: Optimizing 5: Agile Business Structure (Optimized Processes) 5: Optimized 4: Proactively Managed
Table 1. Well-known process maturity models

Organizations moving along the maturity scale shall overcome two barriers:

1) “Go wide” – increasing the scope of process work: from selected priority processes to the system of processes constituting the enterprise.

2) “Go deep” – increasing the degree of control: from basic process documenting to structured definition to full management of the business process lifecycle with strong emphasis.

These two challenges are essentially independent hence it would make more sense to present the maturity model as a two-dimensional matrix rather than a linear scale:

Process maturity matrix Table 2. Process maturity matrix

The only way from the Initial/Ad-hoc level (1) is to Repeatable/Defined (2). The next move is less obvious an organization can increase the degree of control (“go deep” – 3A) or broaden the scope of BPM efforts (“go wide” – 3B) or try to combine both. Either way, the final destination (4) is the same so the question is only about what should come first.

Now let’s get back to the maturity models presented in Table 1: which path do they suggest?
  1. In SEI CMMI “Quantitatively Managed” immediately precede the final “Optimizing” level. It means that this model prescribes broaden the scope first then increase the degree of control.
  2. Gartner’s model makes the emphasis on “Automation and Control” at early level 2, before widening the scope to the system of process.
  3. Forrester puts “Measured” immediately before “Optimized” – approach similar to CMMI (these two models are in fact pretty close).
  4. ABPMP puts “Controlled” before “Architected”
Summarizing it up, Forrester and SEI CMMI propose “Go Wide” first strategy while Gartner and ABPMP recommend to “Go Deep” first.

Having the votes split 50 by 50, a BPM practitioner have to make a decision on his own. The first purpose of this article was to attract attention to the gateway in BPM maturity model – whatever path one would chose, it’s much better to make it with eyes open.

As for my personal recommendation, I tried both paths and this experience made me a strong believer of “Go Deep” strategy. The reason is simple – “Go Wide” strategy systematically fails because:
  1. People just don’t understand what the business process is until they’ve got experience in managing it all over PDCA cycle – and speedy enough! Without the deep understanding of these fundamental things the process map a company may develop as part of “go wide” effort probably would be flawed; the chances are big that it’d be yet another catalog of business functions rather than a true business process architecture based on an enterprise value chain.
  2. A business process well-defined yet not put into the environment granting tight control over execution, automatic monitoring and agility is too expensive in terms of management. A qualified manager will have to spend a big part of his working time to assure that process participants act in accordance with the process definition and workplace instructions and to gather process metrics semi-manually. It’s a pure mechanical job that should be trusted to a software robot (yes, I’m promoting BPMS here). An attempt to widen the BPM initiative without such tools would exhausts the management resources pretty fast.
  3. A process defined yet left unmanaged due to lack of management resources becomes just a wishful dream that have little connection to how the work is really done in the organization. The more processes we defined and left them unmanaged – the deeper we are “paralyzed by analysis”: processes changes faster than we are able to analyze these changes, leaving alone to manage these changes proactively.
Most BPM practitioners’ recommendation is the same: focus on selected process first. Structure and prioritize the process infinity, define few most valuable processes and then proceed to gain full control over them without delay. Then plan carefully the next campaign – widen the process front by cloning the process team and make the company a true process organization. This is essentially “Go Deep” strategy.

The post BPM Maturity Model: Go Deep vs. Go Wide Strategy appeared first on CMW Lab Blog.

]]>
https://www.cmwlab.com/blog/bpm-maturity-model-go-deep-vs-go-wide-strategy/feed/ 0
End-to-End: the Energy Case https://www.cmwlab.com/blog/end-to-end-the-energy-case/ https://www.cmwlab.com/blog/end-to-end-the-energy-case/#respond Tue, 14 Mar 2023 12:47:21 +0000 https://www.cmwlab.com/blog/?p=6597 Here is what the company CEO has told: “It also happens that when the ordered pipes are finally delivered, they are partially used for emergency need, e.g. to fix a breakage happened elsewhere. As a result, the original customer won’t get the ordered goods. It won’t be a problem because often there is enough time […]

The post End-to-End: the Energy Case appeared first on CMW Lab Blog.

]]>

“It also happens that when the ordered pipes are finally delivered, they are partially used for emergency need, e.g. to fix a breakage happened elsewhere. As a result, the original customer won’t get the ordered goods. It won’t be a problem because often there is enough time to re-order but the point is that no one knows that the order will not be delivered.”

Interestingly, this was only revealed at the discussion with the company CEO. And it’s a lesson for business analysts, by the way: never limit your communication with a single representative of the customer, the so-called Subject Matter Expert. This approach generally doesn’t work well. Do your best to communicate with executives (in brief at least) to get their feeling of the business problem. It’s also very important to talk with those who really do the job. Typically the role of a subject matter expert is played by a middle manager and too often he/she tries to prevent communication between the analysts and others: “you don’t need them, I’ll tell you everything.” Don’t let it happen – nobody knows everything in any non-trivial business process; don’t forget that the analysis is your responsibility, not the expert’s.

Now what does this “little peculiarity” mean? The process scope wasn’t defined right from the very beginning!

The customer was talking about the process named “purchase order approval.” Therefore it starts with the needs list from a site and ends with the set of purchase requests approved to start a tender by each.

But is this enough? Obviously not.A purchase request may be perfectly filled and approved yet the site won’t get the ordered pipes because they were used elsewhere. So if we are targeting the business problem – that is, meeting the sites’ needs in the materials, equipment and services – then we can’t help extending the process scope by including the control of ordered goods delivery.

The incoming goods should be associated with a specific original site’s needs list. They were summed up during the approval, now the reverse operations should be performed: incoming total volumes should be split by the initial needs. This way we’ll be able e.g. to re-order the pipes if some of them were taken away.

Old saying “As you call a ship so it will float” is right for the case: if we called the process “purchase order approval” then the approved order would be the process result, period.

So we’ve found that “purchase order approval” isn’t the end-to-end process. So what should be the process scope and process name, then? What should be the process’ “ends”?

Let’s think together. Please leave your suggestions below. (To be continued in a week.)

The post End-to-End: the Energy Case appeared first on CMW Lab Blog.

]]>
https://www.cmwlab.com/blog/end-to-end-the-energy-case/feed/ 0
Round Trip Revisited https://www.cmwlab.com/blog/round-trip-revisited/ https://www.cmwlab.com/blog/round-trip-revisited/#respond Tue, 14 Mar 2023 12:45:37 +0000 https://www.cmwlab.com/blog/?p=6592 Many business process management initiatives over the world suffer from the same pitfall. Most projects are successful in identifying and solving a particular business problem by redesigning and/or constantly improving the corresponding business process. The ROI figures are impressive and BPM gains the trust from executives. Yet attempts to leverage on the initial success to […]

The post Round Trip Revisited appeared first on CMW Lab Blog.

]]>
Many business process management initiatives over the world suffer from the same pitfall. Most projects are successful in identifying and solving a particular business problem by redesigning and/or constantly improving the corresponding business process. The ROI figures are impressive and BPM gains the trust from executives. Yet attempts to leverage on the initial success to apply BPM enterprise-wide are often far less successful.

In terms of BPM maturity, we may say that these organizations fail to reach higher levels. According to CMMI, enterprise-wide processes management is the Level 3 capability:

Capability maturity levels according to CMMI Fig.1. Capability maturity levels according to CMMI

Other sources may define levels differently but the core idea is the same: at certain level organizations must progress from managing isolated processes to defining the whole enterprise as a system of business processes. BPM Common Body of Knowledge (BPM CBOK) by ABPMP for example names the Level 3 “Architected”:

Process maturity curve according to BPM CBOK Fig.2. Process maturity curve according to BPM CBOK

“The State of Business Process Management 2014” report (Paul Harmon and Celia Wolf, BPTrends) indicates that the process maturity does not increase but actually goes down during the last years. The table below shows that percentage of the organizations that practice process management activities “Occasionally” grows up at the expense of those who do it “Most Times” and “Always”.

Frequency of specific organizational activities that suggest organizational maturity Fig.3. Frequency of specific organizational activities that suggest organizational maturity

The authors conclude that “the response pattern in 2013 is more like the pattern we observed in 2009 than the one we saw in 2011”.

There is hardly one single explanation of the slow progress in BPM adoption; we’ll discuss just one aspect – technology. Does the current BPM technology fully answers the needs of organizations willing to go from managing business processes one by one to managing a system of processes constituting enterprise operations? The answer will be “not exactly” but before saying that let’s get back and see how the current progress in managing business processes has been reached.

Bridging Modeling/Execution Gap

The progress in business process management reached so far owes to BPM Suites at large extent. The previous generation of BPM tools was mostly process modelers and as for the process execution, it was ERP and/or standalone workflow engines.

It worked fine for reengineering type of projects: organizations modelled their process say in ARIS EPC, then semi-automatically translated the process model e.g. into SAP program code and here it is: the core company’s processes are implemented in robust execution environment.

But rather sooner than later every organization following this route hits the so-called “round-trip problem”. The essence of the problem is that it’s terribly hard to keep in sync two models – the analyst’s view of the business process depicted in a process notation and developer’s view implemented in the program code. When both representations evolve over time there is no good way to merge changes made by the analyst with changes introduced to the program code.

At the end of the day the program code becomes prevailing and business analysts get out of the game because process diagrams become irrelevant. Only IT department really knows how the process works so in fact it’s programmers who manage the business process! Sounds absurd yet it’s a reality of many organizations. Today’s mainstream BPM Suites leverage the power of BPMN to overcome this issue. BPMN was designed simple enough to be intuitive for business people and at the same time precise enough to be unambiguous for developers. This doesn’t come automatically of course – good “Method and Style” should be at place (this is the name of the famous Bruce Silver’s book on BPMN) but many organizations have shown that it’s doable.

There is no round-trip problem in such environment because BPMN is the code! Hence no gap, business analysts and business people behind them are in the game and everybody is happy.

Bridging Architecture/Execution Gap

So far so good – we are perfectly equipped to deal with any single business process.

But how do we manage processes enterprise-wide? Unfortunately most current BPM Suites offer no more than a list or hierarchy of processes. What’s even worse, BPMN – a de-facto standard notation used in leading BPM Suites – can’t model anything above a single process level. (There is message-based communication in BPMN but from business perspective it’s no more than modeling process internals.)

On the other hand, there are very versatile Enterprise Architecture tools. They support various artifacts, multiple notations, collaborative work within a single repository. But they are separated from execution environment.

Once again, we have two sources of truth – it was EPC vs. program code in the example above, now it’s architecture diagrams vs. BPMN diagrams. Even if it’s possible to export a process definition from EA to import it into BPMS, this produces the same round-trip problem, now on the architecture level. (Thanks to Keith Swenson who perfectly explained the difference between model transforming and model preserving strategies. BPMN-based BPMS follow the latter while EA/BPMS gap is the example of former.)

This is not a theory, speculation or guessing about customer’s need. My personal five years’ experience of teaching BPMN shows that almost every organization that wants to get most from BPMN and BPMS comes to the same question: how can we model process hierarchy above a single process’ level? Unfortunately the honest answer is: you can’t. Not within current mainstream BPM Suites.

What You Architect Is What You Run

We at Comindware attack this problem by implementing what we call “Executable Architecture”. It combines modeling process internals in BPMN with modeling high-level capabilities in the notation similar to Value Chain diagrams. Each capability may be mapped to a process, adaptive case or project – the product supports all forms of collaborative work, not just BPMN processes.

Process analysts and process designers work within a framework set up by the enterprise architect. There is no gap between them: e.g. when business requests certain activity to be implemented in the execution environment it first should be checked against the current architecture and a new capability should be introduced if it was missing.

Process performers are affected by the decision made at the architecture level, too: what user may or may not do depends on how capabilities and resources on the architecture diagram were mapped to processes and data records at the execution level.

The link is not that tight indeed: it’s possible to utilize only architecture and modeling without execution, or modeling and execution without defining architecture. Yet the company that are targeting themselves to the high levels of process maturity should appreciate the ability to manage process hierarchy from the top-level enterprise value chain down to individual tasks within a single tool.

We will show the Executable Architecture in action at the bpmNEXT’2015 conference; our demo is scheduled at the morning March 31. Please join us at the conference or visit the bpmNEXT.com later to watch the recorded video.

The post Round Trip Revisited appeared first on CMW Lab Blog.

]]>
https://www.cmwlab.com/blog/round-trip-revisited/feed/ 0
Managing Projects, Processes and Cases https://www.cmwlab.com/blog/managing-projects-processes-and-cases/ https://www.cmwlab.com/blog/managing-projects-processes-and-cases/#respond Tue, 14 Mar 2023 12:38:36 +0000 https://www.cmwlab.com/blog/?p=6582 In reality, however, most organizations have to deal with processes, projects and cases which are somewhere between the two. Therefore they need a balanced, unbiased view of projects, processes and cases that in essence are just different kinds of collaborative work. Projects, projects and cases have more in common than it may seem at the […]

The post Managing Projects, Processes and Cases appeared first on CMW Lab Blog.

]]>


Yet it’s hard to be an expert in different knowledge areas. One can learn the theory but it takes years to become an experienced practitioner. That’s why it’s relatively easy for an organization to find a project or process experts but there is risk that they will overestimate the importance of one approach at the expense of the other.

As for the business people, they often confuse these approaches. It’s not unusual for a conversation to start from projects and suddenly switch to processes and vice versa.

To make things more complicated, project-oriented people talk a lot about processes – e.g. the Project Management Body of Knowledge (PMBOK) is more about processes governing projects planning, execution and control than about projects themselves. Yet the way PMBOK defines a process differs considerably from process definition given in the Business Process Management Body of Knowledge (BPM CBOK).

At the end of the day, projects, processes and cases are just different methods to solve the issues that every mid to large organization faces – the “silo mindset” and the gap between the business units’ targets and the goals of the organization. The root problem of these issues is the division of labor.

Business needs to resolve these issues by whatever means. It may be project management or process management but there is also adaptive/dynamic case management, document-oriented workflow, issue tracking… Easy to get confused, right?

This article aims at the following:
  1. To help choosing the best approach (or combination of approaches) to collaborative work depending on organization profile.
  2. To provide a basic understanding of available software tools supporting these approaches.
  3. To analyze the differences and similarities between these approaches and define the vision of the integrated software product implementing them all.
The first two discussions are not new but hopefully will be useful for practitioners. The third part is based on the researches currently performed at Comindware and therefore may be considered as a request to comments.
  1. The Forms of Collaborative Work
We will not consider the work performed by an individual – only the teamwork – and will not consider the essence of the work, only the coordination aspects of the teamwork.

Definitions:
  • Project is a sequence of activities following a defined plan and aimed at delivering unique result, product or service. Example: road construction.
Note: “defined” here and below means “defined before the beginning of work”. By contrast, “some” means “defined in the course of work”.
  • Process is a defined and repeating sequence of activities started by a defined event and producing a defined result, product, or service. Example: processing of customer order.
Note: terms “process” and “business process”, “activity” and “work” are used as synonymous here.
  • Case is some sequence of activities aimed at defined goal. Examples: patient’s treatment at the hospital, legal case.
  • Docflow (document-oriented workflow) is a defined sequence of activities related to a particular document. Examples: contract approval, incoming mail processing.
  • Issue is a defined event that needs to be addressed by some sequence of activities aimed at defined result. Example: help desk ticket.
  1. Classifying Attributes of Collaborative Work
The boundaries between the different forms of collaborative work are often blurred. For example, the new product development may be considered as a project, process, case or even docflow depending on the industry, type of product and organization culture.

Nevertheless, they may be differentiated by the following aspects:
  1. Repeatable. Is it possible to typify the sequence of activities, i.e. to give multiple instances a common name? The answer is positive for processes, cases and docflows. Projects and issues are not repeatable, speaking generally. (Although repeating projects and issues may occur, indeed.)
  2. Predictable. Is it possible to define a sequence of activities in advance or is it determined “on the go”? Cases, docflows and issues are not predictable, speaking generally. A case is “rolled out” – next activities are resulted from activities already performed. A document in a typical docflow system may be reassigned at any step. In contrast, a process is fully predictable: although there may be gateways, all options and conditions are known in advance. A project is predictable, too – the project plan comprises a complete list of tasks. There is some degree of unpredictability because the project plan may be amended as the project progresses, yet it’s common to consider projects as predictable.
  3. Structured. Is it possible to describe the work input and output by structured data? Processes and cases deal with structured data: numbers, amounts, dates, references, etc. Projects, documents, issues deal with unstructured information: text descriptions, attached spreadsheets and other content.
Let us summarize the above:
Table 1. Attributes of collaborative work
  Repeatable Predictable Structured
Process
Project
Case
Document
Issue
The table above shows that processes and issues are two poles: repeatable, predictable and structured processes and unique, unpredictable and unstructured issues. Other forms are somewhere in between.

  1. Framing Collaborative Work
At first glance, it may seem that the check marks in Table 1 are placed randomly. To make a system of it, we’ll use the classifying attributes as dimensional axis. Let’s start with “repeated” and “structured”:

Framing Collaborative Work

The Fig.1 shows the correlation of structured and repeatable work. When dealing with a recurring, typified work (even unpredictable, as cases), it may be expected that the work is performed on the same type of business objects. Therefore the information can be presented as structured data rather than documents.

It’s worth to note here that while processes and cases are able to work with structured data, they can also process unstructured content.

Working with structured data has clear advantages –
  • Verification. While any information can be entered into a text document, an input to a screen field bound to a database table column can be thoroughly checked and verified. E.g. the phone number entered matches the telephone number mask, the date for the returning flight is later than the date of the originating flight etc.
  • Ability to integrate with enterprise systems. E.g. if the expense report is submitted as a text file, then its processing will be manual and error-prone. The same report implemented by a business process management software will be well-structured and passing the report to accounting system would be a matter of copying from one database to another.
Document-oriented workflows in Fig. 1 look like an exception: repeatable and yet unstructured. The docflow approach is often criticized for doing not more than simply replacing carbon-based documents by electronic ones. More value could be produced if structure was applied to the information. Not surprisingly, the complexity of integration with enterprise systems is a well-known weakness of docflow systems.

With regards to projects and issues, when dealing with truly unique work, the information will be unstructured. This is inevitable and hence justified. But if we treat not-unique, recurring work as project or issue instead of case or process then the benefits of processing structured data are lost.

Now let’s look at repeatable/predictable axis:

repeatable predictable

All cells are filled, only documents and cases overlap. In fact, case management and docflow software are close relatives; with the advent of ACM some ECM vendors re-labelled their offerings as ACM.

It may be expected that in the future ACM software will fully replace docflow because it’s able to process both unstructured content and structured data. On the other side, today docflow software is generally more mature. (It should be emphasized that author’s criticism is aimed solely at the document-oriented workflow as a way to organize collaboration and doesn’t span to the content storage and delivery provided by Enterprise Content Management systems.)

Fig. 3 depicts the full matrix with illogical combination “repeatable+unstructured” (Documents) excluded:

Managing Projects, Processes and Cases
  1. Pure Forms and Mixed Work
In reality purely project or process work are rare, more common is a mix of work. Projects, processes, cases 1) execute (“call”) each other and 2) transform to each other over time. Some examples:
  • The IT help desk operations are often treated as a process work: first and second lines of support are introduced, SLA and escalations established etc. Yet it’s only about control; as for the physical work that should be done to resolve the issue, it can be virtually anything and therefore should be presented as a case. Here a process executes are case.
  • Classical project management follows similar pattern: project initiation, project closure, project rescheduling may be presented as well-defined process that execute or communicate with the project.
  • A patient in the hospital emergency room is an opposite example. It’s hardly possible to present the medical treatment as a process because there are too many hardly predictable variants. Hence the top level is a case. But on the lower levels there are treatments and tests that very well can be defined as processes. Here we see a case executing a series of processes.
  • An organization may treat a collaborative work which is essentially a process (highly predictable and repeatable) as a project or case work because modeling a process requires skills, efforts and time. For example, a pharmaceutical company treated the new drug development as a project for years and then, when the “recipe” of this work became clear, they implemented it as process.
Unfortunately most existing tools support just one form of collaborative work and therefore do not support interoperability and transformation. This makes a room for the new generation of integrating tools that support all kinds of collaborative work and any combinations. We will present the vision of such tool in the final part.

The post Managing Projects, Processes and Cases appeared first on CMW Lab Blog.

]]>
https://www.cmwlab.com/blog/managing-projects-processes-and-cases/feed/ 0
How the Division of Labor Lowers Productivity https://www.cmwlab.com/blog/how-the-division-of-labor-lowers-productivity/ https://www.cmwlab.com/blog/how-the-division-of-labor-lowers-productivity/#respond Tue, 14 Mar 2023 12:28:59 +0000 https://www.cmwlab.com/blog/?p=6580 It happens all the time: as soon as we find a solution for a problem, the solution becomes a problem itself. The division of labor is not an exception: it increases the productivity indeed, but it also decreases in other cases. The separation of labor is a clear benefit at first sight: doing something right […]

The post How the Division of Labor Lowers Productivity appeared first on CMW Lab Blog.

]]>


The separation of labor is a clear benefit at first sight: doing something right implies training, expertise and specialization. So one goes to a College or University and becomes a professional in economics, agriculture, mechanical engineering etc.

Then he or she graduates, finds a job and becomes “installed” to some company and department. And what’s interesting about people is we tend to identify ourselves with a small group at first – in our case it’s department vs. company. We accept the interests of our near colleagues and our department much closer than interests of the company as a whole, letting alone customer’s interests. We are all great professionals for sure, we are able to demonstrate great productivity in our area but it turns out that department’s productivity doesn’t automatically guarantee the end-to-end productivity of the company. Besides, the distance between “we are able” and “we do” may be significant.

It wasn’t that crucial at the early days of Adam Smith and later at Frederick Taylor’s because it was mostly about the division of industry workers’ labor. As long as each worker performs a single operation and the sequence of operations is predetermined, coordinating them is an easy job. Just measure the time spent for each operation and calculate the conveyor speed and headcount for the given productivity. This is how the scientific process management was born.

Problems arise when we turn away from Adam Smith’s sewing needles and the legendary Ford’s T Model (which could be “painted any color as long as it is black”) to something more complicated and diverse. The greater the range of finished goods and parts, the greater the range of manufacturing operations – the more complicated is job coordination. To cope with this problem western businesses rely on computers and develop MRP, MRP-II, ERP, APS algorithms, while the Japanese invent “Just in Time” and “Kanban” (a kind of “analogue computer”). One way or another (or rather combining both) the problem can be solved.

It becomes worse when we switch from the shop floor and manufacturing processes to the office and business processes. Factors that add complexity are: multitasking, creativity and cross-functionality.

Multitasking means that we switch between tasks many times during the day or even within an hour. Conveyer workers complain about the dullness of their job, yet the office work is another extreme: the more skilled and responsible is an employee the more processes he/she would be involve into.

There are two possible ways for an employee to react. First, he or she can minimize the amount of switches between tasks. E.g. Finance processes payment orders after 4PM because processing them as soon as they arrive would mean a “productivity decrease”. This is a classic example of so-called “sub optimization”: the performance of Finance clerk would increase while the company’s efficiency from a customer’s perspective will decrease.

The second option is a tricky one: do not let anyone figure out how productive you really are. In many cases, it doesn’t make sense for an employee to do the best: the more you do – the more load you get from the boss. As for the boss, he/she is probably a seasoned professional very  able to evaluate subordinate’s true performance. But is it in his/her best interest to get the most from the staff? In so many cases it’s a better strategy for a line manager to ask for extra workforce arguing that the men are overloaded. After all let’s not forget that the more the headcount, the more power and weight within the organization the manager has. Therefore, pressing subordinates would mean not only spending emotions  and efforts, but also a chance to lose the career race to other managers.

Now let’s talk about creativity. It’s relatively easy to measure performance of a manufacturing worker doing routine job and to set up performance targets accordingly. But how would you measure e.g. a software developer’s performance? The number of lines of code is a  very bad metric indeed, but there is hardly anything better.  In fact, this is the case with all knowledge workers: there is no reliable way to measure the result.

Over one hundred years ago, a French professor Maximilien Ringelmann discovered the effect later called by his name. He performed a set of experiments in which men were pulling a rope alone or in a team. Professor has found out that performance in a team decreases: whereas a single man can pull say 100 kg, a team of two pulls 80 kg each and a team of eight – only a pitiful 50 kg. If a man is certain that no one can determine that he isn’t doing his best, then he saves his efforts, consciously or not.

The famous Parkinson’s Law says about the same: “work extends so as to fill the time available for its completion”.

Dan Ariely explored the problem with a series of experiments at MIT. Being a behavioral economist, he demonstrated that the vast majority of people are unable to resist cheating if they are sure that they will not be caught or when their cheating would cause harm or damage only in the distant future.

The rope pullers example shows how even a homogeneous team may become inefficient. Now what should we expect when a coordination of several departments’ efforts is necessary? The problems above would seem nothing compared to cross-functional coordination.

Every time an organization faces a problem that can only be addressed by joint efforts of several departments, the purely hierarchical organization is in deep trouble. A classic example is “design to order” business: the company obtains Request to Proposal; doing it right requires committed participation of a) sales manager, who communicates with the client, b) an engineer, who designs the product requested, c) Purchasing department which knows from where to buy the parts needed, d) Manufacturing who schedules the production, e) Accounting who calculates costs. A purely hierarchical organization has no chances to do the job in time and with acceptable quality. Because why should Manufacturing obey Sales? Each team has its own chief, budget, performance measures… What did you say – customers? Nobody cares.

And this isn’t the most complicated case yet. At least the sequence of activities is the same from one customer’s request to another. It becomes worse when the sequence is unpredictable: geological research at the construction site, law firm actions in court, patient at the hospital emergency room etc.

Getting back to the topic, please note that all these issues result from the separation of labor: a medieval guild master never hit anything like this.

What we see on the positive side is: humanity could raise a nominal productivity manifold thanks to the division of labor. Yet it also brings fuzzy performance measurements, blurred responsibility and poor coordination that decrease overall performance. The larger the organization, the larger are these negative side effects. The effect is non-linear, so both absolute and relative losses increase. There is a certain scale limit where benefits of the separation of labor are negated by increasing losses so the net productivity doesn’t grow anymore and starts to decrease.

How can this limit be estimated? Let’s define the metric first. Keeping in mind that manufacturing process is easier to manage because there is no multitasking, performance measures are clear and it’s a single department, it looks reasonable to use the “white collar” employee headcount as the measure of the organization scale.

This limit depends on a multitude of subjective factors like CEO personality, corporate culture, company age etc. so there is no a single number. Supposedly, the line after crossing which a company should look for the ways to compensate growing negative effects is somewhere between 20 to 100 “white collar” employees with a mean of around 50.

What are the known means – how the negative effects of pure functional management can be handled – will be discussed in the following parts. We won’t throw away the division of labor indeed – we should find the way to compensate the negative effects, yet keep the advantages.

The post How the Division of Labor Lowers Productivity appeared first on CMW Lab Blog.

]]>
https://www.cmwlab.com/blog/how-the-division-of-labor-lowers-productivity/feed/ 0
Functional and Process Management: Tools Support https://www.cmwlab.com/blog/functional-and-process-management-tools-support/ https://www.cmwlab.com/blog/functional-and-process-management-tools-support/#respond Tue, 14 Mar 2023 12:27:33 +0000 https://www.cmwlab.com/blog/?p=6577 Let’s start with the functional management. First, there are standalone applications – accounting, warehouse, product lifecycle management (PLM), advanced planning & scheduling (APS), etc. targeted to specific departments. Historically, these applications have appeared first as the earliest form of management was functional management. Most organizations have a set of such applications. The IT management dislikes […]

The post Functional and Process Management: Tools Support appeared first on CMW Lab Blog.

]]>


Most organizations have a set of such applications. The IT management dislikes this “Applications Zoo”, but we must understand that it is not a disease but rather a symptom! The fragmentation of application is just a consequence of business units’ fragmentation, which is caused by the functional-hierarchical management model.

If employees and managers of different departments cannot align their efforts effectively for the benefit of the entire company and its customers, then it is clear that they will vote for the applications serving the needs of a single department. The application integration issues are also predictable because they reflect the cross-departmental integration issues.

The CIO’s are trying to deal with this issue, but it is a fight with symptoms rather than root causes. If feudal relations are prevalent in the company, the departmental managers will find a way to get the budget for local “optimization” of activities, and spend it on automation.

Alternative to standalone applications are the integrated or enterprise systems, first and foremost – ERP systems.

Historically, ERP systems have emerged as the evolution of resources and capacity planning applications (MRP – MRP II), when financial resources (i.e. accounting) and human resources were added to material resources. The big idea was to plan all resources of the enterprise, and that was encoded in the acronym ERP – Enterprise Resource Planning.

The ERP concept and software emerged in the early 90s. After a while, the concept gradually expanded to include customer relations (CRM), supply chain management (SCM), maintenance and repairs, etc.

In terms of technology, the progress of integrated applications was greatly facilitated by the emergence of commercial relational DBMS. The integration of corporate applications is primarily the integration of databases. Simply put a single database: single reference entities, no gap between material and financial transactions, etc. This was a huge step forward compared to the isolated applications, when each has “a truth of its own”, and data stored in different applications does match one another.

Yet common data is necessary but insufficient for effective cross-functional collaboration: what is crucial is end-to-end workflow. Such an attempt was made within ERP systems: according to the best practices, ERP implementation should include business processes analysis and optimization.

Unfortunately, at that time (early-mid 90’s) the understanding of how to manage end-to-end business processes was far behind the one of today. This was the era of reengineering – the period of “technocratic idealism”:
  1. Analyze current processes (“as-is”)
  2. Design the optimal business process (“to-be”)
  3. Develop a transition plan
  4. Implement new processes
  5. PROFIT!!!
Most ERP implementations follow this approach.

It took about 10 years, with many companies paying a high price to find out that this approach… let’s put it this way: is far from being perfect. A deep contradiction was found: ERP vendors considered the implementation as a one-time automation project while business processes are subject to frequent changes by nature.

This volatility is caused by many factors: changes in regulations, global and national business environment; ever-growing customers’ expectations; competitors challenge us with new technologies and practices etc. The companies have to respond to all this and many of them come to the conclusion that since the processes will inevitably change, strategically, it makes more sense to initiate changes in business processes to achieve a competitive advantage rather than to respond to challenges and be forced to change.

The ability to change business processes fast is a certain type of corporate culture, it requires mature process capabilities, and it supposes adequate tools support. Many companies have found that their ERP system is more a liability than an asset when it comes to the process management.

The vendors of current ERP systems followed the paradigm of reengineering that did not suppose that business processes and supporting applications will change frequently. Of course, all ERP systems are very flexible, but this is a flexibility of a concrete: when it is liquid, it can be cast into anything, but when it dries, only a hammer can help…

This became evident in the early 2000s, and as a response to this challenge BPM – Business Process Management emerged. This concept now has various interpretations, but we at Comindware follow the one laid out in the classic Smith and Fingar’s ”Business Process Management: The Third Wave”. This book positioned BPM as a holistic approach to the management of business processes in a closed loop modeling-execution-analysis and described a new class of software: BPMS. (Smith and Fingar treated BPMS acronym as “Business Process Management System”, since then it has evolved to “Business Process Management Suite”).

Any BPMS (be it SAP, Oracle or Comindware) allows you to define the processes, then add a data model, user and system interfaces, business rules, and load it all in a so-called ”Process engine” to obtain an executable system, which will assign tasks to performers (employees a.k.a. business users), call legacy applications, execute business rules, respond to events etc.

The value of BPMS is its compliance to the modern concept of business process management: they support a cycle of process design – execution – analysis, followed by a next cycle of re-design, execution, etc.

BPMS systems also comply with the principles of agile development: short iterations, reliance on live user experience, dynamic wish list of enhancements (backlog). To enable this BPMS has a strong emphasis on prototyping and graphic modeling of practically everything: processes, data, user interfaces, business rules, etc.

Thus, BPMS-based BPM provides perfect alignment of technologies, methodologies and implementation principles.

The following table compares ERP and BPM as management support tools:
  ERP BPM
Methodology Reengineering: as-is/to-be, transition plan from current to optimal process Continuous improvement: processes change quickly in response to changing business needs
Technology DBMS: data modeling, storage, retrieval, common database, user interfaces, queries and reports BPMS: business process modeling, process engine, user interfaces, performance analysis and process execution monitoring
Implementation Waterfall or Big Bang: one time project Agile: program – a series of projects of different sizes, implementing radical transformations as well as evolutionary process improvements
Now, what prevents ERP vendors from implementing the latest achievements of business process management in their products?

Nothing, in theory. In fact, they do: modern ERP systems contain process modeling and process engines. Yet little effort has been made by ERP vendors to promote these ideas to customers. In fact, adding features to the software is lesser part of the job. Much of the problem lies in the minds of customers, developers and consultants. If you have been implementing a particular technology, methodology and an approach to project execution (see above) for two decades, it will not be easy to say “we are going to change the rules of the game!” Do you think your clients or your partners will be happy? After all, the current approach is replicated in millions of documents, in thousands of presentations, hundreds of training courses, project charter templates, etc. Why not leave it all as is as long as the sales aren’t falling?

Anyway, the fact is that advances in business process management technology did not lead to a transition of ERP-systems to a new, process-based, architecture, but rather to the development of new class of software – BPMS. From an architectural point of view, BPMS differs radically from ERP and other enterprise applications because it is not an application for the end user but a platform – the environment that makes it possible to define business processes and turn them into executable applications. From this perspective, BPMS is similar to DBMS, which is also a versatile tool for creating applications for any business area.

BPMS does not aim at replacing the ERP or any other enterprise application. The combination of both is optimal: ERP is responsible for functional management support, while BPMS supports process management. In the previous article, we noted that process management complements functional management by compensating its drawbacks. The same can be seen at the software level: BPMS is not a substitute, but rather a natural add-on to ERP and other applications.

This separation – ERP supporting functions, BPMS supporting end-to-end, cross-functional business processes – is beneficial for both sides.

The practice of ERP implementation projects show that they are predictable and not overly expensive as soon as dealing with business functions. However, the higher-level business processes are implemented, the greater the scope and coverage of the project (i.e., the more functional areas are involved), the more challenging the project. The volatility of business processes doesn’t match to the waterfall implementation methodology. It’s practically impossible to automate a business process as a one-time project: the process and therefore the software requirements change faster than consultants and programmers are able to implement them at a reasonable cost.

Installation, system configuration, initial data load are relatively easy and predictable part of ERP implementation projects. It becomes much worse when end-to-end processes are in scope. If this is the case then the implementation team should care about communication and collaboration across business unit boundaries, alignment of workflow within business units to the ultimate process goals and overall business strategy. This area is mined with conflicting interests and politics. At the end of the day, it’s not about replacing separate applications with the integrated suite – it’s about reshaping the organization culture and changing its values.

It can’t be done in a “command and conquer” style. On the other hand, it cannot last for too long either, because ERP implementation project has a schedule and deadline. The pace of business functions vs. business process automation differ too much to combine these activities in a single project and use a single tool for both tasks. The best way is to eliminate end-to-end business processes support from ERP and focus on business functions support. This is clear and predictable job and it must be done anyway as functional management is the foundation.

On the other hand, BPMS is ideal for the implementation of process management, but it’s too costly to implement standard business functions. As mentioned above, BPMS is not an “off-the-shelf” software, it implies custom design and development which a priory is more expensive. In the case of core business processes there is no real alternative because they are always company specific; in the case of the business functions standardization and ERP-based automation are justified from methodological, technical and economical standpoints.

The post Functional and Process Management: Tools Support appeared first on CMW Lab Blog.

]]>
https://www.cmwlab.com/blog/functional-and-process-management-tools-support/feed/ 0
The Unified Collaborative Work Environment https://www.cmwlab.com/blog/the-unified-collaborative-work-environment/ https://www.cmwlab.com/blog/the-unified-collaborative-work-environment/#respond Tue, 14 Mar 2023 12:23:57 +0000 https://www.cmwlab.com/blog/?p=6572 In the previous article we divided the collaborative work continuum into projects, processes, cases, document-oriented workflows and issues. We also noted that it was made for analysis purposes only; in reality, they are interrelated. As an illustration, the PMBOK (Process Management Body of Knowledge) talks about processes more than about projects; similarly, the big part […]

The post The Unified Collaborative Work Environment appeared first on CMW Lab Blog.

]]>
previous article we divided the collaborative work continuum into projects, processes, cases, document-oriented workflows and issues.

We also noted that it was made for analysis purposes only; in reality, they are interrelated. As an illustration, the PMBOK (Process Management Body of Knowledge) talks about processes more than about projects; similarly, the big part of BPM CBOK (Business Process Management Common Body of Knowledge) is devoted to processes improvement and process transformation projects.

This interrelation shows itself in the following:
  1. Interoperability: one form of collaborative work initiates or calls another.
Examples: a patient’s treatment at the hospital (a case) calls a series of tests and procedures (processes). A project is instantiated by the project initiation process according to PMBOK.

  1. Migration: changing the collaborative work classification over time.
Attributing a collaborative work as a project, process or case often is a matter of interpretation. The real world example: a pharmaceutical company considered a new drug development as a project, until one day they came to the idea of a standard project template. Soon after that they realized that this work should be better treated as a process.

Another common example is case-process migration. The popularity of case management is partly due to lower initial implementation and deployment costs in comparison with processes. Therefore, even if the activities flow is fully predictable and may be managed as a process end-to-end, an organization may choose to treat it as a case to save the process analysis, design and automation costs. The case management doesn’t require a process model or diagram – it starts from the given goal and assigning a performer who then defines and completes a series tasks, either by himself or by assigning them to others. After a while best practices can be collected, analyzed and implemented as a process – hence the migration.

  1. Common tasks management: every collaboration decomposes to tasks eventually. The task content remains the same whether it comes from a project, process or case. The “My tasks” portal should contain all tasks assigned to the performer wherever they come from.
  2. Common resources management: tasks coming from various collaboration channels usually require the same resources, i.e. people. A manager responsible for the efficient resources utilization should have the whole picture: what processes, cases and/or projects a particular employee participates in.
 

The post The Unified Collaborative Work Environment appeared first on CMW Lab Blog.

]]>
https://www.cmwlab.com/blog/the-unified-collaborative-work-environment/feed/ 0
What Is End-To-End Process https://www.cmwlab.com/blog/what-is-end-to-end-process/ https://www.cmwlab.com/blog/what-is-end-to-end-process/#respond Mon, 09 Feb 2015 14:39:03 +0000 https://www.cmwlab.com/blog/?p=3055 BPM is full of terms that are either ambiguous (which is inevitable to some extent) or taken for granted. One term that no one bothers to explain is “end-to-end process”. Some people believe it means a process spanning through the organization from one end to another. However, it’s rather the description of a cross-functional process. […]

The post What Is End-To-End Process appeared first on CMW Lab Blog.

]]>

Some people believe it means a process spanning through the organization from one end to another. However, it’s rather the description of a cross-functional process.

CMW Platform banner

What is End-To-End Process End-to-end characteristic of a process emphasizes another aspect: it’s a process that comprises all the work that should be done to achieve the process goal. Therefore, end-to-end means “from the very beginning to the very end”.

But what exactly are these “very beginning” and “very end”? That’s the most interesting question.

The answer depends on the perspective and context. The “end-to-end” clarification is the appeal to pay more attention to the context and ask yourself and the process team what is the business problem and how does it map to the process. Like in the case below.

Case #1: Energy

A company in the energy industry owes and operates heating equipment (boilers, pipelines etc.) at thousands of small towns scattered across Siberia. For these settlements it’s the key infrastructure component, taking into account long and severe winters.

The company requested the “Purchase order approval” process automation. This is the core process that supplies each site with necessary equipment (boilers), components (valves), materials (pipes), repair kits and related services. These items are purchased annually as a preparation for the cold season.

According to the customer, the major process activities are:

  • A field engineer performs on site equipment inspection and submits a list of needs.
  • The needs list goes for approval to district and regional levels and finally to the headquarters. The application can be adjusted on the way, returned back for revision, etc.
  • The needs are subject to aggregation: similar items are summarized at each level. The reason for this, is to combine a large order e.g. to pipes that would get better price from a manufacturer than small ones.
  • The portion of total needs can be satisfied by goods on stock but most part should be purchased. Headquarters creates a set of purchase requests, splitting the total needs list by goods/services type and/or suppliers. Therefore, the individual needs from each site are first grouped and then ungrouped.
  • A separate tender is announced and conducted for each purchase request, the best vendor is selected for each, contracts are negotiated and signed.


Processing thousands of initial applications with grouping and ungrouping, conducting dozens of tenders is a challenge by itself, especially if the only tools available are Excel and e-mail. Besides the requestor should be able to asynchronously amend or cancel a request already being processed – this requirement complicates the process considerably.

Business process analysis and automation with the help of BPMS looks well justified. Business case is obvious because a lot of money is at stake and losses are large if the process fails. It’s also worth to mention that it isn’t only about finances: just imagine, a small township in sub-arctic climate not properly prepared for the winter.

OK, here is the case – should we build a project team and project plan and go for it? Or should we stop for a while and re-think the business problem?

Please leave your thoughts and suggestions below. (The case will be continued in a week.)

Anatoly Belaychuk has over 20 years of professional and managerial experience in software and consulting industry. He is acknowledged BPM (Business Process Management) expert, writer, key speaker at BPM conferences, blogger and trainer. His current position is as a BPM Evangelist at Comindware

The post What Is End-To-End Process appeared first on CMW Lab Blog.

]]>
https://www.cmwlab.com/blog/what-is-end-to-end-process/feed/ 0