Process Observability: Three Highlights from a Recent Study
2023-12-4 03:21:10 Author: blogs.sap.com(查看原文) 阅读量:9 收藏

To navigate a changing world, organizations need to have a good grip on their operations. This pertains to both understanding the current state of operational processes and enabling the enactment of change in a fast and controlled manner. The corresponding assessment of the current and future state of operations must hence be conducted from a holistic perspective, fusing data and knowledge from a wide range of systems and stakeholders affecting a particular process. SAP Signavio has coined the term Process Observability to tackle this challenge from first principles. Process observability describes the extent to which an organization can observe its processes in a complete and correct way and make these observations accessible to relevant stakeholders to achieve organizational objectives. Achieving high process observability typically requires a broad perspective on business process intelligence. For example, in a hiring process, many crucial aspects are cultural and ingrained in informal networks. Hence, mining an event log from the system that underlies the formal hiring process may be insufficient for increasing the fit of hires to the organization. 

A recent study by our analyst firm IDC sheds more light on process observability, based on interviews of executives of hundreds of organizations. In this blog post, we discuss three particularly intricate, yet intriguing findings and how these findings relate to our long-term business process intelligence vision. All three findings are statistically significant associations between reported results that IDC’s data science experts have elicited. While correlation is not causation, we can see how these findings confirm intuitively fundamental principles of Business Process Management (BPM) in the big data and AI era. 

According to the IDC study, the organizations that successfully find and tackle the root causes of business problems with business process intelligence are more likely to assemble BPM teams across functions based on what is best for the current objective — for example, in contrast to organizations that primarily rely on a center of excellence driving process management. This insight intuitively reflects SAP Signavio’s long-running BPM for everyone principle, stipulating that a process management culture must be established throughout the organization to affect meaningful change. In the past, the antipole to BPMN for everyone were ivory tower business process centers of excellence, which mapped the processes of entire organizations, often without much effect, as organizational buy-in did not exist, and broader organizations are culturally typically unwilling to have process change shoved down their throats by central staff functions. In our current data-driven world, the same rings true for quantitative analyses: technologically mature organizations can create many dashboards with sophisticated analyses. But these analyses must be placed in the right context by experts and stakeholders to be tied to specific actions to affect meaningful change. Here, trust in the analyzed metrics becomes crucial, for example to avoid process participants feeling incentivized to game Process Performance Indicators (PPIs), to the long-term detriment of the organization. In the age of Artificial Intelligence (AI), this problem can be expected to further exacerbate, as it becomes even more important to nurture a culture of trust to then ultimately ensure that human-AI collaboration is not limited to a few selected experts, but considers the operational reality, as well as the experience of the rank-and-file process participants whose work is affected the most and who tend to know the intricacies of their work best, in particular when it comes to aspects that are not fully captured in database tables and event logs. 

In the IDC survey, organizations whose data is dispersed across a variety of source systems report worse business results. Considering that data management for business process analysis approaches such as process mining is a well-known challenge, this result is somewhat unsurprising, yet crucial to highlight. The trade-off between using a variety of execution systems, for example following a best-of-breed approach and the deployment of overarching solutions that allow covering a process end-to-end with one system is not only of crucial relevance when one or several new systems are introduced: it also has severe impact on the continuous ability to transform and excel. Here, it is crucial to ensure that highly customized processes that aim to distinguish an organization from its competitors are actually understood and that technology support is mature enough to allow for integrated data-driven analysis. 

The organizations surveyed by IDC that reported to fuse several data sources for business process analysis are statistically more likely to have obtained (self-reportedly) good business results. Here, we must highlight the contrast to the previous point: 

  • Analyzing process data across source systems is challenging and hence may lead to less impact and worse results, given everything else is equal. 
  • If data across source systems is competently fused, more meaningful and actionable insights can be achieved, thus facilitating better business results. 

This insight supports the general idea of process observability, which mandates not only the integration of different data sources, but, beyond that, the fusion of data, knowledge, and experience: 

  • Obviously, fusing data from different sources systems is crucial to ensure a holistic picture of how a process runs across systems. 
  • Utilizing knowledge about a process allows organizations to draw more meaningful conclusions from data and to sanity-check data with respect to compliance to constraints that logically must hold in a given environment. For example, in a logistics scenario, items cannot be first shipped and then packed. 
  • Augmenting data and knowledge with experience information helps us check to what extent the hard facts align with the subjective reality of a process. For example, the operational PPIs of an internal support process may be good, but a deep dive into qualitative feedback may unearth issues, such as too late escalations to 2nd or 3rd level in intricate cases. 

To conclude, the IDC study confirms two of our key principles, one long-standing and the other one position towards the future of intelligent BPM: 

  • The bread & butter: BPM is a collaborative effort and hence requires a culture of collaboration. Siloing-in BPM efforts and teams did not work in times of modeling-centric BPM. In the days when Signavio – then still a startup – disrupted the market with its tooling geared towards the ‘BPM for everyone’ vision. Today’s data-centric BPM approaches still must subscribe to this vision. Data depends on context, and context crucial for interpreting results and enacting relevant changes is only available to the people that live and breathe the operational reality subject to analysis and change. 
  • The future of process observability: results-driven BPM requires the fusion of data and, beyond that, the fusion of data, knowledge, and experience to ensure analysis results are reliable and actionable. Accordingly, organizations must utilize a well-integrated collection of tools for fusing different analysis techniques, such as model-driven analysis, process mining, and quantitative process analysis directly on tabular enterprise system databases.
     

Download the IDC paper on Process Observability here: IDC White Paper, sponsored by SAP, Business Process Observability: A Collaborative Approach to Transformation Enablement, doc #EUR251308223, November 2023 


文章来源: https://blogs.sap.com/2023/12/03/process-observability-three-highlights-from-a-recent-study/
如有侵权请联系:admin#unsafe.sh