The VSM Tooling Landscape

Our previous post described Value Stream Management (VSM)  as an economically principled, evidence-based approach to improving information flow and workflow across the systems where people develop software products. 

In the final post in our series, we will examine the role of data and measurement tools in implementing an effective VSM program.

The Role of Data in VSM

Peter Hines and colleagues originally introduced VSM as “a strategic and operational approach to the data capture, analysis, planning, and implementation of effective change within the core cross-functional or cross-company processes required to achieve a truly lean enterprise.” [1]

This definition underscores VSM’s emphasis on cross-functional collaboration to improve customer experiences by optimizing the flow of value through various organizational processes using data and evidence-based techniques. 

However, in most large companies, the entrenched culture of functional silos and deep technical specialization needed to deliver valuable customer experiences presents a formidable barrier to achieving the seamless flow that VSM advocates. 

These silos hinder effective communication and collaboration, making it challenging to create a system that can swiftly respond to customer needs and market demands. 

To make matters worse, much of the collaboration in work systems today is mediated through tools, a trend that has only accelerated with the transition to remote and hybrid work across globally distributed teams. 

These tools are not merely facilitators but the backbone of modern teamwork, especially in virtual settings that are the norm, where direct, face-to-face interactions are limited.

To build a picture of how a “system of work” functions as VSM requires, we need to analyze and interpret a large corpus of data from the systems used to collaborate and align this with the lived experiences of the people on the ground doing the work.

  • For people whose job is to build the organizational wiring to create the flow of value, this is a critical competency to develop, augmenting the management techniques honed via face-to-face interactions with teams. 
  • For people doing the work, having a holistic view of the larger context in which their work is happening is likewise critical when collaborating in an asynchronous, distributed environment. 

Sophisticated analytics and tooling are essential to achieving excellence in both areas; hence, data and tooling are a core part of VSM in software product development. 

The Role of Tools

In their recent book, ”Wiring the Winning Organization: Liberating Our Collective Greatness through Slowification, Simplification, and Amplification”  Gene Kim and Steven Spear call out how critical tools are at the three different levels of work in companies:  

  • Layer 1, focused on the direct interaction with technical artifacts in knowledge work
  • Layer 2, focused on the creation and maintenance of the tools and instrumentation used to work on these artifacts; and 
  • Layer 3 focuses on improving the socio-technical wiring of the system.


Figure 1: The Three Layers of Organizational Work

Gartner Research [3] classified the software development tooling landscape into DevOps Platforms, Engineering Intelligence Platforms, and Value Stream Management Platforms. This categorization aligns well with Kim's layers, so let's use it as a basis for thinking about the tooling landscape for VSM. 

 

 

Figure 2: DevOps Tooling Landscape from Gartner Research

Operating mainly at Levels 1 and 2, DevOps Platforms automate and integrate processes to improve individual productivity when working at the level of the artifact. These include well-known platforms like GitHub, GitLab, Atlassian, etc. 

Engineering Intelligence Platforms operate at Levels 2 and 3, including products like Allstacks. They provide engineering managers and executives with insights to optimize development team performance. These have deep analytics capability to analyze work, focusing primarily on technical software delivery. 

Value Stream Management Platforms provide visibility and connectivity to data sources, forming touchpoints throughout the value stream. By giving data and visibility into work done all along the value stream, from ideation to delivery and even further downstream into operations, customer support, sales, etc., these platforms help analyze and manage the end-to-end flow of work and identify bottlenecks for cross-functional collaboration company-wide. 

While these tools have many overlapping capabilities, the critical thing common across all these toolsets is that they collect, visualize, and analyze how teams work using data derived from the system logs of the tools that mediate knowledge work.  

These are necessary components of a VSM data management program, and it is reasonable to expect that these capabilities will consolidate under a single category of tools as the market matures.  

But does this current landscape sufficiently address everything one might need to manage a VSM program? 

Aside from topics like the economic framework, strategy, etc., discussed in our last post as part of VSM's overall scope,  our thesis at Exathink Research is that we are still very early in the maturation of this space, even when it comes to the tooling and data used to analyze and improve the flow of work.

To understand why, we need to zoom out more and examine the bigger picture of VSM's challenges in the digital domain. 

Navigating Complexity: The Essential Challenge in Digital VSM

A key unrecognized element in digital VSM is that these value streams are complex adaptive systems (CAS). 

In these value streams, human dynamics and technical architecture significantly influence the system's behavior, and value creation involves continuously changing the system that creates value.

Contemporary VSM approaches, such as Value Stream Mapping and Flow Analysis, rooted in Lean manufacturing practices, tend to embrace a static and mechanistic perspective of the “system,”   

Decision-making is largely centralized and based mainly on lagging data derived from system logs.

While these techniques are vital to VSM, this static worldview ignores the subtleties and complexities inherent in knowledge work. 

It's crucial to enrich these traditional methods with insights that capture the tapestry of human interaction, creativity, and innovation that characterizes knowledge work and explicitly model the impact of technical architecture on the flow of work in digital value streams. 

The work of Nigel Thurlow and colleagues is an excellent example of this approach of incorporating the social aspects into reasoning about flow. Building upon Dave Snowden’s work on Complexity, combined with a deep understanding of the Lean operations principles that underpin VSM,  The Flow System [4]  is an example of an analytical approach that addresses the gap head-on.

In contrast to models like Mik Kersten’s  Flow Framework [5],  the canonical value stream management framework in use today, The Flow System starts by recognizing that digital value streams are complex adaptive systems, where the interactions between individuals and the collective behavior of teams cannot be fully predicted or controlled through process optimization alone. 

Thurlow and colleagues emphasize the need for Complexity Thinking, one of the core pillars of this system, along with Distributed Leadership and Team Science, to better understand and manage these dynamic interactions.

This work is also measurement-intensive, but the techniques are qualitative and derive from those more commonly used in the social sciences and psychology. To fully understand all the factors contributing to the workflow in socio-technical systems, we must incorporate these analysis techniques into reasoning about software product value streams. 

Conversely, complexity thinking requires us to view socio-technical systems holistically and recommends reasoning about systems using explicit models, representing the interactions and relationships between the system's parts, and reasoning about the system's behavior as a whole using techniques like direct real-time observation, social network analysis, stochastic modeling,  simulation, etc., rather than simply reporting and tracking lagging high-level system metrics on dashboards.  

We now have the technology to perform much of this reasoning in the digital domain using real-time data from system logs. We expect new generations of powerful log-based tools to emerge for visualizing and analyzing the behavior of complex socio-technical systems in real-time.

These next-generation tools, augmented by AI, will be decentralized and used by front-line workers and executives based on a globally visible, distributed, real-time model of a company's operations

We need to train new generations of leaders on the underlying principles behind these tools so they can use them to operate effectively in a complex environment. 

In the long run, such an approach will be critical to solving VSM problems on the ground. What we have built today are the building blocks of a comprehensive VSM solution that practitioners are only starting to flesh out. 

Conclusion

In wrapping up our series on Value Stream Management, our thesis is that it is essential to build upon the strength of current VSM practices to address system complexity and the human dynamics in our systems of work. 

We don't mean to diminish the value of existing methodologies or technologies but rather highlight areas for expansion, deeper exploration, and improved data and tool use for this purpose. 

At Exathink Research, we are actively engaged in R&D work connecting these streams of thinking. We believe this will be critical in effectively developing the remaining VSM technology pillars, like analyzing the behavior of value stream networks and linking these to economic models and strategy as core parts of VSM.

To summarize, VSM will continue to be a tremendously exciting and fertile area for innovation and progress in organizational improvement in the coming years.

Acknowledgment: I want to thank the AllStacks team:  Emily Luehrs, Parker Ennis, and Adam Dahlgren for inviting me to write down my thoughts on Value Stream Management for this series on their blog. 

I plan to continue writing about this topic and the R&D work we are doing at Exathink Research at The Polaris Flow Dispatch. If you are interested in periodic updates, please subscribe.

References

[1] Value Stream Management, Peter Hines, et al. 

[2] Wiring the Winning Organization: Liberating Our Collective Greatness through Slowification, Simplification, and Amplification,   Gene Kim and Steven Spear, IT Revolution.

[3] Market Guide for Value Stream Management Platforms, Hassan Ennaciri, et al., Gartner Research. 

[4] The Flow System: The Evolution of Agile and Lean Thinking in an Age of Complexity, John Turner, Nigel Thurlow, and Brian Rivera. 

[5] Project to Product: How to Survive and Thrive in the Age of Digital Disruption with The Flow Framework, Mik Kersten. 

Can’t Get Enough Allstacks Content?

Sign up for our newsletter to get all the latest Allstacks articles, news, and insights delivered straight to your inbox.