Let me share a little more about the things I do as part of my job of “leading all kinds of testing activities“. I currently the (test) track lead for a major €100 million program. Some of the things I’m currently looking at are:
- Stakeholder reporting on the progress of integration testing towards third parties
- A test report for verification of security controls, resilience and service levels
- A process of evaluating program milestones, deliveries and obligations
The lessons rhymes with what I have previously seen about test cases, defects and user stories. Perhaps it rings a tune for you?
Less data about what, more about direction
We have around 30 integrations (across four environments) to go through, and over time we have identified a seven state progress model:
- Not started
- Scope agreed
- 3rd party meeting planned
- 3rd party alignment done
- Basic connectivity verified
- System integration testing booked
- Test complete

Which on the surface is fine. At any point in time we can now see what the status is on each on the environments. We can count how many integrations that are in each state. The number of countable things does not tell us where we are going. Only where we are at this point in time. What management is asking, when they see this data point is the consequences – not the facts as such. It is what it is. It’s back to the old game of “When are you done testing” and “How would it look, if I shipped this now”. As test lead I need to connect the dots from the data points to the expected business outcomes. The better story is to build upon the data points and evaluate if we are on the right path. Are we on track? What are the blocker? What priorities need to be taken?
What we do, tells about who we are
For the test report I’m writing I’m looking into a 7-step review process. …yeah.
- drafted
- to be internally reviewed
- ready for review
- stakeholder review complete
- stakeholder review comments implemented
- ready for approval
- approved
The first issue I have with this is the mixing of terms and verbs. Obviously the above is not the actual list – but the problem holds for many progress states I see. Perhaps it’s my old formal methods training kicking in… A state is supposed to be a discrete setting of the system, affected by interactions. A state is something finite and concluded. I recognize the review states as a tool to model hand-offs and wait states. In my time I worked with very elaborate defect states – trust me I’ve been there.
The smell I get from it, though, is a story about the interaction and culture. The level – or lack – of collaboration and interaction between the involved persons. Systems theory tells us that we get the outcomes our frameworks are designed to produce. When things go haywire and review states explodes – the damage is already done.
A little less action, a little more interaction please
For the program milestones, we have around 30 milestones with each around 10 delivery items to confirm and approve. It’s among 4-5 people to work with the details, supported by the whole delivery team. The seemingly first step would be yet another 7-step process. LOL – right. Perhaps there is a better path by now.
The book “The staff engineer’s path” ( https://www.noidea.dog/staff ) mentions the “Nemawashi” approach from the Toyota Production system. I recognize it from “Continuous Test Plan Alignment” ( https://www.o2sn.dk/2023/01/13/continuous-customer-feedback/ ) or “Intentionally Loose Test Planning” ( https://www.o2sn.dk/2026/01/25/intentionally-loose-test-planning/ ). Some would even call it agile or similar “small batch” delivery. What ever the name – I have been more succesful in my customer interactions with this approach. Rather that than crunching numbers on the inside, tossing it over the fence and request yet another project extension.
The Toyota trick is in sharing information early and often. This helps laying the foundations, so that by the time a decision is required, there is already a consensus. While the formal evaluation time for all the milestones are in far off in the horisont, we can start now. Interactively and collaboratively. More interaction – less action over the fence. To butcher Elvis. Again.
Where are you heading in your stakeholder interaction?

One response to “Not where we are – but where we’re going”
Mentioned on the Ministry of Testing Observatory, Feb 18 2026,
https://www.ministryoftesting.com/observatory