Do you think big is better?
…
A batch is a collection of “stuff” which is treated as one unit with respect to a process.
Still here 🙂 Great…All right, we consider that cake the batch. The size of the batch would be the amount of stuff it consist of . When baking our cake we would go through process steps like picking materials, mixing, baking it in the oven.Imagine we use the same amount of stuff, but instead of baking one cake we decide to bake 10 small cakes instead.
One would be crazy to go for the small batch approach right ?
Well.….this is the classical way of thinking about batches. With this thinking the conclusion is clear. If we want to be efficient and fast, we should increase the batch size and push as much cake as possible in that oven everytime we get the chance.
A manager with a classical mental model about batch economics might think: Agile is trendy, colorful, nice to people and all. I can live with that and even the sticky notes, but I draw the line with small batches. Its clearly inefficient, expensive and time consuming.
There must be a limit to this economically irresponsible madness. Agile might work for others, but clearly that part of agile is not a fit for our specific context.
The most important blind spot in classical thinking, is that we assumes the cost we incur everytime we run each batch through the steps is static. Lean calls this cost the transaction cost.
Lean teach us that the transaction costs can actually be reduced dramatically, when we have a lot of small repeatable transactions. The other blind spots are the significant economical benefits of small batches. These benefits are listed below.
In the cake example we would have to invest in automation and change how the steps in the process is done, to make it feasible to bake small cakes in a single piece flow.
We look for opportunities to reduce the transaction cost, until it doesn’t really matter economically if we choice to bake fever big or many small cakes. The only difference would be the speed of the assembly line as the cakes slide through the long owen. Bigger cakes need more heat (work), thus go slower.
All right, enough about cakes. How does this translate to product development?
In productdevelopment our batch is a piece of functionality in the product — a product feature. The smaller we slice the features, the smaller the batch size will be. A feature we spend many hours building is a large batch.
The transaction costs are the costs we incur everytime we do a feature no matter how big that batch/feature is.
So we find the transaction costs by asking:
What is it we have to do everytime we deliver a feature, no matter the size of the feature?
Think about that for a second. What would it be in your context?
Regression tests are usually a big transaction cost in product development, but it varies from context to context what the transaction costs are and how they can be reduced.
…
In waterfall where we take all the functionality through each process step once, the whole project is one large batch. Like in the unautomated owen bakery.
In Agile we take a small portion of the total functionality (a small batch) and run it through all the process steps. Define, build, test, … To make this economically feasible we look for opportunities to reduce the transaction costs.
Does it make econimical sense to build in tiny batches, if our company only have a big Owen technology. It might, but probably not. We probably have to reduce batch size graduatly as we make it economically feasible, by building and improving an “assembly line” or parts of it.
More on that below, but first:
Why is all that batch size reduction worth the trouble anyway?
Small batches influence outcomes positively through:
- Reduction in management overhead costs
- Increased adaptability
- Reduction in time to market
- Increased feedback
- Reduction in Risk
- Increased efficiency
- Increased motivation
- Increased feeling of urgency
- Increased transparency
- Increased trust
- Increased predictability and planning ability
- Reduction in invisible inventory
- Levelingeffect on competence demand
- Increased ability to prioritize work
I will revisit and explain the above list in a future story, so follow me if you find that interesting.
Some of the above effect are suprising to people who are used to working with large batch processes.
The management overhead cost for instance. If we do one project we need to initiate, approve, do reporting on and manage that project. If we do the same project scope, but split the project inten 1/10 sized projects, the total management cost must go up, right?
We would now need to run ten projects and get each initiated, approved, reported on ect. The cake story showed that Lean thinking says that, this mental model does not tell the complete story.
Can management cost really decrease, when we split a big project into 10 small ones? How come ?
Let’s look at the test manager as an example. In a typical waterfall project the test manager would manage and analyse product quality risks, manage test cases, create test plans and manage the test process.
In an fully Lean/Agile organisation we would expect to find zero test managers.
That management overhead is 100% zeroised.
What is done instead is that the quality is build in.
The regression tests are automated like an assembly line. The transaction cost of running a complete system regression test is ideally a push of a button at near zero cost and time.
There is no need for a test manager here, at least in the classical sense. We still need to think a lot about how we test, and some of the competences are needed, but now the management problem is more of a test system engineering problem.
Running a test is now almost FREE. Which is great ! Not because we save money on test (we probably don’t, at least not as a first order effect), but because:
Our economics are no longer influenced significantly by how big the feature/batch is.
The automated test system does not come for free — far from it — just as a cake assembly line doesn’t. Lean suggest we invest in our product development infrastructure to make our problem look more like the assembly line then the single huge owen case.
This way we can enjoy all the benefits listed above and improve our total economic outcome.
Big batches should not be seen as an isolated problem in the development department.
The whole company is part of the “assembly line”, or “value stream” to use a Lean term, so we need to look at all the areas where big batches are usually found in the search for opportunities to decrease transaction costs and then batch size:
- Marketing
- Analysis
- Funding
- Design
- Purchasing
- Prototyping
- Testing
- Management reviews
- Tasks for specialized central resources
This blogpost was written by Tomas Eilsø.
Follow Tomas Eilsø on Medium: https://medium.com/@tomas.eilsoe
Tomas Eilsø is an Agile coach, F16 pilot, owner of several startups. Love dealing with problems related to collaboration in complex and uncertain environments.