Lessons from a conversation on simulation
Recently, I had a fascinating conversation with Fourier-E, a company that builds, amongst others, simulations using FlexSim, about how organizations actually use simulation.
What started as a first introduction quickly turned into something broader: how we think about learning, complexity, and value in operations design.
Below are the reflections that stayed with me. They tell a story about common pitfalls in building simulations and how to utilize a simulation as a real decision-support tool.
Starting Big, Then Zooming In
Many companies begin their simulation journey with ambitious goals. They want a full-scale digital twin: the entire warehouse, the full operation, every process, every forklift.
Yet after a few months, they often realize what they really wanted was something smaller and sharper: to test a few bottlenecks, validate layout alternatives, or compare resource strategies.
It raises an interesting question: should those have been the starting point all along?
In practice, probably not. However, a lot of clients aren’t ready to think small at first. They need to “see the whole picture” before they can appreciate what simulation can do. The evolution can therefore go from full digital twin to targeted decision model.
The Agile Alternative
Building an entire digital twin upfront is costly and the opposite of Agile.
As a consultant, I see a lot of untapped potential for simulation in SME because of this approach. A lot of businesses could profit from a small well-defined simulation model. However, many companies think that they must build a huge model first which results in a very high entry barrier. It’s going to be costly and time-consuming, right? With an agile approach, it doesn’t have to be.
A more iterative approach starts with a single area, tests assumptions, refines logic, and then expands if needed.
The key questions become:
- What really needs full detail now, and what can evolve later?
- How do we demonstrate early value before asking for more investment?
Stakeholders often want to see the “final product” early, but that demand itself can raise barriers. Ironically, insisting on perfection early can delay the very insights that make simulation valuable.
The Right Tool for the Job
A simulation is not a shortcut that lets you skip the “boring” parts. It builds on them. Defining the process, collecting data, and understanding variability are what give the model its backbone.
What a simulation adds is the ability to capture how processes truly interact. Not in isolation, but as a living system. It shows the ripple effects of timing, dependencies, and shared resources.
In other words, simulation connects logic to reality. It accounts for process variation, timing mismatches, and the very real limitations of space, motion, and human behavior. That’s where its value lies: not in replacing analytical tools, but in showing what happens when everything runs at once.
The Power (and Price) of Visualization
One of the most debated topics in simulation is visual detail.
High-fidelity graphics (people, branding, forklifts, or even food ingredients) look impressive and help non-technical stakeholders connect to the model.
It’s easier for an operator or manager to spot errors when they can literally see their process. Therefore visuals can add real value to a model. But there is also a trade-off: beautiful visuals take time and money.
Budget approval often depends less on analytical merit and more on how visually convincing the model looks.
Decision-makers who aren’t familiar with simulation may struggle to grasp its technical value but they do respond to what looks real.
As a result, simulation becomes a two-in-one tool: part analysis, part marketing.
It’s a tricky balance. On one hand, visual realism gets attention and funding. On the other, it risks overshadowing the real question: Is the model actually driving better decisions?
Modeling Human Behavior (or not)
Real human behavior doesn’t follow neat logic.
The extent to which a waiter has attention for their customer’s needs a restaurant simulation can create entirely different outcomes. However, it’s neither practical nor useful to model that level of psychology.
Simulation should capture systemic logic, not individual mood swings.
When the Model Breaks, Pay Attention
Each model has its limitations. So when something unexpected happens, it cannot always facilitate that scenario.
The restaurant model for example, couldn’t handle a party of eight because it wasn’t designed for that. It’s tempting to dismiss that as a shortcoming of the model, and expand the simulation to facilitate this option. But often, it’s adding insight. If your standard process can’t handle exceptions, maybe your real-life system can’t either.
Pause and discuss: does the process need to change? Do we need to add this in the model? Or are we dealing with an outlier which we should not build our entire process around and/or spend too much time simulating.
A good model doesn’t just run smoothly, it reveals where your process doesn’t.
The Expert’s Trap
Simulation experts love complexity. It’s fun to build, fun to show off, and adding complexity pushes experts to new limits of their craftsmanship.
But the risk is overbuilding: creating something technically beautiful that adds no real decision value.
True consulting discipline means questioning every feature. What value does it add?
The best model is the one that is most useful, not the one that is most complete.
Sitting on Both Sides of the Table
Having been both a simulation buyer and a builder myself, I recognize how hard it is for decision-makers to know what simulation can (and can’t) do.
My advice? Get your hands dirty. Try a free demo version, move a few objects around, run a simple process.
That small bit of practical experience makes for much better collaboration.
Once you understand the basics, the conversation with consultants shifts from “show me something impressive” to “help me learn something valuable.”
Summary
If I had to summarize our discussion, it would come down to this:
- Start small and iterate. Modular models lower barriers and accelerate learning.
- Use visuals wisely. They build buy-in and help with understanding, but beauty is not insight.
- Model logic, not moods. Focus on flow, not personalities.
- Welcome the glitches. They often reveal the truth.
- Challenge complexity. Every detail must earn its place.
- Educate everyone. Shared understanding is half the battle.
- Remember the dual purpose. Simulation is both a decision tool and a communication tool.
Final thought
Simulation isn’t just about predicting outcomes, it’s about creating understanding.
When used well, it’s a bridge between data and decision, between what we imagine vs. what actually happens.
And like every good bridge, it’s built one solid piece at a time.
Your own simulation
Interested in applying the power of simulation to your own business needs? Take a look at the Simulation services FlowFast offers.
