The Language Nobody Gave Us
At some point in every robotics HMI project, a designer opens a file that was never written for them. It might be a 60-page system architecture document. A ROS node graph. A manual dense with coordinate system notation. Whatever form it takes, the message beneath it is the same saying "You're on your own here." This isn't a skill gap. It's a vocabulary gap and nobody in the design industry has solved it yet. Here's what it actually looks like.
Category
Industry Insights
Date
Reading Time
4 mins
TLDR by Avaline
Every robotics HMI design starts from a blank Figma canvas with no pattern library, no vocabulary, and no foundation built for machines. This isn't a design skill gap. It's a structural one: the design discipline has never built a shared language for robots. Osiflow HMI is building it.
A scenario that plays out more than anyone admits
Imagine a designer joins a startup building collaborative robot arms for industrial packaging. The brief? Create an interface that lets operators teach the robot new tasks, step by step, without writing code. The designer is experienced. They test the machine & then open Calude Design or Figma and start with what they know.
A task manager flow, then design and then they share the mockups. The engineers ask: "What happens between the steps?" The designer freezes. "Between the steps? That was never part of the brief. What do you mean?"
"When the robot moves from position A to position B, what path does it take? At what speed? Does it blend through the waypoint or stop completely? That changes the whole UI."
The designer had designed the task, not the transition. In their mental model, a step was discrete, like a checklist item. In the robot's physical reality, the movement between waypoints was as important as the waypoint itself. This is a documented challenge in cobot interface design. Even experienced operators struggle to bridge the gap between how they think about a task and how a robot executes it.
They iterate. Think with AI. Add transitions. Run operator testing. An operator asks: "Which button brings the robot home?"
There is no home in the design. Not because the designer forgot. Because no one told them that every cobot has a safe home position entirely distinct from the task start position. This is in the machine manual. Not in any Figma community file. Not in any UX guideline for industrial software. They iterate again. And again.
Designing a machine interface without a shared vocabulary is like building a bridge from both ends without agreeing on the units. Each side does good work, but they won't meet in the middle! :/
So what exactly is the vocabulary gap?
We found three things are missing, and they compound each other.
There are no established patterns for machine states. Every designer working on robotics HMI today invents their own affordance for "fault recovery in progress" or "manual override active." There is no agreed visual language for it, the way there is for loading states, error messages, or empty states in consumer software.
The knowledge doesn't transfer. Research on collaborative human-robot interaction in manufacturing environments shows that ineffective HMI design is a leading cause of unplanned shutdowns and process failures. That knowledge exists in research papers and machine manuals. It has never been translated into design tooling.
Neither side has the translation layer. Engineers know the machine. Designers know the interface. Every conversation between them requires a translation that neither is equipped to make cleanly. UX practitioners in industrial environments have been naming this for years. The tooling has never caught up.
The result? Every company rebuilds the same 15 patterns independently. Every designer asks the same questions. Every project runs the same iteration cycles.
What founders see from the other side
Robotics founders watch this from a different angle, and it's equally costly.
You hire a strong designer. You give them access to engineers, product specs, and user research. Weeks disappear into back-and-forth that nobody can fully explain. The feedback arrives in UAT, when an operator on the floor can't tell if the robot is in autonomous mode or manual override. At that point, the rework is not a design problem. It is a schedule problem, and sometimes a safety problem.
This is why robotics companies are now creating dedicated HMI UX roles, not as a design afterthought but as a core function. Across the wave of funded robotics startups, AMRs, cobots, drone fleets, and industrial inspection systems, the job descriptions tell you everything: designers who can navigate workflows spanning hardware, operators, safety systems, and APIs.
Hiring a great designer without giving them the vocabulary is like hiring a skilled translator and asking them to work in a language that hasn't been written yet.
The collaborative robotics startup ecosystem is growing fast. Every founder in this space solves the vocabulary problem independently, at their own cost and on their own timeline. That cost shouldn't still be this high.
What we're building
That's the gap Osiflow HMI exists to close.
The Osiflow HMI Design System is a Figma-native pattern library built for designers working on operator interfaces for robots and industrial machines. It works extremely well with AI tools like Claude & Google Stitch.
It has universal patterns covering the states every machine has: fault recovery, mode transitions, telemetry display, emergency stop, manual override, and operator action logging. A semantic token system designed for physical system states, not consumer UI. Design principles derived from real operator deployments, documented in our design guidelines (coming soon).
It gives designers a language to start from. For founders, it compresses the onboarding curve. A designer who opens the system on day one starts with the vocabulary already in hand. The design industry has built vocabularies for every major UI paradigm. Machines are next.
The free design system Figma kit is launching soon. Join the early access list!


