Introducing Osiflow HMI - AI Native HMI Design Workspace
A product by Osiflow Creative Agency. A design workspace built for the human layer of the robotics stack.
Category
Product announcements
Date
Reading Time
5 mins
<script type="application/ld+json" id="blog-schema"> { "@context": "https://schema.org", "@type": "BlogPosting", "headline": "", "description": "", "author": { "@type": "Person", "name": "Shreshth Kapoor", "url": "https://www.linkedin.com/in/shreshth-kapoor" }, "publisher": { "@type": "Organization", "name": "Osiflow HMI", "url": "https://osiflow.space" }, "datePublished": "", "url": "", "mainEntityOfPage": { "@type": "WebPage", "@id": "" } } </script> <script> (function() { try { const schema = document.getElementById('blog-schema'); if (!schema) return; const obj = JSON.parse(schema.textContent); obj.headline = document.title.split('|')[0].trim(); obj.description = document.querySelector('meta[name="description"]')?.content || ''; obj.url = window.location.href; obj.datePublished = document.querySelector('meta[property="article:published_time"]')?.content || document.querySelector('time')?.getAttribute('datetime') || new Date().toISOString(); obj.mainEntityOfPage['@id'] = window.location.href; schema.textContent = JSON.stringify(obj); } catch(e) {} })(); <


TLDR by Avaline
Robots are being deployed at unprecedented scale. No shared design infrastructure exists for the interfaces & experiences operators use to control them. Osiflow HMI is building it.
Background
10 years ago I designed a robot. Not the code or its circuit, but its form. Crafting the form makes you think about HMI. The way it should make a person feel standing in front of something they had never seen before. Perhaps that is where my love for robots started. The question about the human side of machines is what fascinates me to this day. After years on the field having done multiple projects, it's time to launch Osiflow's first product as a research preview.
Why now?
The scale of robotics deployment in 2026 is not gradual. With the AI revolution, it's exponentially growing.
Boston Dynamics' Atlas is in commercial production at Hyundai, with 30,000 units per year targeted by 2028 and all 2026 allocation already gone.
BMW deployed humanoids on a European factory floor this February.
Zipline crossed 2 million autonomous drone deliveries with zero recorded injuries.
At MODEX 2026, the entire logistics industry became a physical AI showcase in four days. This is not a trend. It is a transition.
Every one of these deployments has a human somewhere in the loop, looking at an interface to interact with the bot or fleet of bots. The tools to address this have also arrived at the same moment.
Claude Design launched this April.
Figma opened its canvas to AI agents this year.
The distance between a design idea and something a real operator can test has never been shorter. For a discipline as under-resourced as robotics HMI, that is a meaningful shift & we want to do something for it.
The Landscape
The tools that exist today were built for adjacent problems. Formant and InOrbit help fleet managers monitor robots. Not design how operators experience them. Foxglove and RViz make robot data legible to engineers, not to operators. Good tools. Not focused on the HMI needs.
On the design side, Qt Design Studio has an HMI canvas and component library but targets embedded automotive panels and requires QML to ship anything real. ProtoPie handles HMI simulation well but produces prototypes, not deployable interfaces. Osiflow Creative Agency & HelloRobo are the only firms explicitly designing for robotics HMI and they are agencies, not products. A Figma file per engagement, from scratch.
Every robotics team today either adapts Figma with no robotics-specific primitives, assembles components from Transitive Robotics or builds from scratch in React via Claude. There is no shared infrastructure. That is the gap Osiflow HMI was built to close.
The Cost of the Gap?
The cost of that gap is not aesthetic. When a humanoid bot fails at 2am in a warehouse, what lands on the operator's phone determines whether they respond correctly or freeze. When a drone fleet encounters an edge case mid-mission, the interface is the only thing standing between a confident decision and a dangerous one.
Emotion mediates how fast operators decide under pressure. Cognitive load determines whether they act correctly or hesitate at the wrong moment. Mode confusion, the documented failure pattern across aviation and industrial automation, happens when an automated system changes state and the interface never communicated the transition.
Robotics is inheriting every one of these failure patterns right now, at deployment scale, with no shared design language to address them. And as AI agents enter the loop alongside human operators, reading the same interface and acting at machine speed, the stakes compound further. The interface layer is not a product detail. It is operational infrastructure.
The problem nobody has named?
The problem in 2026 is not that engineers and designers don't talk. Roles are blending. Engineers design & designers learn systems. The cognitive failure modes are documented and well understood in other industries. Just not applied here yet.
Mode confusion, where an automated system changes state and the interface never communicated the transition, caused the majority of serious incidents in commercial aviation before systematic HMI research addressed it. Hundreds or even thousands of examples.
Cognitive overload, poor telemetry hierarchy, and broken situational awareness have the same effect on robotics operators today. Emotion mediates how fast a person decides under pressure. Cognitive load determines whether they act correctly or freeze entirely.
These are not design preferences. They are operational risks being inherited at deployment scale right now.
Six years inside this problem
That robot in 2016 was where it started. In college. I then worked with Team Astral on CanSat, competitive satellite engineering that embedded a certain kind of systems thinking early.
Four years at FlytBase followed, building ground control systems and multi-operator interfaces for enterprise drone deployments at scale. The translation tax was visible in every project. Hardware had rigour. Software had rigour. The interface layer had urgency and whatever the team could ship before the deadline.
Over the past two years, designing robotics and deep tech products at Osiflow Creative Agency, the pattern held across every engagement. Design teams working on robotics products spend more time trying to understand the robot's system than they spend designing the interface for it. The process has no infrastructure. Everything starts from scratch.
Introducing Osiflow HMI
Osiflow HMI is a design workspace for the human layer of the robotics stack. It starts where every other robotics tool stops. At the operator, the interface, the moment between a robot's state and a human decision.
The work is grounded in human factors research, not design convention or adapted general-purpose tools.
What does an operator need to comprehend in three seconds?
How does an interface communicate robot intent, not just robot data?
How do you design for AI agents and humans reading the same screen simultaneously?
These questions have no shared answers in robotics today. That is what this workspace is being built to address.
This is a research preview. We are building Osiflow HMI with the first teams who join, not for them. If you are designing operator interfaces for robotic systems and you know what this process is missing, we want to hear from you.
Request early access today.

Your operators deserve better interfaces.
Generate your first operator UI in minutes. No robot required to start.

Your operators deserve better interfaces.
Generate your first operator UI in minutes. No robot required to start.
