At CEATEC 2025, Hitachi presented a connected vision for the future of industrial operations. Rather than showcasing isolated technologies, the company demonstrated how AI agents, metaverse-based digital twins, conversational machinery, and wearable sensing can work together to address a growing manufacturing challenge: maintaining efficiency, safety, and quality amid a shrinking and less experienced workforce.
This challenge needs to be addressed particularly in Japan, where skilled technicians are retiring faster than they can be replaced. Training new workers takes time, and traditional manuals or classroom instruction often struggle to transfer tacit, hands-on knowledge built through years of experience. Hitachi’s response is not simply automation, but augmentation, embedding expert know-how directly into the work environment and making it accessible in real time.
That approach was showcased through three core demonstrations at CEATEC 2025: Naivy, which guides frontline workers inside a metaverse-based digital twin; Talkative Products, which give industrial machines a conversational interface; and Sensor Gloves, designed to digitize human skills. Together, they point toward factories becoming living knowledge spaces that can continuously support workers as conditions change.
Hitachi Naivy: An AI Agent That Guides Frontline Workers Inside a Living Digital Twin
Hitachi presents Naivy at CEATEC 2025 as a next-generation AI agent and “Frontline Coordinator” designed to support on-site industrial operations.
At CEATEC 2025, Hitachi introduced Naivy as a next-generation AI agent designed to guide frontline workers through complex industrial environments using a metaverse-based digital twin. The booth presentation made it clear that Naivy is not a conceptual experiment, but a system already grounded in real factory workflows and operational constraints.
A booth slide outlines Naivy’s role as an AI agent operating within an extended metaverse to connect physical sites, field knowledge, and human decision-making.
Large-format displays and Japanese-language panels described Naivy as a “Frontline Coordinator”, operating within an on-site extended metaverse that mirrors an actual factory. Diagrams shown on the main screen position Naivy at the center of a layered system that connects the physical site, accumulated field knowledge, AI reasoning, and human decision-making. Naivy is meant to support workers on location, not replace them or automate decisions without oversight.
A navigable digital twin tied to real equipment
Naivy’s interface combines a navigable digital twin of the factory with an AI chat panel, linking spatial context with real-time guidance.
The practical role of Naivy became evident when looking at the live interface displayed on the booth monitors: Workers navigate a digitized factory environment that closely resembles a real mechanical room, complete with pipes, valves, and machinery arranged as they would be on site. Directional arrows and movement controls allow users to move through the space, while specific assets are visually framed and identified.
Naivy operates on a site-specific digital twin rather than a generic 3D model. The system knows where equipment is located and presents that information spatially, reducing uncertainty for workers who may not yet be familiar with the facility. Manuals, diagrams, and related documents are directly linked to the virtual environment, allowing users to access relevant knowledge without leaving the task context.
AI-driven troubleshooting with concrete, step-by-step guidance
A tablet close-up shows Naivy providing equipment-specific troubleshooting with step-by-step procedures and explicit human confirmation prompts. Editor’s Note: The stripes visible in the photo were not present on the tablet screen during the on-site demo. They are camera artifacts caused by the shutter speed interacting with the display’s refresh rate.
On the tablet-based Naivy interface displayed at the booth counter, I could see how Naivy combines spatial awareness with an AI chat panel that analyzes abnormal conditions and proposes corrective actions. In the depicted example, the agent identifies a specific piece of equipment by ID and explains a likely cause of a temperature-related issue, such as insufficient cooling water.
Then, Naivy provides structured procedural guidance. The interface lists confirmation steps, references concrete components like valves, and includes explicit safety reminders. Users are encouraged to verify actions and confirm whether a step was completed, reinforcing that the AI’s role is to guide and document, not to act autonomously.
Designed for abnormal situations and real-world constraints
A live booth demonstration illustrated how Naivy supports real-time response to abnormal on-site situations through visual context and structured AI guidance.
Booth panels emphasized abnormal events, night shifts, time pressure, and understaffed operations. Naivy was presented as an evolving system that coordinates information rather than replacing human judgment. At CEATEC 2025, Hitachi showcased Naivy as a way to make the metaverse practically useful by embedding it directly into frontline work.
Hitachi Talkative Products: Giving Industrial Machines a Conversational Interface
Hitachi Talkative Products at CEATEC 2025: an AI agent that enables maintenance workers to converse with industrial machines via chat or voice using operational data and documentation.
Hitachi also demonstrated Talkative Products, an AI-driven system currently under development that explores how industrial machines could communicate directly with maintenance workers through natural conversation. Rather than presenting a finished commercial product, the booth showcased a working prototype designed to illustrate how conversational AI might reshape maintenance and service workflows in industrial environments.
The idea behind Talkative Products is to move beyond dashboards and error codes by allowing workers to ask machines directly about their condition. In the live demo, visitors interacted with equipment such as air compressors, printers, and water pumps using chat or voice. The system returned concrete operational data, including temperature, pressure, operating hours, and maintenance status, and indicated whether the machine was operating normally or required attention.
This interaction is enabled by an AI agent that synthesizes multiple data sources, rather than by autonomous machine behavior. As shown in the booth diagrams and reflected in Hitachi’s official materials, Talkative Products combines generative AI with retrieval-augmented generation (RAG), drawing on manuals, operational data, and diagnostic information to generate human-readable explanations. The goal is not to automate decisions, but to make machine behavior easier to understand and act upon.
Hitachi positions Talkative Products as a step toward more intuitive, data-driven maintenance, especially in manufacturing environments facing labor shortages while experiencing growing equipment complexity.
Hitachi Sensor Glove: Digitizing Human Skill Itself
While AI agents and conversational machines address knowledge gaps, Hitachi recognizes that a significant portion of factory work remains stubbornly manual. Even with increasing automation, manual labor still accounts for a large share of manufacturing tasks, particularly in assembly, inspection, and maintenance. This is where the Sensor Glove comes into play.
The Sensor Glove integrates pressure, inertial, and microphone sensors, along with a Bluetooth unit, to capture fine-grained hand movements that are difficult to observe visually. A live CEATEC demonstration showed the Sensor Glove in use during a manual task, with sensing occurring directly on the worker’s hand.
The Sensor Glove was initially developed to solve a particular problem: connector insertion defects in assembly lines, where products were shipped with components partially inserted. Traditional cameras could not reliably detect these errors due to blind spots. Hitachi’s solution was to instrument the worker’s hands instead. The glove accurately captures hand movements, pressure, and timing, regardless of visual obstructions.
Over time, the scope of the Sensor Glove expanded. By digitizing manual actions, Hitachi created a way to record, analyze, and share human skills in real time. This enables improvements in quality control, productivity, and training, allowing inexperienced workers to be compared with skilled operators using data-driven guidance.
Real-time sensor dashboards visualize pressure, acceleration, gyro, geomagnetic, and audio data captured simultaneously during manual work.
Most importantly, the Sensor Glove addresses skill transfer, one of the hardest challenges in manufacturing. By capturing movements rooted in muscle memory and intuition, Hitachi helps preserve expertise that might otherwise be lost to retirement.
Multiple sensor streams are displayed together to analyze subtle differences in motion, force, and workload that distinguish expert performance.
Safety is another key application. By identifying excessive or hazardous workloads, the system can help prevent injuries and improve workplace ergonomics. While manufacturing remains the initial focus, applications are being explored in healthcare, construction, welfare, and service industries.
At CEATEC 2025, Hitachi presented the Sensor Glove as a working demonstration and future-oriented system under continued development, showcasing how digitized human motion could be applied to real industrial workflows. The company positions the Sensor Glove as a wearable human-measurement technology that digitizes manual labor itself, complementing AI-driven systems for machines and environments.
In the context of CEATEC 2025, the Sensor Glove completes Hitachi’s vision. Where Naivy digitizes the environment and Talkative Products digitize machines, the Sensor Glove digitizes people, creating a comprehensive, data-rich view of industrial work.
Conclusion: The Shift Toward Industry 5.0 Is On
These three demos give a good sense of where Hitachi sees industrial AI heading next. Rather than pushing toward fully humanless automation, the focus here is on helping people do their jobs better on the factory floor. In my opinion, this is perfectly aligned with the key principles of the next big industrial shift, aka Industry 5.0. Please read more on this here.
For instance, Naivy helps workers navigate complex environments, Talkative Products make machine behavior easier to understand, and the Sensor Glove captures hands-on skills that are usually hard to put into software.
I genuinely enjoyed spending time at the Hitachi booth at CEATEC 2025 because the demos stayed close to real-world factory problems, such as abnormal temperature alerts, routine maintenance, and day-to-day safety checks. The takeaway was simple: AI is not meant to replace frontline workers, but to help them move faster, make fewer mistakes, and feel more confident about what they are doing.
With labor shortages worsening and industrial systems becoming more complex, Hitachi’s approach points to a factory that is not just more automated but also easier to understand, easier to work with, and ultimately more human.
Filed in . Read more about 3D, AI (Artificial Intelligence), CEATEC, CEATEC 2025, Hitachi, Sensors and Wearable Tech.



