LAS VEGAS, Jan. 6, 2026 /PRNewswire/ -- At CES 2026, LLVision today announced Leion Hey2, the world's first AI-powered professional AR translation glasses, marking the product's official launch in the United States.
Supporting more than 100 languages and dialects, Leion Hey2 delivers sub-500 millisecond translation latency in real-world use and offers six to eight hours of continuous translation on a single charge. By turning spoken language into real-time subtitles displayed directly in the wearer's line of sight, the glasses are designed to keep conversations natural and uninterrupted.
four-microphone array with 360 degree spatial voice detection
press the right touchpad to activate AI Q&A
AIS Impact Award 2025&UNESCO Netexplo Innovation Award
"This is what translation really means to us," said Dr. Wu Fei, founder and CEO of LLVision. "It's not just about words. It's about giving people the freedom to speak, to connect, and to be truly understood."
Founded in 2014, LLVision has spent more than a decade developing integrated AI and AR solutions focused on real-world multilingual communication.
Translation as a Purpose, Not a Feature
As communication increasingly crosses borders and cultures, translation technology has become more capable, but often at the expense of human presence. Smartphones pull attention away from the person in front of us. Earbuds isolate users from their surroundings. Many smart glasses attempt to do everything at once, yet struggle to support what matters most in conversation: staying connected.
Leion Hey2 is designed around live, face-to-face conversation from the ground up. While many smart glasses treat translation as a secondary feature layered onto entertainment, recording, or social experiences, Hey2 takes the opposite approach. Translation is not an add-on. It is the core.
From hardware architecture to interaction design, every decision is shaped by the demands of real-time, human communication. Spoken language is converted into real-time subtitles displayed directly in the wearer's line of sight, allowing conversations to unfold naturally without interruption.
Hey2 supports more than 100 languages and dialects with sub-500 millisecond latency in real-world conditions. In live conversations, even brief delays can disrupt rhythm, tone, and comprehension. By keeping translation fast and visually immediate, Hey2 allows dialogue to unfold naturally rather than feeling mediated by technology.
Battery performance reinforces that professional focus. With six to eight hours of continuous translation and up to 96 hours of total use via the charging case, Hey2 is designed for full workdays rather than short demos. It is built to last through international meetings, conferences, travel days, and multilingual classrooms without constant recharging.
Head Up, Human Connection First
Conversations stay natural when subtitles appear directly within the wearer's field of view, a design approach used in Leion Hey2.
Through an optical AR head-up display, subtitles appear where conversations actually happen. There is no phone to hold, no device to pass around, and no need to look down mid-sentence. Information stays in the forward line of sight, preserving eye contact and conversational flow.
Waveguide optics paired with a micro-LED light engine deliver crisp, stable text across real-world lighting conditions. The display is intentionally restrained, minimizing visual artifacts and avoiding distraction while keeping subtitles clear and readable over extended use.
With subtitles visible at all times, Hey2 also supports real-time captioning for Deaf and hard-of-hearing users, expanding accessibility without requiring separate tools or workflows.
Engineered for Real-World Conversations
Face-to-face conversation remains the most common and demanding use case for real-time translation, and Hey2 is designed with that reality in mind.
In Free Talk mode, face-to-face directional pickup prioritizes the person you are facing. Voices within an approximately 60 degree forward range are emphasized, while side conversations and ambient noise are suppressed. This keeps translation clear without requiring users to manage speakers or adjust settings mid-conversation.
A four-microphone array with 360 degree spatial voice detection continuously identifies the active speaker's direction before processing begins. Proprietary beamforming and neural noise reduction then isolate speech in real time, maintaining accuracy in meetings, group discussions, and busy public environments.
By aligning audio pickup with natural human posture, who you face rather than who speaks the loudest, Hey2 keeps translation intuitive, focused, and uninterrupted.
AI That Supports, Not Distracts
Beyond translation, Hey2 includes an optional AI Q&A feature for quick, contextual look-ups in the real world. It is built for moments of curiosity, not continuous conversation.
To use it, press the right touchpad to activate AI Q&A, ask a question naturally, then wait for the response to appear in view. The interaction requires no manual text input and no phone, making it discreet and well suited to settings such as museums, exhibitions, city walks, and travel stops.
Instead of competing for attention, AI in Hey2 is intentionally constrained. It provides on-the-spot knowledge when you want it, then steps back, reinforcing the device's core role of helping people understand each other naturally and in real time.
Designed to Be Worn, Not Noticed
Weighing just 49 grams, Hey2 is designed to blend seamlessly into daily life, both visually and socially.
A lightweight magnesium-lithium alloy frame, adjustable titanium nose pads, and a classic browline silhouette recognizable since the 1950s combine to deliver comfort and familiarity. A stepless spring hinge adapts naturally to different face shapes, ensuring stability throughout the day.
Just as intentionally, Hey2 avoids the visual and social signals typically associated with smart glasses. There is no camera and no external speakers. Audio input is used solely for translation, allowing the device to remain discreet and appropriate in professional, educational, and diplomatic environments.
All data processing follows GDPR-aligned privacy principles and is supported by secure cloud infrastructure built on Microsoft Azure. Users remain in control, with clear options to review, manage, or delete translation history at any time.
The result is AR glasses that look and feel like everyday eyewear, enabling real-time translation without drawing attention to technology or raising privacy concerns.
Proven in High-Stakes, Real-World Environments
Leion Hey2 has already been used and validated in real-world environments where accuracy and reliability are critical.
In 2025, the glasses were demonstrated at the United Nations Accessibility for All Exhibition in Geneva. They also supported multilingual communication at diplomatic and international forums, and were used in large-scale live trials. These included a two-hour presentation delivered entirely through the glasses by LLVision's CEO.
The technology has also contributed to award-winning accessibility research. LLVision served as an industry partner on a project recognized by the AIS Impact Award 2025 for improving communication access for Deaf and hard-of-hearing communities.
The first-generation Leion Hey has shipped more than 30,000 units worldwide, with users averaging 150 minutes of daily use. It was previously recognized among the UNESCO Netexplo Innovation Award Top 10.
Hey2's global unveiling in Seoul generated more than 10,000 pre-orders within a single day, underscoring demand for a translation device designed for real-world use.
Pricing and Availability
Leion Hey2 is now available for order in the United States through LLVision's official online store, priced at USD 549. From January 6 to January 31, customers can pre-order Leion Hey2 at USD 499. Pre-order purchases include a clip-on sunglass lens and 1,200 minutes of Pro translation service.
Share this article