We witnessed the presentation of Meta Ray-Ban Displaythe new Smart Glasses shown by Mark Zuckerberg at the event Meta Connect In the Menlo Park headquarters, California. Smart glasses, made in collaboration with Essilorluxottica with photochromatic lenses, are the first equipped with high resolution integrated micro-screen. They also have a neural control system via the bracelet Neural Bandwhich proposes a new approach Wearable to the interaction between human being and artificial intelligence. Their realization requested advancements in the applied perspective and in the wearable neuroergner.
The new Smart Glasses, weight of 69 grams and 6 -hour declared autonomy with mixed use, will be powered by Meta Ai and will allow you to exchange messages and video calls with Instagram and Facebook, use the integrated camera and the navigator when walking, as well as having multilingual translations in real time. The Ray-Ban Display will be available in the USA from 30 September and in Italy at the beginning of 2026, but we still have no information on the release and price date in our country.
How the display works
The heart of the glasses is represented by a color micro-clearyinserted in the right lens. This is not an image projected directly on the retina, but of the result of a complex optical system.
The light emitted by the micro-specific, based on high efficiency microled technology, is placed in an optical wave guide. Within the lens, transparent layers and diffraction rheticicals direct the light rays exploiting the principle of total internal reflection. The light is deflected several times until it emerges in front of the eye in the form of virtual image suspended in space. This image appears at a focal distance between 1.5 and 2 meters, so that the eye perceives it as an object located at medium distance. In this way it is possible to maintain visual fire without continuous accommodative adaptations and without the natural visual field being obscured. The user can then read messages, receive simultaneous translations or follow navigation indications without taking his eyes off the real world.
Like the neural band reads muscles movements
The most radical innovation is perhaps represented by Neural Banda bracelet capable of transforming electrical signals from the wrist muscles into digital controls. This result is made possible by the use of surface electromyography. Whenever the brain sends a motor impulse, even in the case of micro-musovations that are not perceptible to the naked eye, the muscles generate electrical potentials of the order of a few milliones of volts. The electrodes of the bracelet include these variations and transmit them to a circuit of amplification and digital filtering, capable of reducing the background noise and isolating the useful signal. Subsequently, algorithms of Machine Learning They interpret the muscle activation patterns and translate them into actions: a click, a scroll, a selection gesture or even the “ghost” typing of a text without the fingers really move. The commands are finally sent to glasses through a low latency wireless connection, with response times lower than 50 milliseconds.
Engineering challenges
Behind this technology there are important scientific and engineering challenges. The micro-specifics must guarantee a sufficient level of brightness, higher than 2000 Nit (The glasses presented at Meta Connect also reach 5000 nit), to be visible also in full sunlight without compromising battery autonomy. The neural band must sample the muscle signals at high frequencies, up to 1000 Hzso not to be missed crucial details. Anatomical variability between one user and the other also imposes artificial intelligence systems capable of quickly adapting to each wrist with a minimum calibration phase. Finally, all this technology must be contained in a wearable device with a low weight and which dissipates the heat generated by electronics without affecting the experience of use.









