Researchers reconstruct 3D environments from eye reflections

Researchers at the University of Maryland have turned eye reflections into (somewhat discernible) 3D scenes. The work builds on Neural Radiance Fields (NeRF), an AI technology that can reconstruct environments from 2D photos . Although the eye-reflection approach has a long way to go before it spawns any practical applications, the study (first reported by Tech Xplore ) provides a fascinating glimpse into a technology that could eventually reveal an environment from a series of simple portrait photos. The team used subtle reflections of light captured in human eyes (using consecutive images shot from a single sensor) to try […]

Click here to view original web page at www.engadget.com

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.