News / Science News

    Simulated human eye movement aims to train metaverse platforms

    Computer engineers based at Duke University have developed virtual eyes that simulate how humans look at the world. The virtual eyes are accurate enough for companies to train virtual reality and augmented reality applications.



    "Virtual eyes" replicate how human eyes track and react to stimuli. Photo: pixabay


    "The aims of the project are to provide improved mobile augmented reality by using the Internet of Things to source additional information, and to make mobile augmented reality more reliable and accessible for real-world applications," said Prabhakaran Balakrishnan, a program director in NSF's Division of Information and Intelligent Systems.

    The program, EyeSyn, will help developers create applications for the rapidly expanding metaverse while protecting user data.

    "If you're interested in detecting whether a person is reading a comic book or advanced literature by looking at their eyes alone, you can do that," said Maria Gorlatova, one of the study authors.

    "But training that kind of algorithm requires data from hundreds of people wearing headsets for hours at a time. We wanted to develop software that not only reduces the privacy concerns that come with gathering that sort of data, but allows smaller companies that don't have those levels of resources to get into the metaverse game."

    Eye movements contain data that reveal information about responses to stimuli, emotional state, and concentration. The team of computer engineers developed virtual eyes that were trained by artificial intelligence to mimic the movement of human eyes reacting to different stimuli.

    The information could be a blueprint for using AI to train metaverse platforms and software, possibly leading to algorithms customized for a specific individual. It could also be used to tailor content production by measuring engagement responses.

    "If you give EyeSyn a lot of different inputs and run it enough times, you'll create a data set of synthetic eye movements that is large enough to train a [machine learning] classifier for a new program," Gorlatova said.

    When testing the accuracy of the virtual eyes, the engineers compared the behavior of human eyes to the virtual eyes viewing the same event. The results demonstrated that the virtual eyes closely simulated the movement of human eyes.

    "The synthetic data alone aren't perfect, but they're a good starting point," Gorlatova said.

    "Smaller companies can use this rather than spending time and money trying to build their own real-world datasets [with human subjects]. And because the personalization of the algorithms can be done on local systems, people don't have to worry about their private eye movement data becoming part of a large database." (National Science Foundation)

    APRIL 20, 2022



    YOU MAY ALSO LIKE

    Icefields that stretch for hundreds of miles atop the Andes Mountains are melting.
    Spinosaurus is the biggest carnivorous dinosaur ever discovered but the way it hunted has been a subject of debate for decades.
    Remains of plant material wedged between layers of volcanic rock.
    Sensor is quicker and cheaper than current opioid wastewater monitoring methods.
    Data from the research yield a new aerodynamic model and flight simulator.
    New study challenges long-held assumptions about the structure of the sun's atmosphere.

    © 1991-2023 The Titi Tudorancea Bulletin | Titi Tudorancea® is a Registered Trademark | Terms of use and privacy policy
    Contact