TECHNOLOGY set to radically transform the metaverse has been unveiled by computer engineers at Duke University.

A group of academics have come up with new tech that will make interactions in virtual reality spaces seems far more realistic.

The computer engineers have developed virtual eyes that accurately simulate how humans look at the world.

It is hoped the new technology will allow companies building metaverse platforms to change the programming of avatars to make interactions and conversations between two people in the digital space seem more real.

Called EyeSyn, the virtual reality tech replicates how human eyes react to real life scenarios.

Human eye movements change pupils dilate depending on the situation.

They can give away if someone is bored or excited and how much they are concentrating.

Maria Gorlatova, one of the computer engineers who works at Duke, said: “If you’re interested in detecting whether a person is reading a comic book or advanced literature by looking at their eyes alone, you can do that.”

She said that after gathering data from hundreds of people wearing headsets for hours at a time, the academics had successfully come up with an algorithm to mirror the eye movements.


Sign up to get our weekly cryptocurrency, NFT and metaverse news round-ups direct to your inbox


When a person is watching someone talk, their eyes alternate repeatedly between the person’s eyes, nose and mouth.

In metaverse meetings, those eye movements can now be replicated to make digital conversations more natural.

“Where you’re prioritising your vision says a lot about you as a person, too,” Gorlatova added.

“It can inadvertently reveal sexual and racial biases, interests that we don’t want others to know about, and information that we may not even know about ourselves.”

The results of the EyeSyn programme will be presented at the International Conference on Information Processing in Sensor Networks in May.

In the academic paper published this week outlining the results of the findings, the computer scientist said of their technology: “EyeSyn can benefit applications that feature animated characters or avatars, such as video games, social conversational agents, and photo-realistic facial animation for virtual
reality.

“In these applications, the virtual avatars should have realistic eye movements that are consistent with the ongoing activity and the visual stimuli.

“The gaze signals synthesised by EyeSyn can be used as the inputs of the avatar model to produce realistic eye movements for the facial animation.”

Hoping the research will be built on in future, Gorlatova said: “The synthetic data alone isn’t perfect, but it’s a good starting point.”

What happened last week in the metaverse? Find out HERE.