Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

Ad image

Want to spot a deepfake? The eyes could be a giveaway

MONews
6 Min Read

Clues to deepfakes may be in the eyes.

Researchers at the University of Hull in England reported July 15 that eye reflections offer a potential way to suss out AI-generated images of people. The approach relies on a technique also used by astronomers to study galaxies.

In real images, light reflections in the eyeballs match up, showing, for instance, the same number of windows or ceiling lights. But in fake images, there’s often an inconsistency in the reflections. “The physics is incorrect,” says Kevin Pimbblet, an observational astronomer who worked on the research with then–graduate student Adejumoke Owolabi and presented the findings at the Royal Astronomical Society’s National Astronomy Meeting in Hull. 

To carry out the comparisons, the team first used a computer program to detect the reflections and then used those reflections’ pixel values, which represent the intensity of light at a given pixel, to calculate what’s called the Gini index. Astronomers use the Gini index, originally developed to measure wealth inequality in a society, to understand how light is distributed across an image of a galaxy. If one pixel has all the light, the index is 1; if the light is evenly distributed across pixels, the index is 0. This quantification helps astronomers classify galaxies into categories such as spiral or elliptical.

In the current work, the difference in the Gini indices between the left and right eyeballs is the clue to the image’s authenticity. For about 70 percent of the fake images the researchers examined, this difference was much greater than the difference for real images. In real images, there tended to be no, or close to no, difference.

Each of these pairs of eyes (left) have reflections (highlighted at right) that reveal them as deepfakes.Adejumoke Owolabi

“We can’t say that a particular value corresponds to fakery, but we can say it’s indicative of there being an issue, and perhaps a human being should have a closer look,” Pimbblet says.

He emphasizes that the technique, which could also work on videos, is no silver bullet for detecting fakery (SN: 8/14/18). A real image can look like a fake, for example, if the person is blinking or if they are so close to the light source that only one eye shows the reflection. But the technique could be a part of a battery of tests — at least until AI learns to get reflections right.

Share This Article