TRENDING ON ONEINDIA
- Rs 100 Coin To Be Introduced Soon — Check It Out
- New Maruti Wagon R 2019 Spy Pics — Looks Bigger And Premium Than The Current Model
- PUBG Mobile Vikendi Snow Map To Release In December
- Central Railway Recovers Whopping Rs 125 Crore From Errant Travellers
- Here's Why You Should Celebrate Christimas The 'Pondicherry Way'
- India vs Australia: 2nd — Aussies Ahead
- Woman Gets Engaged To A Chandelier And Gets A Tattoo Of It
- Alia Bhatt Reveals Why She Looked So Sad With Ranbir Kapoor!
Most of the times when we read a news, we can understand whether it's fake or real. But it's hard to differentiate between a real and a fake video. Google engineer Supasorn Suwajanakorn has come up with a tool that can make a realistic enough video by mimicking the way a person talks. How is it possible? You must be wondering.
Well, the tool scans existing footage of their mouth and teeth in order to create the perfect lip-sync. While it sounds exciting, this tool can prove to be dangerous if it goes to wrong hands. This is why, the engineer is closely working with the AI Foundation on a 'Reality Defender' app, which would run automatically in web browsers to spot and flag fake pictures and videos.
"I let a computer watch 14 hours of pure Obama video, and synthesized him talking," Suwajanakorn commented while showing his work at the TED Conference in Vancouver on Wednesday.
Such technology would be able to make virtual versions of people who are no longer here. Sounds something out of a Black Mirror episode! Imagine asking advice from your late grandparents, or meeting a loved one who has passed away.
What's more, Suwajanakorn noted a New Dimensions in Testimony project that allows people to have conversations with holograms of Holocaust survivors. However, he also raised his concerns over the potential misuse of this technology.
"These results seemed intriguing, but at the same time troubling; it concerns me, the potential for misuse," he was quoted as saying. "So, I am also working on counter-measure technology to detect fake images and video."
He also gave an example of how war could break by a video of a world leader announcing a nuclear strike. So he has made this tool called 'Reality Defender', which will not only scan for manipulated pictures or videos, but also let users report fake videos or pictures.
"Video manipulation will be used in malicious ways unless counter-measures are in place," the Google engineer noted. "We have to make it very risky and cost-ineffective."
Needless to say, the technology will take some time to get fully developed. We can only wait for now.
Inputs from AFP