Facebook AI Wants To See The World Through Your Eyes, And That’s Scary

|

Facebook recently kicked off a research project that portrays the company's ambition to push the limits of first-person perception. The Ego4D project offers a massive collection of first-person video and supporting data along with a set of challenges for researchers to feed computers to learn and extract useful information from it.

Facebook AI

Recently, Facebook also launched Ray-Ban Stories smart glasses that pack a digital camera and other smart features. Similar to Google Glass, the new product also met with privacy concerns. The Ego4D project aims to create software that will turn smart glasses into a more useful device but could raise eyebrows due to privacy invasion. Let's understand it in detail.

What’s Makes Ego4D different?

What’s Makes Ego4D different?

Facebook says the project is based on an egocentric dataset and benchmark suite gathered from 74 locations across the world and nine countries. The dataset includes more than 3,025 hours of daily-life activity video.

The Ego in Ego4D stands for egocentric. In simpler words, it means first-person video. 4D means three dimensions of space along with the dimension of time. The AI aims to merge images, video, geographical information, and other data to build a model of the user's life.

It involves two big components: a large dataset of first-person photos and videos and a "benchmark suite" that packs five tasks that come in handy to compare different AI models or algorithms with one another.

These benchmarks analyze first-person videos to recall events that took place in the past, create diary entries, understand interactions with objects and humans, and predict future events. 855 volunteers have provided over 3,000 hours of first-person video for the dataset, captured using a variety of devices such as GoPro and AR glasses. The videos show activities done at homes, workplaces, and other social settings.

How Will Ego4D Be Useful?

How Will Ego4D Be Useful?

It's not the first-of-its-kind video dataset, but it's 20 times bigger than publicly available datasets. It comprises video, audio, 3D mesh scans of the surroundings, and synced multi-camera angles of the same event.

Usually, computer vision models are trained using annotated images and videos to perform a certain task. Facebook claims that current AI datasets and models show a third-person or a "spectator" view, limiting visual perception.

With a first-person video, it will be easier to design robots that can interact with their surroundings better. The social media giant also claims egocentric vision can potentially change the way we use VR and AR devices. If AI models can understand the world from first-person view like humans, VR and AR devices could become as useful as a smartphone.

New Way To Breach Your Privacy?

New Way To Breach Your Privacy?

Well, it goes without saying that the new AI raises significant privacy concerns. If the tech is coupled with smart glasses to record and analyze the world constantly, it could result in constant tracking of people moving around in public.

Facebook claims to have maintained high ethical and privacy standards while collecting data for the project. But despite the reassurance, it's still concerning to think about a future with smart glasses backed by a social media company that is infamous for collecting user data for generating revenue.

Best Mobiles in India

Read More About: facebook news features

Best Phones

Get Instant News Updates
Enable
x
Notification Settings X
Time Settings
Done
Clear Notification X
Do you want to clear all the notifications from your inbox?
Yes No
Settings X
X