Harvard students develop Meta smart glasses app that reveals people’s sensitive details


Ray-Ban Meta smart glasses were used by two Harvard engineering students to create an app that can expose sensitive information about people without them knowing. The students posted a demo of the video on X (formerly known as Twitter) and demonstrated the app’s potential. Notably, the app is not being made publicly available to users, instead, they created it to highlight the dangers of AI-powered wearable devices that use discreet cameras that capture photos of people. And can capture images.

The app, called i-Xray, uses artificial intelligence (AI) for facial recognition and then uses the processed visual data to doc individuals. Doxxing, a popular Internet slang that is a form of “dropping docs (unofficial release of documents or documents)”, is the act of revealing personal information about someone without their consent.

It was integrated with Ray-Ban Meta smart glasses, but the developers said it will work with any smart glasses with discreet cameras. It uses the same AI models as PimEyes and FaceCheck for reverse facial recognition. This technology can match a person’s face to publicly available pictures online and check the URL.

These URLs are then fed to another large language model (LLM) and an automated signal is generated to detect the person’s name, occupation, address and other similar data. The AI ​​model also looks at publicly available government data such as voter registration databases. Besides, an online tool named FastPeopleSearch was also used for this.

In a short video demonstration, Harvard students Anh Phu Nguyen and Ken Ardefio also demonstrated the app’s workings. They were already able to approach strangers with the camera on and ask their name, and an AI-powered app could take over from there to find personal data about the person.

In a Google Docs file, the developers said, “This synergy between LLM and reverse face search allows for fully automated and comprehensive data extraction that was not previously possible with traditional methods alone.”

The students have said that they do not intend to make the app publicly available and that they developed it only to highlight the risks of an AI-enabled wearable device that could discreetly record people. However, this does not mean that bad actors cannot create similar apps using the same methodology.



Source link

G Varshith
G Varshith
Articles: 2260

Leave a Reply

Your email address will not be published. Required fields are marked *