Two Harvard University students are exposing how much personal information is publicly available online by combining smart glasses and artificial intelligence to collect that data by just looking at someone.
AnhPhu Nguyen is a junior at Harvard studying human augmentation, and Caine Ardayfio is a junior studying physics. They used a pair of Ray-Ban Meta Smart Glasses available to them through Harvard's Augmented Reality Club, and they built software that allowed them to use existing search engines and facial recognition technology to identify people on the spot, using those lenses.
WATCH ANYTIME FOR FREE
>Stream NBC10 Boston news for free, 24/7, wherever you are. |
"You get a video feed from the glasses, and we have a bot that takes those video data and tries to find a face in it," explained Ardayfio. "If it finds a face, then it will upload it to this tool called 'PimEyes,' and it will essentially, it's called 'reverse image search,' where you take an image and you find other similar images online. Once you have URLs of those other images, we use an AI basically to try and figure out a person's name. Once we find their name, we use databases like voter registration databases to find an address, phone number, that type of thing."
Using a large language model — a type of artificial intelligence algorithm — the data is compiled within a minute in an app they created.
Get updates on what's happening in Boston to your inbox. Sign up for our >News Headlines newsletter.
They posted a video on social media demonstrating the technology by showing other Harvard students and strangers the personal information they learned through the process. The reveals were met with amazement and shock.
Are we ready for a world where our data is exposed at a glance? @CaineArdayfio and I offer an answer to protect yourself here:https://t.co/LhxModhDpk pic.twitter.com/Oo35TxBNtD
— AnhPhu Nguyen (@AnhPhuNguyen1) September 30, 2024
"We were surprised just how much data you could extract now that large language models are the piece that unlocks the rest of the pipeline," said Nguyen. "Most people didn't even know that these tools could exist where you could just find someone's home address from their name, so people were rightfully scared. That's why when we started building this, we opted to also make that guide to solve the problem immediately."
They compiled a list of instructions on how to opt out of reverse face search engines, people search engines and how to protect yourself from data leaks.
In a report Friday on Nguyen's and Ardayfio's findings, the Boston Globe noted that federal law does not bar the use of facial recognition systems, with legal standards varying across the country.
Some states like Massachusetts restrict government use of the technology, with Boston and Cambridge banning government use completely, the Globe wrote.
More on artificial intelligence
Nguyen said he can imagine some positive-use cases, such as in networking situations or even to help people with memory loss.
"Some people have reached out about using this for dementia patients for known faces, so they wear glasses, it just identifies who the name is," said Nguyen. "That could really help a lot of people who struggle recognizing faces and people who struggle remembering things in general. I mean, my grandma had Alzheimer's, so I think that's a huge use case."
He said some magicians have even reached out about the possibility of using the technology, but they don't plan to release the product or their code because they recognize how it could be misused.
"Basically, why we did this was to raise awareness of how much data we have just publicly available," said Nguyen. "All of the data that we collect is publicly accessible by anyone. You can instantly delete yourself and make this tool ineffective."
These students aren't the first to experiment with this kind of technology.
Last year, the New York Times reported, tech giants Google and Facebook built tools years ago that could put a name to any face, but had chosen to hold the technology back, deciding that it was too dangerous to make widely available, however companies continue to develop similar tools.