In a striking development, three Stanford graduate students have pioneered an AI tool capable of pinpointing locations from photographs, a feat that has raised alarm among privacy advocates. This breakthrough, named Predicting Image Geolocations (PIGEON), showcases the burgeoning prowess of artificial intelligence in the realm of geolocation.
Originally devised to discern locales in Google Street View images, PIGEON demonstrated its acumen by accurately identifying the origins of personal photographs. While the tool heralds potential benefits like aiding biologists in ecological studies or assisting in the identification of ancestral photography locations, it simultaneously ushers in a new era of privacy concerns.
Read also: Dos and Don’ts of Using AI Tools to Create Content For Your Business
Jay Stanley, a senior policy analyst at the American Civil Liberties Union with a focus on technology, expresses trepidation over the potential misuse of such advancements. The ease of access to location data, he argues, could pave the way for invasive government surveillance, corporate espionage, and even personal stalking.
The genesis of PIGEON traces back to a Stanford Computer Science course, where Michal Skreta, Silas Alberti, and Lukas Haas, united by their enthusiasm for the geolocation game GeoGuessr, sought to create an AI that could surpass human performance in the game. Utilizing OpenAI’s CLIP, a neural network adept in visual comprehension through textual data, the team trained PIGEON with a curated dataset of half a million Street View images. The result was a system capable of identifying a location’s country with 95% accuracy and pinpointing its specific location within approximately 25 miles.
PIGEON’s prowess was put to the test against Trevor Rainbolt, a renowned figure in the geoguessing community, where it emerged victorious, showcasing its ability to discern nuanced environmental details like variations in vegetation, soil, and climatic conditions.
Despite its remarkable accuracy, PIGEON did encounter occasional missteps, such as confusing the Snake River Canyon in Idaho with New Zealand’s Kawarau Gorge. However, these rare inaccuracies do little to diminish the implications of its capabilities.
Stanley, reflecting on PIGEON’s potential, highlights the concerns regarding the expansive power of AI, particularly in the hands of entities like Google, which possesses an extensive Street View database. The technology’s capacity to trace an individual’s travels or flag visits to sensitive locations remains a pressing concern.
In response to these privacy implications, the Stanford team, along with their professor Chelsea Finn, have refrained from releasing their complete model publicly. Their caution underscores the ethical considerations intertwined with technological advancements in AI. Stanley concludes by emphasizing the importance of vigilance regarding the information visible in the background of publicly shared photographs, underscoring the evolving landscape of data privacy in the digital age.