Every smartphone ping, GPS coordinate, facial scan, online purchase, and social media like becomes part of your “digital exhaust”—a breadcrumb trail of metadata that the government now uses to build behavioral profiles. The FBI calls it “open-source intelligence.” But make no mistake: this is dragnet surveillance, and it is fundamentally unconstitutional.
Already, government agencies are mining this data to generate “pattern of life” analyses, flag “radicalized” individuals, and preemptively investigate those who merely share anti-government views.
This is not law enforcement. This is thought-policing by machine, the logical outcome of a system that criminalizes dissent and deputizes algorithms to do the targeting.
Nor is this entirely new.
For decades, the federal government has reportedly maintained a highly classified database known as Main Core, designed to collect and store information on Americans deemed potential threats to national security.
As Tim Shorrock reported for Salon, “One former intelligence official described Main Core as ‘an emergency internal security database system’ designed for use by the military in the event of a national catastrophe, a suspension of the Constitution or the imposition of martial law.”
Trump’s embrace of Palantir, and its unparalleled ability to fuse surveillance feeds, social media metadata, public records, and AI-driven predictions, marks a dangerous evolution: a modern-day resurrection of Main Core, digitized, centralized, and fully automated.
What was once covert contingency planning is now becoming active policy.
A data broker has been selling raw location data about individual people to federal, state, and local law enforcement agencies, EFF has learned. This personal data isn’t gathered from cell phone towers or tech giants like Google — it’s obtained by the broker via thousands of different apps on Android and iOS app stores as part of the larger location data marketplace.
The company, Fog Data Science, has claimed in marketing materials that it has “billions” of data points about “over 250 million” devices and that its data can be used to learn about where its subjects work, live, and associate. Fog sells access to this data via a web application, called Fog Reveal, that lets customers point and click to access detailed histories of regular people’s lives. This panoptic surveillance apparatus is offered to state highway patrols, local police departments, and county sheriffs across the country for less than $10,000 per year.
Fog Reveal will return a list of location signals associated with each device. Fog’s materials describe this capability as providing a person’s “pattern of life,” which allows authorities to identify “bed downs,” presumably meaning where people sleep, and “other locations of interest.” In other words, Fog’s service allows police to track people’s movements over long periods of time.
............
In other words, even if a user consents to an app collecting location data, it is highly unlikely that they consent to that data winding up in Fog’s hands and being used for law enforcement surveillance.
Fog’s second claim, that its data contains no personally identifying information, is hard to square with common understandings of the identifiability of location data as well as with records showing Fog’s role in identifying individuals.
Location data is understood to be “personally identifying” under many privacy laws. The Colorado Privacy Act specifically defines “identified individuals” as people who can be identified by reference to “specific geolocation data.” The California Privacy Rights Act considers “precise geolocation data” associated with a device to be “sensitive personal information,” which is given heightened protections over other kinds of personal information. These definitions exist because location data traces can often be tied back to individuals even in the absence of other PII. Academic researchers have shown over and over again that de-identified or “anonymized” location data still poses privacy risks.
No comments:
Post a Comment