Illustration for: Flock hires overseas gig workers to train AI that monitors U.S. movements
Industry Applications

Flock hires overseas gig workers to train AI that monitors U.S. movements

3 min read

Flock, a startup that builds AI‑driven surveillance tools, has turned to a global pool of gig workers to annotate the video streams its cameras capture on American streets. The company outsources the painstaking task of labeling each frame—identifying license plates, vehicle colors, brands and models—to contractors based overseas, a practice that mirrors other AI‑training pipelines but carries its own set of implications. While the technical challenge of teaching a system to recognize moving objects is formidable, the stakes rise when the data set consists of continuous, city‑wide footage of everyday traffic.

Critics point out that the constant sweep of public roadways creates a repository far more personal than the typical image sets used for facial‑recognition or product‑tagging models. As Flock scales its operation, the question of how sensitive that visual record is—and who ultimately handles it—becomes central to any assessment of the technology’s broader impact.

But the nature of Flock's business--creating a surveillance system that constantly monitors US residents' movements--means that footage might be more sensitive than other AI training jobs. Flock's cameras continuously scan the license plate, color, brand, and model of all vehicles that drive by. Law

Advertisement

But the nature of Flock's business--creating a surveillance system that constantly monitors US residents' movements--means that footage might be more sensitive than other AI training jobs. Flock's cameras continuously scan the license plate, color, brand, and model of all vehicles that drive by. Law enforcement are then able to search cameras nationwide to see where else a vehicle has driven.

Authorities typically dig through this data without a warrant, leading the American Civil Liberties Union and Electronic Frontier Foundation to recently sue a city blanketed in nearly 500 Flock cameras. Broadly, Flock uses AI or machine learning to automatically detect license plates, vehicles, and people, including what clothes they are wearing, from camera footage. A Flock patent also mentions cameras detecting "race." Multiple tipsters pointed 404 Media to an exposed online panel which showed various metrics associated with Flock's AI training.

It included figures on "annotations completed" and "annotator tasks remaining in queue," with annotations being the notes workers add to reviewed footage to help train AI algorithms. Tasks include categorizing vehicle makes, colors, and types, transcribing license plates, and "audio tasks." Flock recently started advertising a feature that will detect "screaming." The panel showed workers sometimes completed thousands upon thousands of annotations over two day periods. The exposed panel included a list of people tasked with annotating Flock's footage.

Taking those names, 404 Media found some were located in the Philippines, according to their LinkedIn and other online profiles.

Related Topics: #AI #surveillance #Flock #gig workers #license plates #American Civil Liberties Union #Electronic Frontier Foundation #video annotation

Is it appropriate for a U.S. surveillance firm to hand its visual data over to workers abroad? Flock’s practice of hiring Upwork freelancers to label footage of American streets was revealed when internal documents leaked.

The training guides explicitly tell contractors how to sort images of people and vehicles captured by the company’s license‑plate readers. Cameras continuously record plate numbers, colors, brands and models of every passing car. Consequently, the data set includes details that could identify individual movements.

Data is sensitive. Yet the company has not disclosed who ultimately sees the annotated material or how it safeguards it after the gig workers finish. The exposure raises questions about the chain of custody for surveillance footage that is arguably more sensitive than typical AI training inputs.

Because the annotated images travel beyond the original capture point, regulators may need to consider whether existing oversight mechanisms address this transnational handling of domestic surveillance data. While Flock argues the process improves algorithm accuracy, the lack of transparency leaves it unclear whether privacy safeguards are sufficient. Stakeholders will likely scrutinize the balance between operational efficiency and the potential for broader access to domestic visual surveillance.

Further Reading

Common Questions Answered

Why does Flock outsource video annotation to overseas gig workers?

Flock uses overseas gig workers because labeling each video frame—identifying license plates, vehicle colors, brands, and models—is labor‑intensive and cost‑effective at scale. Contractors on platforms like Upwork can process large volumes of footage quickly, enabling the AI‑driven surveillance system to learn from diverse data.

What specific types of visual data do Flock's cameras capture for AI training?

Flock's cameras continuously record license‑plate numbers, vehicle colors, brands, and models of every passing car on American streets. The footage also includes images of people near the vehicles, which are annotated according to detailed training guides provided to the freelancers.

How might law enforcement use the data generated by Flock's surveillance system?

Law enforcement can query the system to locate where a particular vehicle has been seen across the nationwide camera network. This capability allows authorities to track movements without obtaining a warrant, raising concerns about privacy and due‑process safeguards.

What ethical concerns arise from Flock handing U.S. visual data to contractors abroad?

Handing sensitive footage of American streets to overseas freelancers may expose personal information to jurisdictions with weaker privacy protections, potentially compromising national security. Critics argue that this practice sidesteps domestic oversight and could enable misuse of detailed location data.

Advertisement