Search This Blog

Powered by Blogger.

Blog Archive

Labels

Signal's Meredith Whittaker Asserts: AI is Inherently a 'Surveillance Technology'

Speaking at TechCrunch Disrupt 2023, Whittaker emphasized that AI is closely intertwined with the big data and targeting sector.

 

Many companies heavily invested in user data monetization also show a keen interest in AI technology. Signal's president, Meredith Whittaker, argues that this inclination is rooted in the fact that "AI is a surveillance technology."

Speaking at TechCrunch Disrupt 2023, Whittaker emphasized that AI is closely intertwined with the big data and targeting sector, dominated by giants like Google and Meta, as well as influential enterprise and defense corporations. 

She pointed out that AI amplifies the surveillance business model, an extension of the trend observed since the late '90s with the rise of surveillance advertising. According to her, AI serves to solidify and expand this model. She metaphorically described the relationship as a complete overlap in a Venn diagram.

Whittaker further highlighted that the utilization of AI itself is inherently surveillance-oriented. For instance, passing by a facial recognition camera equipped with pseudo-scientific emotion analysis results in the generation of data, accurate or not, about one's emotional state or character. These systems are ultimately tools of surveillance, marketed to entities like employers, governments, and border control, who hold sway over individuals' access to resources and opportunities.

Ironically, she pointed out that the very individuals whose data underpins these systems are often the ones responsible for organizing and annotating it. This step is crucial in the process of creating datasets for AI.

 Whittaker stressed that it's impossible to develop these systems without the labor of humans, who inform the ground truth of the data. This often involves tasks like reinforcement learning with human feedback, which she likened to a form of disguising labor in technological jargon. While this process collectively incurs high costs, individual workers are often paid meager wages. In essence, she revealed, the perceived intelligence behind these systems diminishes significantly when the curtain is pulled back.

However, not all AI and machine learning systems share the same level of exploitation. When asked if Signal incorporates any AI tools or procedures in its app or development work, Whittaker acknowledged the presence of a "small on-device model" that they didn't develop themselves, but rather acquired off the shelf. This model is utilized in Signal's face blur feature within their media editing toolkit. 

She noted that while it's not exceptionally effective, it aids in detecting faces in crowded photos and blurring them, ensuring that individuals' intimate biometric information isn't inadvertently disclosed on social media, particularly to entities like Clearview.

Whittaker concluded by emphasizing that while this is a commendable use of AI, it doesn't negate the negative aspects she discussed earlier. She emphasized that the economic motives driving the costly development and deployment of facial recognition technology would never limit its application to this singular purpose.
Share it:

AI

Artificial Intelligence

Data

Data Safety

data security

Interview

Security

Tech

Technology