Brazilian data protection agency ANPD has instructed Meta to stop utilizing personal data from Brazilian users in training its AI models. This order, similar to those imposed in Europe, leaves Meta facing daily fines if noncompliant.
The ANPD’s ruling, detailed in the official blog, stems from concerns about potential harm and difficulties users might face. Meta has five business days to comply or risk daily penalties of 50,000 reais ($8,808). The regulator warned of the “imminent risk” to the fundamental rights of Brazilian users.
Privacy Policy and User Data Collection
Meta amended its privacy policy in May to allow public data from Facebook, Messenger, and Instagram to be used for AI training in Brazil. This encompasses various forms of content such as posts and images. However, the ANPD argues that the policy poses “excessive and unjustified obstacles” for users seeking to opt out. Meta insists its policy is in line with Brazilian laws, viewing the order as a setback for AI innovation and competition.
In a statement to the Associated Press, Meta has voiced its disappointment with the ANPD’s decision, defending the practice of AI training with user data as standard in the tech sector. The company claims to be more transparent than many competitors and shows a willingness to work with ANPD to resolve concerns.
Human Rights and Ethical Concerns
A report from Human Rights Watch indicated that the LAION-5B dataset, employed in training AI models, includes identifiable images of Brazilian children. This raises ethical issues about potential misuse, including deepfakes. Meta’s updated data collection policy has also been rolled out in the United States, where privacy protections are generally less stringent.
Meta’s challenges with regulators extend beyond Brazil. Similar issues in the European Union have forced a suspension on plans to use Facebook and Instagram data for AI training. The situation in Brazil highlights the growing global regulatory scrutiny on Meta’s AI practices.
Last Updated on November 7, 2024 3:43 pm CET