The Electronic Frontier Foundation (EFF) has called on the Federal Trade Commission (FTC) to target companies employing misleading chatbots, labeling these practices as “unfair and deceptive.”
Legal Recommendations from EFF
Cory Doctorow, an advisor to the EFF, has recommended that the FTC utilize Section 5 of the Federal Trade Commission Act to confront this issue. This section prohibits “unfair or deceptive acts or practices in or affecting commerce.” Doctorow suggests that the FTC should issue guidelines to penalize companies whose chatbots deceive customers, framing such actions as violations warranting fines and other disciplinary measures.
A recent decision by the U.S. Supreme Court rescinding Chevron deference introduces complications to potential FTC actions. Previously, this doctrine allowed courts to defer to agencies for interpreting ambiguous laws. Now, judges have more authority to rule on the extent of regulatory powers, potentially provoking more business challenges against agency actions not clearly outlined in the law.
FTC's Historical Flexibility
Section 5's power is notably not precisely defined, reflecting Congress's intention to allow adaptive regulatory applications in response to evolving market conditions and business practices. Historically, the FTC has applied this statute on a flexible case-by-case basis, subject to review by the judiciary.
Despite general resistance from the technology sector, there's been an increasing push for AI-specific legislation worldwide. The National Conference on State Legislatures indicates that 2024 has seen AI-related bills introduced in over 40 states and territories, with numerous ones already enacted. Eric Goldman, a law professor at Santa Clara University, told The Register that he forecasts a challenging regulatory future for the AI industry due to this surge in legislative activity.
Real-World Impacts of Deceptive Chatbots
Doctorow underscores the need for regulatory measures by citing real-world examples where chatbots have caused considerable problems. An Air Canada passenger, misled by a chatbot about bereavement fare policies, ended up in a legal dispute and received CA$812.02 in compensation. Other incidents include chatbots offering incorrect legal advice, using offensive language with customers, and suggesting illegal activities to businesses.
The EFF stresses that the tendency of chatbots to provide incorrect information—known as “hallucinations”—is a substantial risk for consumers. FTC Chair Lina Khan has proposed leveraging Section 5 of the Federal Trade Commission Act (FTCA5) to tackle these issues. Chair Khan has previously utilized these regulatory powers to address noncompete clauses and online privacy concerns.