The Center for Digital Democracy and other advocacy groups have filed a complaint with the Federal Trade Commission against Replika, an AI chatbot company, alleging deceptive marketing practices and potential harm to users. The complaint focuses on two main issues: the company’s handling of sexual content and its mental health claims. The groups argue that Replika markets itself as a mental health tool while lacking proper medical validation, potentially endangering vulnerable users seeking emotional support. The complaint also highlights concerns about the chatbot’s sexual content and role-play features, particularly regarding access by minors and the creation of explicit AI-generated images. Critics argue that Replika’s marketing downplays the sexual nature of interactions while simultaneously promoting them through targeted ads. The complaint emphasizes the lack of age verification systems and the potential risks of emotional manipulation, especially for young users who may form strong attachments to their AI companions. Additionally, the advocacy groups question Replika’s data collection practices and the company’s claims about user privacy. The case represents a broader concern about AI chatbot regulation and the need for stronger consumer protections in the emerging AI companion market. The FTC is being urged to investigate these practices and establish clearer guidelines for AI companies marketing emotional support services.