Facebook's "recklessness" has put children at risk, and it needs to "answer for its failures", said the Federal Trade Commission's (FTC) Director Samuel Levine yesterday. The comments come after the American competition regulator found that the tech giant, now rechristened as Meta, had allegedly failed to comply with a 2020 order directing it to enhance its privacy practices. Alleging that Facebook had misrepresented its privacy practices to the detriment of user privacy and children's safety online, the FTC has recommended changes to its 2020 order, including prohibiting Meta's services (like Facebook, Instagram, WhatsApp, and Oculus) from monetising the data of users under the age of 18, or once they turn 18. "The company could only collect and use such data to provide the services or for security purposes," the FTC recommended. Meta may also be banned from launching new or modified products without confirmation from an independent assessor that its privacy protocols comply with the FTC's 2020 order. Meta may also have to ensure compliance for companies it merges with or acquires, while it could also have to obtain users' "affirmative consent" for future use of facial recognition technology. "Some privacy program provisions in the 2020 order would be strengthened, such as those related to privacy review, third-party monitoring, data inventory and access controls, and employee training," the FTC recommended. "Meta’s reporting obligations also would be expanded to include its own violations of its commitments." So, what next? The FTC has issued an "Order to Show Cause" to Meta—now, it…
