The Narrative, August 1, 2023

The Narrative, August 1, 2023

U.S. AI Regulation Looks a Lot Like Content Moderation


On July 21, the White House announced it had secured Voluntary Commitments from several leading companies to help manage risks posed by artificial intelligence. The seven companies signing on are: Amazon, Anthropic, Google, Inflection, Meta, Microsoft, and OpenAI. The Biden administration claims credit for jawboning them into this agreement. Covering “safety, security and trust,” the firms will, among other things:


  • Commit to internal and external red-teaming of models or systems in areas including misuse, societal risks, and national security concerns, such as bio, cyber, and other safety areas.

  • Work toward information sharing among companies and governments regarding trust and safety risks, dangerous or emergent capabilities, and attempts to circumvent safeguards.

  • Invest in cybersecurity and insider threat safeguards to protect proprietary and unreleased model weights

  • Incent third-party discovery and reporting of issues and vulnerabilities

  • Develop and deploy mechanisms that enable users to understand if audio or visual content is AI-generated, including robust provenance, watermarking, or both, for AI-generated audio or visual content

  • Publicly report model or system capabilities, limitations, and domains of appropriate and inappropriate use, including discussion of societal risks, such as effects on fairness and bias

  • Despite some howls of protest from those convinced government regulation is the only path forward, voluntary commitments by the leading developers is really the only intervention that makes sense right now. Those calling loudly for “AI regulation” have never been able to specify exactly what they would regulate, what agency would do it, and what criteria would be used. Crafting specific legislation or rules and anticipating specific results under conditions of rapid technological change is near impossible. Voluntar ..

    Support the originator by clicking the read the rest link below.