New Democrat Bill Calls For A Federal Agency To Create Behavioral Codes and Introduce Disinformation Experts
Shared By Peter Boykin – American Political Commentator / Citizen Journalist
New Democrat Bill Calls For A Federal Agency To Create Behavioral Codes and Introduce Disinformation Experts
US politicians are sworn to defend the US constitution. And yet, not a day goes by when they don’t try to undermine it. A new bill wants to create what is essentially a federal speech control agency. Plus, senators have been plotting ways they can intervene in what AI is allowed to say.
A SPEECH AGENCY
On May 18, two US senators introduced the Digital Platform Commission Act of 2023, a bill that seeks to give powers to a new federal agency that will set up a council to regulate AI, in the context of social platforms.
More precisely, the new body – the Federal Digital Platform Commission – would “Rule” on what’s termed as enforceable “Behavioral codes,” and among those staffing it will be “Disinformation experts.” The move by the two Democratic senators – Michael Bennet and Peter Welch – seems to have come in concert with congressional testimony delivered by OpenAI CEO Sam Altman, since the bill was presented shortly afterwards, and backs Altman’s idea to form a new federal agency of the kind.
This is done by proposing that the future Commission be given the authority over how personal information is used in decision-making or content generation, which is thought to specifically refer to tech like ChatGPT. A statement issued by Bennet justified the push for such regulation by accusing “Technology” of corrupting democracy and harming children, while working without proper oversight.
Nor does he think that the Federal Trade Commission or the Department of Justice can do the job – because they lack both expert staff, and resources to provide “Robust and sustained” regulation of the digital platforms sector.
According to Bennet, the Commission’s job will be to regulate companies in the interest of consumers, competition, but also, to “Defend the public interest.” If passed, the Digital Platform Commission Act would bring in several key points to legislation in the US regulating the field.
The federal commission would have five members, who would be able to organize hearings, investigations, assess fines and establish rules through engagement in public rule-making.
Thus, the Commission would be allowed to designate some digital platforms as being of systemic importance, and then subject those to extra oversight and regulation – such as audits and “Explainability” related to algorithms.
As part of the Commission, the bill proposes establishing a Code Council whose job would be to come up with voluntary or enforceable behavioral codes, technical standards, or other policies – such as transparency and accountability for algorithmic processes.
The Council will have 18 members in all, representing digital platforms or their associations, as well as non-profits, academics – and experts whose focus is on technology policy, law, consumer protection, privacy, competition – and, disinformation.
This hodgepodge of sometimes conflicting concepts and issues present in the description of the Council is also in the very introduction to the bill, where an attempt is made to justify the need to establish the Commission.