Texas Becomes Latest State to Weigh Guardrails for Generative AI
U.S. state legislatures are weighing whether to build their own regulatory frameworks around artificial intelligence in the absence of any comprehensive federal rules that protect consumers and businesses against deep fakes, audio cloning and other AI concerns.
Texas became the latest to consider guardrails for the burgeoning technology at a hearing last month, where the Texas Senate Business and Commerce Committee members called on the Motion Picture Association (MPA) among a slate of other stakeholders from several industries to explore issues with AI and propose solutions. The lawmakers are considering protections for the radio, TV, music and film industries as well as the general public against the unauthorized use of AI.
Last month, Illinois passed two laws regulating AI, and the California Senate sent AI legislation to Gov. Gavin Newsom’s desk for final approval.
At a hearing in Texas on Aug. 27, representatives of the MPA and the Recording Industry Association of America (RIAA) both voiced support for the federal “NO FAKES Act,” a bipartisan bill also backed by SAG-AFTRA that was formally introduced in the U.S. Senate in July but remains in committee.
“Congress has a lot on its plate,” Jessica Richard, RIAA public policy SVP, told TheWrap. “But we are very hopeful that the Senate and the House will recognize the fast-moving nature of AI, the urgent need for guardrails and will pass the NO FAKES Act quickly.”
In the meantime, she added, RIAA is “grateful” that state legislators are moving to protect artists and others from deep fakes and voice clones. “It remains to be seen how Texas will take on and prioritize specific AI issues, but lawmakers educating themselves and taking stakeholder input is the right place to start,” she said.
NO FAKES, short for Nurture Originals, Foster Art, And Keep Entertainment Safe, would hold a person liable for making digital replicas of a performer without consent and hold platforms liable if they knowingly host unauthorized digital replicas.
The need for state action could be mitigated if Congress were to pass the NO FAKES bill, Ben Sheffner, MPA’s SVP and associate general counsel of Law & Policy, said at the Texas hearing. But Sheffner acknowledged that it’s tough to get such legislation through Congress this year. U.S. lawmakers, who returned this week from a long summer recess and face a major government funding deadline, have only about five weeks before the end of this legislative session.
Both Sheffner and Richard said that, should the Texas Legislature pursue its own legislation, the NO FAKES bill would be a good model.
Sheffner also warned that any AI legislation should take into account that the First Amendment “sharply limits” regulating the content of speech. He told the Texas lawmakers that they should steer clear of mirroring Tennessee’s Ensuring Likeness Voice and Image Security (ELVIS) Act.
Because AI models require a large amount of data, some of which is copyrighted, deep fakes and other AI issues impact copyright, an entertainment attorney who is closely following federal and state AI legislation told TheWrap.
The ELVIS Act, which became law in Tennessee earlier this year and was praised by RIAA, SAG-AFTRA and entertainment labor unions, targets the unauthorized use of AI in regard to a performer’s voice and likeness.
The ELVIS Act “was drafted in an over-broad manner, not limited to artificial intelligence or digital replicas, and does not include adequate protections for free expression,” Sheffner said.
RIAA’s Richard told the Texas committee that AI companies should be transparent to make sure the system works fairly.
“Any artist or copyright owner who chooses to lean into AI should have the right to do so in a free market, giving them control over whether and on what terms their property — including their voice and likeness — can be used. If they prefer not to engage with AI, that should be their right too,” Richard said.
In August, the California legislature passed two SAG-AFTRA-supported bills. Assembly Bill 1836 would require explicit consent before an AI-generated digital replica of a deceased performer could be used, and AB 2602 tightens consent for using such digital replicas of living performers. At press time, the bills were still pending the governor’s approval.
Illinois also passed two bills — HB 4875, the Right of Publicity Act, and HB 4762, the Digital Voice and Likeness Protection Act — aimed at protecting artists and other individuals against the unauthorized use of AI-generated digital replicas.
The post Texas Becomes Latest State to Weigh Guardrails for Generative AI appeared first on TheWrap.