Over the last 12 months, the proliferation of emerging, largely unregulated generative AI technologies has resulted in class action lawsuits, strikes, and congressional hearings over creatives’ concerns of job loss and copyright infringement. In an apparent step in the right direction, the White House said last week that it had clinched immediate “voluntary commitments” from seven companies — Amazon, Anthropic, Google, Inflection, Meta, Microsoft, and OpenAI, some of the largest leading AI’s advancement — regarding the “safe, secure, and transparent” development of the technology.
The meeting between US President Joe Biden and the companies’ executives addressed issues including cybersecurity and biosecurity risks, misuse prevention, safe testing, privacy protections, and public transparency. The commitments are intended to be upheld until laws addressing these same matters emerge, according to the administration’s press notice.
Listen beautiful relax classics on our Youtube channel.
But the efficacy of these “voluntary” deals with leading AI companies is unclear, as Google, Meta, and OpenAI are each already embroiled in lawsuits over alleged copyright infringement and misuse of user information — and experts in the fields of art and technology are skeptical that they will achieve much.
“The ‘voluntary’ nature of these commitments renders them meaningless,” University of Chicago professor Ben Zhao told Hyperallergic, noting that “while the Biden administration has good intentions, they seem to be oblivious to the real risks at stake.” As a computer science educator, Zhao served as the faculty lead for the research project “Glaze,” a system designed to shield artists from AI-style imitation. The technology, which is currently available for free to download, uses stylized masks that apply barely noticeable alterations to artworks in order to misdirect generative models that try to steal an artist’s personal aesthetic.
“These are incredibly strong yet poorly defined goals that have been set forth, and many of these commitments involve technical problems that lack solutions or may be completely insolvable,” Zhao said, pointing to the example of “watermarking” content.
“There are no robust solutions for watermarking generative content, either text or images, known today,” he explained. “How hard will these AI companies work at ‘voluntarily’ building these difficult systems? What we need is real regulation with well-defined, transparent goals that are backed up with plans for testing, enforcement, and if necessary, penalties. The assumption that big tech will do the ‘right’ thing despite the obvious financial disincentives is naive.”
Concept Art Association, an organization that supports concept artists and their work, also explained to Hyperallergic that because creators “are the true creative core at the heart of generative AI,” they must be allowed to have a say in the legislation around it.
“So far, the White House has been meeting with leaders of some of the top AI companies on the responsibility of developing safe and trustworthy generative artificial intelligence (genAI), but there is still one very important component on the subject around genAI that has been left out of the conversation entirely — the artists and creators whose intellectual property (IP) props up this entire new industry,” Deana Igelsrud, a spokesperson for the group, told Hyperallergic.
“If President Biden and Vice President Harris want to have as thorough of a perspective on this subject as possible when crafting these monumentally important policies, the creatives whose work product fuels this rapidly advancing technology are a very important component of the process that should not be forgotten,” Igelsrud concluded.
In last week’s announcement, the administration reaffirmed its commitment to assemble an executive order and pursue legislation that will protect the public in the era of AI, citing a larger governmental commitment to confront unforeseen risks posed by generative AI tools. In October 2022, the White House Office of Science and Technology Policy (OSTP) published a Blueprint for an AI Bill of Rights, which outlined voluntary guidelines that prioritize civil protections against unanticipated threats from developing AI software. Earlier this spring, Harris met with the top executives of OpenAI, Anthropic, Microsoft, and Alphabet to further discuss the importance of responsible technological advancement in AI.
The news also comes after a second round of congressional hearings on AI policy earlier this month. As Congress considers a route for AI legislation, the Senate Judiciary Subcommittee on Intellectual Property heard testimonies on July 12 from Universal Music Group executive Jeffrey Harleston, San Francisco-based illustrator Karla Ortiz, and Emory University School of Law professor Matthew Sag, in addition to representatives speaking on behalf of Adobe and Stability AI.
“‘AI’ stands for ‘artificial intelligence.’ But that’s a misleading term, because, in fact, these so-called artificial intelligence systems depend entirely on vast quantities of copyrighted work made by human creators like me,” Ortiz said during her testimony, decrying the “AI companies that use our work as training data and raw materials for their AI models without consent, credit, or compensation.”
Speaking about her own experience when she discovered that her and others’ copyrighted work had allegedly been used without their knowledge to train AI software, Ortiz urged Congress to take action. The artist recommended that legislators amend the Copyright Act in order to reinforce the distinction of human authorship over machine-made work, as well as to develop regulatory policy that prioritizes the rights of creators over AI.
“My livelihood is threatened as a result of the uninhibited growth of Generative AI. And I am not alone,” Ortiz said. “Indeed, I and artists like me may only be the first wave of Americans who will have their livelihoods erased by the onset of Generative AI. But tomorrow it could be any number of Americans in a multitude of other professions who may be replaced.”