Skip to content

The AI ​​deal with Hollywood writers surely creates a lot of confidence in the studios to do the right thing





The Impact of Artificial Intelligence on the Entertainment Industry

The entertainment industry has undergone significant transformations over the years, with advancements in technology playing a crucial role. However, the rapid advancement of highly sophisticated and ubiquitous machine learning tools, particularly artificial intelligence (AI), has created a new set of challenges for workers in the industry. This article explores the impact of AI on unions and negotiations in the entertainment industry and delves into the need for stronger protections against the exploitation of artistic work by AI.

The Evolution of the Entertainment Industry

The entertainment industry has been an integral part of society for decades, bringing joy and inspiration to millions of people worldwide. From acting to writing and directing, artists have dedicated their lives to creating memorable experiences for audiences. However, the industry has also faced its fair share of labor crises and strikes throughout history.

The current labor closure, which started last spring, has been a unique experience for the three major unions in the entertainment industry: the Screen Actors Guild (SAG), the Writers Guild of America (WGA), and the Directors Guild of America (DGA). The contracts of these unions were simultaneously pending renegotiation, and the Alliance of Film and Television Producers (AMPTP) rejected their proposed terms. This rejection set the stage for a major flashpoint in this year’s strikes, as AI technology became the focus of negotiations.

The Devaluation of Workers

One of the main points of tension for labor in the entertainment industry is the devaluation of workers, which has reached a boiling point with the rapid advancement of highly sophisticated and ubiquitous machine learning tools. AI has presented new challenges for artists, as their work can now be replicated by AI replicas of their images or voices, leading to job insecurity and the potential for exploitation. This devaluation affects actors, writers, directors, and other members of the industry.

Actors have already experienced instances where their images or voices have been stolen directly by AI systems. Writers have faced incidents of their work being plagiarized by AI language models like ChatGPT, while directors have seen their styles stripped down and replicated by AI tools such as MidJourney. All areas of the team are now ripe for exploitation by the studios and big tech companies.

The Importance of Stronger Protections

In response to the growing concerns surrounding the exploitation of artistic work by AI, the DGA and WGA have reached agreements with the AMPTP that include provisions to protect their members’ work. The DGA contract insists that AI is not a person and cannot replace functions performed by members. The WGA’s language is more detailed, stating that AI cannot write or rewrite literary material, and AI-generated material will not be considered source material. The contracts also require studios to disclose whether AI has been used to generate material for writers.

While these agreements are a step in the right direction, they may not provide broad enough protections for artists, given the significant investments studios have already made in AI technology. Studios are actively developing creative and administrative uses for machine learning tools, raising questions about their willingness to dismantle their AI initiatives. Additionally, proving attribution and addressing biases inherent in AI systems present further challenges.

Expanding Protections for Artists

To address these concerns, it is crucial for the entertainment industry, particularly the Screen Actors Guild (SAG), to negotiate even more specific and protective language in their ongoing negotiations with the AMPTP. The SAG has an opportunity to set new standards and shape the future of the industry by creating an agreement that acknowledges the inevitability of AI’s use by studios and reflects the mutual needs of both parties. Such an agreement should address issues like data collection, attribution, and the mitigation of biases in AI systems.

Furthermore, it is essential for everyone involved in the entertainment industry, including artists and unions, to familiarize themselves with how AI technologies work. Gaining a deeper understanding of the capabilities and limitations of AI will enable them to navigate the rapidly evolving landscape and make informed decisions. Embracing the potential benefits of AI while mitigating potential harm is a delicate balance, but it is achievable through collaboration and education.

The Way Forward: Learning from the Past

As the negotiations between unions and the AMPTP continue, it is crucial to learn from history and avoid repeating the mistakes of the past. The term “Luddite” is often misused to describe individuals who reject technology entirely. However, the true Luddites were not anti-technology but pro-union, fighting against the exploitation and devaluation of their work. Today, it is vital for unions, studios, and big tech companies to find common ground and work towards a mutually beneficial agreement.

In conclusion, the impact of artificial intelligence on the entertainment industry cannot be understated. As technology continues to advance, it is essential for unions and industry stakeholders to prioritize the protection of artists’ work from exploitation by machine learning tools. Stronger agreements and collaborative efforts are needed to ensure that AI is used responsibly and ethically, enriching the industry without compromising the rights and livelihoods of artists.


WIRED Opinion publishes articles from external contributors representing a wide range of points of view. Read more opinions here. Submit an opinion piece at ideas@wired.com.


—————————————————-

Article Link
UK Artful Impressions Premiere Etsy Store
Sponsored Content View
90’s Rock Band Review View
Ted Lasso’s MacBook Guide View
Nature’s Secret to More Energy View
Ancient Recipe for Weight Loss View
MacBook Air i3 vs i5 View
You Need a VPN in 2023 – Liberty Shield View

I have been in the entertainment industry since he was nine years old. I joined the Screen Actors Guild (SAG) when I was 11 in 1977, the Writers Guild of America (WGA) when I was 22, and the Directors Guild of America (DGA) the following year. I started out as a child actor on Broadway, studied film at New York University, and then acted in films like The missing child and the Bill and Ted franchise while writing and directing my own narrative work. I have experienced several labor crises and strikes, but none like our current labor closure, which started last spring when the contracts of the three unions were simultaneously pending renegotiation and the Alliance of Film and Television Producers (AMPTP) rejected his terms.

The unifying point of tension for labor is the devaluation of the worker, which has reached a boiling point with the rapid advancement of highly sophisticated and ubiquitous machine learning tools. Actors have been replaced by AI replicas of their images, or their voices have been stolen directly. Writers have had their work plagiarized by ChatGPT, directors’ styles have been stripped down and replicated by MidJourney, and all areas of the team are ripe for exploitation by studios and big tech. All of this set the stage for AI issues to become a major flashpoint in this year’s strikes. Last summer, the DGA reached an agreement with the AMPTP and on Tuesday the WGA reached an agreement. your own important agreement. Both include terms that unions hope will significantly protect their work from exploitation by machine learning technology. But these deals, while a decided start, seem unlikely to offer broad enough protections for artists, given how much studios have already invested in the technology.

The DGA contract insists that AI is not a person and cannot replace functions performed by members. The WGA’s language, though more detailed, is fundamentally similar, stating that “AI cannot write or rewrite literary material, and AI-generated material will not be considered source material” and requiring that studios “must disclose to the writer whether Some material delivered to the writer has been generated by AI or incorporates material generated by AI. Their contract also adds that the union “reserves the right to state that exploitation of writers’ material to train AI is prohibited.”

But studios are already busy developing countless uses for machine learning tools that are both creative and administrative. Will they stop that development, knowing that their own copyrighted product is at risk from machine learning tools they don’t control and that Big Tech monopolies, all of which could devour the entire film and television industry, Won’t they stop their AI development? Can the government get big tech companies to control them when those companies know that China and other global entities will continue to advance these technologies? All of which brings up the question of proof.

It’s hard to imagine studios telling artists the truth when asked to dismantle their AI initiatives, and attribution is nearly impossible to prove with machine learning results. Likewise, it’s hard to see how to prevent these tools from learning from whatever data the studios want. It is already standard practice for corporations to act first and apologize later, and it should be assumed that they will continue to collect and ingest all the data they can access, which is all the data. The studies will give some protections to people with higher incomes. But these artists are predominantly white and male, a fraction of the union’s membership. There will be little to no protections for women, people of color, LGBTQIA+, and other marginalized groups, as in all areas of the workforce. It is not my intention to begrudge the work of the DGA and WGA in crafting terms that may not adequately represent the scope of the technology. But we can go further, and SAG has the opportunity to do so in its ongoing negotiations.

The SAG is still on strike and plans to meet with the AMPTP next Monday. At your meeting, I hope you can raise the bar another level with even more specific and protective language.

It would be nice to see terminology that accepts that AI willpower be used by studios, regardless of the terms imposed on them. This agreement should also reflect the understanding that studios are as threatened by Big Tech’s voracious appetites as artists, that unions and AMPTP are sitting on opposite sides of the same life raft. To that end, contractual language that recognizes mutual needs will serve everyone’s interest, with agreements between AI users and those affected by its use on all sides of our industry. It would also be helpful to see language that addresses how AI’s inherent biases, which reflect society’s inherent biases, could be a problem. Must all make a pact to use these technologies taking into account those realities and concerns.

Mostly, I hope everyone involved takes the time to learn how these technologies work, what they can and what they can’t. do, and engages in an industrial revolution that, like anything created by humans, can provide enormous benefits and also enormous harm. The term Luddite is often incorrectly used to describe an exhausted and bitter population that wants technology to go away. But the real Luddites were very committed to technology and were adept at using it in their work in the textile industry. They were not an anti-technology movement but a pro-union movement, fighting to prevent the exploitation and devaluation of their work by rapacious corporate overlords. If you want to know how to solve the problems we face due to artificial intelligence and other technologies, get genuinely and deeply involved. Become a Luddite.


WIRED Opinion publishes articles from external contributors representing a wide range of points of view. Read more opinions here. Submit an opinion piece at ideas@wired.com.

—————————————————-