Tech

Hollywood writers don’t want to do justice to generative AI

Hot potato: The Writers Guild of America discusses how to deal with ChatGPT and other generative AI algorithms when it comes to scripting. The organization appears to be willing to allow AI-assisted work, but only if the AI ​​is stripped of any copyright.

As people begin to grapple with the plagiarism potential of AI algorithms, the union representing writers for film, television, radio and other media industries is reflecting on how to properly manage this new frontier in content creation. The WGA seems willing to consider AI as a legitimate tool in the scripting process, but doesn’t want to lose money over it.

In accordance with three unnamed sources Within the film industry, the WGA proposal does not explicitly ban the use of artificial intelligence technology in the work of writers. Hollywood writers and screenwriters would prefer to use generative AI, viewing it simply as a “tool” with no practical implications for credit or monetary compensation.

The WGA is discussing the state of generative AI in its talks with the Alliance of Motion Picture and Television Producers (AMPTP) as the two organizations work on drafting a new work contract. The WGA later confirmed its proposal in a series of tweets about regulating “artificial intelligence-generated content.”

According to the aforementioned tweets, such regulation should ensure that film and TV companies cannot use AI to “undermine writers’ work standards” when it comes to compensation, leftovers, shared rights and credits.

The WGA states that artificial intelligence cannot be used as “source” or “literary material” for any project covered by the MBA program, as these are the two main definitions for classifying writers’ work. Source material refers to original novels, plays, or even magazine articles on which a screenplay can be based. Literary material is the main product of a writer’s work, which is then examined for leftovers and other compensations.

The WGA states that AI cannot be used as source material because AI software is not capable of creating anything on its own. ChatGPT and other machine learning algorithms are just statistical inference machines that generate “a regurgitation of what they’ve been fed,” the organization says.

AI feeds on both copyrighted and public domain content and has no “smartness” or any awareness to determine what’s what. Therefore, AI results cannot be copyrighted, and AI software cannot sign a “certificate of authorship”. Conversely, the WGA concludes, plagiarism is an inherent feature of the AI ​​process.




Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button