ChatGPT: here’s what you need to know about the OpenAI contract

ChatGPT: here’s what you need to know about the OpenAI contract

ChatGPT continues to be the driving force behind the trend of Generative Artificial Intelligence, the new technology for content creation based on prompts with increasingly efficient results. After an auspicious debut with 100 million users in the first month of availability, the system has gained traction in the content creation segment, and this could be one of the most problematic elements for users in the future. All because of not reading the terms of use in the agreement that each user signs.

 

A recent report by Lance Eliot for Forbes highlights some hidden details in the terms of use within the contract each user agrees to when signing up for OpenAI services. The author, a specialist in AI ethics and law, recommends that ChatGPT users take steps to minimize these risks, such as taking out liability insurance.

The article also proposes the need for greater transparency and accountability in the AI industry to protect both users and developers of technologies in this field.

ChatGPT and “indemnification” for harm to third parties.

Eliot highlights an important element in “third party damages” issues, in case the use of OpenAI software may affect a company from different perspectives: copyright, misuse of original assets, impersonation and others. On this point, the Forbes contributor attaches this section of the contract:

“Article 7. Indemnification; Disclaimer of Warranties; Limitations of Liability: (a) Indemnification. You will defend, indemnify and hold us, our affiliates and our staff harmless from and against any claims, losses and expenses (including attorneys’ fees) arising out of or related to your use of the Services, including your Content, products or services you develop or offer in connection with the Services, and your breach of these Terms or violation of applicable law.”

In simple: if OpenAI is sued for misuse of its products by third parties, it is the user who should be legally responsible. This implies the coverage of representation expenses by the user for the benefit of OpenAI. Yes, you would have to pay for OpenAI’s defense in court and cover legal fees.

ChatGPT: you own what it tells you.

On specific intellectual property issues, the article refers to per-content agreements, and focuses on section 3 of the contract:

“Section 3. Content: (a) Your Content. You may provide information to the Service (‘Input’), and receive output generated and returned by the Service based on the Input (‘Output’). Input and Output are collectively referred to as ‘Content’. As between the parties and to the extent permitted by applicable law, you own all ‘Input’. Subject to your compliance with these Terms, OpenAI assigns to you all of its right, title and interest in and to the ‘Output’. This means that you may use the Content for any purpose, including commercial purposes such as sale or publication, if you comply with these Terms. OpenAI may use the Content to provide and maintain the Services, comply with applicable law and enforce our policies. You are responsible for the Content, including ensuring that it does not violate any applicable law or these Terms.”

In short, you own what ChatGPT “tells you,” although the company may use those “inputs” – your requests to the system – to maintain the service and defend itself against the law. However, it is up to the user to be vigilant that their content does not violate copyright.

This is not new

It is clear that all web services add protection clauses to avoid future complications. For Eliot, this is very common in the technology market: “This is, shall we say, commonplace. He probably did not realize it was so. Nor is it likely that he ever really found himself subject to the indemnity clause.”

What is interesting about this is that the rise of this technology could lead, in the not so distant future, to a number of cases associated with intellectual property defense or accusations of rights violations. Already now, several companies that develop image-generative AI have been criticized by artists who have detected part of their work in the service’s suggestions.

It is important to note that generative AI is a tool to optimize work, but that it draws on a variety of sources to generate that request. In some cases, the creation of content or programming code can take references in pre-existing works, and it is the responsibility of each user to investigate possible matches, especially in a field as prolific as audiovisual art.

Daniel Chapman