Camilla Alexandra Hrdy, ‘Keeping ChatGPT Secret While Selling It Too’

ABSTRACT
Generative artificial intelligence products such as ChatGPT raise novel issues for trade secret law. But one of the most important issues is an old one: How to sell an ‘information good’, like computer software, while also maintaining legal protection for the underlying content? When a company wishes to sell a new technology to the public, the normal recourse is to obtain a patent. Patents require public disclosure and end after twenty years. However, based on decades of precedents established for software, generative AI companies will be able rely on trade secrecy and contract law instead – maintaining indefinite legal protection for their technology, even as they profit from making it widely available. This is what many companies have done with software, and this is what developers of ‘closed-source’ generative AI models like ChatGPT are doing today. ChatGPT’s terms of use prohibits reverse engineering, it contains a noncompete clause, and, for businesses and developers who obtain an Enterprise License, it requires maintaining the confidentiality of information obtained through OpenAI. Case law involving closed-source software suggests that breaching some of these provisions could give rise to both contract law liability and trade secret law liability – triggering harsh injunctive and monetary remedies and potentially extending liability to third parties who did not even agree to these provisions. Some protection for information goods is essential. Otherwise, companies might not make them available to the wider public at all. But companies should not generally get the benefit of perpetual legal secrecy long after factual secrecy has ended. This article identifies a variety of doctrinal levers courts can use to reduce the term of protection for information goods – including but not limited to generative AI.

Hrdy, Camilla Alexandra, Keeping ChatGPT Secret While Selling It Too (February 2, 2024).

Leave a Reply