Using ChatGPT for your start-up? Think twice.

Alfred

BC EIC Law Student

October 20, 2023

You have worked hard in building your start-up. Now it’s time to think about the legal clean-ups. With the growing popularity of ChatGPT, start-up founders are increasingly turning to this powerful chatbot for immediate legal advice. However, there are some significant risks. This blog aims to outline some of the major risks associated with startup founders using ChatGPT. Think twice before making your next critical business decision using this powerful AI tool.

 

 

Liability for Damages is Restricted: You may only receive a maximum of $100 in compensation for business losses resulting from fabricated information by ChatGPT

 

ChatGPT has mastered the confidence trick. This can be highly appealing to startup founders who are looking to make decisions in a split second.[1] At the early stage of your business, you may wonder:

 

•    Which business structure should I establish: LLC, C-Crop, S-Crop, or operate as a sole proprietor?

•    How can I protect my company’s intellectual property? Should I choose trademark or copyright protection?

•    What elements should I incorporate into my contracts with my co-founders, employees, or suppliers?

•    What are the legal requirements for raising capital? How about hiring foreign employees?

 

Those are some of the common questions asked by the founders in their early stages. While it’s true that ChatGPT provides neatly structured responses to basic legal queries, it’s important to note that this AI bot is not a legal expert and cannot resolve specific legal issues for your business. What if your business could benefit from a transition from an LLC to a C-Corp? How about getting your business idea protection from both copyright and patent law? ChatGPT does not currently have the capacity to engage in complex legal analysis yet.

 

What is more alarming is that ChatGPT frequently makes stuff up. There is a term coined for this called “AI Hallucination”.[2] For example, a New York lawyer was sanctioned for citing 5 fake ChatGPT cases in a legal brief earlier this year.[3] The federal judge found the lawyer acted in bad faith and made “acts of conscious avoidance and false and misleading statements to the court. “In fact, ChatGPT’s answer contains so many errors that the company foresees legal disputes and caps its liability for any damages to $100 in its “Term of Use”.[4]

 

 

Confidential Concern - you may just have leaked your next business idea to the public.

 

Start-ups often rely heavily on their confidential information to gain advantages in the competitive market. Start-up founders sometimes use ChatGPT to help them review an undisclosed business plan, proofread a contract, or ask certain questions regarding specific compliance issues about certain products in the development stage. However, by chatting with the AI bot, you may just have exposed important business secrets to the general public.

 

Take a glance at ChatGPT’s “Term of Use”.[5] It does not protect the confidentiality you enter in the chat box. On the contrary, ChatGPT explicitly warns against the sharing of information in the conversation.[6] Many law firms now limit lawyers’ use of ChatGPT or implement strict bans on the use of it because they are concerned about its potential for inadvertent disclosure of confidential client information. Business founders should remain vigilant in putting any confidential information in the chat box.

 

Employment Discrimination - we do not need more biases.

 

We all know that start-up founders face disparate treatment based on their gender, race, age, and educational background when they seek out investment capital for their businesses. ChatGPT, as a tool designed to mimic human behavior, unfortunately, adopts the same biased thinking.

 

Nowadays, more and more start-up founders use ChatGPT to evaluate resumes or answer questions about candidates’ experiences.[7] ChatGPT can help HR pick whoever it thinks qualifies for the position you are hiring. However, ChatGPT is only as good as the data it can pull from. There are apparent biases that exist in today’s talent acquisition in corporate America. ChatGPT will only duplicate these biases if not amplify them.

 

It is thus not surprising to see some states have already implemented restrictions on using ChatGPT for employment decisions. New York City issued its “final rule” implementing Local Law 144, which requires a bias audit when employers use an AI software like ChatGPT for employment decisions.[8]

 

Stay away from ChatGPT when making an employment decision can not only shield you from legal risks but also recruit talents who represent the rich diversity of society.

 

 

 

 

 

Affecting Due Diligence Down the Road - Apologies, but there might not be any M&A for you

 

Lastly, Incorporating ChatGPT blindly into the core product development can lead to risky outcomes for a business later down the road. For example, when companies are looking for future merger and acquisition actions, the due diligence might involve a clearance on whether you have disclosed your business secrets or confidential information on AI chatbox such as ChatGPT.[9] This might raise a big red flag for potential buyers when they are thinking about acquiring your business.

 

Revolutionary technology always comes with great risks. ChatGPT is not an exception. Businesses should not shy away from the benefits of ChatGPT. However, start-up founders should always proceed with the tool with great discernment, and when necessary, consult with legal professionals.

 

 

 


[1] https://www.theregister.com/2022/12/12/chatgpt_has_mastered_the_confidence/

[2] https://apnews.com/article/artificial-intelligence-hallucination-chatbots-chatgpt-falsehoods-ac4672c5b06e6f91050aa46ee731bcf4

[3] https://www.reuters.com/legal/new-york-lawyers-sanctioned-using-fake-chatgpt-cases-legal-brief-2023-06-22/

[4] https://openai.com/policies/terms-of-use

[5] https://openai.com/policies/terms-of-use

[6] https://help.openai.com/en/articles/6783457-what-is-chatgpt/. The FAQs state that “Please don’t share nay sensitive information in your conversations.”

[7] https://www.shrm.org/resourcesandtools/hr-topics/behavioral-competencies/global-and-cultural-effectiveness/pages/how-chatgpt-could-discriminate-against-applicants.aspx

[8] https://legistar.council.nyc.gov/LegislationDetail.aspx?ID=4344524&GUID=B051915D-A9AC-451E-81F8-6596032FA3F9&Options=ID%7CText%7C&Search=

[9] https://www.cnn.com/2023/04/06/tech/chatgpt-ai-privacy-concerns/index.html