top of page
Search

"This call is being recorded" & AI consumer lawsuits

This formerly trusty disclaimer may not protect your company from the next wave of AI consumer lawsuits.


Man speaking on telephone, seated behind two computer screens, seated in front of an office window with the blinds closed.

Consumer and class actions lawsuits have arrived against consumer-facing companies and the AI vendors that they use to record, transcribe, and analyze consumer phone calls. You may assume that your business is safe because you provide the well-worn disclaimer “this call may be recorded for training and quality assurance purposes”. To understand why this formerly trusty disclaimer may not be enough in the age of AI, you need to know about something that dates back to the 1800s - wiretapping laws. 


Wiretapping laws in the United States


The phrase “wiretapping” perhaps conjures up images of spy movies or reminds you of the fallout after the Patriot Act was enacted. But the legal definition of wiretapping is quite mundane and pretty easy to violate without realizing it. Among other things, wiretapping simply entails the intentional recording of a private conversation without informing the participants of the conversation. 


Whether any given recording is illegal depends on the state, which means companies must consider 50 different sets of laws. When it comes to phone calls, over one-fifth of states have criminal and/or civil penalties for recording phone calls unless all participants on the call give consent. Some states take it a step further and also penalize anyone who aids someone else in making the unauthorized recording, which would implicate both AI vendors and their client companies. 


“This call is being recorded”


In light of state wiretapping laws, shouldn’t it be enough to say “this call is being recorded” or something to that effect? Those days may be coming to an end. In Galanter v. Cresta Intelligence (N.D. Cal. June 13, 2025), plaintiffs’ lawyers argued that United Airline’s standard call recording disclaimer did not adequately alert consumers to the fact that there is a third party on the call, namely the AI vendor. They also highlighted the reality that the AI vendor uses consumer call data for its own purposes, such as marketing and software development. 


Solutions


So, how can companies minimize the risk of a consumer or class action lawsuit due to the use of AI conversation intelligence tools?  


Be upfront


The old “this call is being recorded” disclaimer may not be specific enough given that AI tools are now capable of much more than just being a tape recorder. A more thorough disclaimer might specifically call out the presence and purpose of the AI tool. To decrease risk even further, a company could seek active consent from the consumer before proceeding with the call. 


Be educated on how AI vendors are using consumer’s data


The default policy of your AI conversation intelligence vendor may be to use consumer data for their own internal purposes. This is a situation where ignorance is not bliss - companies should be fully informed and educated about how AI vendors are using consumer data (and employee data too - hello potential employment lawsuit!). Companies that don’t want to absorb the inherent risk of AI vendors’ use of consumer data can consider negotiating different deal terms, or selecting a different provider. 


Update company privacy policies


With the rate that technology is changing, company privacy policies should be a living document at this point; consistently being updated to accurately reflect and disclose current privacy practices. If your privacy policy has not been reviewed since your company’s adoption of AI tools and/or the hiring of new AI vendors, it’s likely time to find a good privacy lawyer to do a much-needed revamp. 




Thanks for reading the Bevel Law Blog! While this information is hopefully helpful to you, nothing in this blog is intended to be legal advice. Always consult a lawyer before making any legal decisions based on topics in this blog.


 
 
 

Comments


Commenting on this post isn't available anymore. Contact the site owner for more info.
bottom of page