eSudo.com

VA Answer Service for Law Firms

AI answering services can expose your firm to confidentiality breaches and ethical risks. Learn the questions every attorney must ask before using legal tech vendors.

 

AI answering services for law firms are exploding in popularity. They promise to save time, slash intake costs, and project a modern image. But for many attorneys, there is a dangerous blind spot: most lawyers are not checking if these vendors actually protect client confidentiality.

If a vendor mishandles a prospect’s or client’s sensitive information, you—not the software company—face the consequences.

The Risks at a Glance:

  • State Bar investigations

  • Malpractice exposure

  • Loss of client trust

  • License-threatening ethical complaints

This Is Not Hypothetical. It Is Already Happening.

If you think “big tech” companies automatically protect your data, look at the recent headlines. The legal risks are real, and they are already in court.

Exhibit A: The Otter.ai Class Action

In August 2025, a class-action lawsuit (Brewer v. Otter.ai) was filed alleging that the popular AI transcription service was “secretly recording” private conversations and using that data to train its proprietary AI models.

  • The Risk for Law Firms: If you used a tool like this for a client deposition or intake call, that client’s private facts could theoretically be used to “teach” an AI how to speak better—exposing privileged information to a third-party corporation.

Exhibit B: The Zoom “Terms of Service” Panic

In 2023, the legal community revolted when Zoom quietly updated its Terms of Service. The new language appeared to grant Zoom a perpetual license to use customer video and audio to train its AI.

  • The Lesson: Tech vendors will often default to grabbing your data unless they are caught.

Real-World Case Study: The Fine Print of 2 Popular Vendors

To show you exactly what to look for, we audited the public policies of two popular services: Answering Legal and LEX Reception.

Note: This is based on publicly available information as of late 2025. Neither company publicly displays a SOC 2 Type II security audit on their website.

 

Vendor 1: LEX Reception

The Good: They explicitly state they do not use AI to answer phones (“Every one of your calls is answered by a live receptionist”). 

The Red Flag: Offshoring & Data Residency. While they market “24/7 support,” their own FAQ states that receptionists are “chosen based on their quality, not location.” Furthermore, industry directories link their operations to major outsourcing hubs in the Philippines (Manila).

  • The Ethics Trap: If your answering service uses overseas staff, your client data is leaving U.S. jurisdiction. This creates a massive compliance headache for strict data privacy regulations.

Vendor 2: Answering Legal

The Good: They market themselves as having 100% U.S.-based staff, which solves the data residency issue. 

The Red Flag: AI Training Ambiguity. They recently launched a “Free AI Intake Chatbot” that is “trained to answer questions about your firm.” While this sounds helpful, their privacy policy allows data usage for “internal business purposes” and to “improve our Services.”

  • The Ethics Trap: Does “improving services” mean using your client chat logs to train their AI model for other law firms? Without a specific “No Training” clause in your contract, you cannot be sure.

Why This Matters (Even If You “Don’t Do Tech”)

You don’t need to understand code to understand this risk. Think of it this way:

If you hired a human receptionist and gave them full access to every client conversation, you would never allow them to:

  • Store case files at their personal home.

  • Share notes with a friend in another country.

  • Keep copies of confidential transcripts for 30 days after they quit.

Yet, many AI intake vendors do exactly this with your call transcripts, and you never see it happening.

The Real Problem: Developers vs. Duty

Many legal-tech startups are built by engineers who understand software but have zero understanding of attorney ethics.

When a vendor says:

“We only keep your client call recordings for 30 days for quality control.”

Here is what that actually means in plain English:

  • Access: Someone at the tech company can read or listen to your privileged conversations.

  • Identity: That person is likely not a lawyer and may not be background-checked.

  • Location: That person may be located outside of U.S. jurisdiction.

  • Privilege: They are likely not bound by attorney-client privilege.

This is not a “technical bug.” It is a professional responsibility issue.

Your Ethical Duties Do Not Vanish with AI

According to ABA Model Rules (specifically Rule 1.6 on Confidentiality and Rule 5.3 on Supervision of Non-Lawyer Assistance), you are required to:

  1. Protect confidential client information.

  2. Understand the risks of the relevant technology you use.

  3. Supervise the non-lawyers (including vendors) you delegate work to.

If you adopt an AI tool without verifying how it secures data, you could be violating all three.

The Solution: A Vendor-Vetting Checklist

There are secure ways to use AI. You simply need a rigid vetting process. Use this list before signing with any AI answering service, legal intake bot, CRM, or outsourced receptionist.

If the vendor cannot answer these clearly, walk away.

Top 10 Security Questions for Law Firms

Attorney Protecting Client Data, Holding a Confidential Folder

A Simple 4-Step Protection Process

Step 1 – Ask the questions above Do not accept verbal assurances like “Don’t worry, it’s safe.” Get written responses via email.

Step 2 – Require documentation Ask for their Data Retention Schedule and Incident Response Plan. If they don’t have these, they are not ready for law firm clients.

Step 3 – Compare vendors on security, not just price You do not need the cheapest provider. You need the one that protects your license.

Step 4 – Review annually Tech changes fast. Review your vendors once a year to ensure their policies haven’t changed.

Woman Answering A Call

AI answering services can improve your firm’s responsiveness and client satisfaction—but only if you treat them like an extension of your staff, not a toy.

Attorneys should not be afraid of AI. They should be afraid of using AI without understanding it.

The firms that build a proper vetting process now will:

✔ Reduce malpractice risk

✔ Protect their online reputation

✔ Gain the speed of automation safely

The firms that don’t… will eventually become someone’s cautionary tale.

Related Articles for Small Law Firms