This Week in eDiscovery: Maintaining Confidentiality with AI-Powered Discovery, Apple Sanctioned Over Deleted Data

Written by

Every week, the Array team reviews the latest news and analysis about the evolving field of eDiscovery to bring you the topics and trends you need to know. We’re back after the Fourth of July holiday with a post that covers June 24 through July 10. Here’s what’s happening.

The Potential Confidentiality Risks of Using Generative AI in eDiscovery

Like in every other field, artificial intelligence is a valuable tool in the legal profession. More and more attorneys are experimenting with generative AI to conduct research, summarize information and perform other tasks.

When it comes to confidential documents obtained through eDiscovery, though, those experiments run the risk of violating lawyers’ ethical obligations if not submitted to a secure Large Language Model (LLM). A recent article from Owen Wolfe and Eddy Salcedo at Seyfarth Shaw does a great job explaining the potential conflicts.

As part of discovery, parties frequently agree to confidentiality stipulations when handling the other side’s sensitive documents. Or the judge might issue an order spelling out how those documents can be used. Typically, those rules limit who can see the information and how it can be used.

But adhering to confidentiality becomes tricky when feeding those documents to generative AI tools. Some of these tools retain any information that users load into them and then use that data later to produce better, more accurate answers for others. As a result, Wolfe and Salcedo write, attorneys run the risk that sensitive client information could appear in an answer shared with some other user unless the AI tool specifically prevents submitted data to be used to further train the model.

There are some AI-powered eDiscovery tools that don’t use uploaded information for further training and users can set how long the data is retained on the system for analysis. Before uploading data to a Generative AI tool, lawyers must understand what version of a tool is being used – whether it’s public or a closed, enterprise-level model and what level of data retention and data reuse is employed by the tool.

Additionally, there are a few ways to defend against breaches when producing documents in eDiscovery. Parties could request stipulations that ensure confidential documents aren’t used with tools that include generative AI without vetting the security of those tools, limit the use of documents only with AI-powered tools that are believed to be safe, or ban the use of that information with generative AI tools, period.

(Make sure the stipulations specifically address AI. Most only say that confidential information can’t be shared with “any person or entity,” and AI isn’t either of those things, Salcedo and Wolfe note.)

Attorneys must take these risks seriously. As a recent legal ethics opinion from the Pennsylvania and Philadelphia bars reminds us, client confidentiality is still a core responsibility of the profession, no matter how far technology advances.

Apple Runs Into Rule 37 Sanctions Over Deleted Siri Data

Apple has been hit with a tough round of Rule 37 sanctions after it deleted electronically stored information (ESI) in a case from California.

The plaintiffs in Lopez v. Apple, Inc. allege the tech giant improperly stored and used recordings from consumers’ devices, according to eDiscovery Assistant’s latest “Case of the Week.” Those recordings were reportedly from users who didn’t intend to activate Siri – they accidentally triggered the chatbot.

Apple had asked the court for a protective order, saying it needed relief from the burden of retaining every Siri recording and all associated data. At the same time, Apple was negotiating with the plaintiffs, trying to see if they would instead accept a sampling plan that provided more than 100 hours of Siri recordings and hundreds of thousands of Siri requests.

However, Apple had neglected to stop its protocol of auto-deleting certain Siri data when litigation started. The plaintiffs lost access to a huge trove of data that couldn’t be replaced. Though the court didn’t find intent on Apple’s part, that didn’t stop it from laying down a series of sanctions that, in part, preclude Apple from arguing:

  • That plaintiffs lack standing to sue because they have no proof that they were subject to false triggers;
  • Against class-wide damages based on the number of false recordings;
  • Against Apple’s intent based on the number of, and instances of repeated, false recordings.

Kelly Twigger highlights two important takeaways from the situation:

  • When you have a case that heavily depends on data from your opponents, take “proactive steps” to make sure it’s preserved, and get the court involved if necessary.
  • Talk to your clients about their retention policies and how to respond when a duty to preserve arises.

Other recent eDiscovery news and headlines:


Julia Helmer; Director, Client Solutions

With a decade of expertise, Julia excels at optimizing enterprise eDiscovery workflows from start to finish. With a deep understanding of how to seamlessly integrate workflows across various eDiscovery platforms, Julia creates tailored solutions for data identification, legal holds, ESI collections, and productions. By harnessing the power of Technology Assisted Review and Analytics, she delivers efficient, cost-effective results that align with best practices and budgetary constraints. Julia’s exceptional communication and customer service skills have fostered strong, lasting relationships with both clients and Project Management teams, enabling her to effectively problem-solve and drive success across numerous projects.

Skip to content