BIA and our partner, ACEDS, hosted a spirited webinar, Document Review: The EDRM’s Final Frontier, a discussion on managed document review and why successful document reviews rely on the other steps along the ERDM path.
Our speakers, Emily Cobb, a senior attorney at Ropes & Gray, along with Barry Schwartz and Lisa Prowse, who are BIA’s resident experts in all things review, talked about how to approach, plan for and execute a successful legal document review.
While document review might seem like a stand-alone process, the most successful reviews rely on the previous steps in the eDiscovery process, from legal holds to collection, to processing.
In the Document Review: The EDRM’s Final Frontier webinar, you’ll learn about:
- Best practices for document review
- Past challenges with document review
- eDiscovery roadmap preparation
- Data and production protocols
- Defining and working with search terms
- Predictive coding usage (including TAR, CAL, AI, etc.)
- Choosing the right blend of technology and people for each unique review
The team shared ideas on best practices and how everything we do from legal holds to collection to processing ultimately impacts review. This webinar is for seasoned pros, corporate in-house counsel, vendors looking to compare best practices and those of us in the trenches who make it all happen.
- Barry Schwartz, Esq, CEDS
- Lisa Prowse, Esq., CEDS
- Emily Cobb, Esq.
Watch the Webinar
Mary: Hello. My name is Mary Mack. I’m the executive director of ACEDS, the Association of Certified eDiscovery Specialists. And we are pleased to have as our webinar on our ACEDS webinar channel, “Document Review: The EDRM’s Final Frontier.” It’s posted and sponsored by our wonderful affiliate BIA. And we have an all-star panel here to discuss this. It’s a CEDS powered panel, which is just wonderful.
We’ve got Barry Schwartz, who is not only an attorney but also a CEDS, and he boasts his 35 years of legal and business consulting management experience, which is just amazing. The things that you’ve seen Barry, I am telling you. He oversees BIA’s advisory division and provides insight into all sorts of different areas spanning the EDRM, including information management, retention, regulatory compliance, and all-important IT security. And so he has got both litigation experience, business experience, and eDiscovery experience, and we’re absolutely thrilled to have him. Thanks for being on, Barry. And we also have Lisa Prowse. She is also an attorney and a CEDS with 20 years of experience in document review. She oversees BIA’s attorney review team, including doing the hard work of creating databases, protocols, doing some of the training, technology assisted review, the all-important quality controls, and also security, which is just wonderful to see the emphasis on security. She’s got experience with multiple review platforms, including Catalyst, Relativity, and Reveal. She also has litigation experience as well. Emily Cobb, we’ve got from Ropes and Gray, I guess we want to get moving on this rather quickly here. So Emily Cobb is an attorney with Ropes and Gray, and she has wonderful expertise in creating and monitoring best practices and strategies from the entirety of eDiscovery and has an emphasis on metrics, workflows, and new technologies. She’s also cross-border and global and has some forensic, cost-shifting, and all sorts of other deep experience in the EDRM model. And moderating is one of your newly minted rockstar CEDS, Johnny Olmedo. And I’m going to hand the presentation over to him as soon as they can.
Just a little housekeeping, down at the bottom of your screen, you will see a Q&A button, and the Q&A button will allow you to ask questions both substantive and technical if you’re having some issues with the presentation. But more importantly, if you have questions for our presenters, if you’ll type them in there, we’ll get those answered for you. And Johnny, you are going to be doing the moderating and the question wrangling, so I will turn this over to you with our thanks for putting on a wonderful webinar.
Johnny: Thank you, Mary. We really appreciate it. Before we get started, I want to thank everyone for joining. This is our first ACEDS webinar, so we’re super excited to tackle this topic the EDRM, the final document reviews the final frontier. As all of us know, that’s all the way towards the end of the right of the EDRM, and one of the last steps that we take before production and presentation. Typically could be the most expensive area, some of the most challenging areas of eDiscovery. So you have three experts today who are really going to give you an idea of the best practices for document review, some of the challenges, previous things that we’ve done in the past with document review, and the new frontier. So we’re excited to hear from them today, and I will pass it along.
Emily: Thanks, Johnny. Hi everybody, I’m Emily Cobb from Ropes and Gray. And I think when a lot of us hear about document review we think like either the guy on left or the roadrunner to the right there and we just want to get away from it as fast as possible or get through it as quickly as we can. And so we wanted to have – I think this is a great topic for a webinar, because we wanted to take a minute and really talk about document review from a different kind of perspective and kind of go through the manage review process and some of the newer tools and the newer workflows that are available and compare that against maybe some of the older ways of document review that really were very difficult and sort of more of an endurance test than anything else. But it’s also we want to stay mindful with the fact that document review is a critical stage in a litigation or an investigation. You know it’s a process by which we as attorneys go about complying with the federal rules of a simple procedure, or the rules of a requesting body, and it’s also an investigative process. We use documents to gain access to those nuggets of gold, those documents that are going to win our case for us, or that are going to position us to be in the best possible settlement position. We might not be in the settlement position you want, but at least you’ll have the best access to the information that you need. And it’s also a point in a litigation or an investigation where you can save your client’s cost. Those of us that are outside counsel on the call, and for those of you that are in-house counsel, it’s also an opportunity to keep an eye on what your outside counsel are doing and make sure that you are reining in costs and to avoid increasing them potentially exponentially if the document review is mismanaged. And it’s also an opportunity to use some really cool stuff. At Ropes and Gray, we love technology, and I know a lot of the people here in the room with me and on the call do as well. And it’s very exciting to see what’s going on in the world of technology, some of the new things that are available to us in terms of analytics, technology assisted review, even some of the reporting that is available in some of the new tools – really positions us well to gather metrics and inform rule 26 arguments, and better position us to represent our clients at meet and converse.
Just a quick word about the good old days which weren’t really all that good as indicated by our air quotes. See, a lot of us have been doing this for a while and remember the time where document review wasn’t such a good deal for our clients in terms of higher costs, it was very difficult to stage information based on what we were looking for, there was a higher risk of privilege waiver, it was harder to use tools in any really sophisticated way to identify work product or privilege. And we ended up reviewing a lot more non-responsive documents than necessary. Overall, we had collections that were imprecise. They were over-inclusive in some ways, and they were under inclusive in some ways because we had a limited ability to search, to sample. And so we would end up with these document reviews where there were tons of tags because that was the only way to really go about categorizing documents. And so now we’re glad that we’re not dealing with that anymore. Now we have a modern approach. So Barry is here from BIA if you want to talk a little bit about what the managed review and the modern approach incorporates?
Barry: Certainly. In our opinion, the modern approach is managed review. And manage review includes document review which is the heart of our presentation today, as well as project management so that as we’re doing the reviews, we manage the collections and the workflow and everything appropriately so we can be efficient, as Emily was saying in respect to costs and in time and so forth. And we also, in the modern approach, use the technology tools that are now available to us, including technology assisted review, as well as other tools and reporting, as Emily mentioned with respect to email threading, clustering, concept approaches, and the like. And one of the key aspects of the modern approach is the acquisition of institutional knowledge for clients and across custodians, as well as knowing the client’s document idiosyncrasies and the custodian idiosyncrasies as well. And having that institutional knowledge breeds a whole lot of efficiencies as the process continues. And as we move forward, one of the key aspects of the modern approach is relying upon a vendor to facilitate project management across the project so that the client is fully informed. Counsel is fully informed as to the state of collection, any issues involved with collecting data from various custodians and data repositories, processing that data, searching that data to help cull down the document sets, and so forth. The project manager is key in keeping from the vendors, from the vendor’s perspective, in keeping the flow of information and the pace of a document review on course. And lastly, the PM also assists the team in identifying or helping to produce the identified documents and working with counsel so that the right documents are produced in the right order at the right time.
Emily: And you know before I talk about the RFPs, I think you made an excellent point Barry that I just wanted to pull out and talk about a little bit more in terms of institutional knowledge. Whereas before when we were doing bare-bones document review, coding documents, producing them, and that was sort of it – now because of the technology that is available to us, we capture a lot of metrics that really allow us to learn about our clients, about their data sources and their custodian. In many cases, you can use that for your present matter. But what we found is we can really sort of taking a moment when we reach the end stage of a matter or an investigation and take a look at some of those metrics and say okay, well here’s a custodian that has a ton of HIPAA information in their emails, that might be an internal compliance issue, but maybe in addition for future litigation. When we have to articulate a rule 26 argument, we can say okay well we actually know that this is a client that’s burdensome to review because they improperly commingled all this HIPPA information in their emails and we’re going to have to go through and redact it, and then there’s always the risk that we miss a redaction and that protected information will get out the door. So that’s one of the things that I find the most useful about the modern approach. And using a lot of these tools that are available is getting that insight back into the data on that client-side that we really didn’t have a window into before. And although this isn’t a CLE to talk about rule 26 and rule 24 in depths, that’s the information that you need to give to a court. You can’t regurgitate the rule 26 proportionality factor without saying how it is disproportionate, why it is burdensome, why do we think that there’s not a lot of information in that data source. If you can pull metrics from prior reviews that show you know what we found like 2% responsive documents and these matters are really similar. That’s something that’s going to help you articulate a strong position.
Lisa: Real quick on it, but personally, I see that as the biggest change from older review to new review. While the review may have gotten cheaper, all these other things, the amount of analysis that you can get from the review now is just tremendous. We had amounts before, and it was just a chore of collecting paper documents, look at them, produce them, you learned nothing from it. It was basically just a chore. Now you actually learn about your clients and about the case, and it helps tremendously with the cases themselves.
Emily: Yeah, that’s a great point. And now sort of in tandem with this evolution of discovery, we’ve seen how two iterations of rule amendments 2006 and the more recent 2015. The RFP’s responses to requests for production should never have been boilerplate and should never have been as broad as they were. And there are courts either Mansi vs. Mayflower out of the district of Maryland is a great case to read that kind of goes through. That was always the rule, and that case came down well before the rules were amended. But sort of in tandem with this evolution of discovery tools and discovery platforms, the rules have been amended to at least codify and make clear for everyone that discovery should be proportional, and it should be precise. And so it requires the parties to really think critically about things like date ranges, and search terms, and how that’s all going to fit into the larger review plan. And all these things that happen upstream – meet and confers rule 34 requests and responses. That gives you your corpus for review in document review.
So before we really even get to the review management portion and the project management portion, we just wanted to take a moment to just flag whatever you negotiate upstream is what you’re stuck within the middle of the EDRM. So we don’t rush through the meet and confer and pump things downstream. Think critically if you’re going to use search terms at all about how you’re going to be using those – will you evaluate them mid-review? Just because you agreed on certain search terms when you didn’t really know much about the case, then maybe consider if you want to think about retaining the right to review those if it looks like mid-review certain search terms are returning non-responsive documents. And also with the ESI protocols. They can really bind you in a good way or a bad way during your document review. If an ESI protocol is a cut and pastes before the rules were amended or summit an amalgam of ESI protocols from different prior cases in different jurisdictions, it’s not going to help you out all that well. A protocol should be a finely tuned machine. I always use the analogy of the Ferrari versus the school bus. There’s some protocols that I see are like a school bus – they’re clunky, they’re big, and everybody gets on board. They’ve got 15 or 20 pages of provisions, and typically stuck there somewhere in the middle is a landmine. Whereas a Ferrari is a finely tuned machine and that’s really what the protocol should be because that’s what’s going to drive your document review and also potentially position the other side to move for sanctions against you if you don’t comply with a certain portion of that protocol. And so I think one thing to think about is to avoid getting overly complex in them. There are three main buckets that should always be in a protocol: privilege and confidentiality, scope which is the custodians, the date ranges, and the data sources, and format, how are you going to account for families, metadata. Obviously, for complex cases, those will be much more sophisticated than for simple cases but really to just be mindful of retaining control over the review. And a lot of the opportunity to do that is in the drafting of the ESI protocol. And just be mindful. And I know a lot of people have different positions on this, I’ll tell you mind. I don’t believe in discovery on discovery. If a party wants to come and say well, we’re going to dictate how the review is going to be run and the protocol – that’s not going to get very far with me. We believe strongly in adhering to Sedona principle 6. And that doesn’t mean that a party is not going to come and try to ask you to agree to letting them dictate how you’re going to run your technology assisted review or how you’re going to run your review, but I would just keep an eye out for that and think about whether or not you think that’s really consistent and where that provision would be – where the language would be to allow something like that in the federal rules. I don’t think that they do.
Lisa: And I would also add that if don’t add in things or you haven’t discussed productions and not even just the format of the productions but how those documents are going to be delivered to the other side. And even those things that seem like they’re so far off when it comes time for production there will be a delay because the vendor will say how do you need them produced, where do they need to go. And you’ll either not have an answer, and you’ll need to reach out to opposing and ask and basically you’ve just added a day or so before you get back to the vendor with some answers, or you’ll give the answer that you think is right, the vendor will do it, and then you’ll call us back next week and say can you reproduce all these documents in a different format because the other side is squawking. Typically that’s just not something you want to fight over. It’s not worth the money fighting over something that minute. So just ask the simple questions upfront and so that your vendor has all the answers, and you don’t need those delays. Because at the end of document review and it’s time to produce documents, typically there’s a very short time period between that and depositions, and you just don’t want any delays.
Barry: I’ll talk about review management now to a certain extent. One of the keys that we’ve been talking about is organizing yourself throughout the process as Emily mentioned with respect to the ESI protocol, that organization is key. Also, managing the entire process of collecting the documents, processing the documents, setting them up for review, and then doing the QC and the quality control and the production of documents. All follows theoretically an orderly process. And we always recommend to our clients that we have that protocol in place as Lisa was mentioned with respect to, as an example, the production. And there is a process that we follow, and it is proven time over time that privilege review typically follows the document review. We’ve had clients ask us to do a search and produce, just hold back any document that might have hit on privilege. And that’s risky behavior in our opinion because it is possible, and it’s likely unless you have an ironclad 502 order in place that privilege documents are going to get out the door on a search and produce, no matter how rigorous a privilege screen you might have. Because you may not have Emily’s name as a search term, but clients refer to Emily, giving us advice on such and such, and that becomes privileged, and it many times gets missed. And the same with confidentiality review. There are levels of confidentiality that typically come into play, and in intellectual property matters, there’s usually a stipulation for attorney’s eyes only. And that review needs to be done in an orderly process as well. And then lastly, we’ll talk about intelligent document assignments. Which custodians did we want to review first? Which issues do we want to review first because they’re important to our case, or that’s actually part of the protocol? We’ll review custodian Smith before we review custodian Jones because that’s what the parties agreed to. But it’s important that we think about how we do our document review from that perspective as well in terms of as well search terms. We may want to look at specific search terms before we look at others, though.
Lisa: And let’s take a minute to talk about project management. I bet if we took a poll of everyone on this webinar right now, everyone would define project management differently. And I think there are a lot of folks out there that work in eDiscovery who use the title project managers. And at the beginning of a project, it’s really important to understand what people mean when they say that. Project management means a lot of different things I’ve found to different people. Ultimately, if you’re outside counsel, you need to be the beach master in a project. There are certain duties you cannot ethically discharge, and you are ultimately responsible for making sure that the documents that you produce and the discovery process is compliant with the Federal Rules of Civil Procedure 26G. And with your ethical obligations, you’re ultimately answerable to the client, setting aside all the other ethical and professional concerns that you have on your own end. And so it’s important at the beginning of a project to think about, okay, what is this project? What are the steps in this project? And how is best situated to handle the task, and who is going to implement that task? And proactively try to identify issues that could throw off your workflow at the beginning of a matter. Matters that tend not to go well are ones where it’s not clear who’s in charge of what. And that can very easily happen when you’re managing several vendors, all of whom have a project manager. And so, the communications aspect of project management is really one of the critical factors, and that’s something that needs to happen at the beginning of a review. And there does need to be one person that’s ultimately accountable. For me, the thing that drives me the most crazy is when people say, and often through no fault of their own – I didn’t realize I was supposed to be doing that. And that’s bad project management. And that’s on us as outside counsel to proactively make sure that things like that never happen. And the way to do that is to never rush into a document review, and that’s very often very difficult to do. We’ve got a second request, we’ve got a whistle blow, or we’ve got something from the DOJ or the SEC, and even sometimes we’ll rush for run-of-the-mill litigations, maybe just not as fast. But there’s often a lot of time constraints, and it’s penny wise and pound foolish not to take the time in the beginning to think about project management, workflow, and not just jumping straight into a document review.
And so just going back to look at the slide here. Your project management is the structure of a document review that either will make or break it. One of the things that’s critical about it, and now we have all these great metrics and reporting tools, is keeping – if you’re the manager review vendor, is keeping outside counsel and client informed. Now we can have – if you’re not sure how to do this, have a vendor come in and show you. What types of reports are available to me now? Some of the reports, most of them now, will say okay your review is going to last 22 days, it’s going to cost you a dollar fifty to review your next document based on the yield curve, based on how many responsive documents you’re getting for each document review. And from steps to cost. You can have – all of this reporting will go to how much you’re going substantively, how much it costs you for the client budget, and how much time is left so you can make sure that you can comply to whatever the agreements are that you’ve made in your discovery plan or the requirements of your court or the agency to whom you’ve responded. And it keeps all parts of the project moving. Your data collection is something that needs to be carefully managed down to processing – all of those steps that are more to the left of the EDRM. It’s critical to keep project management sort of at the fore and make sure that there’s a clear structure, clear workflow in place that folks need to really adhere to. And also have a plan in place for when things go awry because they absolutely will go awry. And so people can’t just freak out, we’ll say okay, well if it doesn’t work this way, here’s sort of our second step. And I think for outside counsel, none of this is going to work unless you really roll up your sleeves as attorneys and understand how these new review platforms function. They are fantastic. They’re our friends, right? Some people are a little reluctant to get involved in the technology because they feel that as partners, associates, that’s not really my job. That is our job, and it makes us better at our jobs because it helps us get responsive documents faster. It helps us decrease costs, which in the client relationship, is critical. And it’s also something real responsible for doing ethically, and you can get really good at it the better you understand how these tools function.
Then even setting aside what we traditionally think about document review – deposition prep. If you know how to run searches, old keyword searches, like you do in Westlaw, you’ll find some documents. But if you understand how to really take advantage of clusters, or you want to look at an entire conversation, but you’re not sure how to use threading – it’s really worth taking your time to figure out how to do those types of things. Also, sometimes you’re not sure what you’re looking for. That’s where things like analytics and clusters can really come into play. You may have a bunch of documents, and you may hear from a client you know something’s up with this guys – we have Office 365, so we know that the majority of his communications are happening after business hours or over, and we need you to go through his documents and figure out what’s going on here. Either way, the board is meeting tomorrow, so we need like a high-level overview by then. That’s going to be something we’re probably not going to be able to do using a traditional what we call linear document review, going document by document. But roll up your sleeves and take the time. I’ve yet to meet a vendor that won’t perform a free training to show how to use analytics or show how to use something basic like Relativity or something more sophisticated like Brainspace. There are so many different tools out there, not just those two.
And another thing that will help you get better at your job as an attorney in terms of just mitigating risks to the client. So you know costs, and complying with the federal rules, and getting the knowledge transferal – but always at the outskirts of all that is the risk. Anytime we produce a document, are we running afoul of European data protection laws? Are we inadvertently producing financial information? Is their customer information that might be maybe not financial, but maybe it’s a Telecom, or it’s a cable providers, and they’re protected by some other act, maybe there’s HIPAA information in there. Of course, there’s always a privilege and work product. Or maybe your client is being sued by a competitor, and there’s very sensitive business information about your meetings where they’re brainstorming, developing new products, and things like that. In the old way of reviewing documents one at a time, it didn’t mean you weren’t going to see those documents. But because now we can group documents in an intelligent way, you might be able to put together puzzle pieces that were there before, but you didn’t realize it because you were looking at things grouped by theme, or grouped by conversation thread, or organized in a certain way, or even just prioritized so that all of the responsive documents are in sort of one bucket for you to look at. So part of the project management aspect is not just retaining a vendor and saying okay, so you have someone that’s going to send me to review questions and answers at the end of each day and sent me reports, great. It’s really about understanding how to steer that vendor, and how to tell them what you’re looking for. And in order to do that, you really need to understand how to use these tools, and it’ll make you a much more effective advocate.
Johnny: Do you find that from the underside, having a project manager that could steer you to that process as well? I know we’ve seen certain situations where attorneys aren’t knowledgeable on every review tool. Can project managers steer you through that process through the technology, and also guide you through the different aspects of that tool that can help guide your review. Do you see that often with the vendor you’re working with?
Emily: Absolutely. And it’s in response to the law firm asking the question. Hey, I don’t – I’ve asked the question. The first time – the most recent tool I can think of, where the supplies are Brainspace, where I said this looks awesome. First of all, I love technology obviously, but I don’t know how to use it. Can we get on a call, and you guys show me how to get the best out of this tool? And vendors typically love to do that. And that’s right. I think vendors are a great partner, but you have to tell the partner just like in any other relationship – it’s relational, right. You need to tell me what’s expected from my side, and I need to tell you what’s expected from your side. So it is on you as the partner or the counsel or the associate – if you’re not sure how to do it, or you’re not even really sure what questions to ask, don’t hide that fact – say it. There was a case in the news fairly recently involved Wells Fargo where the partner, I think it was reviewing documents, I’m not sure what the review tool was, but looking at it I suspect it was Relativity. But it seemed that this particular person, and it was not a party to this, it was in response to a non-party subpoena, had gone through a certain set of documents, I think it was a thousand, and gotten to the bottom of the review pane. And just assuming that it was Relativity, you all know when you go through you can look at 500 documents, 1000, then you have to scroll to the next set. Well, this particular reviewer had thought that she had come to the end of the review set, not realizing she had to scroll to the next set of documents, and greenlit the production. Now, if you’re familiar with the New York Times article, you know that there were other issues besides that regarding private information that should have been redacted that wasn’t. That was a parent ostensibly billed potentially billions of dollars of financial information in there. But I think that the takeaway and why it’s relevant to this conversation is it at least seems from the affidavit that the woman that was reviewing the documents there just wasn’t that familiar with the review platform. And so the potential risk can be – it’s unknowable. In many situations a receiving counsel might have just said hey it happens right, it happens in the context of privileged data, here it was privileged and customer financial information, we’ll send the production back. That’s not what happened in this case obviously because we’ve read about it in the New York Times. But sort of the issue is once a mistake like that is made, you can’t really control the other side’s response. Yes, we have some protections from clawbacks. And if the federal rules of civil procedure event apply to your case, there’s some protection in the language of little 26. But it’s not just preventative to learn how these tools work. They can really make you a better advocate and a better attorney.
Johnny: Thanks for that answer, Emily. Appreciate it.
Emily: So I’d love to talk about technology assisted review, and I promise now to spend all of our time on this. But you know, as we mentioned before that there are these great tools out there that can make document review so much better, so much faster, so much cheaper, and so much more precise. Is every document review is a good candidate for technology assisted review? Well, first, let’s define a little bit what we mean when we say technology assisted review. And I know there are some people that define it slightly differently, but here’s how I define it, and here’s what I mean when I talk about it today. I view technology assisted review as predictive coding. And all that other fun stuff, the analytics, the clustering, the threading, that’s something else. I know some people group those all together, but I’m not going to do that when I’m using that term today – I’m just talking about predictive coding when I say TAR. And so typically good candidates are – there are certain types of data types that are not going to be good candidates for TAR. Photographs, if you’re working on an issue where there’s a lot of architectural plans, encrypted files, things without a lot of texts – those are not going to be great candidates for technology assisted review. Email is our dream candidate for technology assisted review, as well as e-docs. And it’s not just for review. We also use technology assisted review to prioritize documents, to conduct QC. One great thing about technology assisted review is it can make predictions about, or it can rank documents based on how responsive it thinks that they are. And you can say well show me all the instances where the contract attorneys coded something non-responsive, but the tools give it a really high ranking indicating that you’re looking at these documents on a curve, that it’s probable that those documents are responsive, those are the ones that I want to QC. So it’s not just something that you’re going to use for the first-level review.
And the predictive coding is also a great step in a multi-modal approach. So a lot of people, I think, sometimes still think of predictive coding as a computer coding your documents. You don’t have to have it code your documents at all. In fact, that’s not how we typically use it. But it’s part of a process where you can still use search terms. You can still use some amount of linear review. You can use it to do really anything that you want. You can actually just use it to prioritize, which means put all the responsive documents to the front of the review, or you can use it to just look at a certain section of documents after you’ve run them through clusters. And for those of you that have never seen clusters, we have a pretty small image of them up here that I took off of a website. And they’re a great way to get a quick insight into what you’re looking at in a document population. And below that is a graph that shows a yield curve. It’s a little hard to see on this, I apologize for the side, but you know typically if you look at the curve at the angle all the way to the right, that’s sort of the old linear way of doing a document review. Where if you wanted to get to 100% of responsive documents, which is on the vertical access if you had to review all the documents. And that’s contrasted here with using an active learning tool, with using technology assisted review where the team is able to get to all of the responsive documents. You see how quickly that red line is almost vertical and gets to the top very quickly in just a couple days of review because they’ve trained the tools, and they were able to successfully apply it to their review population. You may have heard people reviewed at TAR 1.0 and TAR 2.0. We’re not going to go into that too much today other than to say they’re not the same, and so you may find some challenges with vendors that are still using TAR 1.0 tools if you’re dealing with smaller document sets or low richness sets. And really take the time to talk to the vendor. I was surprised to hear some vendors describe some of their tools to me as technology’s resisted review, or TAR 2.0 that really didn’t fit with what I think is a pretty commonly accepted definition of those. So really make sure if someone’s selling you predictive coding or TAR 2.0, you kick the tires a little bit and understand what they mean when they say that.
In terms of judicial acceptance, if you’re in front of Judge Peck, you’re golden. But he’s not the only judge. He just has the best cases to cite for that. Rio Tinto is one of his cases; Hiles is a recent one. But there are other good rulings out of you know the District of Nebraska. There’s a lot of great cases out there that refer to predictive coding as a more accurate, more efficient, more cost-effective method of reviewing ESI. And then you can measure your success – it’s quantifiable when you’re using technology assisted review. You may have heard people talk about ideas of recall and precision. Those are ways of measuring how much of the responsive data did we get and sort of how much did it cost us to get that information? Those are really valuable metrics that we didn’t have back in the olden days that will help show us the success of the review, and give us a certain level of confidence in terms of how good can we feel about representing to the government or to a judge that we’ve complied with what the government wanted or we can sign that 26G certification and feel pretty good about that. Lisa, did you have any comments you wanted to make about?
Lisa: We’ll go ahead and go to the next one just to make sure. I know there’s some bigger ones coming up.
Barry: I’ll speak about contract attorneys. We’ve got the pluses and the minuses, the green and the red issues, flags against these lines of text here. But with contract attorneys, they’re talking now about review team considerations, makeup, and where they’re located and so forth. Contract attorneys typically have lower costs. The firm using the contract attorney or hiring them has no liability issues and are not worried about payroll, taxes, workers comp, and so forth. There’s ultimate flexibility on the team size. You can add people, subtract people at will. And so those are some of the advantages of using contract attorneys. And if you’re going to use contract attorneys, we recommend that you use agencies that have employed contract attorneys, because they’ve got different motivations than a 1099 or contractor. They’ll get benefits. They’ll have their payroll taxes covered, and so forth. And that’s an advantage when dealing with the morale of contract attorneys. And then on the negative side, you’ve got limited subject matter knowledge. As we mentioned earlier, the institutional knowledge doesn’t stick with contract attorneys because there’s generally a high turnover rate. And they have limited productivity incentives. They look forward to delays in the system and downtime and in unscheduled downtime in system issues where all of a sudden, there’s new documents added and the system has to index, and because of the speed of the case, unfortunately, that may have to happen during daylight hours, normal review hours. And so they don’t mind waiting for the documents to become available to them again. And Emily, you wanted to speak to firm associates and staff attorneys.
Emily: Yeah, just briefly. You know when you’re thinking about staffing and sending everything out to contract attorneys, remember if you’re incorporating some of these technology assisted review steps are using some of the clusters and some of the analytics. That’s a really fast way to get information in many cases from your document collections. So if you’ve got your associates that need to understand quickly sort of the lay of the land, or they’re going to be drafting some of the protocols, take time to have the associates do some of that. Spend a couple of days in the analytics, spend a couple of days in the clusters, maybe eliminate what you’re sending over to the manage review center to look at. I know sometimes the instinct is to just get it out, get people looking at the documents – it’s worth the time to pause and spend a couple of days looking through the documents to see if you can use the analytics to gain insight into what’s in the documents, or just to eliminate documents from the review stream. And I can’t tell you how often we will have a client who doesn’t even want input on the protocol and where the review team is thinking okay I mean we’ve read the RFPs and we’ve seen from the docs, but we don’t know your client, we don’t know your case, we really need that input from the law firm. And that just makes the review go so much quicker and more smoothly.
Barry: And with review team location, there are options here. There’s onshore, which in our case, we refer to the law firm is here, and the review team is either within the law firm or across the street or down the block from the law firm. And it’s advantageous so that the associates and the Emily’s of the world can come over and educate the reviewers as they proceed. However, that can be a higher cost because, in many cases, attorneys or the law firms are located in high real estate cost markets, and so that’s not necessarily the advantageous way to go from a cost perspective for the client. And then we can look at nearshore, where we’re looking at a remote location – a lower-cost locale where the attorneys can still get the training, the Emily’s’ of the world can come in and do an in-person training, or they can be available through electronic means like we’re doing our webinar here today. We can have screen sharing and video conferencing, and the training can happen in that fashion. One of the negatives of nearshore is that there could be a limited pool of reviewers available, and typically if there’s a foreign language review that could be a problem because not necessarily in smaller geographic markets are there many foreign-language speakers in the language, you might need in a given case. And then for offshore, obviously that is the lowest cost, and typically those offshore agencies do have full-time employees. And we see the use of offshore, we just recently had a case where the data couldn’t leave the country. It was a far eastern country, and with the very restrictive flow of data, so we needed to have staffing occur in that locale. And one of the negatives with offshore is the fact that the associates can’t travel except through – not easily – except through electronic means.
Emily: So review companies with full time employed attorney reviews. Like our team, they’re all licensed attorneys. They do this job 24, well not 24/7 – they’ll say it was. Sometimes it’s about 40 hours a week-ish. This is what they do every week for years on end. They are basically experts in the process of document review. They’re not going to be experts on every case, but they are experts in using the tools and knowing how email threading works, and knowing where to find information in Excels, and things that it takes a lot of experience and a lot of documents or document review experience to learn. And basically, that’s all they do. They’re experts at this. With that comes, it’s typically a low cost. Obviously, they’re a much lower cost than associates at a law firm who typically also have a lot better things to do with their time and the client’s money than to sit and review all the documents. But they’re typically a lower cost than even the temps because, as Barry mentioned for your downtime and stuff, we don’t have to worry about that. We don’t have to worry about bringing an entire team up to speed on the technology. There’s a lot of efficiencies that you save when using the full-time attorneys or the full-time document review attorneys, and that translates into a lower cost. The biggest value for the client, though, I think overall, is the institutional knowledge. We’ve got cases where clients that we’ve been working on their cases for 7, 8, 9 years going at least, and things like IP licensing and medical devices and things that are fairly complex concepts to understand and go through. But our review team has reviewed it for so long. It’s just easy. And granted we have had only a few different companies with medical devices, but I’m pretty sure we can take any medical device case at this point, and we understand the medical terminology and what different people and what reports they write and stuff like that. So the institutional knowledge. These are the reviewers who if you gave us a case a year ago and they reviewed it, and all of a sudden another piece of that case comes back this year of it gets appealed, and you have to go back and look at more documents, they still remember all of these documents and all the information on this case. You’re not spending weeks bringing another team up to speed on just the basics of understanding the documents themselves. And they’re also fully vetted and trained by the company. Our reviewers are trained on all the review platforms. Most people know how to use Excel, and it’s basically a really glorified calculator, but it can do so much more. And while most of us can use Microsoft Excel, most of us also know the one or two people in our office who are Excel wizards and can make that software sing and do amazing things. And that’s really what the full-time document review attorneys can do with the document review tools. The document review tools are designed so that most people with very little training can get in, and they can review their documents. So you know it’s simplified enough for that. But you put people in it who use it all the time and have really studied how it works and everything, and they can make that sing, and they can come back with so much more information for you than you would just get without using anything like that.
And there’s also review sites. If when we’re talking about temps, a lot of times temps will get shoved into backrooms with folding tables and the crappy computers that you pulled out. And when you’ve got a dedicated review team, you’ve got people who are always using the top software, the newest equipment. It’s their job all the time, so they’re prepared for that.
Lisa: So I don’t think we need to go through these next two slides line by line. They kind of go over what the critical steps are in training a team and establishing your priorities. I would just say it’s worth the time to review these if you’re in the middle of launching a reviewer. Keep these handy as a reference. You’re the teacher here. The most successful training I’ve ever seen here is by some of our associates who’ve done Teach for America or have been teachers in the past. You’re there to give someone the tools to code the document. You’re not there to impress them with your knowledge or just talk about things in a super sophisticated way that nobody can understand. You’re there to give them the tools and information they need to code the documents, and that’s what the focus should be at a review launch. We can skip through this slide as well.
Emily: So I can actually get through this one fairly easily. So basically, document review is usually a first-level review, which is the basic. The documents we want you to look at, here’s what we’re going to look for, go for it. And you get through that and following that you usually will have a second level or a QC review where what really needs to be stressed is that QC really should not be about looking for or catching all the mistakes. Unless you’re actually going to look at every single document again, you’re not going to catch all the mistakes. What you’re looking for during QC is the people that are making a mistake. So your sampling says 10% of documents. You’re looking is there some kind of pattern here where Joe Smith is constantly missing this idea? If he is, so let’s look at a bigger selection of Joe Smith’s documents or a bigger selection of documents with that topic because either Joe Smith didn’t understand something or the review protocol didn’t explain the topic well enough. But your QC is just helping you pinpoint areas that you then need to go back and look to see if those were missed further. So it’s not about, you know, obviously sampling 10% of your documents is not going to find every mistake, but it should help you identify if there are any areas that you need to go back and look at and try and fix. And then metrics, I mean there’s the tools now have so many metrics that you can check out, it’s crazy. We could do an entire show on just metrics.
Lisa: Yup. And again, with this slide, I don’t think we need to go through list by list. But if you take a look at that Irth Sols case, which is one of the cases that we have for you on the last slide and in one of the attachments, that case is a great example of a production that got out the door, actually twice, with privileged documents where the appropriate checks and balances in place would have caught that there were privileged documents in the production. And it really stresses the point, especially in that extremely well-written ruling, that you cannot un-ring the bell. And if a court finds that you’ve acted recklessly, it’s going to find waiver. And especially, in that case, they had a clawback, and the court said you know what, this clawback in a word sucks, and I’m not going to use it, I’m going to fill in the blanks here with federal rule of evidence 502. So you know it’s not enough to just sit back and feel the comfort of a clawback or rule 26 because you never know what you’re going to get on the other side, right. I mean obviously, we have case law, and there’s quite a bit of predictability there, and we have the rules. But just like what we saw with the Wells Fargo data breach, data breach it was called in the article, that the production of financial information, the other side might not give the data back, and the plaintiff, in that case, might call the New York Times. So it’s really important to go through the steps that we’ve outlined in these slides very carefully.
Barry: And I have one point on process management, which is critical. We mentioned ISO. The key here is that as you’re going through a document review, the protocols change. And they change for a good reason typically. And if you don’t document that change so that a year from now when you go back to see what the protocol was because now, as Lisa mentioned, there could be new custodians. There could be new issues to review. If you haven’t documented the clear change with respect to how you’re doing whatever it is you’re doing, you’re going to be stuck and twisting in the wind, not knowing how you go to where you are now. So the critical aspect, in our opinion, with respect to process management, is documenting the change issues.
Lisa: It’s not enough to just make the changes in protocol, because I can’t tell you how many times we’ve had changes in the protocol, but two years later somebody asked about it and nobody can remember why we made the change, just obviously somebody told us to make the change. You really need to document who instructed the change, what the date was, and what the circumstances surrounding it were so that two years later, you have some kind of context to that.
So moving into pricing models and basically, document review is typically priced either in hourly or per document ways. And for hourly, basically, you’re talking about paying the attorneys per hour to review the documents. It’s harder to determine your overall cost upfront, but it’s also a gamble. If the document review is slower, if there are technical issues, if just a million things that take longer hours than you expected – a lot of spreadsheets or something, then it’s going to last a lot longer than you expected, it’s going to cost more. The gamble of that is you could end up with fewer hours than you expected, but that’s something you’ll never know upfront, unfortunately. Per document pricing is the model that we usually like to go with. And it just provides the clients very consistent pricing upfront. They know exactly what their bills are going to be. You’ve got one price per doc. They cover the review, and the QC, and everything. So you know exactly. If you’ve got a hundred thousand docs and you know exactly how much your cost is going to be at the beginning, middle, and end – no matter what happens with those documents.
Emily: So we’ve talked a lot today about the rule 26G certification. And some of the cases that we’ve highlighted here also go into some of the ethical rules that can come into play here. The 26G is something that’s not always – is not used. I think as often as it could be. And that’s the section of federal rule of civil procedure 26 that requires that every discovery disclosure, requests, response, or objection are signed by at least one attorney of record. And basically, you’re saying to the best of your knowledge as an attorney that after reasonable inquiry that your request, response, or objection or whatever you’re filing is neither unreasonable nor unduly burdensome nor expensive considering the needs of the case, the issues at stake in the action. And so in some of the cases that we’ve attached here, we attach for the reason that they go through rule 26G. So if you’re interested either as counsel or as a vendor to understand how that’s applied, the Branhaven case is a great case to read. It’s out of the district of Maryland, as is Mansia versus Mayflower. But the court really took the attorney to task in what was frankly pretty egregious discovery violations. But it’s a good kind of walkthrough of the rule there. In addition to ABA model rules, I think that the California Standing Committees Opinion – it’s 2015-193 – it’s a great reference as well. It goes through a list of factors that attorneys need to be familiar with in order to meet their requirements under the professional responsibility rules in California. And it talks about things like attorneys should understand and analyze the client’s ESI system, right. You need to do that anyway if you’re going to be coaching at your meet and confers. But that doesn’t always occur. You need to understand and identify the custodians of relevant ESI. You need to be able to engage in meaningful and competent meet and confers. And so aside from rule 26G in the ethical rules, there are also some rules that have been promulgated by jurisdictions in which you may practice that you should be mindful of. And I think it’s worth for everyone to review that California rule. Again, it’s opinion 2015, it was in the year 2015, -193, and just look through that list and think – am I really fluent with these? Do I really understand all of these things? It’s a great list of topics that you should be fairly fluent in if you’re practicing in the eDiscovery space as an attorney.
Johnny: Thanks, Emily. So we’re actually going to hand out some hand out of the case law with notes, supposed that Emily and Barry would talk about the entire ACEDS members that joined this. But there a couple of questions.
Emily: Yeah, let’s not go through these cases. You know that the McDermott Will & Emery is not a false claims litigation, it’s really about malpractice. That’s why we included that case. And we talked a little about Irth Sols already and Branhaven, so why don’t we use the last five minutes for questions. Do you want to start taking those?
Johnny: Yeah, I have a few questions that actually came through. So we want to thank the members for submitting those. One of the questions that came through was – how does the QC process mature over time with the modern approach, and how do we best leverage each of these to track errors and any changes in QC? I’d love to hear your opinion on this.
Emily: Yeah, so I can take that one. And I recognize some of the names on the attendee list today, so some of you folks already know that I love to track errors. And one of the ways in which document review has changed is that we can actually do that now. You can have reported that – you know I’ve worked with some vendors to generate reports that just track outliers, and you can see who’s coding maybe not wrong or right, but differently, that might show that your training is not that good if the one outlier is the person that’s getting them all right and the rest of the group is getting them all wrong. So it allows for more two way QC of the training and of the coding, but you can track specific errors and have those put into reports, and then see what the error percentages are and really hone in not only on who’s making the errors, but what type of errors. Are they missing privilege? Are they missing privacy reductions? So it’s not just are people making mistakes? Now we have much more insight into what type of mistakes they are making. And then if you want to compare what I call a discrepancy analysis using a predictive coding tool, you know when just the computer and the person disagree – then that also gives you insight into them. Did I train the tool properly? I have to say I have never found the tools to be wrong. It’s usually an issue with the reviewer or an issue with the training. But one day, that will happen that we didn’t train the tool the right way, and you’ll get insight into maybe needing to do additional training for your review tool or for the humans that are doing the coding.
Johnny: That’s a great response, and I think we’ve seen that a lot in the reviews that we’ve managed.
Lisa: We also like to overturn returned reports a lot. So basically, an overturn report is while you’re doing QC, you’re obviously changing the coding that the original reviewer did, and we can run an awful lot of analytics afterward that gives us a lot of insight. You know, like you said, what are the outliers and all that. But also even more specific, like are they tagging too much work product, and they should be tagging it an attorney-client privilege? Just very, very small amounts of things that could make a lot of difference. And the overturn reports are really helpful for that.
Johnny: That’s a great point, Lisa. One of the questions that came up in this that was raised by someone on the call – with the per doc model is there an incentive for the companies to go through this faster and not be as careful? Just a little bit about that?
Lisa: I would say absolutely the opposite. Basically, because our QC and our finalized product are – QC doesn’t cost any more. So say we’re charging a dollar a doc or something. That covers the cost of coding the documents the way that we’re instructed to code them. Meaning if you tell us A, B, and C is responsive, and X, Y, and Z are privileged. You get that back, and it’s not that way, you give it back to us, and we change it because that wasn’t what you asked for, and we’re not charging you more than. But we also have – all of our QC is included in all that pricing. So it really makes it so that the reviewers are much more careful, I would have to say. The reviewers make sure that they don’t get anything wrong or they try not to so that we’re not spending so much time in QC, because QC is on us. It’s not costing the client anything. So the less time we have to spend fixing mistakes, the better it is for us.
Johnny: That’s a great answer, Lisa. Just one last question before we sign off, it’s a question about production. If a party does not specify a format for production, can you produce it in a format that is reasonable?
Emily: Yup. That’s in the language of rule 34. And if you take a look at one of the cases that we cited to today, it had some really great language about that. That party didn’t do it there. That’s the Branhaven matter, they chose to produce in non-searchable static PDF form, and they ended up having to pay the other side to convert it into a searchable format. But if you look at the notes of rule 34, I think it’s in the 2006 amendment. There’s some pretty clear language in there that as long as the format is reasonably usable if the other side doesn’t specify, you can produce in that form.
Lisa: And the only comment I would add to that is exactly what that case was. So you think it’s reasonable, you do it that way. But that doesn’t mean the other side isn’t going to argue it. And any arguing that takes the attorneys back to court is going to cost more money. So if you just ask the question upfront and get the answer, none of that even has to happen.
Johnny: Fantastic, Lisa. Thank you, everyone, for joining the call today. And I’ll thank the panelists today for providing some great information about managing the review and the entire process from beginning to end. We will follow-up with the handout of the case law, as well as the presentation that was submitted today. So I’d like to thank everyone on the ACEDS call today, and we look forward to our next presentation.