You are on page 1of 33

Keeping Control of Your

Managed Review
William B. McManus, Esq.

Consider?
Is a lawsuit against a corporation a legal problem
or a business problem?
Does it have a business value?

What about the cost of e-discovery and


document review as a part of the litigation
process?
A legal problem or a business problem?

Successful businesses solve business problems


by surrounding them in principles of project
management: identifying a desired outcome, due
within a certain time, at a defined cost.

HAVE A PLAN AND STICK TO IT


Project Managing Your Review:

Define scope (contingency for scope change?)


Who is responsible for what
Processing/hosting budget
Review budget
Project duration
Production budget
Required resources (generally)

Who is Responsible for What:


Litigation team (law firm) responsibilities
Corporate legal department responsibilities
Data collection/processing team
responsibilities
Review team (firm vs. contract)
responsibilities

Processing / Hosting Budget


Predicting Processing / Hosting Costs
125 GB inbound x $150 processing = $18,750
70% elimination through filtering = 37.5 GB promoted
37.5 GB x $450 per GB = $16,875
25 licenses at $100 per month = $2,500 (recurring)
Data storage = $50 GB per month = $1,875 (recurring)
Total estimated processing / hosting = $40,000

Review Budget
Predicting Review Costs
200,000 documents / 65 per hour = 3,077 hours
3,077 hours x $65 = $200,005
Training = 20 attys x 3 hours x $65 = $3,900
PM = 1 PM x $95 x 8 x 25 days = $19,000
Total estimated review cost: $222,905

Project Duration
Predicting a reasonable review and production schedule
200,000 documents / 65 per hour = 3077 hours
3077 hours / 20 attorneys = 154 hours each
154 hours / 8 hours a day = 20 days
Plus 3-5 days for QA and production preparation
Review & QA Duration = 25 days
Production = 5 days
Greatest opportunity for cost escalation is the fire drill. Avoid it
by planning ahead.
Result in down time; overstaffing; confusion; do-overs; errors
caused by fatigue, lack of training or confusion; etc.

Production Budget
Production Costs: Depend on form of production and can
be negotiated on a variety of basis:
e.g., $1,000 per 50,000 pages tiff, or by GB for native,
etc.
In our example, 200,000 documents, assume 30 %
responsive, or 60,000 documents so $1,000-2,000 for
OCR tiff with appropriate load files.
Tech time for special projects @ $200 per hour;
estimate 5 hours = $1,000

Project Overview Generally


Projected Project Cost:
Review:
Process/Hosting:
Production:
Tech Time:
Total Estimated Cost:
Projected Project Duration:
Total time :

$222,905
40,000
2,000
1,000
$265,905

30 days

Required Resources
Project the resources you will need
How will you collect data
How will you process and host data
Will you use an internal or external review platform?
The platform you use may depend on specific case
needs
Who will review?
Foreign language documents?
Do you have reviewers to manage those?
Does your selected technology work with foreign
language documents

Specific Case Needs Issues Unique to Each Matter


Where will you conduct your review?
Within the law firm? With an external review
partner?
Do you have space / work stations?
how many reviewers do you need?
Appropriate hardware / software
Dual screens
Are you opening documents locally?
Appropriate bandwidth?
Adequate security?

Specific Case Needs


Have you agreed with your adversary on issues
that can cause do-overs, e.g.,:

Filters
Form of production
Form of review: Linear? Advanced?
Parent-Family relationship
Metadata fields,
Footers (confidentiality, attorney eyes only, etc.)
Have you agreed on descriptive langauge for the
privilege log

Specific Case Needs


Are you leveraging advanced review technology?
Culling/Clustering; Predictive Ranking/Coding; email analytics,
etc.

Are you leveraging de-duplication technology?


What about near duplicates and if so, what is your threshold?
What documents will be subject to near de-duplication?

How will you handle parent / child relationships


Parent child when relevant?
Parent child when privilege?

Will you run privilege filters / have a separate privilege


team

Consider Tag Structure Get the most from your review


Work with the litigation team to develop a sound tag
structure.
Spend time considering how these documents will be
used later in the case this will take time.
Jury instructions are a good place to start.
Case Theories and Defenses memo that was drafted
for the client is a good place to start.

Sample Tag Structure

QA/QC Complete
Responsive
Non-Responsive
Specific Substantive Issues

Corporate Knowledge of Fraudulent Communications to Plaintiff


Corporate Knowledge of Reliance on Deceptive Statements by Plaintiff
Evidence Supporting Claims of Damages
Evidence of Michael Smiths Effort to Hide his Conduct

Confidentiality / Attorneys Eyes Only


Trade Secret
Attorney-Client Privileged
Redaction Required

Work Product Privileged


Statutory Privileged

Consistent Tagging
If multiple reviewers are working on the same
documents; or
If multiple law firms (joint defense) are working on the
same general documents:
Ensuring consistent tagging across reviewers / firm
Ensuring consistent privilege across reviewers / firm
Ensuring consistent redaction across reviewer / firm

Ensuring Quality Three Parts


Quality Staff
Quality Training
Quality Assurance / Quality Control

Quality Staff:
Proper hiring/background checks/reference checks
Align attorney experience with specific case needs
Patent attorneys with engineering background
Pharmaceutical attorneys with health science background
Environmental attorneys with science background
Will you use:
in-house corporate attorneys
law firm attorneys
outsources review
onshore
offshore

Quality Training Let Your Review Team Succeed


Prepare an informative project manual for your review team.
Client must sign off that the manual accurately describes the
goals of the project.
Manual includes two parts: (1) memo; and (2) supporting docs
Project memo:
Precise description of the objective of the review and what is
expected of each reviewer
Brief facts of case
Outline of theories and/or defenses
Issues of interest/concern defined and described
Types of documents you expect and dont expect
Issue tags, hot documents, and footer tags defined with examples
Etc.

Supporting Documents

List of key persons by title and function


Key word list
Organizational chart
Relevant pleading and motions
Requests for production
List of outside and inside counsel (for privilege /
Work product identification)
Jurisdictionally relevant synthesis of law on
privilege and work product
Etc.

Schedule a formal training session with litigation team /


in-house counsel and review team.
Have them define exactly what they expect from
reviewers.
Have them describe the case, theories, etc.
Go through sample documents as a group to define
tagging
Audio-video record training in case deadlines are
truncated
Have litigation team present during first couple days
of review to answer questions learning curve time

Quality Assurance/ Quality Control


How will you QC your review
What is an acceptable QC rate? 5%; 8%; 10%
What is your blend of PM to Reviewers (1 to 12)
8% QC 10x70x8=5600 docs / 8% means 448 need QC (6 hours)

What types of QA will you use


Filtering QA (validating filters)
Review QA
Best QA is up front training and ongoing communication
Hold team meetings to discuss errors and changes in policy,
etc.
Self QA by team members
Random eyes-on and word search QA by PM
TIP: Run privilege terms through final production set for
100% privilege QA

What Reporting will you (or your client) need?


Individual reviewer and team production
reports (daily; weekly, etc)
Individual tag hits reports
Bulk tagging reports
Daily budget expended/review complete
reports

Processing Data for Review


Have a plan in place for processing and culling data e.g.,
use of search terms
keyword, date, Boolean; stem; fuzzy; concept, etc.
The only prudent way to test the reliability of the keyword
search is to perform some appropriate sampling of the
documents determined to be privileged and those determined
not to be in order to arrive at a comfort level that the
categories are neither over-inclusive nor under-inclusive.
Victor Stanley Inc., v. Creative Pipe, Inc., 250 F.R.D 251, 257
(D.Md. 2008)

Creating Search Terms


Recognizing search terms are imperfect, if you are going to
use them, make them the best they can be.
Involve people whose documents you are searching in the
process of creating those filters.
Interview Custodians how did you talk
Include in-house counsel how does corp. talk
Work on several iterations of search terms internally and even
with adverse counsel to refine them
Useless, poorly planned terms result in bad pulls of data

Validating Your Searches


Define criteria for relevant/responsive docs
Test a purely random subset through actual eyes on
verification
Check positive and non-positive set for accuracy, and
modify
Redefine your terms and resample until you reach
acceptable accuracy

Define criteria for privilege/confidential docs


Test a purely random subset through actual eyes on
verification
Redefine until you reach acceptable accuracy

Applying Your Methodology

Your Project Manager has responsibility for planning and


organizing the review; keeping it on track and getting you
to the point of actual review.
Once you are at the point of actual review, the Project
Manager assumes responsibility for actually executing
your plan, which includes many responsibilities, e.g.,:

Gather and dispense training materials


Select a review team to match specific case needs
Participate in keyword validation
Perform early case assessment culling, filtering, sorting
data
Create tag structure to fit case needs
Set up the database

Applying Your Methodology


Train: the predictive ranking / coding technologies
Orchestrate substantive training for reviewers
Confirm that passwords are assigned and work stations work
Anticipate every conceivable bottleneck between review location and
data hosting vendor (e.g., pop-up blockers; other security issues)
Prepare and keep a project diary to track scope and/or rule changes,
quality checks, etc things you will need for defensibility.
Batch and assign documents to appropriate team members
QA/QC
Prepare daily/weekly progress reports
Oversee privilege review and privilege log development
Oversee transfer of data to production team/ to end client

DOCUMENTING YOUR PROCESS


Document your agreements with adverse counsel

Scope of holds
Scope of custodians
Key word filters
All efforts to resolve conflict (white hat approach wins)

Document your QA/QC process


Document filter validation, culling, batching, etc.
Document your training and overall process
Document your policy change / critical activities
Project manual is a terrific resource for showing defensibility

CHOSING THE RIGHT TECHNOLOGY


Have you sampled various platforms?
Ask questions, interrupt, understand limitations.

What are your budgetary constraints?


Is your review team familiar with this platform?
What are your specific case needs?
Volume, sophistication, expectations, etc.

Does the platform you are considering give you the tools
that you need / want?
e.g,, create your own tags, save search results, robust reporting
capabilities, redact on the fly, view in native, etc.

CHOSING THE RIGHT TECHNOLOGY


Does it have the features you want?
E.g., multiple color highlighting, clustering, e-mal analytics,
predictive ranking, etc.

Have you vetted the vendor who will be hosting your data?
What are their security requirements
How long have they been around
How much capacity can they handle

Never let up-front price be the sole driving factor in deciding what
technology to use.
Poor project management on technology team can derail a project.
Sophisticated technology can help reduce the cost of review which
is traditionally the largest expense.
Bells and whistles can dramatically increase decisions per hour
Reducing the volume of documents that require eyes on review

You might also like