Thursday, July 27, 2017

It's Not Just Legal Research -- Check Alerts & Analytics Too!

Numerous articles have been written lately about the differences in relevant results pulled back in various legal research databases (e.g. Robert Ambrogi's recent article and Susan Mart's research that really kicked off this discussion). That may just be the tip of the iceberg. In closely examining the resources you already have or those that you're trialing, you may soon discover more discrepancies than you'd expect when making apples to apples comparisons. Docket litigation alerts and even those trendy analytics that we can't seem to get enough of are at least two additional areas where more critical evaluation is needed.

New Litigation Docket Alerts
Back in April, I evaluated and compared two weeks' worth of patent alerts from six different resources. After examining well over 250+ cases, the results were quite surprising to me. Only two of the resources included all of the patent infringement cases and one of these, while including the relevant cases, did not always have the patent infringement cause of action indicated. Three of the six resources missed over ten cases in that two-week time period. The lowest performer of these resources missed seventeen cases! Some of the cases that were missing had both trademark and patent infringement causes of action, others just simply did not include your run-of-the-mill patent infringement case. My hypothesis is that this happens because of the nature of suit limitations on the civil cover sheets filed in federal district court cases:



Unless the various vendors have diligent people parsing through the filings (clue: it's not the vendors charging an arm and a leg) or their technology is such that they're capturing every cause of action, our attorneys may be missing cases that they (and perhaps some information professionals) assume are coming through. I've noticed too that at least one of the databases will miss cases in the alerts, but then eventually go back and tag those cases correctly. While helpful in conducting analytics later, it still misses the point of being able to count on these alerts for business development purposes and making sure we can keep our clients informed of new litigation of interest to them.

Analytics
I recently decided to evaluate three resources on their analytics of patent cases filed within a certain time period in federal district courts and for certain companies (and making sure to take into account company subsidiaries, etc.). This would seem like a fairly straightforward exercise considering I was looking at data within the past two years and all three resources indicate that they have this coverage. Once again the discrepancies are disconcerting. While two of the resources were pretty close in their numbers, one resource was significantly off and clearly missing relevant cases. Try a similar search with analytics for judges too and you're likely to be surprised. It's easy enough to foresee this playing out in a less commonly-filed motion, where one database has identified four motions as having been granted and one denied, leading one to believe on initial glance that the motion will probably be successful in front of a judge, whereas another database has identified that there are actually a total of seven motions - four granted, two denied, and one granted-in-part/denied-in-part. What looked like a likely outcome for the client, doesn't look like such a sure win now.

Law Librarians & Information Professionals to the Rescue
This is a great opportunity for us to make sure we're asking questions of the vendors and comparing databases against each other so that when they claim to have the same scope and coverage of information, when we run a search, we're getting results that make sense. This isn't limited to the patent infringement arena either. I've seen it in securities products as well. The underlying issue here is really information literacy. Most of our attorneys are going to assume that we've evaluated and chosen reliable and accurate resources that have the coverage for their needs. Have we evaluated our resources to know that this is the case? With all of the analytics products coming out, do we emphasize in our training sessions with our attorneys that if possible they should be comparing analytics within the databases that track the same data or at least taking a deep-dive into the analytics when they're relying on it for an important strategic decision for a client's case? It would be great to have one go-to place for all of our analytics needs, but we're not there yet. 

4 comments:

Bob Ambrogi said...

Corinne - Have you published these evaluations anywhere? Can you identify the services you compared?

-Bob Ambrogi

Susan Nevelow Mart said...

Thank you for this work on evaluating the resources we use! Unless we expose the flaws, the algorithms won't get fixed. And knowing the potential for flawed results is now a key p art of information literacy.

Corrine Latham said...

Hi Bob,

I have not published these evaluations. Initially I was just comparing the alerts for a project at the firm and was quite surprised by my findings. Your article and Susan's made me realize that sharing my findings would be a good idea too. The six resources that I evaluated for the alerts are Bloomberg, CNS, CourtLink, Court Wire, Docket Navigator, and Lex Machina. The three resources I looked at for analytics were Bloomberg, Docket Navigator, and Lex Machina.

Thanks,
Corrine

Corrine Latham said...

Thank you Susan, I wholeheartedly agree.