Are you using data analytics to identify profitability of individuals or specific matters? Or to support or refute claims and allegations? Or to spot and exploit trends and opportunities? While definitions may vary, these are some of the things made possible by “big data,” according to legal and technical experts who shared their insights during a one-hour panel discussion hosted by LexisNexis® Litigation Solutions and developed by HB Litigation Conferences at LegalTech® New York on Jan. 30, 2013.

Speaking to a packed conference room, the panel comprised four thought leaders who were moderated by nationally recognized e-discovery expert George Socha, co-founder of EDRM.  Orrie Dinstein, Chief Privacy Leader and Senior IT & IP Counsel for GE Capital, started off by saying that “big data” is not just a large amount of data. He said there needs to be a problem to solve.

Dinstein defines big data with the “Three Vs.” First, is volume. That means terabytes or petabytes?once unimaginable volumes of data.

Second is velocity. Dinstein defined this with a hypothetical. A power plant has a thousand sensors all connected to a central system and each sensor is reporting back status information at constant intervals.  Taking Dinstein’s example, at a rate of a thousand bits of information perhaps every second, each day would generate over 1.4 million bits of data, or 511 million bits of data a year?and that is only one source of data.

The third “V,” Dinstein said, is variety. In the e-discovery context, the parties may be gathering email, documents, Web activity, and voice mail?both structured and unstructured data. When all of this goes into a single bucket?that variety creates a big data type of issue, he said.

Make sure you are receiving our conference and webinar calendar!

Calendar 3

 

Email us today to be sure you receive HB’s latest litigation event calendar! 

 

 

Browning Marean of DLA Piper gave the example of Google™ Flu Trends (www.google.org/flutrends), in which the company analyzes peoples’ search terms as indicators of, as Google puts it, “current flu activity around the world in near-real-time.”  Other examples of big data’s uses were the examination of information coming into call centers with details of product problems or HR complaints, data that can be mined to spot potential problems. 

Chris Emerson, Director of Practice Economics at Bryan Cave, brought his technology background to the firm, applying collection, coding and analysis of data to determine, for example, whether a case involving overtime violations in an employment case was really as big an issue as a plaintiff attorney claimed it was. “We were able to show that the magnitude of the violations was much smaller than they thought, then we were able to push the case to a settlement. This also saved the client money by not having to get outside analysis,” he said.

Jon Neiditz of Nelson Mullins (now with Kilpatrick Townsend) said that if a company or firm wants to demonstrate — or tell its story — that it does not have a hostile work environment, for example, then they “need to work with this data on a privileged basis and demonstrate improvement over time. Then you have a story of improvement.  If it is within a framework of a structured program it can be immensely powerful. These things have to take place in a high trust environment with a truly legitimate authority that is using that information very carefully and with consent.” Read the full story.