Home – Do You Need the Government to Tell You How to be Ethical with Big Data?

Do You Need the Government to Tell You How to be Ethical with Big Data?

  By Kristin Casler, featuring Tim Keller of Lindquist & Vennum LLP and Robert Carver of Brandeis International Business School

 

Regulators from all walks of government are anxious to exert some control over data use and privacy. While companies wait for guidance or restrictions, experts agree there is a good deal that businesses can do on their own to use data ethically while still achieving their objectives. Since the possible implications of data analytics and use lie beyond the collection of personal information, extending into many unexpected areas of business, corporate leaders need to keep their eyes on their technologies. 

“The law is always behind the technology. Until the law catches up, we have to work with our understanding of the technology and an understanding of how regulators have previously addressed the kinds of issues the new technologies raise,” said Tim Keller of Lindquist & Vennum. 

For guidance, Robert Carver, an adjunct professor at Brandeis International Business School, went back to some of Supreme Court Justice Louis Brandeis’ writings on privacy. While they were generated to address concerns over photography and newspapers, he found they’re still relevant today. Carver said companies should discuss data use issues with a focus on their own internal values and their reputations. 

“As in Brandeis’ time, technological innovation can present new privacy and ethical challenges,” Carver said. “There are already some pretty well-established and relevant codes of ethics; the scale of the data and applications are just new.” 

 

Personal Data

Most people know that when they buy something online or use a search engine to find an article, their data is being analyzed and stored. But most don’t know how long it’s stored (probably long after it is valid) or myriad ways it might be used. With data analytics, different data sources can be combined and used to sift information in ways that are unimaginable when people consent to providing their data, Carver said. Sometimes the combination and sifting is benign, and sometimes, not so much. 

It makes sense for companies to ask whether the current privacy standards, definitions and permissions are adequate, Carver said. Companies should try to be open and honest about how they use data. 

At the same time, companies don’t want to stagnate the advances in criminal justice and public health or the economic boosts that are made possible by data analytics. 

Part of the answer may be greater public education about data availability and use, Carver said. Coming up with meaningful language that people could actually comprehend would be one giant, ethical step.

 

The Spectrum of Risk

Industry experts talk about the spectrum of risk, Carver said. At the bottom are companies like Netflix and Amazon that analyze customer data to make recommendations in order to increase profits. If they’re way off base, the risk is small. 

Using similar predictive methods, a pharmacy may recommend products or even a flu shot based on a customer’s age and gender. It’s still not a tremendous risk. But what if someone is applying for a mortgage, and the broker’s analytical software says the applicant’s profile looks like other people who have defaulted in the past, and he denies the loan? Or what if the police think a profile has too many similarities to child abusers, and they initiate an investigation?

 

And it may not be too great of a leap for some entrepreneur to take the data used to make movie recommendations to come up with a clever new, unanticipated use for that data. 

Consequences of an error are quite different from one end of the spectrum to the other, Carver said. “Where are the safeguards?”  

“We worry about people with nefarious motives—hackers looking for credit card data, corporate spies seeking intellectual property. Maybe for corporate counsel, the equally large concern should be protecting their organizations from unintended consequences that result from data use,” Carver said.

 

Beyond Personal Data

Data—not just personal data—is integral to so many aspects of every type of business. We tend to think of data issues as privacy issues. So, if you don’t work in a consumer-focused business, you might not give much thought to the other data-related issues businesses face. But data issues are increasingly unavoidable, and those other issues can be big ones, Keller said. 

As businesses begin to rely on externally sourced data, there can be an impact on their intellectual property rights, Keller said. When you use data to create data, there can be questions about ownership. Businesses that license data from outside sources for design, research and development know that they need to look carefully at their agreements with their data suppliers.  But how does the data vendor interpret things as simple as confidentiality provisions?  Does the vendor think their confidential information includes the results of your use of the vendor’s data?  As the value of data increases, data vendors are likely to become more particular about such things.  

Then there’s the problem of using bad data to create unsafe products that get into the marketplace. “As technology enhances our ability to do things, it enhances our ability to do things badly,” Keller said. “The faster you drive, the harder you hit the abutment.”

 

New Privacy Issues

Data and privacy issues are arising in areas where they haven’t been considered in the past.  In mergers and acquisitions, we now need to consider the data that is involved, Keller said. How are you going to deal with a merger of two businesses that have collected personal information under two different privacy policies?  How do you deal with different assumptions about how that is to be treated? Deal makers should be asking about data governance policies and practices and consider how data held by each party will be handled after the transaction, Keller said.  

“This is a big issue, and regulators have only recently started to address it,” Keller said. 

Dealing with data that you know is personal can be hard enough, but what about data that people would generally consider private, but is discovered through the analysis of public information?  Such data is now coming under scrutiny, Keller said. He noted that European Union regulators are considering the privacy implications of such data.  

There’s also the law of unintended consequences. The EU sponsored a study about what kind of pressure is being put on the fisheries. It looked anonymously at what boats were catching. This was useful for science. But because each boat submits a manifest, the data would show how hard the workers on each boat were working. That sent up a privacy red flag, Keller said. 

 

Using Data for the Good and the Bad

Data analytics takes information you’ve had for years and lets you do more with it. You now have the ability to know things that you simply couldn’t before. Keller pointed to a Federal Trade Commission workshop on Big Data that focused a lot on consumer protection issues.  Those issues could include the ability to use data to discriminate in matters such as insurance, lending and employment. It was abundantly clear that regulators are going to be all over this, he said. 

A great deal of positives also come out of new data-analysis tools and techniques, Keller said. Health care is a good example.  One researcher looking at a visualization of data from a study of *** cancer survivors spotted a pattern in the data that revealed previously unknown populations of cancer survivors.  The data had been analyzed for years, but new data tools made this new discovery possible.

 

Commence a More Careful Treatment of Data

The first step in the data diligence process should be examining your company’s data governance policies and practices. 

In-house counsel should take advantage of their position as members of the inside team to learn about the company’s data practices. You don’t have to know much about the technology to get started.  Your inquiry will lead you in the right direction. 

“It all circles back to having data governance policies that are current with technology and the data used, no matter where it comes from, so the business is working in a manner consistent with internal policy and anticipated regulations,” Keller said. 

Remember, data analytics is a very enabling technology. It enables the good and the bad. It is imperative, Keller said, to ramp up caution at the same time we ramp up capabilities. Failure to do so could harm third parties as well as your organization’s financial position and image in the world marketplace. 

 

Disclaimer: The views and opinions expressed in this article are those of the individual sources referenced and do not reflect the views, opinions or policies of the organizations the sources represent.