Main image of article The Basics of Ethical Data Management for Technologists

If the idea of “ethical data management” sounds academic, and even dull, ask yourself this: Would it bother you to learn that healthcare providers share your medical records with outside organizations, whether they tell you about it or not?

It shouldn’t, say companies such as Google, IBM, Microsoft and Amazon, all of whom work with data from millions of patients gathered by the hospitals where they received care. It’s also perfectly legal, Lisa Bari, a consultant and former healthcare technology executive, told The Wall Street Journal. Despite all those HIPAA forms you fill out at the doctor’s office, patient information can be shared as long as federal privacy laws are met. “The data belongs to whoever has it,” Bari said.

The idea that consumers are protected when businesses follow the law doesn’t particularly comfort many people. In a separate article, the Journal said employees at Ascension, a St. Louis-based hospital chain, questioned the way records were collected and shared with Google as the search giant worked on a healthcare solution.

Today, at least 150 Google employees have access to medical data from millions of patients across 21 states, including lab results, diagnoses and hospitalization records. The data adds up to “a complete health history, including patient names and dates of birth,” the Journal said. None of the patients or physicians involved were notified.

If you’re one of the 15.1 million people who’ve visited an Ascension physician or clinic, does this still seem academic and dull? Data professionals don’t think so.

Data, Ethics and the Real World

Ethical data management “is a huge subject,” said Ben Yurchak, president of KnowClick, a business analytics firm in Bryn Mawr, Pa. The discussion encompasses everything from how information is captured to who’s analyzing it and for what purpose.

Yurchak suggests that any business using data should provide “not just ‘legalese’ disclosure, but disclosure from a [view of] let’s really simplify this, let’s make really clear what we’re capturing on people and how we’re using it.”

Making that kind of commitment must occur on two levels, experts suggest. Organization-wide efforts, such as codes of conduct and other policies, are obviously important. But tech and data professionals also have a role to play. “It's incumbent on anybody in this field to keep in mind what protections they have to enforce on themselves in their own practices,” said Jeffrey Lyons, principal of Performance Advantage Technologies, a research-based assessment firm in Plano, Texas.

For example, Lyons has collected data on employee performance since 1991, but he’s never shared the core database. “I made the choice that the end user, the participant, has to be respected in all cases,” he said. When customers ask for raw data, “I simply tell them it would be a violation of my participants’ rights to share information that they did not consent to share.”

That’s especially important when data-use impacts people at the individual level. “If you’re capturing a lot of data on people and you’re just using it to make larger-scale business decisions, people generally don’t worry about that,” Yurchak said. “But if it comes down to me and you’re now increasing my credit card fees or denying a loan or aggressively marketing certain products, I’m going to get pretty uncomfortable unless I know what data you have.”

This means technologists shouldn’t necessarily wait for their employer to set direction or unthinkingly go along with data policies, many data professionals say. When working on projects that involve data, it’s important “to open the questions up internally and be brave enough to challenge your own organization’s practices,” Lyons said.

Organizations, he noted, often have the luxury of keeping discussions on sensitive matters to a purely business level. Tech and data pros, on the other hand, often discuss issues more openly: “We, as the producers of that technology, need to use our innate sense of right and wrong and generate some dialogue amongst ourselves about how our organization is using our talents.” 

Of course, that’s often easier said than done. Even in today’s tight labor market, few employees enjoy going toe-to-toe with their managers and department heads over matters of policy. But Lyons believes technologists can encourage their bosses to put data and ethics issues on meeting agendas or form multi-functional task forces to explore the implications of different policies and approaches.

In doing this, employees have a strong business argument to make: If customers become frustrated by how a company handles their data, “they’re going to rebel,” Lyons said. Both internally and within industry organizations, technologists can explore ways to build customer trust.

“There’s a big incentive that if we’re better-trusted than our competitors, we’re going to gain business over them because consumers want that trust,” Lyons added.

Who Wants Regulation?

Already, a number of tech companies have developed codes of ethics to govern their approaches to the data collection/analysis/presentation cycle, executives believe. Such efforts are “a good first step,” said Montra Ellis, senior director of product innovation for Weston, Fla.-based Ultimate Software. In addition to developing codes of ethics, she added, many firms are working with officials to develop “thoughtful” regulatory policies.

However, the idea of regulation makes some people, like Lyons, nervous. “Can we regulate data management without destroying the engine that’s driving innovation in data management?” he asked. Although the EU adopted its General Data Privacy Regulations, or GDPR, to address privacy and disclosure concerns, he’s not convinced a mandated approach will work in the U.S.

“Our free enterprise approach and our innovative approach to data management is saying we don’t want to throw the baby out with the bathwater because there’s many other contingencies to this decisions besides protecting rights,” Lyons said. “There are many cases where consumers would be hurt more by regulation and bureaucracy than they would be by allowing the system to correct itself.”

In the end, Lyons believes businesses have an innate motivation to treat data in an ethical way: “It’s just plain good business… When you irritate the consumer, you create a horror story for yourself down the road as that blossoms out into ‘this company is not ethical. No company can afford that.”