By MJ Petroni and Jessica Long, with Steven Tiell, Harrison Lynch and Scott L. David
Produced as part of the Accenture Data Ethics research initiative and shared under Creative Commons.

As the capabilities of data analytics push further ahead, the risks grow for those whose data is collected. The likelihood that previously anonymized data may become de-anonymized increases with each new advance. Inherent biases are introduced through algorithm selection, training data, and hypothesis testing, which can result in automated decision-making that is biased.

Analytics can uncover information previously unavailable: it’s already possible, in some cases, for governments to use big data analytics to discover crimes that otherwise would have remained secret. What should be done with that information? Is that an easier question to answer when the culprits are terrorists or sex offenders? What if the government in question is an oppressive regime and the crime is breaking a law related to censorship? It is difficult to imagine the potential harm of unintended consequences in these areas, let alone take active steps to prepare for that harm, mitigate it, and recover from it.


One prudent approach to minimize the potential for harm is to gain informed consent from individuals who are disclosing data. With the increasing presence of (and reliance on) digital technologies, it is critical for individuals to understand what they are consenting to by sharing data. Similarly, it’s important to help designers and developers minimize unintended harm from the use of that data. In the field of data science, practitioners must integrate ethical decision-making into most discussions about data to avoid unforeseen risks that arise from the complex movement of data through a wide diversity of systems. Ethical behavior in this context is about the treatment, protection and transformation of data moving between systems (“data in motion”)—not just recorded, static data (“data at rest”). This paper explores how the concept of informed consent in a world of “data in motion” might be addressed to help apply the doctrine of doing no harm in the digital age.