“In most cases, companies do not offer any explanation about how they gain access to users’ profiles, from where they collect the data, and with whom they trade their data. It’s not just fairness that’s at stake; it’s also trust.”
Tae Wan Kim
Associate Professor of Business Ethics
Increasingly, businesses rely on algorithms that use data provided by users to make decisions that affect people. For example, Amazon, Google, and Facebook use algorithms to tailor what users see, and Uber and Lyft use them to match passengers with drivers and set prices. Do users, customers, employees, and others have a right to know how companies that use algorithms make their decisions?
In a new analysis, researchers explore the moral and ethical foundations to such a right. They conclude that the right to such an explanation is a moral right, then address how companies might do so.
The analysis, by researchers at Carnegie Mellon University, appears in Business Ethics Quarterly, a publication of the Society for Business Ethics.
“In most cases, companies do not offer any explanation about how they gain access to users’ profiles, from where they collect the data, and with whom they trade their data,” explains Tae Wan Kim, Associate Professor of Business Ethics at Carnegie Mellon University’s Tepper School of Business, who co-wrote the analysis. “It’s not just fairness that’s at stake; it’s also trust.”
In response to the rise of autonomous decision-making algorithms and their reliance on data provided by users, a growing number of computer scientists and governmental bodies have called for transparency under the broad concept of algorithmic accountability. For example, the European Parliament and the Council of the European Union adopted the General Data Protection Regulation (GDPR) in 2016, part of which regulates the use of automatic algorithmic decision systems. The GDPR, which launched in 2018, affects businesses that process the personally identifiable information of residents of the European Union.
But the GDPR is ambiguous about whether it involves a right to explanation regarding how businesses’ automated algorithmic profiling systems reach decisions. In this analysis, the authors develop a moral argument that can serve as a foundation for a legally recognized version of this right.
In the digital era, the authors write, some say that informed consent—obtaining prior permission for disclosing information with full knowledge of the possible consequences—is no longer possible because many digital transactions are ongoing. Instead, the authors conceptualize informed consent as an assurance of trust for incomplete algorithmic processes.
Obtaining informed consent, especially when companies collect and process personal data, is ethically required unless overridden for specific, acceptable reasons, the authors argue. Moreover, informed consent in the context of algorithmic decision-making, especially for non-contextual and unpredictable uses, is incomplete without an assurance of trust.
In this context, the authors conclude, companies have a moral duty to provide an explanation not just before automated decision making occurs, but also afterward, so the explanation can address both system functionality and the rationale of a specific decision.
The authors also delve into how companies that run businesses based on algorithms can provide explanations of their use in a way that attracts clients while maintaining trade secrets. This is an important decision for many modern start-ups, including such questions as to how much code should be open source, and how extensive and exposed the application program interface should be.
Many companies are already tackling these challenges, the authors note. Some may choose to hire “data interpreters,” employees who bridge the work of data scientists and the people affected by the companies’ decisions.
“Will requiring an algorithm to be interpretable or explainable hinder businesses’ performance or lead to better results?” asks Bryan R. Routledge, Associate Professor of Finance at Carnegie Mellon’s Tepper School of Business, who co-wrote the analysis. “That is something we’ll see play out in the near future, much like the transparency conflict of Apple and Facebook. But more importantly, the right to explanation is an ethical obligation apart from bottom-line impact.”
More from: Carnegie Mellon University
The Latest Updates from Bing News & Google News
Go deeper with Bing News on:
- Art and the Algorithm: Computer Program Predicts Painting Preferenceson June 18, 2021 at 2:58 am
Do you like the thick brush strokes and soft color palettes of an impressionist painting such as those by Claude Monet? Or do you prefer the bold colors and abstract shapes of a Rothko? Individual art ...
- We must all learn more about the algorithms that shape our daily liveson June 16, 2021 at 10:06 am
Algorithms are crucial to many of life’s biggest decisions, but lots of us know nothing about how they actually work. To improve our relationship with algorithms, we should get to grips with the basic ...
- Understanding The Potential Of Unsupervised AIon June 15, 2021 at 4:30 am
It may have taken a pandemic, changing the very fabric of the workplace, but at last, we’re realizing the promise of unsupervised AI. For the first time, companies are leapfrogging to success by ...
- CRISPR Made More Precise by New Algorithmon June 14, 2021 at 2:17 am
Researchers from have developed a new method that makes CRISPR gene editing more precise than conventional methods. The method selects the molecules best suited for helping the CRISPR-Cas9 protein ...
- What Makes Quantum Computing So Hard to Explain?on June 13, 2021 at 5:00 am
Before we can even begin to talk about these computers' potential applications, we need to understand the fundamental physics behind them.
Go deeper with Google Headlines on:
Go deeper with Bing News on:
- The Future Challenges Of Big Data In Healthcareon June 18, 2021 at 4:00 am
Incredible discoveries in the science community have made data collection of individual health measures simple, from everyday wearables that measure vitals to mouth swabs that map out a genome.
- AFIMSC accelerates change across the enterprise with Big Dataon June 17, 2021 at 4:22 pm
The Air Force Installation and Mission Support Center is improving data-driven decision making processes across the Air and Space Forces. The center hosted a Big Data Summit June 15-17 to share the ...
- Ensuring data authenticity, integrity, and confidentiality biggest barrier to using digital technologies in clinical trials: Pollon June 17, 2021 at 2:37 am
Employing digital technologies in clinical trials, however, has presented new challenges including regulatory issues and reliability of data. In a poll Verdict has conducted to assess the barriers to ...
- Financial networks: A new discipline to interpret crises and green transitionon June 16, 2021 at 9:00 pm
Modelling the financial system as a network is a precondition to understanding and managing the containment of financial crises and the transition to a low-carbon economy. Financial Networks is the ...
- The American Legion demands the U.S. government act now to extract Afghan interpreters before Sept. 11 troop withdrawal deadlineon June 16, 2021 at 7:32 am
With less than 100 days remaining before the Sept. 11 deadline for U.S. troops to withdraw from Afghanistan, The American Legion demands that the U.S. government act now to extract Afghan interpreters ...