“In most cases, companies do not offer any explanation about how they gain access to users’ profiles, from where they collect the data, and with whom they trade their data. It’s not just fairness that’s at stake; it’s also trust.”
Tae Wan Kim
Associate Professor of Business Ethics
Increasingly, businesses rely on algorithms that use data provided by users to make decisions that affect people. For example, Amazon, Google, and Facebook use algorithms to tailor what users see, and Uber and Lyft use them to match passengers with drivers and set prices. Do users, customers, employees, and others have a right to know how companies that use algorithms make their decisions?
In a new analysis, researchers explore the moral and ethical foundations to such a right. They conclude that the right to such an explanation is a moral right, then address how companies might do so.
The analysis, by researchers at Carnegie Mellon University, appears in Business Ethics Quarterly, a publication of the Society for Business Ethics.
“In most cases, companies do not offer any explanation about how they gain access to users’ profiles, from where they collect the data, and with whom they trade their data,” explains Tae Wan Kim, Associate Professor of Business Ethics at Carnegie Mellon University’s Tepper School of Business, who co-wrote the analysis. “It’s not just fairness that’s at stake; it’s also trust.”
In response to the rise of autonomous decision-making algorithms and their reliance on data provided by users, a growing number of computer scientists and governmental bodies have called for transparency under the broad concept of algorithmic accountability. For example, the European Parliament and the Council of the European Union adopted the General Data Protection Regulation (GDPR) in 2016, part of which regulates the use of automatic algorithmic decision systems. The GDPR, which launched in 2018, affects businesses that process the personally identifiable information of residents of the European Union.
But the GDPR is ambiguous about whether it involves a right to explanation regarding how businesses’ automated algorithmic profiling systems reach decisions. In this analysis, the authors develop a moral argument that can serve as a foundation for a legally recognized version of this right.
In the digital era, the authors write, some say that informed consent—obtaining prior permission for disclosing information with full knowledge of the possible consequences—is no longer possible because many digital transactions are ongoing. Instead, the authors conceptualize informed consent as an assurance of trust for incomplete algorithmic processes.
Obtaining informed consent, especially when companies collect and process personal data, is ethically required unless overridden for specific, acceptable reasons, the authors argue. Moreover, informed consent in the context of algorithmic decision-making, especially for non-contextual and unpredictable uses, is incomplete without an assurance of trust.
In this context, the authors conclude, companies have a moral duty to provide an explanation not just before automated decision making occurs, but also afterward, so the explanation can address both system functionality and the rationale of a specific decision.
The authors also delve into how companies that run businesses based on algorithms can provide explanations of their use in a way that attracts clients while maintaining trade secrets. This is an important decision for many modern start-ups, including such questions as to how much code should be open source, and how extensive and exposed the application program interface should be.
Many companies are already tackling these challenges, the authors note. Some may choose to hire “data interpreters,” employees who bridge the work of data scientists and the people affected by the companies’ decisions.
“Will requiring an algorithm to be interpretable or explainable hinder businesses’ performance or lead to better results?” asks Bryan R. Routledge, Associate Professor of Finance at Carnegie Mellon’s Tepper School of Business, who co-wrote the analysis. “That is something we’ll see play out in the near future, much like the transparency conflict of Apple and Facebook. But more importantly, the right to explanation is an ethical obligation apart from bottom-line impact.”
Original Article: Analysis: Businesses Have a Moral Duty to Explain How Algorithms Make Decisions That Affect People
More from: Carnegie Mellon University
The Latest Updates from Bing News & Google News
Go deeper with Bing News on:
Explaining algorithms
- 'I Hate That F—ing Word’: Emily Blunt Gets Real About How Oppenheimer Would Never Have Been Made If The Studio Had Listened To Algorithms
Algorithms are often use to predict the success of big screen titles, but Emily Blunt uses Oppenheimer as a reason to fight against the practice.
- What's so special about TikTok's algorithm?
STORY: TikTok's algorithm has played a key role in the app's major success.Its unique content recommendation technology is what predicts which videos will pique user's interest.And it has attracted ...
- Explained: What is so special about TikTok's technology
The content recommendation algorithm that powers the online short video platform TikTok has once again come under the spotlight after the US ordered its Chinese owner, ByteDance, to sell the app's US ...
- Emily Blunt criticizes influence of algorithms: ‘frustrates me’ and ‘hate that word’
Emily Blunt has expressed strong disapproval of algorithms playing a role in Hollywood decision-making, particularly when it comes to determining the success of a film. In a Vanity Fair ...
- Emily Blunt Says Algorithms ‘Frustrate Me’ and ‘I Hate That F—ing Word’: ‘How Can We Let It Determine What Will Be Successful?’
Emily Blunt seems staunchly opposed to algorithms making any decisions in Hollywood. Ahead of the release of her summer tentpole “The Fall Guy,” the Oscar nominee joined co-star Ryan Gosling for a ...
Go deeper with Google Headlines on:
Explaining algorithms
[google_news title=”” keyword=”explaining algorithms” num_posts=”5″ blurb_length=”0″ show_thumb=”left”]
Go deeper with Bing News on:
Data interpreters
- Tesla's Robotaxi Plan Hasn't Excited Investors, But Fund Manager Gary Black Says This Is What Matters More: 'No Other Way To Interpret This...'
Tesla, Inc. (NASDAQ:TSLA) shares extended their gains for the third consecutive session on Thursday, following the release of the electric vehicle maker’s quarterly results. Gary Black of Future Fund ...
- Vietnam Automotive Cybersecurity Market Art of Data Analysis Techniques and Tools for Interpreting Market Research Results
The report utilizes quantitative analysis to provide relevant data-driven insights and identifies patterns in public records. The report predicts the growth of the market sector from 2024 to 2032, and ...
- INTERPRETING SLAVERY AT THE STRAITS
TOP: Fresh wood highlights the ongoing construction of a new addition to Colonial Michilimackinac reconstructed Southwest Rowhouse Thursday, April 16. ABOVE: Mackinac State Historic Parks Director of ...
- How to Become a Data Scientist in 2024: A 5-Step Guide to Get the Job
Learn how to become a data scientist, including the required education, skills, and experiences needed to get the job and a high data scientist salary.
- Interpreting News and Economic Data in Forex: A Guide for Traders
For forex traders, the constant flow of news and economic data releases can seem like an endless barrage of noise. However, those who learn to interpr ...
Go deeper with Google Headlines on:
Data interpreters
[google_news title=”” keyword=”data interpreters” num_posts=”5″ blurb_length=”0″ show_thumb=”left”]