
via Rensselaer Polytechnic Institute
Computer scientists propose systemic changes in automatic content curation
As the volume of available information expands, the fraction a person is able to absorb shrinks. They end up retreating into a narrow slice of thought, becoming more vulnerable to misinformation, and polarizing into isolated enclaves of competing opinions. To break this cycle, computer scientists say we need new algorithms that prioritize a broader view over fulfilling consumer biases.
“This is a call to arms,” said Boleslaw Szymanski, a professor of computer science at Rensselaer Polytechnic Institute. “Informed citizens are the foundation of democracy, but the interest of big companies, who supply that information, is to sell us a product. The way they do that on the internet is to repeat what we showed interest in. They’re not interested in a reader’s growth; they’re interested in the reader’s continued attention.”
Szymanski and colleagues at the University of Illinois at Urbana Champaign, the University of California, Los Angeles, and the University of California, San Diego, explore this troubling “paradox of information access,” in a paper published on arXiv.org.
“You would think that enabling everybody to be an author would be a blessing,” said Szymanski, an expert in social and cognitive networks, with previous work that includes findings on the power of a committed minority to sway outcomes. “But the attention span of human beings is not prepared for hundreds of millions of authors. We don’t know what to read, and since we cannot select everything, we simply go back to the familiar, to works that represent our own beliefs.”
Nor is the effect entirely unprecedented, said Tarek Abdelzaher, a professor and University of Illinois at Urbana Champaign lead on the project.
“It’s not the first time that affordances of connectivity and increased access have led to polarization,” said Abdelzaher. “When the U.S. interstate freeway system was built, urban socioeconomic polarization increased. Connectivity allowed people to self-segregate into more homogenous sprawling neighborhoods. The big question this project answers is: how to undo the polarizing effects of creating the information super-highway?”
The effect is exacerbated when our own human limitations are combined with information curations systems that maximize “clicks.”
To disrupt this cycle, the authors contend that the algorithms that provide a daily individualized menu of information must be changed from systems that merely “give consumers more of what these consumers express interest in.”
The authors propose adapting a technique long used in conveying history, which is to provide a tighter summation for events further back from the present day. They call this model for content curation “a scalable presentation of knowledge.” Algorithms would shift from “extractive summarization,” which gives us more of what we consumed in the past, to “abstractive summarization,” which increases the proportion of available thought we can digest.
“As long as you do balance content, you can cover more distant knowledge in much less space,” said Szymanski, who is also the director of a Network Science and Technology Center at Rensselaer. “Although readers have a finite attention span, they still have a slight knowledge in new areas, and then they can choose to shift their attention in a new direction or stay the course.”
Few analytical models exist to measure the trend toward what the authors call “ideological fragmentation in an age of democratized global access.” But one, which the authors considered, treats individuals as “particles in a belief space” — almost like a fluid — and measures their changing positions based on the change in content they share over time. The model “confirms the emergence of polarization with increased information overload.”
The more ideologically isolated and polarized we are, the more we are vulnerable to disinformation tailored to reinforce our own biases. Szymanski and his colleagues offer a slew of technical solutions to reduce misinformation, including better data provenance and algorithms that detect misinformation, such as internal consistency reasoning, background consistency reasoning, and intra-element consistency reasoning tools.
“The very sad development discussed in this paper is that today, people are not conversing with each other. We are living in our own universe created by the data which is coming from these summarization systems, data that confirms our innate biases,” Szymanski said. “This a big issue which we face as a democracy, and I think we have a duty to address it for the good of society.”
Szymanski and his co-authors are working on mathematical models that both measure the extent of polarization in various media, and predict how trends would change under various mitigating strategies.
The Latest Updates from Bing News & Google News
Go deeper with Bing News on:
Automatic content curation
- Fully-automatic Crossbow
Seeing the electrically-cocked, automatic projectile launchers wets our appetite, but we do wish there was more background info about the build process.
- Using AI In Your Content Curation Efforts
While saving time can be a benefit, using AI for content curation should provide some additional intelligence and perspective. AI-powered “Smart Sort” filters, like ours, continually learn ...
- Automatic Content Recognition (ACR) Market 2023: Growth Prospects and Revenue Analysis
Global “Automatic Content Recognition (ACR) Market” Report 2023-2029 is a comprehensive analysis of the market, providing detailed insights into the industry's growth opportunities ...
- Global Automatic Content Recognition (ACR) Market Research Report 2023-2028: Key Insights And Trends
The 2023 research report on the global 'automatic content recognition (acr) market' offers a comprehensive analysis of the market, providing insights into the challenges, drivers, and trends ...
- Content Curation Software Market Outlook 2023 and Forecast to 2029 with Top Countries Data
Feb 20, 2023 (The Expresswire) -- "Content Curation Software Market" Research Report 2023 Provides a Basic overview of the Industry including definitions, Company profiles of the Important thing ...
Go deeper with Google Headlines on:
Automatic content curation
[google_news title=”” keyword=”automatic content curation” num_posts=”5″ blurb_length=”0″ show_thumb=”left”]
Go deeper with Bing News on:
Ideological fragmentation
- US human rights reports full of lies, ideological bias: Chinese FM
The US' 2022 Country Reports on Human Rights Practices are "full of lies and ideological bias, and the clichés are repeated year after year and are not even worth refuting," said Chinese Foreign ...
- US human rights reports full of lies, ideological bias: Chinese FM
Says yearly reports price list to ‘coerce, extort other countries’ ISLAMABAD: The US’ 2022 Country Reports on Human Rights Practices are “full of lies and ideological bias, and the ...
- R.D. Judd: Libraries must not become ideological battlegrounds
Ecclesiastes 12:12 teaches us “the making of many scrolls (books to us) is without limit.” I have always been intrigued by this notion. Although couched as a warning, I have always thought ...
- Italy's gay opposition leader blasts bureaucratic crackdown on rights of LGBTQ parents as ideological, cruel and discriminatory as thousands protest in Milan
The head of Italy's opposition party blasted a bureaucratic crackdown on LGBTQ families as 'ideological', 'cruel' and 'discriminatory'. The Interior Ministry this week forced Milan to limit ...
- Teach activist educrats a lesson: Parents matter not your ideological agenda
The school board’s message couldn’t be clearer: Parents only matter if they help the county accomplish its ideological agenda. That’s activism, not education. Unfortunately, those who’ve ...
Go deeper with Google Headlines on:
Ideological fragmentation
[google_news title=”” keyword=”ideological fragmentation” num_posts=”5″ blurb_length=”0″ show_thumb=”left”]