
via Rensselaer Polytechnic Institute
Computer scientists propose systemic changes in automatic content curation
As the volume of available information expands, the fraction a person is able to absorb shrinks. They end up retreating into a narrow slice of thought, becoming more vulnerable to misinformation, and polarizing into isolated enclaves of competing opinions. To break this cycle, computer scientists say we need new algorithms that prioritize a broader view over fulfilling consumer biases.
“This is a call to arms,” said Boleslaw Szymanski, a professor of computer science at Rensselaer Polytechnic Institute. “Informed citizens are the foundation of democracy, but the interest of big companies, who supply that information, is to sell us a product. The way they do that on the internet is to repeat what we showed interest in. They’re not interested in a reader’s growth; they’re interested in the reader’s continued attention.”
Szymanski and colleagues at the University of Illinois at Urbana Champaign, the University of California, Los Angeles, and the University of California, San Diego, explore this troubling “paradox of information access,” in a paper published on arXiv.org.
“You would think that enabling everybody to be an author would be a blessing,” said Szymanski, an expert in social and cognitive networks, with previous work that includes findings on the power of a committed minority to sway outcomes. “But the attention span of human beings is not prepared for hundreds of millions of authors. We don’t know what to read, and since we cannot select everything, we simply go back to the familiar, to works that represent our own beliefs.”
Nor is the effect entirely unprecedented, said Tarek Abdelzaher, a professor and University of Illinois at Urbana Champaign lead on the project.
“It’s not the first time that affordances of connectivity and increased access have led to polarization,” said Abdelzaher. “When the U.S. interstate freeway system was built, urban socioeconomic polarization increased. Connectivity allowed people to self-segregate into more homogenous sprawling neighborhoods. The big question this project answers is: how to undo the polarizing effects of creating the information super-highway?”
The effect is exacerbated when our own human limitations are combined with information curations systems that maximize “clicks.”
To disrupt this cycle, the authors contend that the algorithms that provide a daily individualized menu of information must be changed from systems that merely “give consumers more of what these consumers express interest in.”
The authors propose adapting a technique long used in conveying history, which is to provide a tighter summation for events further back from the present day. They call this model for content curation “a scalable presentation of knowledge.” Algorithms would shift from “extractive summarization,” which gives us more of what we consumed in the past, to “abstractive summarization,” which increases the proportion of available thought we can digest.
“As long as you do balance content, you can cover more distant knowledge in much less space,” said Szymanski, who is also the director of a Network Science and Technology Center at Rensselaer. “Although readers have a finite attention span, they still have a slight knowledge in new areas, and then they can choose to shift their attention in a new direction or stay the course.”
Few analytical models exist to measure the trend toward what the authors call “ideological fragmentation in an age of democratized global access.” But one, which the authors considered, treats individuals as “particles in a belief space” — almost like a fluid — and measures their changing positions based on the change in content they share over time. The model “confirms the emergence of polarization with increased information overload.”
The more ideologically isolated and polarized we are, the more we are vulnerable to disinformation tailored to reinforce our own biases. Szymanski and his colleagues offer a slew of technical solutions to reduce misinformation, including better data provenance and algorithms that detect misinformation, such as internal consistency reasoning, background consistency reasoning, and intra-element consistency reasoning tools.
“The very sad development discussed in this paper is that today, people are not conversing with each other. We are living in our own universe created by the data which is coming from these summarization systems, data that confirms our innate biases,” Szymanski said. “This a big issue which we face as a democracy, and I think we have a duty to address it for the good of society.”
Szymanski and his co-authors are working on mathematical models that both measure the extent of polarization in various media, and predict how trends would change under various mitigating strategies.
The Latest Updates from Bing News & Google News
Go deeper with Bing News on:
Automatic content curation
- Automation needed to fight army of AI content harvesters stalking the web
Just when you think you've ban-hammered one, it pops up with another name Analysis This month Anthropic's ClaudeBot – a web content crawler that scrapes data from pages for training AI models – ...
- The AI Landscape Is Shifting Amid State-Level Regulation
New York recently passed a law prohibiting insurers from using AI to set auto insurance rates based on factors such ... Concerns over AI-driven content curation and targeted advertising on social ...
- Curation Litigation: Social Networks’ Right to Be Unsociable
At their core, these cases turn on whether large media platforms, which are said to be tantamount to the modern public square, may censor content from their sites. Or, put another way, can the ...
- Launch of Revolutionary Content Curation Layer (CCL) for Travel Agencies Highlights Travelport's AI Strategy
"We believe AI and machine learning are powerful tools to do just that. The Content Curation Layer will allow travel agencies to provide travelers the right range of normalized, enriched ...
- Tag: automatic content recognition
Yes. Our advice: Turn off its tracking features — especially Automatic Content Recognition (ACR). What is ACR, and how do you turn it off? It’s a visual recognition feature that can identify ...
Go deeper with Google Headlines on:
Automatic content curation
[google_news title=”” keyword=”automatic content curation” num_posts=”5″ blurb_length=”0″ show_thumb=”left”]
Go deeper with Bing News on:
Ideological fragmentation
- India’s Dalits and Muslims Can Work Together Against Modi
The results of India’s national elections, announced at the beginning of June, cast doubt on the accuracy of forecasts which had unanimously predicted that the National Democratic Alliance (NDA) — the ...
- Success, collapse or muddle? Predicting the road ahead for the GNU
“Fragile stability” where the GNU struggles to find common ground on policy issues, leading to internal dissent and slow progress; and “Fragmented collapse” where the GNU fails to reconcile ...
- Why is Prez Ranil Wickremesinghe is running as an independent in Sri Lankan polls?
Ranil Wickremesinghe’s move comes as the UNP faces significant internal fragmentation, weakening its support base. He likely aims to leverage the economic recovery he has spearheaded since the 2022 ...
- US practice of divide and rule could be the ruin of us all
It means market fragmentation, economic destruction, social upheaval and loss of lives. In a fractured world today, it could trigger another world war.
- J.D. Vance’s politics may be opportunistic but also appear to have an ideological core: Thomas Suddes
Donald Trump’s selection of U.S. Sen. J.D. (James David) Vance, a Cincinnati Republican, and Middletown native, to be his vice-presidential running mate upshifted Democrats’ Talking Points ...
Go deeper with Google Headlines on:
Ideological fragmentation
[google_news title=”” keyword=”ideological fragmentation” num_posts=”5″ blurb_length=”0″ show_thumb=”left”]