As technology has continued to evolve at an unprecedented rate, so have communication and public discourse styles on social media, for both the good and the bad. Colorado State University students and faculty gathered to discuss this rapidly changing landscape March 7 in CSU’s Lory Student Center.
A part of the College of Liberal Arts Democracy Summit 2025, the panel discussion, titled “Code vs. Consequence: The Tech & Policy Debate on Misinformation and Social Media,” featured the return of former CSU political science Professor Dominik Stecuła. He currently holds an assistant professor position at Ohio State University.
“His research agenda is situated primarily in the fields of political science and communication,” said Sam Houghteling, Straayer Center for Public Service Leadership program manager. “In his research, he analyzes both the supply and demand side of information and how it impacts public opinion formation, important societal issues like climate change, vaccinations … and how misinformation influences those processes.”
Moderated by student democracy fellow and political science student Ethan McGuinness, the discussion kicked off with the impacts that artificial intelligence bots and deep-learning algorithms have on the misinformation social media landscape. While research into their effects is still growing, their immediate effect is quite evident.
“Really all this stuff does is not only spreads individual falsehoods, but it also fosters this epistemic uncertainty,” Stecuła said. “It erodes public trust in everything, not just the misinformation or specific information sources. It just undermines trust in all kinds of institutions, media and, you know, electoral institutions, governmental institutions, … and that’s the biggest problem.”
Misinformation is defined as false or inaccurate information, while disinformation is false information designed to intentionally mislead through misrepresented facts. Yet the true consumption of both information categories may be different than one would assume, as illustrated through the percentage of America’s total internet browsing that includes news sites.
“Three percent,” Stecuła said. “Out of that 3%, 14% (are) websites that focus on political news specifically. So that’s a very small part of people’s information consumption, right? Fake news, misinformation is an even smaller percentage of that. … Who consumes a lot of (misinformation)? It’s essentially 1% of the population (that) consumes (the) overwhelming majority of all misinformation, right? So it’s a very skewed distribution.”
Evidence, as Stecuła explained, also illustrates how exposure to misinformation online does not always drastically change someone’s political behaviors compared to their ideologies prior to consuming the information.
“The people who do seek out partisan political information tend to seek out partisan, ideologically aligned information, so the people who are already fairly polarized are the ones who are most likely to end up in an echo chamber,” Stecuła said. “Polarization has been something that’s been going on as a process before social media really became a major source of information.”
Additionally, regardless of political partisanship, online viewers of news media are not often inclined to actively fact check the information they are presented.
“The problem with fact checks is that the people who are exposed to misinformation usually don’t start their day by going to factcheck.org,” Stecuła said. “So where misinformation occurs and where fact checks happen are two different venues.”
When determining how to combat misinformation online, several factors must be considered, such as which party gets the ultimate authority to define what is fact versus what is fiction.
“So we have to think about kind of what, who are the actors that we feel comfortable with making these decisions,” Stecuła said. “And there’s no easy answer here. Do you want the government, or do you want the industry to regulate itself?”
“There will always be misinformation, but there are ways and practices that we can implement that will slow down the spread of misinformation.” -Kaitlyn Spencer, CSU student
To illustrate this growing debate of nuance, Stecuła pointed attendees to Germany’s Network Enforcement Act. Passed in 2017, the law was designed to address online hate speech and extremism; it instead produced unintended consequences.
“When Germany passed that law, it was definitely restricting speech,” Stecuła said. “It was not particularly transparent in terms of how they were making decisions (about) whether something is misinformation or not (and) what is fake news, and that law then itself became something that people like Vladimir Putin and other authoritarian rulers then said, ‘See, Germany is passing this law. We’re going to pass a law similar to that. We’re also going to crack down on misinformation,’ but in their context, misinformation was just essentially trailing journalists, shutting down speech and … trying to mute political dissent.”
Another way to combat the spread of misinformation is by installing more fact-checking methods across social media sites. When present, these cliff notes have been found to make media consumers pause and consider the evidence in front of them.
“That’s friction, right?” Stecuła said. “That’s slowing you down. … And then research shows that slowing you down helps because when you’re kind of on cruise control, that’s when all of your bad partisan instincts kick in. So the moment you slow down, that tends to work. So there’s kind of platform design things that can be introduced.”
Social media users can also advocate for themselves by diversifying their news sources away from solely social media origins, both by curating their feeds and searching for traditional publications.
Only through constructive, intentional dialogue will combating misinformation and overcoming partisan divisions be possible.
“To quote like Marshall McLuhan: (The) medium is the message,’ and, like, social media as a medium is just not a very conducive forum for really thoughtful dialogue or conveying information in a way that’s really meaningful, right?” Stecuła said. “It’s convenient. Short videos are fun, but I think, like, just spending less time on social media, or at least not thinking you’re being informed on social media (is crucial).”
This sentiment was echoed by political science and international studies student Kaitlyn Spencer.
“There will always be misinformation, but there are ways and practices that we can implement that will slow down the spread of misinformation,” Spencer said. “So there is some hope there, but it’s a very slippery slope to navigate, and I’d say, like, my biggest thing that I learned today is … (to) be patient with people that are definitely spreading this misinformation.”
Only through intentional, communal action will real change — both in the digital and physical realms — be possible.
“I think the takeaway is that we each need to do our own part to break out of that doom loop of political polarization (and) misinformation, and, you know, really do what we can to seek out that quality news to help you ensure that we continue to live in a really healthy democracy, and that we continue to be stewards of that,” McGuinness said.
Reach Katie Fisher at news@collegian.com or on social media @CSUCollegian.