This is interesting. Earlier today, Twitter chief Elon Musk replied to a commentator who questioned why he was seeing more ‘right wing’ posts in his feed.
People on the right should see more “left wing” stuff and people on the left should see more “right wing” stuff.
— Elon Musk (@elonmusk) January 16, 2023
But you can just block it if you want to stay in an echo chamber.
Now, this is not a definitive policy change or update that Elon’s reinforcing. But it does seem to suggest that Twitter may, at the least, be looking to show people more opposing political commentary in their tweet feeds, as a means to spark broader awareness and engagement. Musk also pinned this tweet, which adds a little more weight to the suggestion.
Which is interesting because as various studies have shown, this approach simply doesn’t work.
Back in 2020, Meta executive Andrew Bosworth published a long blog post on the challenges of political polarization on social networks, and their experiences at Facebook in dealing with such.
Bosworth explained that, while they had, at different times, tried to try and show users more content from both sides of the political spectrum, the user response had been the opposite of the intended effect.
As per Bosworth:
“Ask yourself how many newspapers and news programs people read/watched before the internet. If you guessed ‘one and one’ on average you are right, and if you guessed those were ideologically aligned with them you are right again. The internet exposes them to far more content from other sources (26% more on Facebook, according to our research). This is one that everyone just gets wrong. The focus on filter bubbles causes people to miss the real disaster which is polarization. What happens when you see 26% more content from people you don’t agree with? Does it help you empathize with them as everyone has been suggesting? Nope. It makes you dislike them even more.”
Within this, Bosworth essentially acknowledges that Facebook usage has indeed amplified political division, though not in the way that many expect – i.e. by showing you more and more posts that align with your established beliefs. Bosworth says that Facebook users actually ended up seeing a lot more opposing viewpoints, but that only exacerbated political divides, because these posts drove more angst, and further embedded opposition, as opposed to opening people’s minds to another way of thinking.
Indeed, that same year, in a speech at the Munich Security Conference, Meta CEO Mark Zuckerberg explained that:
“People are less likely to click on things and engage with them if they don’t agree with them. So, I don’t know how to solve that problem. That’s not a technology problem as much as it is a human affirmation problem.”
Shortly after this, in January 2021, Meta announced its intention to reduce the amount of political content in user feeds.
As per Zuckerberg (on Meta’s Q4 ‘20 earnings call):
“One of the top pieces of feedback we’re hearing from our community right now is that people don’t want politics and fighting to take over their experience on our services.”
People were getting sick of the angst and bickering on Facebook, which was causing them to log on less and less, so Meta sought to reduce political content, in favor of more enjoyable experiences.
In fact, as per more recent reports, Zuckerberg actually directed his engineering teams to effectively cut political content out of people’s News Feeds altogether. Which Facebook users also didn’t like.
Meta has since scaled back politics in-feed, but it’s stopped short of eliminating it.
As reported by The Wall Street Journal:
“Meta now estimates politics accounts for less than 3% of total content views in users’ newsfeed, down from 6% around the time of the 2020 election, the documents show. But instead of reducing views through indiscriminate suppression or heavy-handed moderation, Facebook has altered the newsfeed algorithm’s recommendations of sensitive content toward what users say they value, and away from what simply makes them engage, according to documents and people familiar with the efforts.”
In other words, Meta’s not showing people as much divisive, incendiary posts - which likely means that it’s not looking to highlight as much content from the opposite side of the political spectrum.
Which may, as Musk notes, keep users in their echo chamber. But the research and experiments show that users simply don’t want the constant provocation and angst, which, eventually, sees them use the app less.
On Twitter, that’ll likely drive users to switch over to the ‘Following’ feed instead of the ‘For You’ main listing, which includes recommended tweets, as chosen by Twitter’s systems. If Musk and Co. are indeed pushing more politics into this stream, these past experiments suggest that it won’t work out as they might hope – though in theory, you can see why Musk wants to expand people’s horizons, and get them to see more content from the other side of the political divide.
Or he just wants to promote his own political opinions, and get more people to see things from his perspective.
It’s difficult to understand the full motivations in this respect, particularly given Musk’s overt political leanings and opinions. But in essence, it seems like another idea that seems to make sense, based on an ideological view, but in reality, doesn’t work - and we have a heap of study and data to underline this.
Still, Musk has shown that he’s going to go his own way, even if that means challenging established concepts, in order to prove them for himself.
Maybe it works out different on Twitter, but it seems like a risky move, especially when you’re trying to maximize discovery and engagement within that main ‘For You’ feed.
But if you start to notice more political content in your tweet feed, this is probably why.