By Peter Ramsey

16 Apr 22

Elon's Twitter, I mean X Company Logo
Twitter5 min read
Listen

Elon's Twitter, I mean X

Elon's Twitter, I mean X Featured Image

You've probably heard: Elon Musk is trying to buy Twitter, with the rationale that it's a philanthropic move to preserve and promote democracy. 

Whether or not he's successful, it's encouraged conversations about free speech, the role of social media and how a more transparent Twitter could theoretically work.

Undoubtedly a major asset in Elon's playbook will be the user experience. Not necessarily improving it from the typical lens of conversions, but rather with the goal of transparency and understanding.

So how could his incarnation of Twitter look? And how could clever UX help promote transparency and freedom of speech?

What's covered:

  • 👀

    1. Promoting and demoting tweets

  • 📝

    2. The open-source algorithm

  • 🧠

    3. Better tools to block users

  • 🔕

    4. Trending transparency

Let's dive in:

1. Anti-viral notice

One of Musk's criticisms of the status quo, is that Twitter silently promotes or demotes content, through a mixture of both algorithmic and human decisions.

At some point, someone, for some reason, decided to promote Tweet X, and stop Tweet Y showing up in other people's feeds.

His suggestion was that when intervention is necessary, the rationale, decision, method for deciding and consequences should be public.

i.e., a notice could appear like this:

null image

Clicking to learn more could—as we'll see in a moment—take you into the Matrix (Twitter's Matrix).

2. Open-sourced algorithm

In his recent Ted Talk, Elon outright mentioned releasing the algorithm(s) on GitHub—a popular tool for developers.

But whilst GitHub has an impressive 73 million users, Twitter has around 300 million. 

So for this to work, Twitter will need to teach people about how content is curated in a non-technical way.

To be clear, I'm not suggesting that we need 300 million people peer-reviewing the code, but rather everybody will need a basic understanding of how to control the content that they see.

As an example, why is this Tweet's reply appearing third, when there are plenty of other replies which have 100x more engagement, further down?

null image

Now, consider an example like this, but where the Tweets are all politically charged. The lack of justification may obfuscate the real reason why that Tweet is being shown so highly—fuelling theories that it's an intentional act to help popularise a bias.

Putting aside the algorithmic peer-reviewing aspect, this poses a major UX challenge: how do you get people to interact with, understand and trust the mechanism in which content is served?

i.e., how do you get non-technical people to trust that there aren't controlling biases?

The complexities need to be distilled and immediately accessible. For example, by introducing a new action: The Twitter Academy.

null image

Clicking on the 'new' icon, on any Tweet could take you to a visual breakdown of the factors influencing the position and promotion of that Tweet.

null image

This would allow non-technical users to peer-review the curation output with a greater understanding of the inputs.

This isn't the only solution, but it highlights the dilemma: it's not simply enough to fix the algorithm, you need to fix the public perception of the machine.

3. Better tools

Currently, when you report a Tweet as offensive, Twitter will "do something" with your complaint, and then give you two actions: to block or mute this user.

null image

Hearing opinions that you don't agree with is a necessary result of free speech. 

And as an organisation (or algorithm), being an arbiter of the rules is unfathomably hard. When is something disagreeable, and when does it encite violence?

But, at an individual level, it should be easy for you to control the type of content you see.

As an example, if I'd reported one of Elon Musk's Tweets, Twitter could ask me why, and then help me identify and sever the connection that led to me being shown it in the first place.

i.e., by outlining the ingredients that led to that content being served.

null image

Today, Twitter allows you to block or mute individuals, but makes very little effort to assist you in further curating the content that you see. It also doesn't help you understand why you're seeing Tweets from people that you don't follow.

As it stands, the user experience is not built as a curation tool, but optimised for endless consumption.

4. #Transparency

There's a similar UX challenge facing Musk, should the acquisition go ahead: helping users understand what's trending, and why.

I personally use Twitter's discovery tab daily—which I love because it's a real time reflection of society. Things trend on Twitter before they've broken on major news networks.

But, almost every time I use it, I see an item and wonder why I'm being shown that. It's very often neither notable, nor getting huge engagement.

Today, you can click, and mark hashtags as being uninteresting:

null image

Which in a vague world of "tell us stuff you don't like, and we'll show you better stuff", feels akin to visiting a witch doctor, rather than traditional medicine.

The incentive to mark something as uninteresting is very low, because it's unlikely to reappear anyway.

Instead, the motivation to self-curate the trending content, relies on the basic understanding and confidence in how the topics are selected in the first place.

For example, when marking something as uninteresting, Twitter could help you select other topics that you're likely not interested in.

null image

Summary

If Twitter is to transition into free speech absolution, then it will need to create tools, and build thoughtful UX to help people self-curate the content that they see.

This isn't just a question of opening up the algorithm on GitHub, and letting developers pour over every line—but rather a long-term exercise of helping people utilise the mechanics of the game.

Humans are lazy, and many people will be more likely to moan endlessly about their messy feeds, rather than go into the settings and curate the content themselves.

Providing the functionality to do this is one thing, creating an experience which makes that process enjoyable is another.

So, to maintain the coherence of Twitter, without the vagueness of how it works, the complexities need to be approached with a thoughtful user experience in mind.

You’ve finished this study

+1

Become a BFM+ member to track your progress, create a library of content and share learnings within your team.

You’ve finished this study

Other studies picked for you

How Product Psychology Could Stop Uber Drivers From Stealing

How Product Psychology Could Stop Uber Drivers From Stealing

How upstream thinking could reduce stolen orders, avoid tip-baiting and increase driver happiness.

How to Reduce Churn by Doing Your "One Thing"

How to Reduce Churn by Doing Your "One Thing"

When it comes to onboarding, it's often more effective to do just one thing (really damn well).

BFM+ Exclusive

Unlock all 77 case studies with BFM+

View Plans
A Masterclass in User Activation (96% of them)

A Masterclass in User Activation (96% of them)Preview this content

Discover the art of setting a goal and then using that to immediately create the perception of success.

All of the UX analysis on Built for Mars is original, and was researched and written by me, Peter Ramsey.

Never miss the free UX analysis

Free case studies, the moment they’re released, plus a digest of the best UX Bites every few weeks.