I'm going to summarize this a bit and then muse about it because I think it's really, really important. There is support for all these points in the article.
- If you let people choose who they want to "friend" and those friends don't have limits on posting to an amorphous "feed", there will be more content than is reasonable to read through
- If you present this content in strict chronological order, people will end up getting whatever random sample of stuff that was posted right before they happened to look
- There is a quote that is important enough that I'm going to excerpt and come back to it:
Meanwhile, giving us detailed manual controls and filters makes little more sense - the entire history of the tech industry tells us that actual normal people would never use them, even if they worked. People don't file.
- The algorithmic newsfeed as championed by Facebook or Twitter is fundamentally an attempt at addressing this situation. It has a lot of bad effects, both for society and for individuals.
- In response, there are two big trends we're seeing:
- One is group chats coming to be a more important venue for sharing content. Because you know exactly what everyone is seeing in strict chronological order, there is a social etiquette that can arise around not dominating the conversation.
- Another is stories. Stories are limited temporally, such that people have to check in every day to be able to see all content. It's presented in a way that you're really checking each person's feed in turn.
- There are also perhaps bad effects to a retreat to group chats, cf. how misinformation shares through WhatsApp.
Stories seem to me to actually be a counter example to what Benedict is saying above about manual controls; it turns out people are willing to manually navigate through content to see what they care about as long as the UX is smooth enough. They are more willing to accept limits on content if it facilitates that UX being smooth.
In my opinion, stories can be kind of a dark pattern if they make people feel like they have to check back in all the time or they'll miss something, but at the same time, hey, a bold and initially-ridiculous-seeming restriction enabling a new idiom of social communication was Twitter's whole 140 character thing too.
The way Lemmy handles this is a la Reddit, by tossing "wisdom of the crowd" on the situation, and dividing the crowd up into aligned subsegments. The same content may be received differently by /c/aww and /c/photography, and get a different judgment.
But that doesn't work well for a couple of different things:
- Personal content: not everyone who posts a picture of a cocktail on Instagram is trying to be an influencer. Sometimes you want to tell your friends what you're up to. Upvoting and downvoting does not mesh well with this casual content.
- Niche content: in theory it could, but in practice -- and much to everyone's shame -- the best niche memes these days are in Facebook groups, not on Reddit. I don't have much to say about this except that if you doubt me, you are not thinking niche enough.
- Generally antidemocratic corner cases: Not everyone's judgement is equally valuable in all cases.
So let's move away from impersonal content. What can be done to address the newsfeed problem in a less bad way? As a Fediverse fan, I'm thinking here of how Mastodon has a mix of personal and impersonal content, sometimes in a longer form than Twitter and more like Facebook.
Broadly, it sucks when content is pushed at us in ways we can't understand, and in ways we can't control. Chronological sorting is comprehensible and you can scroll around in it. Every time you hear Instagram artists talking about changes in "the algorithm", you get a sense of something incomprehensible out of the control of both posters and consumers. So what does comprehensible control mean?
I like being able to toggle among ordering methods. People complain about how Twitter inserts content into their feed that was merely "liked" by someone they follow, but I like seeing that stuff, and toggling over to a separate feed of it would be cool.
There's lots someone could implement by diffusing judgment more through a social graph -- the engagement of people you engage with counting for more or less, then who do they interact with a lot, etc.--but that doesn't end up being comprehensible to people, and we should take more seriously that that should be disqualifying.
Clicking around to look at content clusters can be both comprehensible and controlled. Spotify does this (bizarrely calling it "daily mixes"). The mechanisms by which clusters are determined don't have to be totally transparent so long as people can be explicitly told what the clusters are that they're looking at; this could be something like boiling topics down to keywords in text content to "discover" tags.
Well, I've talked a bit about how to present content after it's been created, but I will have to return some other time to dig into constructive limits on that content that can help. Really curious to hear people's thoughts on this.
Comment with a webmention, or with commentpara.de