Two of the stupidest people on Canadian television. |
(I've already complained at length about Frankish elsewhere; that's why this post is labelled "II".)
They were probably riffing off articles like the one in Time from 2017. I say "probably" because they didn't actually give their sources. I mean, who cares about facts and attribution, right?
The fact is this: social media use algorithms to create a positive feedback loop: the more you watch, the more you want to watch. Because that's how they stay in business. There's so much stuff out there, and its quality is so widely variable, that it's impossible for a mere mortal to sort through it all. Left to their own devices, users1 would become frustrated looking for content they like and leave the site. Enter the kindly algorithm, intended to help users find content that's interesting to them.
There is, however, such a thing as too much of a good thing: people engage with social media more than they should. The positive feedback loop created by algorithms is unbalanced by any controls, and so we get the so-called zombies who appear addicted to social media.
If one were to draw an analogy to drug addiction - and I'm not, but it's interesting to consider anyways - it would be as if pushers and dealers were everywhere, entirely unhindered by law or ethics, delivering the drugs directly to you wherever you happen to be. And the drugs were free, so long as you accepted advertising on the packaging, bongs, needles, etc.
This would be relatively easy to fix, by altering the algorithms to begin showing more and more dislikable content after a certain threshold2 has been reached. Of course, this would quickly be recast as censorship and immediately painted with the same brush as communism, thought policing, deplatforming and all those other terrible things modern society loses its shit over so easily. Still, it is fixable at the source. And given that the problem is in fact a problem, and that the problem is being created by the content providers (the positive feedback mechanism of algorithms is strictly their invention), then it rather follows that they should also be on the hook for mitigating the harmful effects.
But there's another aspect to this; an aspect the responsibility for which resides strictly with the users: they consume by choice media that makes them unhappy!
How can that happen? Easy! Individual humans are generally poorly educated monkeys who have virtually no willpower and detest reflecting on their own behaviours due to marginal self-esteem.
The algorithms used by social media platforms are limited in what they can measure. You can easily find all kinds of information via Google about them. You will find that all the metrics they use are behavioural: clicking a "like" button, time spent watching a video, photos that are bookmarked, etc. This makes sense, because behaviour is really all they can measure.
However, they then all make the same assumption: that people will follow social media that is good for them. You will "like" a post that makes you happy. You will watch a video all the way to the end that brings you contentment. Similarly, you will avoid items that make you unhappy or that distress you.
News flash: this assumption is not valid. We don't only "like" things that make us happy, we also "like" things that we desire. "Oh! That is so cool! I wish I had that!" If you do that often enough, the algorithms will notice and start showing you more and more of those things. So now, what you see on social media is at least partly constituted by things you want. Unreflective people will begin to conflate desire with happiness, but that doesn't make sense. And the resulting conflict leads to mental distress since we usually can't reconcile desire and happiness, except through acquiring that which we desire - if that's even possible. Even worse, some people will be such that they'll only "like" what they desire - leading to a completely dissonant state.
And since we can't all have what we want, we're painting ourselves into a psychological corner.
Not to mention the implied social hierarchy that results from distinguishing those who have what is desired and those who do not. If you're constantly bombarding yourself with images of people enjoying things that you wish you had, you will be far more susceptible to feelings of inadequacy and low social standing.
While I can, and do, hold social media organizations responsible in part for creating this situation, it is also incumbent on users to moderate their own behaviours. With great power comes great responsibility, and all that.
Social media is, like everything else, a tool. Is it the fault of the hammer's manufacturer if a user decides to use it to put a nail through their hand without even bothering to think "This might be a terrible idea"?
So if Pugliese and Frankish actually wanted to have a meaningful discussion about social media, one that was rational and informed, they easily could have. There's no deep philosophy here, just a relatively general knowledge of how systems work, a little humility, some common sense... and a Spider-Man quote3.
What they did provide was an absolutely cringe-worthy mishmash of subjective "feelings" and anecdotal evidence that was an absolute waste of air-time.
But then again, that's all I've come to expect from those two.
- I refuse to call people who use social media platforms "consumers."
- A threshold to be determined openly and transparently, and grounded in evidence-based research.
- Or Voltaire, depending on your preferences.
COMMENTS