Today is one of my favorite days of the year. Don’t worry, it’s not some obscure holiday that only six hipsters celebrate. In the alternative universe that is my head, cutting the grass is one of my most beloved activities. And today is the first cut of the year!
I know that’s not exciting for most people, but for me it’s great. I grab my headphones, put on a podcast or two and get to work. I can’t tell you the number of conversations I’ve had on a Monday morning that start with, “I was cutting the grass this weekend and had a thought …”
Some people have their best thoughts in the shower, or in the middle of the night, I have mine while walking back and forth in my yard.
So, I was cutting my grass this morning, listening to the most recent two episodes of the Stuff To Blow Your Mind Podcast about storytelling, and how it might actually be bad for humans. The basic argument is that when we tell a story it transforms an event or memory into narrative. Possibly changing what we remember.
Coincidentally, I read an article earlier this morning in Psychology Today on the how our memories are being weaponized by online platforms. The article provides several experiments, as well as real-world events, where groups of people had, or were given, memories of events that never occurred.
The Psychology Today article starts with the following:
Read this list of words: table, sit, legs, seat, soft, desk, arm, sofa, wood, cushion, rest, stool.
Now count to 30.
Did you spot the word chair?
If so, I’ve just implanted a false memory—read the list again. In tests using lists such as this, people usually say they saw the lure word, a word that was danced around but never actually named.
It worked on me. I was sure I read the word chair. I was wrong.
A similar experiment, one that I love – and not just for the late-90s style, is the case of the invisible gorilla.
Did you see it?
What does storytelling, memory, and selective attention have to do with media?
Easy. They are at the core of what media is, and how and why audiences choose to engage. The Stuff To Blow Your Mind Podcast doesn’t take a side, but what if the basis premise it lays out is correct and there is an inherent danger in storytelling?
There is plenty of evidence to suggest this is true. Even a poorly told story can implant a false-memory in a person’s brain. And no matter what corrective action is taken, that false-memory will always be there – especially if it confirms one’s own bias.
Consider the Mandela Effect where a large segment of the population believes Nelson Mandela died in the 1980s. He died in 2013.
And we all know Darth Vader’s line when he informs Luke who Luke’s father is, right? Go ahead, say it out loud.
Now watch this:
It’s a small difference from what most people remember as, “Luke, I am your father,” but my point is that the person telling the story has great power over the collective mind of the audience.
Depending on your perspective this may be exhilarating or terrifying, but my guess is most people fall into the middle ground of, ‘meh.’ Which terrifies me.
Even when you know you’re being manipulated it’s nearly impossible to protect yourself against it. When you don’t realize it, or just don’t care, that’s when bad actors can begin to take control.
Speaking of bad actors …
Changes to the algorithm
Did your search traffic take a hit with Google’s 2019 Core Algorithm Update on March 12? According to the Search Metrics Blog the big change is around more sensitive topics, such as health news:
The analysis shows that Google continues to fine-tune its search results for queries related to particularly sensitive topics, such as those dealing with health questions. Following the E-A-T Update in August 2018, which is also known as the Medic Update, and E-A-T Update No. 2 in early October 2018, we see that, once again, the March 2019 Google Core Update has had a large impact on websites in this industry.
This isn’t anything new, but Google is really emphasizing trust in this update. This update focuses heavily on what Google calls Your Money Your Life (YMYL) Pages.
User signals, such as time on site, pages per visit, and bounce rate, also received added importance in the update.
In looking at the winners and losers noted on the Search Metrics Blog, a lot of traditional news publishers, including the New York Times and The Atlantic, were big losers. Which, when you consider the volume of content these publishers produce, makes sense.
While most of the publishers on the losers list don’t fall into what you would consider a YMYL site, almost all of them would dedicate a significant amount of content to this pages. And from a trust standpoint, they haven’t built a brand around YMYL news, so the drop makes sense. Not that this is any consolation to the persons responsible for web traffic for these sites.
In more algorithm news, on Wednesday Wired posted this story about Facebook’s algorithm changes, including a new metric called Click-Gap.
What is Click-Gap you ask? According to Wired:
… the company’s attempt to limit the spread of websites that are disproportionately popular on Facebook compared with the rest of the web. If Facebook finds that a ton of links to a certain website are appearing on Facebook, but few websites on the broader web are linking to that site, Facebook will use that signal, among others, to limit the website’s reach.
I see what they are trying to do, but I can also see them harming legitimate sites that have built businesses around the virility of Facebook. There are still a lot of start-ups that insist on building businesses that rely on platforms (something I disagree with, but I understand it), which could inherently make them disproportionately popular on Facebook. This could kill these businesses.
Not that Facebook has ever considered the businesses it has killed.
Facebook’s coming changes will also focus on Groups, where it says fake news is amplified. There will be more scrutiny in Groups, and admins will be given additional tools and notifications.
Facebook also reiterated how hard it is to have humans do the work of fact-checking. From Wired:
“Our professional fact-checking partners are an important piece of our strategy against misinformation, but they face challenges of scale: There simply aren’t enough professional fact-checkers worldwide and, like all good journalism, fact-checking takes time,” the blog post reads. One possibility Facebook is considering is relying on groups of users for help with fact-checking, a system that would no doubt introduce the possibility for manipulation. In the meantime, Facebook recently began adding what it calls “Trust Indicators” to news stories. These ratings are created by a news consortium called The Trust Project, which assesses news outlets’ credibility based on things like their ethics standards, corrections policies, and ownership.
When more users start to leave and advertisers stop paying, I bet they’re able to fix the problems. Until then, the fight continues.
The new metric rolled out on Wednesday.