Some random tweet of mine got 2,900 likes and 800 retweets.
I could never have guessed it– it was nothing special.
But that’s why you let the data guide your decision making– not your own biased opinions.
I’ve learned over the years that I’m a lousy winner picker.
So I let the algorithm do the work for me– to find the needle in a haystack.
And I put more dollars and effort against what generates traction.
I’ve made thousands of pieces of content, which is “many shots on goal”.
And even though only 1 in 10 pieces of content does decently, that’s more than enough– since I can boost it, cross-post it to another social network, convert to a blog post, repurpose into email, make a one minute video about it, and so forth.
The author uses this extreme example to blame YouTube’s attention-hungry algorithm for creating radical of all types.
If I watch videos of people eating bacon, the algo will recommend more videos of people eating bacon– SURPRISE!
You can use this same line of logic in any algorithmically-driven system that makes recommendations.
If you watch a lot of Disney movies on Netflix, guess what movies Netflix will be recommending for you?
The “filter bubble” ensures the red people see red content, such that only 1% of people (call them purple) will also see blue content, because they have blue friends.
If people are seeing more of what they already are inclined to believe, is it possible that social media can shift their opinions, ever so slightly?
I believe it still can, in a powerful and subconscious way.