I’m an avid user of Pocket, a useful app that stores and recommends content. When I started using it, the recommendations were wide-ranging, interesting and from lots of sources. I started saving articles from the recommended feed. Over time, the range of topics narrowed: lots of philosophy, politics and productivity. The range of sources narrowed: The Atlantic, New Yorker and Aeon dominating the show.
Recommendation algorithms inherently narrow over time. They take historical data as an input, look at what you engage with and then take the stuff you have not engaged with out of reach. The stuff out of reach you cannot engage with any longer (by default), so it remains out of reach (unless you proactively seek it out). And bingo, you are left with a narrower set of stuff as the algorithm actively makes you engage with less and less over time.
Pocket’s solution to my problem was “saving articles to Pocket that are outside of your typical saves (including different sources and points of view)”. This would help, of course, but I could not use the recommended feed in Pocket to do it.
Algorithms are encroaching on every facet of our lives. Recommendation engines are everywhere. They are limited by the quality of input data and how accurately they are a real predictor of what we want. We should be wary as we use them and as we make them.
Facebook’s algorithm optimises for engagement. You engage with cat videos and outrage pieces from the left, so that’s what you’ll get. This is a problem for reasons that Tristan Harris explains more eloquently than I ever could, but in essence it preys on our worst base instincts rather than facilitating what we want to be and it creates the echo chambers that we hang out in.
Facebook used to be quite a fun way to engage with people I know. It has now become a mechanism for the same people to broadcast random stuff (<1% of which they have created), and all it cares about is how I engage with that stuff so they can sell my engagement to advertisers. I’m done with it.