Industry Insights

What If Personalization Traps Us into a "Filter Bubble"?

It is no secret that personalization has been a huge topic not only in the tech industry but also in any industry that concerns customer experience. For years, we have discussed the benefits of delivering personalized content and experiences but personalization technology had generated more discussion than actual results. In recent years, though, the technology has finally matured and caught up. Now, machine learning algorithms are injected into almost every platform to predict human intentions based on what the platform has learned from behavioral and historical data.

As you may already know, we as the CMS-Connected team often attend some significant tech providers’ annual user conferences to capture the buzz of their events through our interviews with the attendees. No matter if the person we speak with is a user or a technology partner; everyone has been stressing the importance of the same functionality of web content management systems: personalization algorithms.

First, for those who are not very familiar with how they work; it is essentially filtering information to the extent that is deemed relevant to the user, based on a list of criteria that are set primarily for users’ technological convenience. The number of clicks and viewing time are the most common factors while determining the list.

Despite the buzz around such personalization tools and algorithms, end consumers seem to have a different view on personalized experiences than what the tech industry has been fixated on. To put this into perspective, I’d like to share a little story where I had my “aha” moment during a conversation with a friend who has nothing to do with the tech industry other than being a regular consumer. He said: “I don’t want my Facebook feed to consistently show me only what 10 out of my 1300 Facebook friends have been up to lately. Same thing with Instagram. I don’t want a platform to decide what I want to see on my behalf without even running it by me. I might prefer to see the content only based on a chronological order, rather than based on what Facebook thinks of what I’m interested in.”

In my view, he was right because what he was saying is that relevancy is such a relevant concept. We know those algorithms pick up a pattern based on many interactions - which is an amazing technology but personalizing an experience is one thing, controlling or even worse, painting a consumer into a corner is another. From where I see it, some brands have lately been mixing up these two significantly distinctive concepts while executing their marketing campaigns and tailoring digital experiences.

Given there has been an ongoing explosion in the volume of content, data, devices, and channels, having this sort of algorithm at our disposal comes in handy in the process of searching for the particular information we want to learn about. While sometimes this functionality makes us more productive, it appears that such productivity may come at a price because it also traps us into a "filter bubble.” Now I am not trying to diminish the power and benefits of machine learning technology but the dangerous unintended consequence occurs when we are open to be exposed to information that could challenge or broaden our worldview. After all, getting all sides of information is essential to learning, making decisions, and forming an opinion.

When we look at this matter from a content creators' perspective, the competition for people’s attention has never been more fierce. This topic has been discussed as a whole in one of my recent articles titled “How to Crack Consumer Code in the Attention Economy” but in a nutshell, the demand for consumer attention is rapidly increasing while the supply of consumer attention seems to remain the same (scientifically, it is not proven that our attention capacity is expanding within). This situation drives marketers and content creators to take advantage of consumers’ obsession with their mobile devices and other online activities. While doing so, they often implement personalization algorithms to cut through the noise. 

Because personalization is very powerful, it makes or breaks it for businesses. In fact, a recent research commissioned by Sitecore and conducted by Vanson Bourne found that 80% of 680 marketing and IT decision makers place a high priority on personalization while 96% of 6800 customer respondents believe many brands are in fact, providing “bad personalization.” Obviously, many struggles to manage and mine customer data to both inform customer experience strategies and deliver on the promise of personalization. To go in-depth on these interesting findings and more, I sat down with Joe Henriques, Vice President of Innovation at Sitecore during Sitecore Symposium 2017 in Vegas. The full interview may be viewed, here. However, one of my biggest takeaways from the conversation was that brands should approach to personalization in a holistic manner. Their method should be almost like “utilizing a digital body language.”

Do Personalization Algorithms Risk Narrowing Our Minds?

Google’s CEO, Eric Schmidt, had once said about tailored online services: “The technology will be so good, it will be very hard for people to watch or consume something that has not in some sense been tailored for them.” Now, it is exactly what’s happening. As innovative as they are, personalization algorithms risk narrowing our minds and limit exposure to attitude-challenging information. It causes a dangerous unintended outcome which is creating a static and ever-narrowing version of people by taking them down a road of confirmation bias where what they have happened to click in the past shapes what they see next.

There is a fabulous TED Talk on this very topic delivered by Eli Pariser as he brilliantly explains because of personalization how we get trapped in a "filter bubble" and don't get exposed to information that could challenge or broaden our worldview. I highly recommend you spend 9 minutes hearing what he has to say.

Brand Journalism Brings Editorial-Like Responsibility Within

In recent years, marketers have shifted their focus from the ad-driven business model, where the only thing that matters is whether a web page stayed open long enough to make money, to content marketing, where marketers focus on cultivating a meaningful engagement with the target audience through interactive content. This extremely successful and long-lasting impact of content marketing, further drove brands to act more like publishers. In other words, marketers have become “brand journalists.” They are tasked with bringing forward valuable, informative, and educational content, instead of pushing products and services onto consumers. As the Godfather of content marketing, Joe Pulizzi, said on the Magnificent Marketing Podcast, “all businesses must think of themselves as media outlets. Retail, lawn care, accounting, healthcare, insert-your-industry-here. The goal is the same: you want to deliver relevant and valuable information to your audience.”

This shift toward creating content in an editorial manner gives marketers and content creators not only a new title but also brings editorial-like responsibility towards the public. In fact, based on the principles of Responsible Research and Innovation to the design, development, and appropriation of technologies, content creators and technology providers must have a corporate social responsibility to promote a healthy democratic discourse. We can also throw governments into this mix as they certainly need to step in and regulate online information sources from social networks to technology platforms built on machine learning and artificial intelligence.

Whether a website hosts third-party content like Facebook, it literally creates content like CMS-Connected, or technology companies provide content creation tools including algorithms, they should be liable to some sort of media regulations and liberal laws. Facebook Founder and CEO Mark Zuckerberg reportedly said during a live Q&A session in Italy that Facebook is a tech company, they are not a media company as they build the tools, not produce any of the content. In my view, just because an algorithm makes decisions based on user-derived data, doesn’t mean the algorithm creators are not responsible for the outcome. Fortunately, the Reuters Institute stepped in and challenged Zuckerberg’s statement with its findings. The institute found that digital intermediaries act as gatekeepers who exert editorial-like judgments to varying degrees as they “sort and select content to provide news which is of ‘relevance’ to their customers, and decide which sources of news to feature prominently.” Thereby they do affect the nature and range of news content that users have access to, hence, “… they do perform important roles in selecting and channeling information, which implies a legitimate public interest in what they do.”

Elon Musk, the CEO of SpaceX and Tesla, also expressed his longstanding concerns about the potential danger of artificial intelligence, warning governments at a meeting of U.S. governors this year, saying: “AI is a rare case where we need to be proactive in regulation instead of reactive because if we’re reactive in AI regulation it’s too late.” He believes that by manipulating information such as doing fake news, spoofing email accounts, and doing fake press releases a war can begin.

As we also reported this past Friday, Marc Benioff is the latest in a long line of executives and other elites in the tech industry who have raised concerns about the dangers of technology products and services. Simultaneously being a successful CEO and socially responsible tech leader, he quite rightfully criticizes the ignorance of other CEOs and entrepreneurs in the tech industry: “I mentor a lot of CEOs and entrepreneurs and when I see that product is the number-one thing, the only thing that matters, that’s a real red flag.”

From a content marketing perspective, providing a personalized experience is a double-edged sword. On one hand, providing personalized content and search contributes to the efficiency of content consumers, on the other hand, the results are narrowed down without users’ knowledge, awareness, and consent. There certainly are times when consumers would like to see the narrowed down options for the sake of saving time. This approach works perfectly especially in the scenarios when people have a sense of what they want and would like to get straight to the point. There are also times when they want to know what the virtual world has to offer to them.

As much as we consumers like personalized experiences and tailored search results, we need to know what our options are. In fact, 77% would trust businesses more if they explained how they’re using personal information to improve their online experience. With this in mind, not to contribute to a sense of distrust, brands may take an approach where they explain why the specific content or product is being recommended. Amazon, for instance, provides a button saying “Why recommended?” alongside its personalized recommendations. When a visitor clicks on that the platform shows the triggering factors such as previous purchases or information given to them and more importantly, it also allows them to opt to not include a specific purchase or information for the future recommendations.

As HSBC’s former head of marketing in EMEA Philip Mehl had once said: “Marketing used to be a creative challenge but it’s a data challenge now.” To me, this type of approach that Amazon has taken is not only fair and valuable for consumers but also it establishes brand’s good intentions and skills in their personalization marketing strategy in such an organic way.


I understand that the engineers who created personalization algorithms might not anticipate its outcomes. However, this situation shouldn’t result in the lack of transparency towards users to the extent that the options and information consumers are exposed to go beyond their ability to control. As digital transformation continues to engulf everything in its path, opportunities and challenges will equally get off the ground. These ethical disruptive outcomes are one of those serious challenges for society as due to their power, they go beyond the intended scope of use. Keep this in mind, technology providers, especially, social media companies, cannot simply absolve themselves of any responsibility in this area.

At the end of the day, by its very nature, the act of personalizing necessarily brings forward some content while excluding other content from view. We all understand that nature, and it is somewhat okay as long as a consumer says so. Otherwise, it means dictating and controlling rather than liberating. Therefore, regulations such as the General Data Protection Regulation (GDPR) going into effect in May will result in greater trust and greater transparency as this particular legislation gives brands orders to collect explicit consent from EU residents to store and use any data. If you’re in a U.S.-based multinational enterprise doing business in the EU but not complying with the requirements, you will, as well, face hefty fines of up to €20 million or 4 percent of global annual revenue (whichever is greater). You can learn more about how to get ready, here, but I believe that whether it’s GDPR, PIPEDA, ePrivacy Directive, or AdChoices, privacy laws and programs around the world are necessary to truly regulate the internet and banish the doubts coming from a somewhat distrusting consumer base because that way, not only will they be informed about how brands actually collect their data and what it will be used for but also they will be empowered by having an option to choose to have personalized experiences based on that data or experiences without being filtered.

Venus Tamturk

Venus Tamturk

Venus is the Media Reporter for CMS-Connected, with one of her tasks to write thorough articles by creating the most up-to-date and engaging content using B2B digital marketing. She enjoys increasing brand equity and conversion through the strategic use of social media channels and integrated media marketing plans.

Featured Case Studies