Note: The opinions expressed in this article are those of the author and do not necessarily reflect those of Insigniam.
According to The Wall Street Journal, some 700,000 unsuspecting Facebook users were subject to psychological experiments in 2012, sparking widespread outrage among the social networking site’s community.
“To determine whether it could alter the emotional state of its users and prompt them to post either more positive or negative content, the site’s data scientists enabled an algorithm, for one week, to automatically omit content that contained words associated with either positive or negative emotions from the central news feeds of 689,003 users,” writes The Journal.
While the story notes that this is hardly the first time the world’s second-most-trafficked website has conducted experiments on its users, it points out that Facebook’s Data Science Team is, “tasked with turning the reams of information created by the more than 800 million people who log on every day into usable scientific research.”
“What many of us feared is already a reality: Facebook is using us as lab rats, and not just to figure out which ads we’ll respond to but actually change our emotions,” wrote Animalnewyork.com, in a blog post that drew attention to the study ahead of The Wall Street Journal’s report.
However, taking a very different angle on the controversy is AdAge, with the perspective that “Facebook, and every other tech company that collects and sells user data, should publicly embrace and proactively advocate this kind of data collection instead of trying to play it quiet, afraid of causing the next uproar.”
“If users recognized digital advertising as legitimately helpful content, then they wouldn’t care whether information that corresponds to their buying intent is delivered through paid media, which companies can control, or earned media like news coverage, which companies can’t control,” says AdAge.
Furthermore, the piece states that, “On the client side, marketers … have mounds upon mounds of data about [online behavior] to sift through for signals. That’s the real barrier, because the engineering talent necessary to turn seemingly countless pieces of data into a few accurate predictions of intent seems to exist already. More data collection would eliminate some off-target advertising that is based on demographics rather than on intent.”
In a previous post, we detailed how innovators can use data-driven evidence to find the blind spots and gain insights into how to solve customers’ problems, meet their needs in new ways, or even expand their customer base.
The crux of the issue seems to be this: Facebook’s innovative use of data-driven technology isn’t necessarily aberrant, but rather, as echoed in AdAge, it’s their approach and lack of transparency, which stands to alienate their 1.28 billion monthly users.
While Facebook is no doubt well aware that the behavior of test subjects will alter when they become aware that they are being observed (which thereby compromises the integrity of a given study and its results) Facebook would be well-served by getting ahead of such regular, public outcries over violations regarding their user’s privacy.
A possible solution: By opening a dialog with their users that such ground-breaking data-mining programs exist — which are sometimes even in their best interest — Facebook stands to build trust with its community and can thereby break down barriers that impede breakthrough innovation, mainly, customer sentiment.