Facebook has lost the plot
Facebook has been on a bit of a downward spiral for several months now, with the latest controversy concerning how many users it actually has. It’s a valid question, with Facebook’s rate of growth so consistent it’s scary, to the extent that when I plotted a linear trend against Facebook’s quarterly change in active users it was literally a flat line.
Facebook’s share price followed a similar trajectory up until July 2018, when Facebook reporteddisappointing second-quarter earnings and a negative sales growth outlook. In percentage terms, growth in the number of active users has fallen to an all-time low, and in absolute terms it’s now the lowest since September 2014.
The poor result was the first sign that Facebook’s clients - companies buying Facebook’s algorithms and marketing their products to the supposed ever-growing number of eyeballs on its website - were beginning to lose faith in the advertising behemoth. Investors were understandably worried, and their concerns led to Facebook’s share price taking a big dive. But Facebook’s problems run deeper than a reported slowdown in active user growth.
Losing the plot
“Facebook has been lying to the public about the scale of its problem with fake accounts, which likely exceed 50% of its network. Its official metrics—many of which it has stopped reporting quarterly—are self-contradictory and even farcical. The company has lost control of its own product.”
“…documents recently revealed show that since 2012, management has worried about where it can find more warm bodies to sign on. Fake accounts have been keeping the company that Columbia professor Tim Wu has called an “attention merchant” afloat. The cost of Zuckerberg’s dissembling, dating all the way back to 2004, has accrued, and is finally coming due. Accordingly, it is increasingly likely that Facebook will go the way of AOL, CompuServe, and Prodigy—if legal liability doesn’t bankrupt it first.”
True or not, the report is consistent with a growing amount of anecdotal evidence, the most recent being reports that companies are starting to move away from Facebook and back to more traditional mediums such as TV. As revealed last Thursday:
“…the inversion isn’t solely due to how fed up brands are with the social network; it’s also down to the emergence of an addressable alternative on TV.
Duracell… discovered that a large portion of the video ads it bought on the social network in the U.K. weren’t being viewed. [So] rather than try to focus on more expensive, but viewable impressions, Duracell decided to divert the money it had spent back to TV, a large portion of which went toward on-demand content.”
Fake-user scams seem to be popping up on Facebook on a regular basis. A very average American band called Threatin managed to amass 38,000 likes on its Facebook page, convincing a number of venues in the United Kingdom that it had an active fanbase and had toured extensively in the US. According to the Guardian, the stunt “has done nothing but fleece several UK venues out of money and time that would be far better spent on genuine artists.”
The scam was made possible by so-called “click-farms”, where you can buy likes and comments for your products. They are big businessthese days, and Facebook is powerless - or simply unwilling - to stop the onslaught of fake accounts. As one of the click-farm operators exclaims, “Every system is made by humans, so there is always a way to beat it”.
Given that Facebook charges its customers based on impressions and clicks, not their actual conversion ratio (people buying their products), having a large number of fake users that click on real, paid-for advertising is actually a big deal. If that trend continues for long enough, and conversion ratios remain low enough, I expect to see even more advertisers follow Duracell back to TV, or as was the case for the New York Times, ending the use of ad exchanges and behavioural targeting altogether.
Even if Facebook is telling the truth about the number of fake users on its platform, it needs to answer the question as to why it’s not delivering conversions for its customers, and to date it hasn’t been able to. Late last year a group of small advertisers even sued Facebook, claiming it overstated the amount of time users spent watching videos by up to 900%.
The only thing we know for sure is that Facebook lies, constantly. Whether it’s lying about video metrics, page post reach, or what it does with its users’ data, it has a solid track record of deception. So why would we expect it to behave any differently with click-farms and fake accounts?
Once the “best place to work in America”, Facebook has fallen to #7 in Glassdoor’s annual survey, the first time since 2015 its score has worsened. And it wasn’t because of things like compensation and benefits, on which it still ranks highly, but“continued scandals and the increased bureaucracy that comes with the maturing of any tech company”.
Companies are less eager to advertise on Facebook as it fails to deliver on its promises. Its stated rate of growth in active monthly users is dubious at best. Its own employees are increasingly looking elsewhere for employment. And its CEO is a raging lunatic. Knowing all of that, why would anyone buy Facebook?
What’s next for Facebook
In a Wall Street Journal op-ed last week titled “The Facts About Facebook”, CEO Mark Zuckerberg hit back at Facebook’s critics by stating:
“This [advertising] model can feel opaque, and we’re all distrustful of systems we don’t understand. Sometimes this means people assume we do things that we don’t do. For example, we don’t sell people’s data, even though it’s often reported that we do. In fact, selling people’s information to advertisers would be counter to our business interests, because it would reduce the unique value of our service to advertisers. We have a strong incentive to protect people’s information from being accessed by anyone else.”
The thing is, people are finally beginning to understand how Facebook works, and that’s Facebook’s problem. They don’t trust it because they do understand how it works. Just because the CEO with a history of shady dealings that no one trusts argues a technicality, doesn’t mean it ain’t so.
Zuckerberg is right that it would be against Facebook’s interest to sell people’s data. But he’s being deceptive. Facebook collects, aggregates and then sells the insights generated to companies to help them optimise their adverts. Facebook’s users are still a commodity and their data and privacy are treated as such, whether it is directly “sold” or not.
So where does that leave Facebook? People are growing tired of its dated, invasive business model, but it doesn’t really have anything new on the horizon. I doubt very much that, knowing all of Facebook’s privacy issues, people are going super keen to have a “Portal” to Facebook in their homes, which Facebook describes as a “connected smart camera that lets you video chat with the ones you love”. Nice try, Mark.
The only other “innovation” on the cards is the amalgamation of Instagram, WhatsApp and Facebook Messenger. As most of you are probably aware, Facebook owns both Instagram and WhatsApp but to date it hasn’t done much to integrate them into the main Facebook platform. That might be about to change, with Facebook announcingthat it plans to do just that by early 2020. However, somewhat surprisingly its first step will be to add end-to-end encryption to Instagram messages and encrypting Messenger by default (currently you have to opt in).
An act of desperation
Now colour me sceptical, but I’m not sure why Facebook would be doing this unless it’s going to backdoor the encryption. Yes it could rely solely on metadata analysis, but that’s not nearly as lucrative as being able to analyse entire messages. Maybe it’s a public relations exercise, along the lines of “see, encryption, we care about your privacy”, but Facebook wouldn’t do anything that might jeopardise its bottom line; it needs as much of your data as possible to improve its advertising algorithms.
A backdoor would be easy enough to accomplish and users of its services wouldn’t even know. For while WhatsApp’s encryption is built on the open-source Signal protocol, WhatsApp itself is closed-source, meaning there’s no way for the user to know whether or not Facebook retained a copy of your or your contact’s private key for itself.
Adding fuel to the speculative backdoor fire is the fact that the founders of both Instagram and WhatsApp exited Facebook in recent months, with the latter citing disagreementswith Mark Zuckerberg and Chief Operating Officer Sheryl Sandberg over how to monetise the messaging service. Facebook paid $19 billion for WhatsApp in 2014 and has yet to figure out how to monetise it in any significant way, and may never be able to. Unlike Facebook and Instagram, there are secure substitutes for WhatsApp, such as Signal and Wire.
Personally, I think it’s an act of desperation; a sign that Facebook has completely lost the plot in terms of what to do next. I’m leaning toward the theory that Facebook has reached “critical human”, with its reported active user growth mostly derived from fake accounts. With some creative tracking metrics, bringing WhatsApp and Instagram into the Facebook ecosystem might enable it to inflate its active user numbers by double-counting people from multiple platforms. As an added bonus, it can try to pull the wool over the eyes of its users and regulators by feigning concern for user privacy through end-to-end encryption (which may not be backdoored, but probably will be).
But now that the rot has begun it will be hard to stop. People understand how Facebook works and are either spending less time on the platform or quitting altogether. How Mark Zuckerberg and Facebook react will be crucial in determining Facebook’s future: is it here to stay, or will it join MySpace and countless other once glorious social network experiments on the internet’s scrap heap?
That's all for now. If you enjoyed this issue, feel free to share it via email →