Cambridge Analytica Slammed for Psychographic Advertising
Now that we know Cambridge Analytica used psychographic advertising for political gain, is the revenue model for Google, Facebook, Amazon and Twitter a violation of personal privacy?
Should advertisers and developers be allowed to target advertising based on psychographics profiles?
That’s the question everyone’s asking, despite the fact we willingly surrender any reasonable expectation of privacy every time we share our personal opinions online.
Social networks have been segmenting users based on psychographics for over a decade now.
We should all know by now that if the media we consume is free, we’re the product.
Psychographic vs Demographic Advertising
Mass media pays the bills by selling ads based on demographics.
And online media pays the bills by selling ads based on psychographics.
Facebook is and always has been a psychographic advertising platform.
And while the richest profile and social graph data a marketer can analyze these days may come from Facebook — since they can pair likes, comments and shares with extended user profile info — digital marketing pros know they’re not the only game in town.
YouTube, Twitter and Linkedin are psychographic advertising platforms as well.
Even Amazon mines profile data to recommend stuff to buy.
That’s what big data marketing analytics are all about, and they’re not going away.
As grandma used to say, a leopards can’t change his spots.
Cambridge Analytica got hit first, waking us all up to the concept of psychographic profiling.
But what they did, albeit a tad more sophisticated, is what Amazon does to serve product recommendations and what Facebook, Twitter and Google do to sell ads.
Big data is data about you that’s been analyzed, interpreted and segmented to try and get to take some sort of action.
Social networks sell ads based on user profiles they surveil. Amazon and Netflix make recommendations based on user data they surveil.
Social customer relationship management or consumer surveillance services and CIAM Platforms have actual platforms that are used to improve customer loyalty by assembling rich customer profiles that can be used to drive better marketing by amassing psychographic segmentation variables from multiple sources in a single database.
When you use a loyalty rewards card to save money at the check out counter, you’re selling your purchasing history to someone, somewhere, who can sell it to someone else, to build a consumer profile about you.
For the consumer’s standpoint, what you say and buy in digital environments leaves a record behind, and managing those record has become a form of online reputation management.
Imagine how well you could predict someone’s behavior — particularly someone unaware of what psychographic profiling is all about — if you could combine their Facebook, Twitter, Google, Amazon, Yelp and shopping data into one big digital dossier?
Aenta recently bought CVS, which means they’re likely to be using your purchase history data at some point to determine your insurance premiums.
Think about that next time you use social sign-in options like these on a third-party website.
Without critical thinking skills, ease of use come at a price, and in this case, it’s the ability to download your profile information into a database.
So if you’re surprised about the Cambridge Analytica “scandal”, you shouldn’t be. And if you think there’s anyway Zuck can protect you, you’re dreaming.
When you share information on social networks, you’re giving up your right to privacy, because it’s unreasonable to think that after sharing something publicly on Facebook or Twitter, you can still keep it private.
You’ve forfeited any reasonable expectation of privacy.
Having sex at home is private, but having sex in a glass elevator is not.
Social networks are glass elevators.
It’s incredibly naive to be shocked, particularly at this point in the game, that advertisers are using profile information for psychographic profiling.
Regulation isn’t the answer either. It would be nothing more than congressional witch hunt, which I doubt would yield meaningful consumer protections.
We already take ads on TV with a grain of salt.
We need to learn to take silly Facebook ads that shoot to the top of our newsfeed — cause they’ve been liked, commented and shared of our connections — with the same healthy skepticism.
Psychographic factors don’t have to drive consumer behavior. We can circumvent their effectiveness through education.
After all, it’s just an ad. It can’t make up our mind for us.
Government regulations can’t protect us from ignorance. We need sharper critical thinking skills. And we need smart tools to help us predict outcomes.
Government regulators have been unable to adapt their rules to the digital media environment.
Want to make a difference? Mentor a kid.
Teach someone fairmindedness and intellectual integrity. Show them how to evaluate arguments from many points of view.
When questionable psychographic advertising tactics provoke international hysteria, it’s time to invest in the intellectual evolution of our species.