Is AI-Powered Sentiment Analysis in Media Monitoring a Sham?
If you have access to a good media monitoring platform, conversations occurring via social media channels present a valuable listening opportunity for organizations.
Armed with the right information, organizations can engage in data driven marketing program.
Customers are rewarding and getting even with brands, products and services by sharing their experiences on Yelp, Twitter, Charity Navigator and Facebook.
But with so many voices and so few ears, is the notion of offloading the analysis of those conversations to computers feasible or practical?
But is sentiment analysis accurate enough to help digital marketing experts allocate budget? Big data programs bring both risks and rewards.
Toyota's recent decision to halt manufacturing for eight of its models underscores the difficulties and complexities involved with respect to surfacing meaningful business intelligence from customer data.
According to the National Highway Traffic and Safety Administration, which conducted six separate investigations, there were no defects other than unsecured floor mats.
Despite numerous customer complaints filed with the government regulator, the agency was unable to connect the dots and deduce a real problem, partly because the various customer complaints had been tagged with different keywords, preventing them from connecting the dots.
Social media training programs at Intel have long focused on leveraging customer data to inform the marketing message.
The business community is rife with vendors promising listening solutions with ability to gauge customer sentiment from social media conversations, and in so doing, afford the process of automated listening and data mining.
But in a world where CAPTCHA codes are required to discern people from computers, is it on realistic for organizations to rely on computers to analyze the sentiment of these conversations online?
Rob Key of Converseon says we're still a good 10 years away from accurate sentiment analysis.
In his description of the seven layers of data analysis, Rob says human analysis is still key because computers struggle with sarcasm, neologism, images, implicit and explicit information.
He went on to suggest that any sentiment analysis vendor promising 90% accuracy should be disqualified from the group of listening platforms you may be considering.
Public relations measurement specialist Katie Paine, prescribes a listening method based on the concept of cost per message communicated (CPMC), a metric derived from outputs, outtakes and outcomes. But even those outputs must be ranked by people to assess positive versus neutral versus negative messages, a process that seems to me excessively error prone.
While these models may be a significant improvement over measuring advertising equivalency, the notion of sentiment analysis has, perhaps, as many blind spots.
Mark Weiner says ad equivalency may be meaningless as an absolute, but as a relative measure of progress, it does have value for some organizations.
In my B2B digital marketing book, I will addressed the issue of metrics and return on investment through listening and the issue of sentiment analysis is one I'm skeptical of.
B2B Digital Marketing Book Topics Covered
- How are practitioners determining return on investment?
- What are the strengths and weaknesses of sentiment analysis?
- Mapping objectives to measurement programs.
- How frequently should the program be tweaked?
- What are the best return on investment methodologies?
- How can organizations rely on technology to listen, particularly when the number of voices significantly outweigh the number of ears?
As I mentioned, if you're interested in B2B applications of social and digital media, it all starts with mainstream and social media monitoring.