-->

KMWorld 2024 Is Nov. 18-21 in Washington, DC. Register now for Super Early Bird Savings!

Semantic technology:
From sentiment to applications

In the last few months, sentiment has become the next big thing in enterprise content processing. Manning & Napier, an investment firm, funded a number of projects for its search and content processing system that could determine what the computer scientists call "polarity" and I call the positive and negative aspect of a document.

A human can read a document and make a comment like, "This customer is really annoyed at our warranty program," or "We need to get this letter over to marketing because our customer is raving about our new product." Computers, as it turns out, can do a reasonably good job of determining the sentiment of a document or processing a large number of documents and providing a report that says, "Sixty-three percent of the messages about our service are positive."

Social media's impact

Sentiment analysis is one of the facets of text analytics that can discern the softer or intentional components of a report, an e-mail or other communication. What has boosted interest in tapping into the sentiment of a document? My view is: social media and a potent tool for competitive advantage. The rise of Facebook and the proliferation of information from Twitter users have created awareness of social media across consumer and business markets. The competitive advantage angle has become increasingly important, as examples of sentiment-centric content processing have diffused at conferences and in professional journals.

My research into this facet of content processing has identified the Southwest Airlines incident as one pivot point. You might remember that Kevin Smith, a Hollywood notable, was asked to get off a Southwest flight because he was too large for the seat. Smith used social media to call attention to Southwest's action. In a matter of minutes of his exiting the aircraft, Southwest found itself behind the curve. Over the span of a day, the incident went viral. Not surprisingly, the airline apologized and set up a social media program. The story flitted across the national media, and the message was not lost on other organizations' managers.

Vendors of sentiment analysis technology and related monitoring and reporting services found that after that fateful day in February 2010, their products and services were of considerable interest to many companies.

One company has become a touchstone for me in measuring the interest in sentiment analysis. That firm is Attensity, originally positioned as a specialist firm with "deep extraction" technology. The founder of the company focused on processing text iteratively. With each cycle of his novel approach, the Attensity system would generate additional data about the text. In today's lingo, Attensity was generating metatags, connections and entities. The system would then perform a range of analytic functions.

Attensity's approach to the analysis of unstructured content was of interest to the U.S. government and certain agencies that performed intelligence and analytic functions for law enforcement and other federal activities. Attensity received an injection of investment money from In-Q-Tel in 2002.

The company, like other firms with technology of interest to In-Q-Tel, continued to win contracts in the government markets. However, Attensity wanted to grow more rapidly, so the company decided to expand its products and services for the commercial sector Attensity merged with Empolis and Living-e to form Attensity Group and Attensity Europe GmbH. (Attensity also has a unit that focuses on the government market.)

Attensity is a privately held company, but the firm has been growing rapidly. Unlike some content processing companies, its jump from a government-centric vendor to the commercial sector has been successful. As early as 2009, the company was trumpeting the payoff from the emerging interest in social media.  At a time when many companies were mired in the financial doldrums, Attensity was revving its engines.

Credibility rankings

But sentiment may not be enough. According to Riza C. Berkan, founder and CEO of Hakia: "I think the next phase of the search will have credibility rankings. For example, for medical searches, first you will see government results (FDA, National Institutes of Health, National Science Foundation), then commercial (WebMD),  then some doctor in Southern California, and then user-contributed content. You give users such results with every search.

In early 2010, Hakia rolled out SENSEnews, a service that uses some of the methods of "sentiment" and applies them in a way that has struck a nerve with the financial services market. In the Semantic Technology Blog (http://priyankmohan.blogspot.com/2011/01/hakias-sensenews-can-they-they-really.html), Priyank Mohan described Hakia's semantic application as "a service to make buy-and-sell recommendations for any stock. SENSEnews reads news sources (more than 30,000 news sources), blogs (more than 1 million) and Twitter, and performs an advanced computation to make buy/sell recommendations to you."

Hakia's engineers know that news about a company is not mathematical in nature. Hakia has looked at precursors like Monitor 110 (no longer in business), Relegence (acquired by AOL), Need to Know (acquired by Deutsche Borse, deutsche-boerse.com, in November 2009) and Reuters' NewsScope (now part of Thomson Reuters' Eikon, thomsonreuters.com/products_services/financial/eikon).

Hakia uses semantics in a way that makes sense to enterprise professionals and individual investors. In my opinion, SENSEnews is an application that makes "sense" to the user. Who does not want an edge when making an investment in a publicly traded stock?

KMWorld Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues