Analytics tool that recognises sarcasm

our consultant managing the role
Posting date: 2/18/2014 7:55 AM

French company Spotter has developed an analytics tool that claims to be able to identify sarcastic comments posted online.

Spotter says its clients include the Home Office, EU Commission and Dubai Courts.

The algorithm-based analytics software generates reputation reports based on social and traditional media material.

However some experts say such tools are often inadequate because of the nuance of language.

A spokeswoman for the Home Office said she should not comment at this time.

Spotter's UK sales director Richard May said the company monitored material that was "publicly available".

Its proprietary software uses a combination of linguistics, semantics and heuristics to create algorithms that generate reports about online reputation. It says it is able to identify sentiment with up to an 80% accuracy rate.

The company says these reports can also be verified by human analysts if the client wishes.

Algorithms had been developed to reflect various tones in 29 different languages including Chinese, Russian and Arabic, said Mr May.

"Nothing is fool-proof - we are talking about automated systems," he told the BBC.

"But five years ago you couldn't get this level of accuracy - we were at the 50% mark."

Mr May added one of the most common subjects for sarcasm was bad service - such as delayed journeys.

"One of our clients is Air France. If someone has a delayed flight, they will tweet, 'Thanks Air France for getting us into London two hours late' - obviously they are not actually thanking them," he said.

"We also have to be very specific to specific industries. The word 'virus' is usually negative. But if you're talking about virus in the context of the medical industry, it might not be."

Spotter charged a minimum of £1,000 per month for its services, Mr May said.

Human effort

Simon Collister, who lectures in PR and social media at the London College of Communication, told the BBC there was "no magic bullet" when it came to analytics that recognize tone.

"These tools are often next to useless - in terms of understanding tone, sarcasm, it's so dependent on context and human languages," he said.

"It's social media and what makes it interesting and fascinating is the social side - machines just can't comprehend that side of things in my opinion."

Mr Collister added that human interpretation was still vital.

"The challenge that governments and businesses have is whether to rely on automated tools that are not that effective or to engage a huge amount of human effort."


Click here for the article on the web.

Related blog & news

With over 10 years experience working solely in the Data & Analytics sector our consultants are able to offer detailed insights into the industry.

Visit our Blogs & News portal or check out the related posts below.

Harnham's Brush with Fame

Harnham have partnered with The Charter School North Dulwich as corporate sponsors of their ‘Secret Charter’ event. The event sees the south London state school selling over 500 postcard-sized original pieces of art to raise funds for their Art, Drama and Music departments. Conceived by local parent Laura Stephens, the original concept was to auction art from both pupils and contributing parents.  Whilst designs from 30 of the school's best art students remain, the scope of contributors has rapidly expanded and now includes the work of local artists alongside celebrated greats including Tracey Emin, Sir Anthony Gormley, Julian Opie, and Gary Hume.  In addition to famous artists, several well-known names have contributed their own designs including James Corden, David Mitchell, Miranda Hart, Jo Brand, Jeremy Corbyn, and Hugh Grant.  The event itself, sponsored by Harnham and others, will be hosted by James Nesbitt, and will take place at Dulwich Picture Gallery on the 15th October 2018.  You can find out how to purchase a postcard and more information about the event here. 

Breaking Code: How Programmers and AI are Shaping the Internet of Tomorrow

Data. It’s what we do. But, before the data is read and analysed, before the engineers lay the foundation of infrastructure, it is the programmers who create the code – the building blocks upon which our tomorrow is built. And once a year, we celebrate the wizards behind the curtain.  In a nod to 8-bit systems, on the 256th day of the year, we celebrate Programmers’ Day. Innovators from around the world gather to share knowledge with leading experts from a variety of disciplines, such as privacy and trust, artificial intelligence, and discovery and identification. Together they will discuss the internet of tomorrow.  The Next Generation of Internet At the Next Generation Internet (NGI), users are empowered to make choices in the control and use of their data. Each field from artificial intelligent agents to distributed ledger technologies support highly secure, transparent, and resilient internet infrastructures. A variety of businesses are able to decide how best to evaluate their data through the use of social models, high accessibility, and language transparency. Seamless interaction of an individual’s environment regardless of age or physical condition will drive the next generation of the internet. But, like all things which progress, practically at the speed of light, there is an element of ‘buyer beware’, or in this case, from ‘coder to user beware’. Caveat Emptor or rather, Caveat Coder The understanding, creation, and use of algorithms has revolutionised technology in ways we couldn’t possibly have imagined a few decades ago. Digital and Quantitative Analysts aim to, with enough data, be able to predict some action or outcome. However, as algorithms learn, there can be severe consequences of unpredictable code.  We create technology to improve our quality of life and to make our tasks more efficient. Through our efforts, we’ve made great strides in medicine, transportation, the sciences, and communication. But, what happens when the algorithms on which the technology is run surpasses the human at the helm? What happens when it builds upon itself faster than we can teach it? Or predict the infinite variable outcomes? Predictive analytics can become useless, or worse dangerous.  Balance is Key Electro-mechanical systems we could test and verify before implementation are a thing of the past, and the role of Machine Learning takes front and centre. Unfortunately, without the ability to test algorithms exhaustively, we must walk a tightrope of test and hope. Faith in systems is a fine balance of Machine Learning and the idea that it is possible to update or rewrite a host of programs, essentially ‘teaching’ the machine how to correct itself. But, who is ultimately responsible? These, and other questions, may balance out in the long run, but until then, basic laws regarding intention or negligence will need to be rethought. Searching for a solution  In every evolution there are growing pains. But, there are also solutions. In the world of tech, it’s important to put the health of society first and profit second, a fine balancing act in itself. Though solutions remain elusive, there are precautions technology companies can employ. One such precaution is to make tech companies responsible for the actions of their products, whether it is lines of rogue code or keeping a close eye on avoiding the tangled mass of ‘spaghetti’ code which can endanger us or our environment. Want to weigh in on the debate and learn how you can help shape the internet of tomorrow? If you’re interested in Big Data and Analytics, we may have a role for you. Check out our current vacancies. To learn more, contact our UK team at +44 20 8408 6070 or email us at info@harnham.com.