How algorithms shape our world

our consultant managing the role
Author: Kian Dixon
Posting date: 2/7/2013 4:14 PM

Kevin Slavin argues that we're living in a world designed for -- and increasingly controlled by -- algorithms. In this riveting talk from TEDGlobal, he shows how these complex computer programs determine: espionage tactics, stock prices, movie scripts, and architecture. And he warns that we are writing code we can't understand, with implications we can't control.

Watch this 15 minute video (which has divided opinion based on the feedback) to learn more...

Related blog & news

With over 10 years experience working solely in the Data & Analytics sector our consultants are able to offer detailed insights into the industry.

Visit our Blogs & News portal or check out the related posts below.

Harnham's Brush with Fame

Harnham have partnered with The Charter School North Dulwich as corporate sponsors of their ‘Secret Charter’ event. The event sees the south London state school selling over 500 postcard-sized original pieces of art to raise funds for their Art, Drama and Music departments. Conceived by local parent Laura Stephens, the original concept was to auction art from both pupils and contributing parents.  Whilst designs from 30 of the school's best art students remain, the scope of contributors has rapidly expanded and now includes the work of local artists alongside celebrated greats including Tracey Emin, Sir Anthony Gormley, Julian Opie, and Gary Hume.  In addition to famous artists, several well-known names have contributed their own designs including James Corden, David Mitchell, Miranda Hart, Jo Brand, Jeremy Corbyn, and Hugh Grant.  The event itself, sponsored by Harnham and others, will be hosted by James Nesbitt, and will take place at Dulwich Picture Gallery on the 15th October 2018.  You can find out how to purchase a postcard and more information about the event here. 

Breaking Code: How Programmers and AI are Shaping the Internet of Tomorrow

Data. It’s what we do. But, before the data is read and analysed, before the engineers lay the foundation of infrastructure, it is the programmers who create the code – the building blocks upon which our tomorrow is built. And once a year, we celebrate the wizards behind the curtain.  In a nod to 8-bit systems, on the 256th day of the year, we celebrate Programmers’ Day. Innovators from around the world gather to share knowledge with leading experts from a variety of disciplines, such as privacy and trust, artificial intelligence, and discovery and identification. Together they will discuss the internet of tomorrow.  The Next Generation of Internet At the Next Generation Internet (NGI), users are empowered to make choices in the control and use of their data. Each field from artificial intelligent agents to distributed ledger technologies support highly secure, transparent, and resilient internet infrastructures. A variety of businesses are able to decide how best to evaluate their data through the use of social models, high accessibility, and language transparency. Seamless interaction of an individual’s environment regardless of age or physical condition will drive the next generation of the internet. But, like all things which progress, practically at the speed of light, there is an element of ‘buyer beware’, or in this case, from ‘coder to user beware’. Caveat Emptor or rather, Caveat Coder The understanding, creation, and use of algorithms has revolutionised technology in ways we couldn’t possibly have imagined a few decades ago. Digital and Quantitative Analysts aim to, with enough data, be able to predict some action or outcome. However, as algorithms learn, there can be severe consequences of unpredictable code.  We create technology to improve our quality of life and to make our tasks more efficient. Through our efforts, we’ve made great strides in medicine, transportation, the sciences, and communication. But, what happens when the algorithms on which the technology is run surpasses the human at the helm? What happens when it builds upon itself faster than we can teach it? Or predict the infinite variable outcomes? Predictive analytics can become useless, or worse dangerous.  Balance is Key Electro-mechanical systems we could test and verify before implementation are a thing of the past, and the role of Machine Learning takes front and centre. Unfortunately, without the ability to test algorithms exhaustively, we must walk a tightrope of test and hope. Faith in systems is a fine balance of Machine Learning and the idea that it is possible to update or rewrite a host of programs, essentially ‘teaching’ the machine how to correct itself. But, who is ultimately responsible? These, and other questions, may balance out in the long run, but until then, basic laws regarding intention or negligence will need to be rethought. Searching for a solution  In every evolution there are growing pains. But, there are also solutions. In the world of tech, it’s important to put the health of society first and profit second, a fine balancing act in itself. Though solutions remain elusive, there are precautions technology companies can employ. One such precaution is to make tech companies responsible for the actions of their products, whether it is lines of rogue code or keeping a close eye on avoiding the tangled mass of ‘spaghetti’ code which can endanger us or our environment. Want to weigh in on the debate and learn how you can help shape the internet of tomorrow? If you’re interested in Big Data and Analytics, we may have a role for you. Check out our current vacancies. To learn more, contact our UK team at +44 20 8408 6070 or email us at info@harnham.com.