The Tempest has been transported into the 21st century with a revolutionary new digital avatar created by Imaginarium Studios.
For the first time ever, The Royal Shakespeare Company worked alongside Intel on a new production. The play incorporated a real-time avatar as the part of the sprite Ariel on stage within Shakespeare’s 400-year-old play.
This was done using performance capture technology most commonly used in gaming and films. It captures the facial expressions and movements of an actor to create the most lifelike animations.
The company that created the extraordinary visuals was The Imaginarium, a digital studio based in London, set up in 2011 by producer Jonathan Cavendish and actor Andy Serkis whose interest in animation was peaked during his time in ‘Lord of the Rings’.
The duo aimed to set up a studio that could produce the highest quality animation using motion capture technology to feature in video games, television and film. Since then it has been used in major films such as ‘Rise of the Planet of the Apes’ (2011), ‘Star Wars: The Force Awakens’ (2015) and the game ‘Battlefield 1’ (2016).
The new production of The Tempest aimed to join together today's ground-breaking computing tech with the ancient tradition of theatre.
Intel's Tawny Schlieski revealed that that the unlikely union come about after Gregory Doran, the Artistic Director at the RSC saw a video of Intel’s giant digital whale that flew across the heads of the audience at the Consumer Electronics Show.
The RSC wanted to capture the wonder of the spectacle in his interpretation of The Tempest.
Motion capture technology is usually used in the post-production stages of films and television programmes. So the challenge for The Imaginarium lay in making the animation live to capture the “spontaneity of theatre”. Therefore, the endeavour needed powerful Intel technology.
Gregory Doran said that he was inspired by Shakespeare’s own innovations to create “a unique theatre experience, which marries [The RSC’s] distinctive theatre skills with cutting-edge technology”. The results of this collaboration were immersive and visually breath taking.
Schlieski also hinted that Intel would be interested in taking part in further productions using this technology to “change the face of live events and make more amazing, immersive experiences possible.”