Over recent years, researchers at the University of Bath have been developing motion capture – the technology that was most notably used in The Lord of the Rings trilogy to create Gollum’s distinctive movements. We look back at the time Simon Horsford spoke to the team about one of their projects, which required a little help from some local canine friends.
In a small room festooned with cameras and with dark blinds shutting out the autumn light, University of Bath researchers have been monitoring the movement of dogs and opera singers, and actors from the Bristol Old Vic Theatre School. It’s all about motion capture – digitally recording the movements of people and animals – and applying their research to everything from films and video games to elite sports performers and rehabilitation technology.
Maggie’s movements are picked up by the cameras
The team is part of Camera – the nifty acronym for the Centre for Analysis of Motion, Entertainment Research & Applications – and its most recent project certainly captured the imagination. Researchers have been developing a new procedure that will use the movements of a two-legged human actor to drive a four-legged animal character, so as to make animal animations more realistic on film and in video games.
Martin Parsons, head of studio at Camera, says: “At the moment for animal animations, actors have to walk around on all fours and the software changes them into an animal. [Planet of the Apes, for instance, used motion capture techniques to transform the characters]. If it’s close to a human like a gorilla and bipedal then you can do something with that, but if the way the animal moves is fundamentally different, such as a four-legged animal, you need something a lot more sophisticated and that’s what we are trying to do.”
Fred’s movements are captured on the computer as he moves around the studio
Camera began life three years ago with Parsons being recruited by Professor Darren Cosker from the university, his role funded by the Engineering and Physical Sciences Research Council and the Arts and Humanities Research Council. The animal project came out of a bid related to Andy Serkis’ company, The Imaginarium, which was making a new version of George Orwell’s Animal Farm. Serkis, who played Caesar in the Planet of the Apes films and Gollum in The Lord of the Rings, “is a big fan of motion capture,” says Parsons, “and wanted to make Animal Farm using actors appearing as the animals. So what we had to do was take the human motion and convert it into animal motion. The way we do that is to take lots of animal data and teach the computer about how animals move and then feed in the human motion. It works a bit like a puppeteer with the actor using their body to drive the animal avatar.”
The project began by studying pigs from a nearby farm at Newton St Loe – the pigs were too big and uncooperative for the studio – and tracking their movements using reflective markers on the animals and special cameras. “We also used actors from the Bristol Old Vic Theatre School as we wanted to know how actors would perform as pigs, so that we had a better idea of what kind of human motion data we would be trying to convert.” Although that version of Animal Farm wasn’t greenlit (the film has now been taken up by Netflix), Camera continued their studies by moving on to dogs, using nine breeds from the nearby Bath Cats and Dogs Home. “We felt the easiest way forward for capturing quadrupedal motion in the studio was with a dog and we started with mine, Maggie, and then we thought of the Cats and Dogs Home, says Parsons.
Buckbeak the hippogriff first appeared in Harry Potter: The Prisoner of Azkaban
“Humans are much more cooperative wearing markers, but the dogs tried to bite them off, so we designed special coats with reflective markers attached to them denoting the hinge and shaft of the bone on their bodies. Infrared light then hits the markers which is then picked up by the cameras. We captured their movement with a standard agility set using ramps, jumps and poles to weave through.” Apparently, the dogs enjoyed their time in the limelight and were rewarded with numerous treats!
Parsons and his team are still collecting data and processing it in their efforts to make a virtual dog. “There wasn’t much data before,” says Parsons, “and that is interesting in itself. It’s all about taking real world motion and making something that a computer can understand and also finding easier ways of doing that.”
Camera, which liaises with other departments at the University of Bath, such as health and psychology, is also involved in sports research. One example is using motion tracking technology to measure the step length and foot contact time in runners. As one of the researchers, Murray Evans, puts it: “Someone like Usain Bolt is really tall and may do three steps less than everyone else during a race, but would he run faster if he did more steps?” However, it can be hard to monitor high performance athletes as the margins are so fine and they will apparently run differently if they are being watched and monitored.
Grawp the giant in Harry Potter with Emma Watson as Hermione Granger
Similarly Camera researchers have been assessing the impact of tackles in rugby. In rehabilitation too, visual computer technology has been employed in looking at rugby players with spinal injuries and, more widely, in the fitting of artificial limbs and monitoring physical activity in amputees.
The beauty of this research and technology says Parsons is that “there are lots of different avenues where you can use it and Camera is about exploring these areas.”
Parsons’ background is in visual effects. A trained artist with a fine arts degree, he worked in Bristol on BBC Natural History projects before switching to London working with visual effects companies on films creating spiders and dragons for the Harry Potter films and picked up an Emmy award for the 1999 TV movie of Alice in Wonderland.
You understand Parson’s enthusiasm when you realise where the technology can be used. For instance, they are talking to psychologists about creating safe environments for people with virtual reality. Camera has also been working with the innovative chef Heston Blumenthal using a VR headset to re-imagine the way we interact with food – giving familiar foods a different form and texture. “You might have something like a piece of chocolate or a protein ball, but it looks like an ice cube.”
Maggie in the Camera studio
Camera’s commercial enterprises have also seen it work with a company to make use of virtual reality to introduce people to opera. “We got a singer from the Welsh National Opera in the studio, fitted her with marker dots – I used a cut up football net to recreate her flowing sleeves and she mimed the arias,” adds Parsons. The result saw visitors enter a specially designed shipping container and don a VR headset to be immersed into scenes from Madam Butterfly and The Magic Flute.
In all there’s a lot going on behind those darkened blinds on the University of Bath’s campus and it could be coming your way soon. Don’t worry, though, we are still a long way off from the virtual environments imagined in sci-fi dramas and films like Westworld or Total Recall.