Electrical engineer Guy Griffiths heads up R&D at Animal Logic. An Academy Award winner, he is passionate about using science and technology to empower artists.
Trips to the local Radio Shack, rather than the cinema, helped set Guy Griffiths down his career path.
The long-time director of R&D at Animal Logic (AL) – the world-renowned, Sydney-headquartered visual effects, animation and production studio – had his heart set on chemistry at age 12.
But then something twigged and he latched onto electronics, spending countless hours at the electronics hobby store, reading their books and trying to absorb as much as possible from them for free.
“I immediately said, ‘That’s what I want to do,’ he told create of his epiphany.
“I just got fascinated by electronics, so I pretty much acquired as much knowledge as I could.”
He continues the potted history that takes him through school, as part of the team that helped invent a key piece of digital filmmaking technology, and then to AL.
Following years of self-teaching through many visits to stores like Tandy and Radio Shack, spending pocket money on books for programming BASIC, and many afternoons staying back at school to use a shared Apple II until he was told to go home, Griffiths showed up at Swinburne University with a good understanding of digital logic.
In his final year he built a Macintosh clone, gained an internship at Kodak (his first project was building a CCD camera frame buffer) and was then offered a job in its metrology systems group.
He finally entered the film industry in 1990, joining Lindsay Arnold on the company’s Cineon Workstation project, which eventually earned the pair (and three others) a Scientific and Technical Academy Award in 2005.
“We were building essentially the first workstation image manipulation system, which we know these days as a compositing system,” he said.
Griffiths eventually joined Animal Logic at the start of the millennium (after six years in Hollywood as Director of Technology at a VFX studio and Co-CTO at a startup company) and since then has managed the development of the tools that enable artists to dazzle and delight us through our screens.
Despite Australia’s distance from Hollywood, it has an impressive reputation for VFX. Griffiths’ company has worked on movies including Alien: Covenant, The Great Gatsby and The Matrix.
Its R&D team often take off-the-shelf software programs from companies such as Autodesk and Pixar and “heavily, heavily extend them”, giving artists an edge.
By the time he came to AL, the company had developed a set of digital filmmaking innovations such as the pioneering effects software Eddie, and had been involved in VFX work, most notably for Babe: Pig in the City and The Matrix.
However, he has seen it grow “from small to big”, assisted by the success of its Oscar-winning animated feature in 2007.
“As we grew we were hired to do some pretty cool things on the back of Happy Feet, scale the company up,” he said. The company now employs around 500 staff in Sydney, 190 in Vancouver (where it set up shop in 2015) and a small team in LA focusing on film development.
Bricks by the bucket
Following work on the highly-successful The Lego Movie, this year will see the release of The Lego Ninjago Movie and The Lego Movie Sequel is scheduled for 2019.
“When we put the first trailer out [on YouTube] one of the most satisfying comments on there was basically, ‘Thank God! I thought it was going to be this horrible CG film. They’ve done stop motion. Of course they had to put the faces on as a post production process,” Griffiths said.
Among the many challenges for animating the iconic bricks was achieving photoreal animation, with other tough demands including overcoming challenges posed by the way the toy articulates and joins, and the colours and the idiosyncratic ‘look’ of Lego.
“The fact is everybody has an imprinted idea from childhood on what that Lego plastic looks like! There is nowhere to hide,” AL’s former Key Lighter Max Liani told Video and Filmmaker magazine afterwards.
“Traditional CG lighting tricks won’t work. Some approximate global illumination won’t cut it. The last thing we wanted was this movie to look like… well ‘CG’.”
The way light interacts with the bricks threw up hurdles for existing tools, such as Pixar’s RenderMan, which was hampered by subsurface scattering (SSS) and glossy reflections/refractions of light on Lego. SSS is the way light interacts with a translucent object and enters/exits at different parts. A backlit ear can provide an example.
The company developed Glimpse (formerly an in-house previewing tool for lighting leads) into a full production renderer. The Lego Batman Movie was rendered entirely using Glimpse.
“Over the duration of the project we started adding more capability to Glimpse,” Griffiths said of The Lego Movie.
“It wasn’t a primary renderer, it was a ray accelerator. In fact, the team out here, we were still two weeks away from ending the film, adding capability to it, for Glimpse to speed up certain things.”
Asked about the main differences between the first Lego feature and The Lego Batman Movie, he explains in terms of scale – “there’s a bucket load more bricks” – though the two aren’t easily comparable.
“If you look at Batman’s cave – we wanted to be able to go the size of this massive cave. It’s got a space shuttle in it, it’s got every toy you can imagine. It’s like ‘holy crap, that’s got to be big.’ In the first film we actually were limited in terms of how many bricks we could get, or joints we could get in there.”
Gotham City consists of 41,605 models made up of 220,831,071 bricks. If built from real bricks it would be 182 m in length x 193 m in width x 19 m in height!
Global impact
As mentioned earlier, Griffiths was a key part of Kodak’s Cineon project. As Platform Engineering Group Leader, he primarily worked on the platform, the hardware, and much of the image processing software.
“In the day there were Intel 386 processors inside most PCs, probably with 640 Kbytes of RAM – they were tiny! We were trying to process these 4K images: one image was 48 megabytes,” he said.
“Our system had 120 processors arranged in a 10×12 mesh with 480 Mbytes of RAM in a big box with 40 gigs of disk, which in those days was a lot of gigs, a lot of megabytes.”
Developed at Coburg, the workstation was manufactured from 1993 to 1997. According to Australia’s Greatest Inventions (2010), after a subsequent product gained an Oscar nomination, the Academy later “recognised that Cineon was the breakthrough technology behind the product.”
Other Australian film innovations include the first feature-length movie, The Story of The Kelly Gang (1906) and more recently cineSync, a review and approval tool developed by Rising Sun Research (now Cospective), which won a 2011 Technical Achievement Award.
“RSP’s CineSync has proven to be a game-changer in the VFX industry, particularly for companies that are geographically distant from Hollywood and London,” Dr Josh McCarthy, a senior lecturer in Media Arts at University of South Australia, said.
Making things with light
One rendering technique used in The Lego Movie franchise and other movies is ray tracing, which is underpinned by expertise in physics, maths and computer engineering know-how.
When a ray of light hits a surface it can be absorbed, reflected, refracted or fluoresced. The path of this light, which might continue to bounce off of and be affected by other surfaces, can be expressed mathematically and computed.
A picture can be made up of many, many rays, and the paths that all these billions and billions of rays travel can be computed, though expensively, using Monte Carlo sampling, which takes quite a bit of processing power. The private cloud at AL uses “north of 30,000 cores”.
“If you’ve got a certain side of a city, there’s a lot of places that ray can bounce, right? From one side of the city to the other and back,” explains Griffiths.
“The techniques we use these days are heavily physics-based. That’s the starting point.”
Good effects need to be plausible, and plausibility needs a rich understanding of physical effects and mathematics. The pursuit can be summed up as ‘plausible though at times heavily-stylised reality’. This is achieved through huge amounts of description through geometry. For example, Batman’s cape contains about 93,000 curves to describe its weave.
“We had systems to describe all that weave in terms of thread going interlocking, and systems and the mathematical descriptions of that, and controllability,” adds Griffiths.
Another example that comes up is simulating water effects, specifically ‘Lego water’. This is all based on computational fluid dynamics, based on Navier-Stokes equations.
“You can do the old volume approach, dice it up into little volumes and deal with pressure solves and the volume solves, or you can do it as particles,” explains Griffiths.
“It’s probably a little bit more straightforward than you would imagine to actually get to that point. That would have been solved with that sort of computational fluid dynamics, which we do a lot of, by the way. It’s not like attributive to Lego. We do it all the time for water.”
This brings us to the type of people required to develop such tools.
Griffiths says there’s a mix of physics, maths and computational skills among an R&D group of roughly 40. Developers need to bake detailed mathematical descriptions of the physical world into systems that artists can then manipulate.
“It’s good to see the mix between art and the engineering and maths behind it,” he says.
“You’ve got the maths, you’ve got the art, but the engineering part is joining that up. Building a systematic release of the science into the hands of the artists.”