Tuesday, 22 October 2013
Wednesday, 30 May 2012
I haven't posted here much recently, so here are a few details on a fun project I worked on recently.
I spent the three weeks before NVIDIA's GPU Technology Conference working on the graphics for the galaxy simulation demo shown above, which was shown during the keynote presentation. You can watch the video here. The (ambitious) goal was to achieve something that looked like these classic Hubble telescope images.
Much of the look of these images is due to bright star light being scattered by the surrounding dust, so we spent a lot of time trying to get the dust to look right. For efficiency, the dust particles are simulated separately - they are affected by the stars' gravity, but don't interact with each other. The colour of the stars is mainly due to variance in temperature / age, but we took some artistic license here.
The rendering was done using OpenGL, using a variant of my favourite technique used in the old CUDA smoke particles demo (originally due to Joe Kniss). The particles are sorted from back-to-front, and rendered in slices, first to an off-screen light buffer, and then to the screen (sampling the indirect lighting from the light buffer). The light buffer is blurred after each slice to simulate scattering. Obviously this only simulates light scattering towards the camera, but this isn't a bad approximation in practice. The dust particles are drawn larger and less opaque than the stars.
I also added a lot of post-process glow (which makes everything look better), and a cheesy star filter (see below), which they made me remove in the end!
Anyway, most of the credit for the demo should go to Jeroen Bédorf and Evghenii Gaburov, who wrote the Bonsai simulation code, which you can read about here. Props also to Mark Harris (who did a lot of the optimization), and Stephen Jones, who did the CUDA dynamic parallelism implementation (which is pretty cool, by the way).
The biggest regret I have is not doing proper anti-aliasing for the stars that were smaller than a pixel. On a 60 x 20 foot screen, each pixel was about the size of a sugar cube and you could see them crawling from pixel to pixel!
Friday, 26 August 2011
With the recent news about Steve Jobs leaving Apple, everybody seems to be posting their stories about him. Mine isn't that funny, but here goes.
One of my first projects at NVIDIA was working on a real-time version of Pixar's famous "Luxo Jr" short film. Shadows maps were a new feature on the GeForce 3 graphics card, and our marketing guys thought this would be a great way of showing them off (without any concept of how difficult this would be, obviously). The GeForce 3 was planned to be launched with the new iMacs at MacWorld Tokyo 2001.
The crazy thing was that we developed the whole demo without any permission (or help) from Pixar whatsoever. As my boss liked to say, it's easier to ask for forgiveness than permission.
Our intern Eugene D'Eon (now at Weta), modeled the lamps and rotoscoped the entire animation frame-by-frame from a rip of the original DVD. I worked on the shadows and shading. We finished the demo with a few days to spare (those bendy lamp cords were a real pain to get right, let me tell you), and our execs showed it to Steve at Apple. He said it was pretty close to the original, talked to Pixar, and gave us permission to show it at the launch.
So me and our chief scientist David Kirk flew out to Tokyo for the launch. My job was to drive the demo on stage, which was somewhat nerve-wracking, as you might be able to tell from the video above. During the rehearsal one of Steve's assistants told us we might have to leave the room because he was getting upset that his slide remote control wasn't working, and we wouldn't want to see that. I got the impression this happened quite often!
Later, David and I had a brief chance to talk to Steve. From what I remember of the conversation, he mainly talked about how great the new OS X user interface was. In my youthful ignorance I jumped in and said it was pretty, but had they thought about using the graphics hardware to speed up the rendering? (It was pretty slow at the time). He said he had talked to his best engineers about this, and they told him it was impossible. There was no way they could get the quality they needed using the GPU. Anyway, a year or so later OS X did introduce a GPU accelerated UI with smoothly scaling icons etc. Coincidence? I don't think so!
We only got to show that demo once, but it was worth it, I think.