Tuesday, March 29, 2011

720p video of Unbiased Truck Soccer Sunny Sky with caustics!

HD video: http://www.youtube.com/watch?v=ymo57ElhHvY
Rendered with my 8600M GT, it's still doing a pretty good job despite its age :) The caustics from the glass sphere are not reflected in the mirror ball because max path length is set to 3 for performance reasons. Setting max path length to 4 will show the reflection of the caustic light pattern.

Download executable: http://code.google.com/p/tokap-the-once-known-as-pong/downloads/list



Increasing the emission values (RGB) of the "sun" to 4, 4, 2 makes the caustics more obvious:

Monday, March 28, 2011

Unbiased Truck Soccer: Sunny sky with only a few traces of rayn!

Another test in the quest for faster convergence speed, this time using a skydome and sun.








One of the advantages of using a skydome to light the scene is that the difference between 2 and 3 bounces of indirect light is not as large as when using an area light:

max path length 1 (zero bounces)


max path length 2 (1 bounce)


2 bounces


3 bounces


Another advantage is the very fast convergence speed compared to using area lights due to the fact that almost every pixel can 'see' the skydome. Only the pixels that are occluded from the skydome (e.g. the ground patch under the car) clean up slower because they are indirectly lit. Using bidirectional path tracing would greatly increase the convergence speed of these pixels (edit: as pointed out by Iliyan in the comments, bidir path tracing would actually perform worse in this outdoor scene where standard path tracing shines).

An overcast sky can be simulated by using only the skydome for lighting (without an emitting sun sphere):




In this case, the noise clears up very fast with just a few samples. This lighting setup will be used for the Unbiased Truck Soccer game.

Download the executable for this sky test at http://code.google.com/p/tokap-the-once-known-as-pong/downloads/list (package updated with glass sphere for some nice caustics)

To brighten/darken the sun, select the sun sphere by right-clicking it, click the 'emi' button on the top right of the screen and change the values at the bottom of the screen to e.g. 10, 10, 5. Overbrightening the sun will cause the shadows to look sharper and more pronounced, but will also increase the noise to unacceptable levels:

Sunday, March 27, 2011

Unbiased Truck Soccer: motion blur tests

I made some tests to determine which amount of motion blur (accomplished by accumulating samples from previous frames) is acceptable for relatively fast moving objects.

The picture below shows a comparison of the scene from 'Unbiased Truck Soccer' with different amounts of motion blur: no motion blur (left), averaging the last 3 frames (middle) and averaging the last 6 frames (right). These images were all rendered on a 8600M GT with a frame rate of barely 5 fps, so the comparison is not representative for more powerful GPUs, but it provides a general idea of the "noise freeness" of parts of the image that are static like the walls, floor and ceiling. The three little rectangles at the bottom of the picture show a close-up comparison of a noisy area of the ceiling which is lit only with indirect lighting.



A low-res video:


The truck is performing a looped animation. The camera can be moved by holding the middle mouse button (dragging in image plane) and Shift + middle mouse button (zooming).

The new test can be downloaded from http://code.google.com/p/tokap-the-once-known-as-pong/downloads/list The package contains three executables, with different amounts of motion blur (none, average of last 3 frames and average of last 6 frames).

On a GTX 580 this demo should run at 60 fps at default settings (768x512 resolution, max path length 4, 4 samples per pixel per pass blurred to 12 (motion blur = 1) or 24 (motion blur = 2) samples per pixel).

Monday, March 21, 2011

OTOY at the Abu Dhabi Media Summit 2011

Last week, AMD held a session about "Content and the Cloud" at the Abu Dhabi Media Summit with OTOY's Jules Urbach as one of the main speakers. A video of the complete session can be found at

http://www.youtube.com/watch?v=WGcyyTZfXTE

Some interesting snippets that were shown and talked about:

- Crysis 2 rendered in the cloud at the highest settings and streamed to an iPad using OTOY's tech

- games can be rendered for 16 concurrent users with a single GPU

- (around the 16:00 mark) path tracing!!! A very short clip was shown where Jules manipulates an extremely high detail model from the Transformers movie (created by ILM) on an iPhone in real-time, rendered in the cloud with path tracing and displayed at 60 fps. Path tracing will scale to as many servers as are available. This will really revolutionize the way games and films are made. A blurry picture below:

- Software tools such as Blender will be delivered through the cloud with OTOY

- WebCL! The next logical step after WebGL, which will make the GPU computing power from the cloud accessible through a webbrowser. Very interesting.

- Operating systems, next-gen consoles, Blu-Ray discs will become irrelevant when all apps run in the cloud

- The same assets from the Gaiking movie (to be released next year), will be used in a Gaiking game that can only be played on the cloud due to the massive computing resources it will require for rendering the graphics in real-time. Tantalizing... :-D

Screen from the Gaiking teaser trailer:



Thursday, March 17, 2011

Unbiased Truck Soccer: First physics test with Bullet Physics

UPDATE: I've uploaded the executables for the Bullet physics test and the new scene with the soccer playing field at http://code.google.com/p/tokap-the-once-known-as-pong/downloads/list

I've recreated the truck from 'Unbiased Truck Soccer' in the Bullet Physics engine and applied some physical properties to make it behave like a real vehicle like suspension stiffness, damping, rolling and friction.

This is a video of what the gameplay should be like when using the Bullet Physics engine, made with the built-in debug OpenGL renderer of Bullet:


And a Soccer game is not complete without a huge open playing field. The goals will be represented by a blue and a red sphere. Players can score by bumping the soccer ball against the opponent's sphere.
The hardest part is using the output of the Bullet Physics engine to update the position of the trucks and the soccer ball in the real-time path tracer. It should be fairly straightforward though, so I hope to have a working version soon!






Tuesday, March 15, 2011

Unbiased Truck Soccer: coming soon!




I had the idea for this game just yesterday. The goal is to push the ball against the moving goal (glowing paddle) of the opponent and score. Once it's finished and physics are actually working, you will be able to move the truck in every way, not just forward, backward and strafing left and right. Initially it will be a two player game, but hopefully I can make a single player game with an AI controlled truck.

In this particular case, path tracing provides very real and natural looking lighting and shadows. And it's still somewhat real-time on my poor laptop with 8600M GT (2.3 fps, with 4 samples per pixel , max path length 4, at default resolution), so I'm confident that it will look and play much better on a high end GPU. A GTX580, which is 20 times faster than my card (measured with Cornell Box Pong), should be able to reach 40 fps at 4 spp, default resolution. Image quality at 4 spp is very acceptable thanks to the frame averaging trick (reusing samples from previous frames to fake motion blur) by Kerrash. Hopefully I can get the Bullet physics engine working soon.

Download the exe for this WIP Tokap Unbiased Truck Soccer at http://code.google.com/p/tokap-the-once-known-as-pong/

The following GIF (click on it to see the whole image) shows the effect of the max path length on the lighting in the scene. The difference in realism between the image with path length 1 (zero bounces = direct lighting only, no global illumination) and the image with path length 2 (1 bounce global illumination) is huge. Reflections and color bleeding (mostly visible on the surfaces facing downward and on the ceiling) are completely missing from the image with path length 1. Refractive objects need at least a path length of 3 to become (slightly) transparent. The effect on framerate is also interesting: rendering with path length 3 (1.08 fps) halves the framerate compared to rendering with direct light only (2.07 fps).








Below is a simple chart plotting max path length against framerate. The numbers are for the above scene at default resolution and 4 spp on a 8600M GT. The curve demonstrates that the framerate is less impacted at higher path lengths.



UPDATE: 2 more videos

Mapping the movement keys to the eye pupils gives this result:


Playing with the main light source (720p video):


Rocky's opinion about the current state of game graphics:

Update 12 on real-time path tracing: Meet Rocky



Path traced in real-time on a 8600M GT. I hope to have physics working soon, so I can make a (multiplayer) game with trucks bumping and crashing into one another. Or a game with two trucks and a ball, where each truck has to push the ball into the goal of the adversary. Unbiased Truck Soccer, path traced in real-time. Joy ;)

Download 'tokap truck eyes' executable and source code at http://code.google.com/p/tokap-the-once-known-as-pong/

Sunday, March 13, 2011

Update 11 on real-time path traced Tokap: Pimped out, chromed out truck!



Since Tokap can (currently) only ray trace spheres, I've decided to build a funny looking car out of spheres: the body of the car consists of 3 merged diffuse spheres, the top sphere is a blue refractive sphere, the wheels are grey diffuse spheres with reflective spheres inside them representing the rims. The headlight is a reflective chrome like sphere with an emitting sphere inside. The car can currently only move forward, backward, and strafe sideways.

Image with motion blur, 20 spp (reusing samples from previous frames), 0.84 fps on 8600M GT


Image without motion blur, 20 spp, 0.84 fps (on 8600M GT):


High resolution image with motion blur, 8 spp with frame averaging, 0.77 fps



This is an image where the main light source is turned off and the scene is only lit by the emitting white spheres in the headlights:



Some videos:

Notice the soft shadows and ambient occlusion under the car, and color bleeding from the floor onto the body of the car. Photorealism becomes a piece of cake with path tracing :-).

Everything is still rendered on my humble laptop with a 8600M GT, maybe it's time to upgrade ;-) Even on such low end hardware, the amount of noise is quite acceptable in this particular scene (where everything is mostly directly lit).

Download this 'tokap truck' executable (CUDA enabled GPU required) at http://code.google.com/p/tokap-the-once-known-as-pong/

Stay tuned for more tests with hopefully some physics so the truck can drive up a slope, push a ball, collide with another truck, ...


Chromed out ;)



UPDATE: more videos!




If anyone has a better CUDA GPU than mine (which is not unlikely ;-), I would really appreciate it if you could capture a short video and upload it somewhere.

Friday, March 11, 2011

RenderSpud, a new path tracer for Blender

RenderSpud is a new path tracer which has been in development for some time. Recently the author released a very short but cool looking video made with the RenderSpud Blender plugin: http://www.youtube.com/watch?v=4q2jdIkgtd0
"Rendered with PyBlenderSpud (RenderSpud Blender plugin). 16 samples/pixel path tracing at 720x480, 15-20 seconds per frame to render the 70-frame sequence."
The image quality for just 16 samples/pixel looks great and 20 seconds per frame is not bad for that resolution, especially when keeping in mind that this is just CPU path tracing. If this would be optimized and ported to the GPU, it could probably reach rendertimes of 1 second per frame or less (at 720x480 and 16 samples/pixel) when using multiple GPUs.

Some recent pics of RenderSpud can be found at https://picasaweb.google.com/mike.farnsworth/RenderSpud#5529166650522716434

Thursday, March 10, 2011

Update 10 on the real-time path traced Pong game: still looking for help! :)

During the last few weeks I've been trying to integrate Bullet Physics into Tokap while simultaneously learning myself to program C++. I've made some progress, but I'm still stuck on getting user controlled kinematic bodies working.

Having a quality 3D physics engine like Bullet Physics powering the gameplay mechanics in Tokap would be a big plus and would allow a multitude of gameplay ideas and variations. Once the physics are in, there could be a lot more physics driven game ideas beyond just the simple Pong game.

Sample code and a Bullet Physics-themed 'Hello World' can be found at http://www.bulletphysics.org/mediawiki-1.5.8/index.php/Getting_Started and http://www.bulletphysics.org/mediawiki-1.5.8/index.php/Main_Page. It's easy to get things up and running and to familiarize yourself with the Bullet Physics API, but it gets a little trickier when trying to get user controlled objects working. If anyone's interested to help me out with this project or has experience with Bullet Physics or other physics engines, I would very much appreciate it! You can contact me at the address on this page http://i55.tinypic.com/2ppfqma.jpg

Some details about Project Denver

BSN has put up 2 articles with some additional details about Nvidia's Project Denver which is going to be part of Maxwell GPUs:

- http://www.brightsideofnews.com/news/2011/3/9/nvidia-reveals-64-bit-project-denver-cpu-silicon-die.aspx

- http://www.brightsideofnews.com/news/2011/3/8/nvidia-project-denver-is-a-64-bit-arm-processor-architecture.aspx

Some interesting bits:

Fermi can apparently run a custom version of Linux:
"Thus, we don't expect Project Denver to appear before late 2012 or early 2013 - in line with Maxwell GPU architecture, which is expected to integrate Project Denver architecture and become the first shipping GPU which could boot an operating system. It would not be the first GPU to boot an operating system, though. According to several PR representatives, the company already managed to boot a special build of Linux using Fermi GPU, but resources for that were abandoned as it proved too much of a hassle."

"In theory, Project Denver cores inside the Maxwell GPU die should enjoy access to 2+TB/s of internal bandwidth and potentially beyond currently possible 320GB/s of external memory bandwidth (using 512-bit interface and high-speed GDDR5 memory). If nVidia delivers this architecture as planned, we might see quite a change in the market - given that neither CPUs from AMD or Intel don't have as high system bandwidth as contemporary graphics cards."
With such extremely fast memory bandwidth between the ARM CPU and the Maxwell GPU (both on the same die), real-time ray tracing of dynamic scenes will benefit greatly because building and rebuilding/refitting of acceleration structures (such as BVHs) is still best handled by the CPU (although there are parallel implementations already, see the HLBVH paper by Pantaleoni and Luebke or the real-time kd-tree construction paper by Rui Wang et al.)
David Luebke (Nvidia graphics researcher and GPU ray tracing expert) said in a chat session preceding the GTC 2010 conference in September:
"I think Jacopo Pantaleoni's "HLBVH" paper at High Performance Graphics this year will be looked back on as a watershed for ray tracing of dynamic content. He can sort 1M utterly dynamic triangles into a quality acceleration structure at real-time rates, and we think there's more headroom for improvement. So to answer your question, with techniques like these and continued advances in GPU ray traversal, I would expect heavy ray tracing of dynamic content to be possible in a generation or two."
This would imply that the Maxwell generation of GPUs would be able to raytrace highly dynamic scenes and that path tracing of dynamic scenes could be feasible as well. A pretty exciting thought and much sooner than expected :-)