Home

Projects

BitBucket

About

Links



My First Raytracer


7/03/2004

I'm going to write a very simple basic raytracer in C++. At least it should be OOP and flexible to extend. Besides that I don't have any goals to reach except creating nice images. Got the idea to code a raytracer. Wrote a GDI template with window and pset/SetPixel function.

8/03/2004

Wrote sphere intersection code..


Two intersecting spheres

..and dotproduct lighting..


Three spheres with dotproduct lighting

..and ignored the deformed spheres :)

9/03/2004

Started with specular highlights. Actually when I had the viewvector - lightvector reflection code working I made (hacked it into) reflections and lost the specular highlight code :)


Spheres, 2 reflecting and 2 colored

The spheres are still deformed and the reflection isn't physically correct because of a small mistake. (exchanging the light and view vector, so now the reflections are dependent of the light position in stead of the viewer)

10/03/2004

Made adaptive supersampling. 3x3 supersampling looks quite nice compared to the speed, 16x16 looks best but takes kinda long.


16x16 adaptive supersampling

After that I decided to write specular highlights again which wasn't too much work. Also shadows were added.


Specular highlights

And the spheres are still incredible deformed because of the fish-eye vision :)

11/03/2004

Created the Plane class + intersection algo. In the screenshot below it is 100% reflective.


16 spheres in a circle with drop shadows on a reflecting plane

And played a little more with planes to make a room for the spheres to live in.


Planes and reflecting spheres

14/03/2004

After sleeping the past 3 days I decided to make a kind of Global Illumination. I'm not sure if I use this name correctly as I'm a total raytracer newbie. What I do is rendering a fish-eye view from the target point, and taking the average of it. A fisheye with 193 samples gives quite acceptable results.


Light emitting spheres


Reflected light emitting spheres

The next thing I'll do is finally correcting the uber ugly fisheye :)

15/03/2004

Corrected the fisheye vision, wrote a .TGA saver. Now you can enjoy real round spheres ;)


Plane with global illumination

4/04/2004

I was busy the past few weeks with other things, but today I re-ordered some code so that the global-illumination code was usable for fuzzy reflections too. I also made the quality of it easy changable to speed up coding and testing. I also implemented gamma correction. The number of samples taken per calculation for illumination and fuzzy reflections n is around pi·q² where q is the quality.

Fuzzy transparency (and even normal transparency) are the next thing i'll code.


Results and rendertimes of quality factors 2, 4, 8 and 16

The fuzzy reflections are calculated by rendering a fisheye view from the point being calculated looking at the normal vector direction. I also checked how randomizing the averaged reflection-directions looks.


Comparison of quality factors q=2 and q=4 with and without noise

You can see that q=4 with noise gives quite acceptable results.


640x480, q=10, noise/montecarlo=on, 8x8 supersampling


19/04/2004

Yesterday and today I wrote ray-tri intersection, texturing, and axis aligned bounding boxes. The train and pig models are from a free model site. (3dcafe.com)


320x240, q=4, noise/montecarlo=on, 3x3 supersampling


640x480, q=4, noise/montecarlo=on, 3x3 supersampling, speculars highlights=on.

Later I'll maybe optimize the usage of AA-boundingboxes to an octree structure.


23/05/2004

Played with environment maps. I made the first maps by photographing a christmas ornament:



Too bad that it isn't 100% round and full of dirt (like fake snow from spray cans). Anyway, using a picture I made later on the street near our house as environment map, I got this result:



Not stunning, but I had some fun at least ;). The environment maps I made are LDR maps, but for example the Eucalyptus Grove HDR lightprobe from the Light Probe Image Gallery gives results like this:




24/05/2004

Didn't make any changes in the code, but just rendered a reflectance test-scene:



Something I didn't discuss yet were rendertimes. This image above, on high quality settings (levels of recursion, accuracy) took 2300 seconds to render, on my development box (P2-350). That is a little more than 38 minutes, not as bad as I expected, since I didn't optimize a single line of code for speed.


07/10/2004

Still nothing changed, but being IOTD (Image Of The Day) on Flipcode, I decided to render something else than only a sphere:



A little legocar (again, not modeled by me). Without colors and with a low number of samples for Monte-Carlo global illumination.



Again the car, now with colors and a texture. Note that only the first 50% of the image is supersampled, it took too long to render because of some mistake I made.


20/03/2005

Again nothing changed, but I am about to write a new raytracer, this time using photon mapping. For this reason I decided to take a look at my good old raytracer, and found out that I left the code behind while being in the middle of coding refraction. I fixed it and rendered some new scenes. This will probably be the last scene my raytracer #1 has rendered, it may take a certain amount of time, but there'll be a raytracer #2 for sure!



Refraction test, rendered at 2304x1728 using low quality inefficient montecarlo, in 1766.8s with 1.1M ray/s. no supersampling, because the adaptive ss would have been triggered like crazy because of the low q. montecarlo noise. bicubic resampled to 1152x864 with Photoshop. The transparent sphere appears to cast caustics on the floor, I'm not sure whether they are correct but it looks somehow odd.



Same scene as above, but this time some post-processing with a kind of hdr bloom effect, to simulate an imperfect camera-lens (or even bleeding of emulsion of the film). Whatever you call it, it just looks cool ;)



Last thing, a small movie of the camera rotating around a static scene, this time illuminated using the hdr map of The Uffizi Gallery, Florence, again from Paul Debevec's Light Probe Image Gallery. The movie is created by rendering 128 frames, each frame took about 1 minute to render. A lot of cpu time was wasted because of a lack of irradiance caching and other smart tricks. DivX, 2.2MB

Future

Next thing will be raytracer #2? With photonmapping, sub-surface scattering and participating media?

home