Difference between revisions of "Image Synthesis"
(→Project 1) |
(→Project 1) |
||
Line 9: | Line 9: | ||
[[File:im_synth_p1_s1.png|thumb|center|1 Sample]] | [[File:im_synth_p1_s1.png|thumb|center|1 Sample]] | ||
− | [[File:im_synth_p1_s16.png|thumb|16 Samples]] | + | [[File:im_synth_p1_s16.png|thumb|center|16 Samples]] |
− | [[File:im_synth_p1_s256.png|thumb|256 Samples]] | + | [[File:im_synth_p1_s256.png|thumb|center|256 Samples]] |
− | [[File:im_synth_p1_s1024.png|thumb|1024 Samples]] | + | [[File:im_synth_p1_s1024.png|thumb|center|1024 Samples]] |
== Project 2 == | == Project 2 == |
Revision as of 07:04, 18 January 2015
I took this class with Dr. Pete Shirley in 2006. He was a very animated professor and loved talking about light and how it behaved. At one point I remember him mid sentence running into the kitchen connected to our classroom and coming back with a glass container he held up looking at it from different angles to talk about refraction.
All of the assignments were completed via blog postings. I decided to write my renderer in java for kicks, while most people chose to use C++. Looking back I slightly regret my decision since I had no experience writing "fast" java code and my renders took noticeably longer than my classmates.
Project 1
This first project samples various frequencies on the Macbeth color checker. If it passes a basic sampling check, the image color is sampled to the frame buffer. All the samples are acculumated over each time step to get a better average color. All these images were rendered at 720x480 (they seemed like good numbers). They were rendered on a Toshiba Tecra S2.
Project 2
This second project is similar to the first. I sampled XYZ estimates using the tristimulus curves and converted the samples on the graphics card to RGB using the Adobe RGB table conversion standard matrix. It took my poor little laptop almost two minutes to render 1024 time steps at 720x480.