Our scene showcases a glimpse of a cyberpunk-inspired city, populated with high-rise buildings,
with neon lights breathing life into it. Our city lies amidst dense fog, which adds to the eerie vibe of
our scene. The harmonious combination of the neon signs and buildings while showering in the ominous,
chaotic fog looming over the city encapsulates the theme of this rendering competition.
A person residing within the scene might have their visual sensors chaotically overstimulated, but
couldn't get enough of the eye-candy which our scene has to offer.
Click the image on the right to be redirected to full image
Allowed adding texture details on meshes without additional complication
Provided photorealism by adding frosting to glass dielectric as shown below
Not only did it light up the scene in a more photorealistic fashion, but also reduced the overall noise as shown
Simply changing the PDF sampling schema to subtended cone sampling reduced the noise further
We show our results from a test case by Eric Veach. This demonstrates improved sampling along the areal emissions
Respecting our overall cyberpunk theme, we added fog to our scene with a volume tracer that allows for handling homogeneous participating media
Using Open Image Denoise Library (OIDN) from Intel, we reduce the noise further from our scene
Adding bloom at the end makes light emissions and reflections even more realistic
Key metrics and performance data for the renderer.
4 hours 17 minutes
6
512
2565x1440 pixels
Ryzen 7 7840HS 16GB RAM
2.79 Million
One of the key challeges we faced was to get volume path
tracing for homogeneous media to work. As opposed to
PBRT's implementation of attaching medium property to the ray,
we took an instance-centric approach where our scene lies in a cube
having medium property associated with it. We explicitly track and
maintain number of bounces in the cube instance to determine entry and exit
from the media. Since we do not use localized homogeneous media, this
hack suffices for our use case.
We also faced minor issues with getting improved area light sampling to work
alongside MIS path tracer. We had to ensure that PDFs were computed and returned in the
right domain, specifically conversions between spatial and angular domains.
We would like to thank the course instructors for equipping us with the knowledge to write out a ray tracer from scratch. We would also like to thank our tutors for their invaluable support and assisting us through the assignments and the rendering competition. Lastly, we provide references for the links that were used as additional resources for scene creation: