Projects
Destiny
8.5 years of Destiny.
I worked on several teams across the studio supporting the standalone and expansion releases.
Majority of my work revolved around test automation and creating tools to improve the workflow of testers, content creators and engineers.
A few highlights of my work include:
Helped establish a test automation pipeline, a crash pipeline and a quality bar as the first QA representative on an incubation project.
Helped grow the SDET Org by hiring and managing direct reports.
Helped streamline the automation pipeline to allow for easier debugging of farm jobs.
Helped create an automation framework that efficiently runs test automation jobs on multiple farm VMs.
Created an automated framework that tests graphics features in a particular build, by using computer vision to compare screenshots against ground truth.
Created a world editor plugin to help run graphics automation locally, to assist with iteration of R&D graphics features.
Created a tool to help browse combatants within our world editor, to assist testers and designers with searching and filtering content related to combatants.
Bridged communication between the Stadia platform and our internal tools. This paved the way for automated tests to be run on Stadia.
Supported a suite of automation tests on multiple platforms and reports the results to branch owners.
Chrononaut
This project is a 3D first person puzzle game where players must solve a variety of puzzles to progress through a level.
Players can use a variety of time based abilities to help them deal with the challenges in the game.
This game is a year-long project from DigiPen’s Masters Program. The engine was built from scratch using C++, DirectX 11 and HLSL.
You can download the game here.
I had two roles in this project:
Graphics Programming:
I co-created the graphics engine in my game from scratch using C++, DirectX 11 and HLSL (Pixel Shader 5.0). I was solely responsible for setting up a deferred renderer that can support more than 1024 point lights in a scene with negligible drop in frame rate. I optimized the deferred renderer to cull lights based on screen tiles and grouping the lights to avoid wasting graphics memory. This tile-based optimization is based closely on work done by Intel’s Andrew Lauritzen.
Additionally, I have implemented post-processing effects such as glow mapping, bloom, screen space ambient occlusion (SSAO) and fast approximate anti-aliasing (FXAA).
I have also implemented texture techniques such as normal mapping and texture blending.
UI Programming:
I set up a text renderer within the engine that was mainly used as an in-game debugging tool for the Physics and Gameplay programmers.
Additionally, I implemented a 2D renderer that adds support for displaying or blending textures in screen space or world space. This allowed UI elements to be drawn on the HUD such as crosshairs, text prompts etc.
1000 Point Light Rigid Bodies
A graphics and physics stress test in my game engine. 1000 point lights attached to 1000 rigid sphere bodies are randomly spawned in a scene.
The graphics system uses deferred shading for lighting and rendering. Tile-based optimizations based on Andrew Lauritzen’s implementation were also used.
The physics system uses SAT for narrow phase and Sweep and Prune for broad phase.
This game engine was built from scratch by a team of 5 using only C++, HLSL and DirectX 11.
Random Scattering: Landscape Generation
This video is a demonstration of an Artificial Intelligence research project that I worked on. The goal of this project was to procedurally generate a forest such that objects like trees, rocks, fireflies etc. are randomly scattered in a controlled environment. There are two approaches to implementing this project:
The first approach is random scattering with no overlap. This method has been adopted based on the article linked here.
The second approach is more physically based. This method involves randomly scattering spheres and letting the physics engine resolve collisions (thus avoiding overlap). The spheres are then replaced by the assets in the game. Additionally, the player can trace a path through the forest by simply walking on the grass.
The fireflies are rendered as dynamic point lights with random velocities and initial positions. There are 512 point lights in this scene.
The engine has been built completely from scratch using C++, HLSL and the DirectX 11 framework.
D.O.T.S. (Defense of the Sphere)
DOTS is a survival top-down shooter brought to you by Team TDS where the player takes control of the only sphere on a giant cube and must survive all six sides in order to win the game.
This game is a semester-long project from DigiPen’s Masters Program. The engine was built from scratch using C++, DirectX 9, and using FMOD for the sound. The game was inspired by other survival shooters, Geometry Wars, Killing Floor, and Beat Hazard, as well as strategy games like Frozen Synapse and even a little bit of Tim Schafer’s Psychonauts!
I worked as a visual designer and graphics programmer for this project. I implemented lighting, post-processing, modeling and textures.
You can download the game project here, and view the readme here.
Omni-Directional Shadows via Cube Maps
Omni-directional shadow mapping is a technique applied to point lights to provide more realistic shadows in closed environments.
Shadows are created in two passes:
1. In the first pass (depth pass), a cube mapped depth texture is created from the light’s point of view. Basically all the objects in the scene are rendered from six perspectives of the point light and stored into a depth texture.
2. In the second pass (lighting pass), the depth texture is sampled while calculating lighting for all the objects in the scene. Any pixels on objects that are in direct view of the light source will pass the depth check and be lit. All other pixels that fail the depth test will be in shadow.
This project was implemented in a C++ based game engine using the DirectX 11 framework and HLSL.
Animation
The videos above showcase a few animation techniques that I have implemented within my game engine.
The first video shows the motion of an animated model along a path.
Enhancements include ease-in and ease out with skidding and sliding fixes. Ease-in and ease-out slows down the animation for the model at the beginning and ends of the path. The skidding/sliding fix slows down the character’s animation time to match the speed of ease-in and ease-out.
Additionally, the path followed by the animated model is rendered. There are initially only 8 control points, but they are interpolated into a curve using Catmull-Rom interpolation. The eight control points can be edited using my engine’s in-game editor.
The second video demonstrates my implementation of inverse kinematics.
The arrow models represent the bones, whereas the claw represents the end-effector. The skeleton tracks the position of the beach ball in every frame using the CCD algorithm (Cyclic-Coordinate Descent).
Ray Tracing
This project involved creating a C++ based raytracer.
All the intersection and lighting calculations were written from scratch.
After the basic intersections and lighting were up and running, several enhancements were implemented. These include tracing rays through reflective and refractive materials. Also, lighting was improved from the basic Phong lighting to the BRDF model, and shadows were sampled for every object in the scene.
The images above are rendered by my raytracer at 32x AA.
Airborne Assault
This game is based on the classic Paratrooper game. The goal of the game is to defend your home against the enemy troops using either Surface-To-Air cannons or Surface-To-Surface Bombs.
You can download the game here.
Beach Ballin'
This game is based on a simple platformer concept. You control a beach ball and you must collect all the air canisters before the ball deflates. Also, spikes are bad. Avoid them!
You can download the game here.
Return of the Death Stars
This game is based on the classic Asteroids game with a Star Wars twist. You control the Millenium Falcon and your mission is to destroy the Death Stars before they annihilate you.
You can download the game here.
Project Sentinel
Project Sentinel consists of a nerf turret that can be controlled remotely through a network. It operates in two modes: manual and autonomous.
In manual mode, the user is able to send movement and firing commands to the nerf turret using an attached Xbox 360 controller.
In autonomous mode, the system uses an Xbox 360 Kinect sensor to track moving subjects in the environment. The subject closest to the Kinect will be followed and fired upon by the turret.