Melody LeeSoftware Engineer - Programmer - Game Designer
Resume
I am a programmer who enjoys learning about and developing algorithms. I have programmed in Java, Python, Scheme (a Lisp dialect), C#, C++, and JavaScript, and am comfortable with learning new languages and frameworks. My code for games is crafted with game design in mind, as I pre-empt possible needs and directions of the game and allow for easy tweaking of variables.
I graduated with a Master in Entertainment Technology from Carnegie Mellon University, as well as a Bachelor in Computing (First Class Honors) from the National University of Singapore, with a Minor in English. Currently, I am a Software Engineer at Garena, and am pumped that I get to learn new technologies every day.
I am Melody (Zhi Xin) Lee, an enthusiastic learner and problem solver from Singapore. I enjoy crafting creations of various mediums, including designing games and writing poetry. I also like working with people as enthusiastic as I am in making solid products that have emotional significance.
I have experience in working with teams of diverse disciplines and nationalities. The key thing I learnt
in these teams is communication, and I strive to make sure that all team members are updated with each other's progress.
Contact Me
Send me a LinkedIn message! My profile can be found here. I look forward to hearing from you! p>
Flappy Pong is a game developed during a Game Design class. In the span of two weeks, it went through 13 iterations and playtests in order to arrive at this final product. It makes use of the simple addictive mechanic of quick estimation in order to charge the ball to the correct height and distance. The height of hurdles, as well as the distance between the top and bottom hurdles, are randomly generated within a range that increases in difficulty as the player's score increases. Power ups are also strategically positioned along the arc that the player should ideally achieve, in order for the ball to travel further.
Team: Melody Lee, Stephanie Fawaz, Arim Yoon, Shih-Tsui Kuo, Han Liu
Zoo Panic is a top-down projected game, created for a Building Virtual Worlds class, that makes use of PS Move controller coordinate position for input. Players play a zookeeper at his first day of work who has accidentally let the animals escape, and is now trying to catch them back by swinging his net over the called animal. As we were aiming for an experience where players are able to catch animals when the net is directly over the specific animal, we had to do bilinear interpolation to make sure that the PS Move coordinates map to real world projected space.
I decided on the overall direction of the code, and was in charge of programming animal behavior as well as the AnimalManager.
Animals had three behavior modes - Normal, Fleeing, and Caught. Normal mode is further broken down into two states, Move and Stop. Animals in Move state have a randomized direction as well as a randomized distance to travel along that direction. The desired destination is thus stored based on the direction and distance, along with the normalized direction that the animal is going in. Based on these variables, we were able to check if the animal has reached its destination by checking if the normalized direction from the animal to the destination is different from the stored direction. Upon reaching their destination, animals would transit to a Stop state and not move for a randomized amount of time, following which it will transit back to Move state. In order for animals not to get stuck upon collision, a new randomized direction and distance will be set upon collision with any object.
Fleeing mode is activated when the player is within a certain distance of the animal. The animal will then move in the direction away from the player.
Caught mode is activated when the player catches the correct animal. Animals will begin their struggling animation and stay at the same spot. The AnimalManager class is responsible for randomizing which animal to call next, and checking if the animal that the player swings her net over is the correct animal.
Zoo Panic went through many iterations. We began with the idea of catching animals in a fixed order, with the objective of trying to catch all the animals in the fastest amount of time and no time limit. The lack of choice and motivation made us develop another idea of having carnivores and herbivores, with the aim of catching the animals before too many herbivores were eaten. This devolved into mindless spamming. We went back to the first idea with a twist of having to lead the animals back to their pens instead, but this slowed down the pace so much that it became boring. We then decided to try a random order each time, which worked better, and finally settled on a time limit to catch each animal that got shorter and shorter as the game progresses. All of these ideas were prototyped. In order to support this rapid iteration, we made use of class inheritance, so that we could compare multiple ideas side by side by simply changing the class component to a parent or child class.
Zoo Panic was eventually selected to be exhibited in the Building Virtual Worlds Festival. 20 games were exhibited, out of a total of 78 worlds made in the entirety of the course.
Team: Melody Lee, Zhetao Wang, Wenyu Jiang, Jack Tsai, Minoong Ji
The Old Men and the Sea is a co-op game for 2 players, each of them wielding a PS Move, made for a Building Virtual Worlds course. One player takes control of the lighthouse light, lighting the fish as well as determining the direction of the boat, while the other is in charge of throwing the spear to hit the fish.
I decided on the overall direction of the code, and programmed the fish, the boat, the wave sprite order, event triggers, and the light. The dynamic 2D lighting for the fish was a collaborative effort.
There were 4 main things to take care of in terms of programming: the fish, the light, the boat, as well as the spear. Because the y-coordinate corresponded to depth as well in the game, we had to also make sure that objects scaled linearly based on its in-game depth, with further objects being smaller and nearer objects bigger.
In order to enable dynamic 2D lighting for the fish, we checked if the light was in the same "depth" as the fish, before checking for every pixel of the opaque layer over the fish whether the light collider was over it. We had to map the texture coordinate of the opaque layer to its in-world coordinate in order to perform the second check. In order to make this less computationally intensive, we used a lower-resolution opaque texture, and scaled the texture up.
We made a static class, WaveSpriteOrder, that provides a reference for which wave(s) the objects in the game are behind. All dynamic objects in the game, namely, the fish, the light, the boat, and the spear, refer to this class and set their sprite orders accordingly. These objects also store their own sprite order in relation to the waves, so that we can establish if they are in between the same waves. Only when the boat and fish are in between the same waves, give or take a tolerance we set, will the spear be able to hit the fish.
The light direction depends on the angle of the PS Move. Moving the PS Move up directs the light further away and vice versa. Where the light hits the wave, determined through ray-casting, is where the boat will travel towards, unless the light is hitting the fish - the boat will travel towards the fish instead.
The boat's x-axis movement is entirely programmed, while its y-axis movement is animated. This was done because we wanted greater control over how much the boat rocks depending on what stage the storm is in. The boat either moves forward or backward; each time it changes direction, it sets a random segment distance around a range that changes depending on whether it is moving forward or backward. When it has travelled that distance, it will change direction. "Depth"-wise, it will simply move towards the light.
The fish's x-axis movement is also programmed. The fish changes velocity around a random range at the end of each animated leap, with a chance of changing direction in both the x-axis as well as the z-axis. This is tied to the fish's animation so that the fish will only flip when it is under the waves. As the stages progress, the fish has a higher chance of doing its high leap, making the game more exciting.
The spear's angle is determined by the angle of the PS Move. New spears are only instantiated when the thrown spear is out of the screen.
The Old Men and the Sea was eventually selected to be exhibited in the Building Virtual Worlds Festival. 20 games were exhibited, out of a total of 78 worlds made in the entirety of the course.
Guardian of the Forest is a Kinect game made for a Building Virtual Worlds course. Players play a tree which tries to connect the hollow branches within itself to make a path for the raindrops to reach the fire.
I decided on the overall direction of the code, and was in charge of programming the connection triggers and the level manager. I also did my best to smooth the kinect movement.
We made use of 2D sprites for the tree parts, because we wanted the flexibility to update the level design by swapping sprites, instead of redoing a 3D model. To do that, we set each part to rotate based on the angle between the two joints it links and the x-axis.
For each end of a tree part that can connect to another tree part, there is a trigger that will set a boolean when it is in contact with another such trigger. When that boolean is true, the droplet will teleport from its current tree part to the connected tree part, creating the illusion of an unbroken path. In addition, each tree part can also by default be connected to another tree part - for example, if there are hollow branches in both the upper and lower arm, these parts are automatically connected, so if the branch is not currently in contact with another trigger, water droplets will flow to the default connected part. If the branch has neither, droplets will flow out of the branch instead.
The level transition for this game is unique in its side-scrolling manner, as we wanted to reinforce the idea that there are fires at various parts of the forest. To make that happen, tree parts for the respective levels were set active or inactive, and the current level is stored as a variable inside LevelManager.
Kinect 1's movement proved to be jerky, so instead of taking the position values of each joint every frame, we averaged several frames to try to smooth the values out. We also adjusted smoothing, correction, prediction, and jitter radius values for the Kinect, while taking care not to introduce too much lag.
Guardian of the Forest was eventually selected to be exhibited in the Building Virtual Worlds Festival. 20 games were exhibited, out of a total of 78 worlds made in the entirety of the course.
Team: Melody Lee (Programmer), Rishit Bhatia (Programmer), Julian Toker (Sound and Level Designer), Hye Jeong Jeon (Artist), Adam Liss (Artist)
Climb to the Owl is an Oculus game made for a Building Virtual Worlds course. Players take control of two plungers and latch on to blocks in order to climb to the owl. Some blocks spring surprises when the player latches on to them. There were two main things to program: the plungers, or hands, as we call it, and the camera.
I decided on the overall direction of the code, and was in charge of programming the hands, the camera, and all the blocks.
Each hand essentially had two modes, the gripped mode, and the free mode. Gripped mode refers to the mode when the player is latched on to the block, while free refers to the opposite. When the player's hand is free, the position of the plunger will correspond to the position of the PS Moves in real life, relative to the camera. While latched, the player's hand will stay in that position. Hands can latch on to blocks when their triggers are in contact with blocks, taking the nearest block if multiple blocks are in contact.
These affect the camera differently. If only one hand is gripping onto a block, the initial position of the camera will be directly below that hand, and the free hand will automatically hang under the camera at arm length's range if the new position of the camera is further from the free hand than arm length. If both hands are gripping, the camera will gravitate towards the midpoint of the two hands.
When hands are latched onto blocks and the player moves the PS Move controllers, this is interpreted as a pulling motion. The camera will hence move opposite the direction that the player is moving - as an illustration, the player will normally move her hands down after latching on, so the camera will move up, as if the player is pulling herself up. The range of this is limited to a pre-set arm length.
There are also special kinds of blocks in the game that act as elevators, taking the players up. When the player latches on to an elevator block, her other hand will automatically stop gripping whatever block it is gripping on. Both the velocity of the latched hand and the camera are overridden by the elevator block, while the free hand's initial position will hang under the camera.
Climb to the Owl was eventually selected to be exhibited in the Building Virtual Worlds Festival. 20 games were exhibited, out of a total of 78 worlds made in the entirety of the course.
Team: Melody Lee (Programmer), Nguyen Trung Hieu (Programmer), Chan Chung Leung (Producer), Sim Jiajin (QA Tester), Jermaine Cheng (Designer), Ha Thi Thuy Linh (Artist), Sara Jane Sim (Artist)
Skyward is a 2D single-player serious game made for the SUTD Game Lab Internship. Our product advisor was SG Enable, an organisation that seeks to integrate the disabled into society, and they wanted a game with two messages: first, for the able-bodied to empathize with the disabled, and second, for the able-bodied to look beyond their disabilities and recognize their strengths instead. Even though the programme length was 14 weeks, my team struggled with reconciling both messages into a game without being didactic, and iterated multiple times before arriving at Skyward with 3 weeks to spare for production.
I was in charge of programming all the moving platforms, Rolly, as well as the lethal enemies.
The two characters in the game, Rolly Polly, the blue bird, and Lanky, the green anteater, each have their unique abilities - Rolly Polly is able to jump, while Lanky is able to levitate objects. The two characters never meet until the end, so they will have to make use of their abilities to help each other past their respective hurdles. The puzzle nature of our game meant that the environment needed to have diverse objects to interact with, such as normal moving platforms, moving platforms activated by switches which will only continue in the path when the switch continues to be pressed, and lethal enemies.
Team: Melody Lee (Programmer), Zongye Yang (Programmer), Sophia Mei Xue (Artist), Chuck Lee (Artist), Larry Chiang (Sound Designer)
Blow, Dude, Blow is a Jam-O-Drum game made for a Building Virtual Worlds course. The Jam-O-Drum is a table with four wheels that players can spin at each corner, each of which has a button in the middle to press. Players take control of a balloon whose direction they can change by spinning the wheel, and try to direct the falling man to a safe hole by pressing the button to make their balloons blow.
I was in charge of programming the scaling of the 2D floor, the procedural generation of floors, as well as the man's dynamically breaking limbs.
The tower walls are made from a single short 3D model with a texture that repeats, done through shaders, while the man and the floors of the tower are 2D sprites. Both 3D and 2D objects are rendered with two cameras.
In order to determine the speed at which the floor should scale bigger, we made use of the projected area of a 3D object on the screen as it gets closer to the camera, and scaled the floor accordingly. This was done by checking WorldToScreenPoint of the left and right points of a hypothetical 3D ball, whose hypothetical velocity we increase after each successive floor.
Hole positions were procedurally generated through a level system that sorts the different kinds of hole images into different stages, and allows us to set the number of floors per stage. For example, we could have 5 floors for the first stage, which has 3 possible hole sprites for us to choose from. The level manager would randomize the rotation of the sprite as well as which sprite to use for five floors, before moving on to the next stage, which might have 3 floors with 1 hole sprite.
The dynamic breaking of the man's limbs is done by putting a trigger on each limb. The game ends when the trigger on the man's head hits the ground collider.
Team (Programmers): Melody Lee, Rose Marie Tan, Liew Jia Zheng Alex and Sing Keng Hua
VoxeLab is a game prototyping software, done for a Thematic Systems Project, that allows you to plan out a game quickly by using
voxels to build game worlds. You need not manipulate meshes to generate 3D worlds,
allowing you to focus more on the game mechanics with respect to the environment. VoxeLab
also aims to provide game-makers with a clear idea of the metrics of the in-game environment,
allowing you to test your game prototype. You will be able to manipulate voxels with the
features of a paint software, add basic game elements and test the prototype.
I was in charge of programming the following tools: Selection, Move, Magic Wand, and Fill.
Select was done by first clicking, which set a world point as the center of the 3D selection box, and dragging, causing the 3D box to expand until the player releases the button. Magic wand allows the player to select contiguous or non-contiguous voxels within the selection that have the same color, allowing for a certain level of tolerance. Contiguous magic wand was implemented using the flood fill technique. Move simply deletes the voxels within the selection box and reproduces them at each frame according to where the player is dragging the box. Fill simply paints the voxels within the selection box a certain color.
VoxeLab can be used in 2 modes -- Create Mode and Game Mode. In Create Mode, you can
design, build and edit 3D worlds and game levels using brushes, and group structures of voxels
into objects to support object‐based modelling. In Game Mode, you can edit basic game
elements (such as jump-height and gravity) so as to test your 3D world as a game environment.
Jiku Video Player is an Android application that enables users to zoom in and track objects of
interest in videos, based on a combination of computation and crowd-sourcing. A lower
resolution is sufficient most of the time for a video, especially one for a screen as small
as a mobile device. Sometimes, however, you might want to view objects in the video in greater
detail, for example in security camera footage, and hence want to render that particular area in
higher resolution.
I developed this application and its user interface, given the tracking points of objects of interest
that was pre-generated by another program, as well as the backend program of decoding the video in various
resolutions. This was done for my Final Year Project in National University of Singapore.
Team (Programmers): Melody Lee, Ziwei Peng, Yang Zhao
Item Genie seeks to enable better gaming choices by players by allowing comparisons between items'
attribute bonuses and cost, so that players can tailor their in-game purchases to the current situation
of the game. The UI part focuses on facilitating these comparisons through a graphical manner
and will provide functions to sort, search and selectively focus on specific items and
attributes for side-by-side comparison. This will be based on backend data of all the in-game items and their properties.
I was in charge of programming the star chart.
The current version of the program uses Riot Games' League of Legends as an example to demonstrate its functions.
This program was done for a User Interface Development module.
Team (Programmers): Melody Lee, Goh Horng Bor, Steve Ng and Jiao Jing Ping
Geekdo is a to-do manager that enables you
to schedule and organise your tasks quickly
and efficiently. Primarily aimed at geeks,
it allows you to access all functionalities from the
command line, although non-geeks, through the
graphical user interface, will find Geekdo easy to use as well.
I was in charge of programming the addition, deletion, and editing of tasks. This project was done for a Software Engineering module, and was selected for the Hall of Fame, of which 11 programs out of 60 possible projects were chosen.
Some of Geekdo's functions include: natural language processing,
adding, deleting and editing of tasks, auto-complete, prioritising of tasks
as well as reminders when tasks' deadlines draw near.
Musical compositions can often be expressed in a structured manner,
whether it be musical form or rhythmic patterns.
With this basic idea, complex compositions can be derived from a certain set of musical rules.
In fact, multiple musical compositions can be derived from a single set of rules,
leading to the ability to create musical improvisations as well.
Generative Grammar is a javascript program that generates a language from user-defined rules.
I was in charge of generating the tree of possibilities from the input rules.
This project is done for a research module.