Team based game creation at BYU
At Brigham Young University, a team of about 20 students builds a video game each year. This past year’s game was Beat Boxers, which was completed in the summer of 2018. It won first place at E3’s highly prestigious College Game Competition.
At the start of each summer, BYU invites students in the animation and computer science departments to pitch ideas and choose one as the next group capstone project for various art and computer science majors.
Target piece painted by student Vanessa Palmer
In 2017, Beat Boxers won the vote, a game where we wanted the core experience to give players an opportunity to steal the limelight in a performance battle royale. The project eventually boiled down to a fighting game where the player has a choice of three moves, each of which is more powerful when hitting on the beat of the music. The player has to also choose the right move each beat to trump the other player’s choice, like several rapid games of rock-paper-scissors.
All positions on the project, including producer and director, were undergraduate students, and the person who originally pitched Beat Boxers became the programming lead.
Executing in Unreal
Beat Boxers was not only made for the enjoyment of making a game, but also to further students’ education on game-making principles. One reason we used Unreal Engine is because it is a powerful, stable package that can achieve a professional look. BYU students apply to AAA studios all over the world, so the finished look was important to us. We also used Unreal Engine because it is valuable in teaching deeper principles. Node networks and deep capabilities let students experiment and solve difficult problems. UE4’s Materials system is based in node networks and procedural workflows which our students are used to working in, so it was a good fit for that reason as well.
During the summer, the artists rapidly prototyped using UE4 Blueprints, iterating through several white-boxed character and stage designs, which helped us cement in our minds what experience we wanted players to have and design for that.
Whiteboxed version of Beat Boxers in Unreal Engine
Beat Boxers is both a fighting game and a rhythm game, but because fighting games and rhythm games each have their own mechanics, we had to figure out the right gameplay that would mix elements of both. The Blueprints system allowed us to create dozens of prototypes varying the gameplay designs quickly and easily. The lead designer could quickly change and add to the gameplay in Blueprints before we added them in C++.
Final gameplay screenshot
Striking a balance between fighting and rhythm games was difficult. In order to prevent ourselves from straying too far from either genre, we chose to focus on making a fighting game first and enhancing it with rhythmic elements.
From playtests, we determined it was impossible to encourage players to attack on the beat unless the pace of the game was founded around the beat itself. This led to a design where your inputs are buffered and fire only on the beat, at the same time as your opponent’s inputs. The rock/paper/scissors nature of our moveset allowed us to keep the control scheme simple for people new to fighting games, but also gave us room to use inputs in creative ways for fighting game enthusiasts.
Blueprint of the logic to know if a player hit on or off beat
Using the FMOD plugin, we fired an event on the beat consistently in sync with music. We used this event in conjunction with timers to open and close an “onbeat window.” This was approximately a tenth of a second before and after the beat of the song. When the “onbeat window” is open (determined by which timers were active) the attack was onbeat, otherwise it was offbeat.
The first challenge was to personify genres of music into appealing characters that each had their own attitude. The classical violin character Maestra was developed first. She was based on a concept sketch that was part of the game’s initial pitch. It got everybody really excited about what the game could be. The design went through many different people and many iterations over several months until she became the tall, elegant, and deadly opponent we presented at E3 2018. Unreal Engine allowed us to pull our models in quickly to test out how they looked in context of the assets being made by the other members of the team. We pulled assets into Unreal to see them in context with each other as early as possible.
Comparison between concept art (left) and execution in Unreal (right).
We also wanted to create a second character that contrasted Maestra while belonging in the same world. We decided that a bulky, fun, rock character who loves the spotlight would contrast with the slim silhouette of Maestra and unify the game. He’s been lovingly named Riff. Our process of starting development of characters one after the other instead of all at once was valuable. It let us solve character problems with a single team, then hand lessons learned off to the next character team, so as not to repeat solving the same problems on multiple teams.
Substance Painter was used to texture all of the assets in Beat Boxers. Maestra’s wood texture is completely procedural. Her eyebrows and gold accents are hand painted. The maps were exported from substance as basic texture maps, then additional effects like Fresnel were added via shader networks in Unreal Engine.
Comparison between concept art and execution in Unreal Engine.
Both Riff and Maestra were designed to be quintessential examples of the genres they represent. They were modeled in Maya and Zbrush and textured in Substance painter. Maestra has 37,670 triangles and Riff has 29,747. Both are using only one UV channel.
Regarding animation, we wanted to make sure the characters look like they were fighting and also performing. We studied many Street Fighter-style games for reference and found ways to make every characters’ moveset unique. We attended concerts and discovered essential differences between the dignified elegance of a symphony and the raw energy unleashed at a rock concert. All of this helped us build the movesets for Maestra and Riff.
Introducing a rhythm component into a fighting game is challenging. Every aspect of the game had to emphasize the importance of tempo and beat. In order to accomplish this, we created two animation files for each attack. We played with the timing and exaggeration of poses so when players timed actions to the beat, they were rewarded with a more powerful and visually interesting moveset animation. When players ignored the beat, they got a weaker version of the move using the second animation (as well as reduced points). We gave the animations to our magical programmers and they strung the files together in node networks to create smooth transitions when a player strung combos together and awkward pauses in the animations when the player acted off the beat. With the finished version, one could feel as if they were being held accountable for performing well—and we were very excited about that.
The movesets were arranged in a state machine with specific criteria such as player input determining the movement between states.
Many of the assets in the game were designed to be flexible and reusable. We used Houdini to build intricate procedural assets and ported them into Unreal using the Houdini Engine. The Unreal Editor’s flexibility to use the Houdini Engine plugin allowed us to fluidly adjust components from various sources in the Unreal Editor. This made it remarkably easy to art-direct even complicated pieces like the stadium seating and scaffolding.
Minimal lights were used in the game. Each character was lit in isolation through lighting channels. They could be lit up and fine-tuned individually so we didn’t wash out the other character or the environment. We had to make sure the characters stood out from the background, so elements of the background glowed, but dimly, and we kept all the background lights low and with added fog. We did iterations of lighting adjustments by working with the art lead who did paintovers for each iteration.
This project has received a lot of high praise for its look, which was made possible by Unreal Engine’s quality artistic tools.
The crowds are hardware instanced meshes with animations baked into the texture. Animations are accessed through vertex offset information in the shader.
Normally the CPU has to send a draw call to the GPU for each mesh to be drawn. This can waste a lot of the GPU’s time as it can finish before the CPU is able to send the next call. Using hardware instancing, the GPU stores an array of transformation matrices. With a single draw call the GPU draws the model once for each transformation matrix.
Our hardware instanced crowd
The downside to hardware instancing is it doesn’t work on computers without a dedicated graphics card or with skeletal animations, so the crowd won’t move and will be very uniform.
To get animation on a hardware instanced mesh, start with an animated skeletal mesh, process it and the animation, and the result is a static mesh with very specific data in the vertex color data, a texture representing the animation, and a complicated shader.
The node network that animates with a texture.
If the color channel was used on the animated texture.
What this process does is calculate the difference in location of each vertex from the bind pose at each frame. You store those differences in a texture, one axis being time (or frame rather) and the other axis being vertex ID (a unique identifier you need to assign to each vertex). We encoded the value of the vertex ID across the RGBA channels of the vertex color data of the model. Then, write a shader that decodes the vertex ID from the vertex color data. With the shader, use the elapsed time to figure out what UVs to sample, and then sample the texture to figure out how far to displace each vertex, putting that value into the vertex offset. Make sure to disable sRGB and use nearest filtering. Also be sure to transform the vertex displacements into world space before applying them to vertex offsets. To combine multiple animations into one image, sample different animations like you would a sprite sheet.
Spritesheet timeline for the animated textures for crowd characters.
You can vary the individuals within the crowd by using the single random value you can assign to every instance in Unreal’s provided Instanced Static Mesh component. You also ought to transform the location deltas from local space to world space before applying them as vertex offsets (though this can have issues with scaling). You will also notice the normals are screwy. It is still using the normals from their static position. If that becomes a problem, it can be fixed by also calculating, and storing in a second image (or different portion of the same image), the change in normals over the course of the animation.
We used Unreal’s fine-tuning options and powerful tools to optimize the game and keep it looking good. We disabled collisions on everything but the characters, and the stage they were on. We also set the background assets to static lightning.
Another thing that we fixed to keep the game’s performance up was to adjust the fog and particle effects that were in the crowd. Originally there were three types of fog effects on the map. Since the viewing angle doesn’t change much in the game, our student in charge of optimization reduced the fog to one and extended it within the camera view. While that may not look as good from other angles, it looks the same as our first iteration from the player’s viewpoint.
We had three master Materials, with incrementing levels of complexity. The simplest one was used for the entire crowd and background. That Material only had a color channel. The second was used for the stage floor and set dressings. That material has color, roughness, normal, metallic, and AO. The last one was used only for the characters, and had all those previously mentioned, but also controls for tinting, Fresnel effects, and a few others. By doing this, we were able to keep all shader complexity levels in the green.
Our effects were created using hand-painted animated sprite sheets. These are then read frame-by-frame by Unreal’s Cascade Particle System from left to right; playing as 2D animations in-game and timed to the player’s attacks and actions. Using this process allowed us to plan and design the shapes, pacing, and dissolve of the effects to the stylistic appearance we desired. Then, by layering the animations with sparks, light, and hit effects, we integrated them more smoothly into the 3D gameplay. The resulting effects are more stylized than 3D effects, similar to those achieved by games such as Dragon Ball Fighter Z and Guilty Gears Xrd. We varied which effect would play depending on whether the player hit on or off-beat.
The offbeat effect on the left, the on beat effect on the right.
Unreal Engine allowed us, as a team of students, to learn how to create professional looking content quickly and optimize it to perform well. Its interface was approachable for our artists to upload their own work and deep enough for our programmers to customize gameplay to achieve the experience we were after. Because of this, we were able to iterate quickly and polish, which was one of the key reasons we were prepared for E3’s College Game Competition. Everyone on the team was really excited to be selected for the opportunity to go to California’s E3 conference, rub shoulders with our peers, and see what AAA game studios are making. We were also honored that Beat Boxers won the 2018 college competition. The entire team of programmers, modelers, texture artists, concept designers, gameplay designers, and logistics all worked together to produce an experience we enjoy, and we hope you will enjoy too. The game is available on Steam for free.
Director – David Burnham
Producer – Jessica Runyan
Design Lead – Mike Towne
Art Director – Vanessa Palmer
Music – Alastair Scheuermann, Jarrett Davis
Sound – Jared Richardson, Dallin Frank
Created by (the following students took on various roles in the making of the game):