Home Menu

The Bully Pulpit

Words about things and stuff.

March 21, 2017

Mixed Reality Goes Old Skool

Retro is all the rage. Minecraft, Terraria, and the Nintendo64 sold out in minutes. So, how do you bring a retro feel to cutting edge technology? That was the task we faced when client Mixed River hired us to help with Synergy Technical’s Game Changer HoloLens game. We know not everyone is a retro gamer […]

Retro is all the rage. Minecraft, Terraria, and the Nintendo64 sold out in minutes. So, how do you bring a retro feel to cutting edge technology? That was the task we faced when client Mixed River hired us to help with Synergy Technical’s Game Changer HoloLens game. We know not everyone is a retro gamer so the characters had to be engaging enough to appeal to today’s user, but still reminiscent of aliens we loved from Space Invaders.

2_MagicaVoxel_LineUp
Awesome right?

In the beginning, we thought we’d keep a bit of the classic process we use for development: sketch the character, detail it out, and then run through production. We soon realized we needed a way to make a big character in an itty-bitty playing space.

Enter MagicaVoxel. Instead of the normal pipeline, our concept artists built the little guys in this lightweight voxel editor. MagicaVoxel helped us achieve two goals: excellent visuals and reduced production time.

Synergy_InProcess

Kinda makes you feel bad shooting them, doesn’t it?

Our designers also used MagicaVoxel for our asteroids.

4_Asteroids_b

These little gems get blown apart and increase game complexity. As you progress through the game, more of them show up and blow up (if you’re good that is). Again, using MagicaVoxel reduced production time while improving production quality for this application.

Sometimes, it really pays to think outside of the voxel.

February 1, 2017

Key Art production for Planet3

As part of the game modules we created for DC-based Planet3, we were asked to create a series of print-res promotional posters to capture the drama of each game.

As part of the game modules we created for DC-based Planet3, we were asked to create a series of print-res promotional posters to capture the drama of each game.

We identified key moments and using in-game Unity 3D assets, composed the scenes and positioned player avatars to maximize composition.

 

KeyArt

Basic lighting and cubemaps were dropped in to rough in tone and overall mood.

KeyArt_2

Using a custom script, each layer of the scene was rendered out at a very high resolution for compositing.

KeyArt_5

 

Elements were enhanced and reworked for a final overpaint and poster delivery…

KeyArt_6

…and the process repeated for each additional module.

Climate Module

Climate-final-key-art

Volcano Module

PLANET3_Volcano_key-art

Plants Module

PLANET3_Plants_key-art

 

May 16, 2016

Lighting for Mobile VR

The goal was to drop users into the action, allow them to choose Captain America or Iron Man and then fight their way through 8 levels of Crossbones and his Henchmen.

Multi-lightmap walkthrough for Kellogg’s Marvel’s Civil War VR

The goal of Kellogg’s Marvel’s Civil War VR was to drop users into the action of the feature, allow them to choose Captain America or Iron Man and then fight their way through 8 levels of Crossbones and his Henchmen. The levels themselves were created from film references – with a generous helping of our own creative interpretation – and resulted in 3 main maps: Lagos, Berlin and the Leipzig Airport.

We set the experience in the early evening to help reduce eye strain but in large part it was to allow for much more dramatic lighting. But because our target was mobile VR we were severely limited in what we could do with real-time lights and shadows. To ensure the experience ran across as many devices as possible, we would need to bake the lighting into the scenes using light maps.

Traditionally, light maps meant static lighting. Our goal, however, was to build an experience that was much more dynamic. This lead us to the Multi-light maps technique: a way to get a lot more life into a VR environment without breaking the lighting budget.

Here’s a breakdown of the Multi-lightmap process:

First, the environments were divided into sectors. Each sector was given one main light and a couple accent lights. Generally, the main light is something like sunlight or other primary light source that is consistent throughout all sectors. The accent lights are secondary light sources such as lamp posts or traffic signals and have their own color and intensity animated as needed. These lights should not affect any objects outside their sector or unexpected results may occur. Once we had our map subdivided into sectors we began lighting. This is where we took a different approach: Instead of baking the light colour and intensity, we used the light map to just bake light position and shadows. This was accomplished by using the RGB channels of the light map to store the lighting information for each light.

These 3 channels is the reason we are limited to one main and two accent lights per sector. Red could be used for the main light while blue and green are used for the two accents, for example. This gave us something that looked like this:

sputlight_probes_01edit

In this case, the red channel was used as consistent point lighting throughout the Lagos market rather than a directional light that might be used in a daylight setting. An ambient occlusion pass, saved as an alpha channel, was also included in the light map and, with a custom shader we assigned colour and intensity to each of the channels and ended up with something like this:

sputlight_final_01edit

One of the benefits of controlling lighting intensity and colour through a shader is it becomes easy to make adjustments to lighting in real time. The image below is using the same light maps with different colours and intensities assigned in the materials.

sputlight_alt01_01_1

Since the ambient occlusion pass is also available through the shader more drastic lighting changes can be easily achieved such as changing the night scene to an overcast day:

sputlight_alt02_01

Again, this is achieved by adjusting material settings using the same initial RBG light map. With all these lighting options tied to materials it can be difficult to sync all the color setting across materials in each sector – and even harder to animate them all. To solve this, we came up with a Light Manager script.

This script is added to a null object and pulls in all material attributes from objects within its volume which makes them editable in one place. An animation controller is then added allowing the lights to be animated using only one animation clip per sector. This, in turn, allows all objects affected by the same light to have the same settings even if a different material is being used – making it a very powerful tool.

The three light managers for the Lagos map can be seen in yellow below:

sputlight_manager_01

Once we had the environment looking the way we wanted, we realized we would have skinned characters – Crossbones’ henchmen – running through these maps. And if they were not affected by the flickering lights or by passing through the pools of light that made up the levels, it would break the immersion.

The Light Manager gave us access to all of the colours and intensities of the lights so it was a matter of coming up with a way to apply these to a mesh without a static light map. The key was utilizing light probes – though not exactly in the way they were intended. In the same way the light maps were re purposed, lighting probes were baked with the same RGB light data as the static light maps. But since the original lighting is baked outside of Unity, the lighting set-up had to be mirrored in Unity to bake the probes.

sputlight_probes_01edit

To this end, we wrote a shader that was similar to the one that was used for the light maps but instead used the RGB channels of the lighting probes. Since all of the material attributes were named the same in the Light Manager, we were able to apply the same colour and intensity to the skinned meshes as we did for the environments. This allowed the henchmen to blend into the scene seamlessly, picking up accent lights and, in the example below, flickering lights:

sputlight_skinned_01edit1

This technique was used to throughout out the entire project allowing us to create dynamic lighting effects and enhancing the immersive feel of the experience without using a single real time light.

Try it for iOS https://itunes.apple.com/us/app/kelloggs-marvels-civil-war-vr/id1093762466 or Android https://play.google.com/store/apps/details?id=com.kelloggs.civilwarvr

April 22, 2016

Kulipari Teaser Concept Art

Behind any beautifully animated project is often a treasure trove of equally beautiful concept art that is rarely seen outside the studio walls.

Behind any beautifully animated project is often a treasure trove of equally beautiful concept art that is rarely seen outside the studio walls.

This is particularly the case in producing the Kulipari Teaser for Outlook’s multi-talented Trevor Pryce. To make sure some of this behind the scenes work from the teaser production sees the light of day, we’ve pulled a couple selects. Creatures, storyboards, matte paintings – all amazing work in and of itself. Hope you enjoy!

(Thanks Kit, Matt, Tommy and Nick.)

PS: Check out the Kulipari teaser here!

July 2, 2015

Google Cardboard Developer Tools: Black Screen Issue

In the interest of sharing our experiences and furthering the Cardboard community, we wanted to share an issue we dealt with during development of some VR experiences and how we fixed it.

Google Cardboard has brought virtual reality to masses with it’s simple, effective, and inexpensive design. In the interest of sharing our experiences and furthering the Cardboard community, we wanted to share an issue we dealt with during development of some VR experiences and how we fixed it.

We started to experience a strange bug on iOS devices when testing our apps that would give us a black screen saying “Let’s get you set up” instead of actual game view. By looking at the Debug.Log() messages, it was clear that the game was working and seemed to be running behind the black screen. After trying several work-arounds, the “Let’s get you set up” message was still blocking the view. Although we strongly believe in the power of Google, a thorough search did not produce any results on how to fix this. So we started to dig in.

After a couple of hours of banging our heads against the wall, we noticed that this screen appears after the game is started and a couple of frames where rendered. It seemed it was definitely related to iOS SDK and was probably some ViewController pushed on top of the game view controller. So we decided to find out where that code was called from.

After some file “grep’ing” we found out that these texts are stored in libplatform.a binary and we had no source code of it. The next step was to try to prevent this code from executing and we started exploring all of the SDK code we had hoping to find out calling points. Luckily, the solution came pretty easily from there. The CardboardAppController.mm has a couple of functions, and one of them is “void launchOnboardingDialog(){…}” so by just commenting it’s content we solved this black screen problem!

Hopefully this helps some of you out there who may be dealing with the same issue. This was a quick fix, but we are working on a solution that will stop this problem from happening in the first place. Stay tuned…

January 9, 2015

Maestro! A Symphony of Mobile Audio Tools

Audio for mobile presents some pretty fearsome challenges. Today’s software offers almost limitless audio features and options, but once your application goes to mobile it’s a whole different ballgame.

Audio for mobile presents some pretty fearsome challenges. Today’s software offers almost limitless audio features and options, but once your application goes to mobile it’s a whole different ballgame. The audio developer is in a constant struggle for CPU, RAM, and the small amount of volume produced by that tiny little speaker.

Robust, interactive features like those found in today’s audio middleware seem completely out of reach for the mobile developer. This middleware, like WWISE or FMOD, can handle complex playlists of music, and can change the music interactively based on the user’s actions. It knows the natural pulse of the song it is playing, and can be used to synchronize other game events. It can automatically change volume levels to make sure every sound is heard clearly. All of these would be great features to have on a mobile platform, but would require far too much CPU time to be practical.

But at Bully! we don’t believe in sacrificing quality in our mobile apps. In fact, we aim to improve it! Harnessing the immense power of Unity’s scripting engine, Bully! is working on a solution called Maestro! It’s a mobile-optimized plugin that offers a complete suite of audio features that is being battle tested in a couple recent and upcoming releases.

Maestro!’s goal is to make it easier for developers to drop the music in and have greater control over the sound with features including:

  • Interactive scoring: The software can change the audio it only where it musically makes sense, so it feels locked to the action like a movie soundtrack.
  • A robust event system: Developers can synchronize game features with background music, lock animations to a beat, or to create rhythm-based gameplay.
  • Familiar audio middleware functionality: Maestro! can easily “duck” the background music in a variety of ways, elegantly softening the score to allow sound effects to shine through.
  • Smart looping: Audio designers can also specify a “tail” segment of their sound files, allowing an audio clip to loop in the middle while still playing the natural echoes after the loop point.

Keep an eye out for updates on Maestro! in the coming months. And if you’re not a mobile developer, don’t fret, it will work on desktops too!