Home Menu

Behind the Scenes

March 21, 2017

Mixed Reality Goes Old Skool

Retro is all the rage. Minecraft, Terraria, and the Nintendo64 sold out in minutes. So, how do you bring a retro feel to cutting edge technology? That was the task we faced when client Mixed River hired us to help with Synergy Technical’s Game Changer HoloLens game. We know not everyone is a retro gamer […]

Retro is all the rage. Minecraft, Terraria, and the Nintendo64 sold out in minutes. So, how do you bring a retro feel to cutting edge technology? That was the task we faced when client Mixed River hired us to help with Synergy Technical’s Game Changer HoloLens game. We know not everyone is a retro gamer so the characters had to be engaging enough to appeal to today’s user, but still reminiscent of aliens we loved from Space Invaders.

2_MagicaVoxel_LineUp
Awesome right?

In the beginning, we thought we’d keep a bit of the classic process we use for development: sketch the character, detail it out, and then run through production. We soon realized we needed a way to make a big character in an itty-bitty playing space.

Enter MagicaVoxel. Instead of the normal pipeline, our concept artists built the little guys in this lightweight voxel editor. MagicaVoxel helped us achieve two goals: excellent visuals and reduced production time.

Synergy_InProcess

Kinda makes you feel bad shooting them, doesn’t it?

Our designers also used MagicaVoxel for our asteroids.

4_Asteroids_b

These little gems get blown apart and increase game complexity. As you progress through the game, more of them show up and blow up (if you’re good that is). Again, using MagicaVoxel reduced production time while improving production quality for this application.

Sometimes, it really pays to think outside of the voxel.

February 1, 2017

Key Art production for Planet3

As part of the game modules we created for DC-based Planet3, we were asked to create a series of print-res promotional posters to capture the drama of each game.

As part of the game modules we created for DC-based Planet3, we were asked to create a series of print-res promotional posters to capture the drama of each game.

We identified key moments and using in-game Unity 3D assets, composed the scenes and positioned player avatars to maximize composition.

 

KeyArt

Basic lighting and cubemaps were dropped in to rough in tone and overall mood.

KeyArt_2

Using a custom script, each layer of the scene was rendered out at a very high resolution for compositing.

KeyArt_5

 

Elements were enhanced and reworked for a final overpaint and poster delivery…

KeyArt_6

…and the process repeated for each additional module.

Climate Module

Climate-final-key-art

Volcano Module

PLANET3_Volcano_key-art

Plants Module

PLANET3_Plants_key-art

 

May 16, 2016

Lighting for Mobile VR

The goal was to drop users into the action, allow them to choose Captain America or Iron Man and then fight their way through 8 levels of Crossbones and his Henchmen.

Multi-lightmap walkthrough for Kellogg’s Marvel’s Civil War VR

The goal of Kellogg’s Marvel’s Civil War VR was to drop users into the action of the feature, allow them to choose Captain America or Iron Man and then fight their way through 8 levels of Crossbones and his Henchmen. The levels themselves were created from film references – with a generous helping of our own creative interpretation – and resulted in 3 main maps: Lagos, Berlin and the Leipzig Airport.

We set the experience in the early evening to help reduce eye strain but in large part it was to allow for much more dramatic lighting. But because our target was mobile VR we were severely limited in what we could do with real-time lights and shadows. To ensure the experience ran across as many devices as possible, we would need to bake the lighting into the scenes using light maps.

Traditionally, light maps meant static lighting. Our goal, however, was to build an experience that was much more dynamic. This lead us to the Multi-light maps technique: a way to get a lot more life into a VR environment without breaking the lighting budget.

Here’s a breakdown of the Multi-lightmap process:

First, the environments were divided into sectors. Each sector was given one main light and a couple accent lights. Generally, the main light is something like sunlight or other primary light source that is consistent throughout all sectors. The accent lights are secondary light sources such as lamp posts or traffic signals and have their own color and intensity animated as needed. These lights should not affect any objects outside their sector or unexpected results may occur. Once we had our map subdivided into sectors we began lighting. This is where we took a different approach: Instead of baking the light colour and intensity, we used the light map to just bake light position and shadows. This was accomplished by using the RGB channels of the light map to store the lighting information for each light.

These 3 channels is the reason we are limited to one main and two accent lights per sector. Red could be used for the main light while blue and green are used for the two accents, for example. This gave us something that looked like this:

sputlight_probes_01edit

In this case, the red channel was used as consistent point lighting throughout the Lagos market rather than a directional light that might be used in a daylight setting. An ambient occlusion pass, saved as an alpha channel, was also included in the light map and, with a custom shader we assigned colour and intensity to each of the channels and ended up with something like this:

sputlight_final_01edit

One of the benefits of controlling lighting intensity and colour through a shader is it becomes easy to make adjustments to lighting in real time. The image below is using the same light maps with different colours and intensities assigned in the materials.

sputlight_alt01_01_1

Since the ambient occlusion pass is also available through the shader more drastic lighting changes can be easily achieved such as changing the night scene to an overcast day:

sputlight_alt02_01

Again, this is achieved by adjusting material settings using the same initial RBG light map. With all these lighting options tied to materials it can be difficult to sync all the color setting across materials in each sector – and even harder to animate them all. To solve this, we came up with a Light Manager script.

This script is added to a null object and pulls in all material attributes from objects within its volume which makes them editable in one place. An animation controller is then added allowing the lights to be animated using only one animation clip per sector. This, in turn, allows all objects affected by the same light to have the same settings even if a different material is being used – making it a very powerful tool.

The three light managers for the Lagos map can be seen in yellow below:

sputlight_manager_01

Once we had the environment looking the way we wanted, we realized we would have skinned characters – Crossbones’ henchmen – running through these maps. And if they were not affected by the flickering lights or by passing through the pools of light that made up the levels, it would break the immersion.

The Light Manager gave us access to all of the colours and intensities of the lights so it was a matter of coming up with a way to apply these to a mesh without a static light map. The key was utilizing light probes – though not exactly in the way they were intended. In the same way the light maps were re purposed, lighting probes were baked with the same RGB light data as the static light maps. But since the original lighting is baked outside of Unity, the lighting set-up had to be mirrored in Unity to bake the probes.

sputlight_probes_01edit

To this end, we wrote a shader that was similar to the one that was used for the light maps but instead used the RGB channels of the lighting probes. Since all of the material attributes were named the same in the Light Manager, we were able to apply the same colour and intensity to the skinned meshes as we did for the environments. This allowed the henchmen to blend into the scene seamlessly, picking up accent lights and, in the example below, flickering lights:

sputlight_skinned_01edit1

This technique was used to throughout out the entire project allowing us to create dynamic lighting effects and enhancing the immersive feel of the experience without using a single real time light.

Try it for iOS https://itunes.apple.com/us/app/kelloggs-marvels-civil-war-vr/id1093762466 or Android https://play.google.com/store/apps/details?id=com.kelloggs.civilwarvr

July 2, 2015

Google Cardboard Developer Tools: Black Screen Issue

In the interest of sharing our experiences and furthering the Cardboard community, we wanted to share an issue we dealt with during development of some VR experiences and how we fixed it.

Google Cardboard has brought virtual reality to masses with it’s simple, effective, and inexpensive design. In the interest of sharing our experiences and furthering the Cardboard community, we wanted to share an issue we dealt with during development of some VR experiences and how we fixed it.

We started to experience a strange bug on iOS devices when testing our apps that would give us a black screen saying “Let’s get you set up” instead of actual game view. By looking at the Debug.Log() messages, it was clear that the game was working and seemed to be running behind the black screen. After trying several work-arounds, the “Let’s get you set up” message was still blocking the view. Although we strongly believe in the power of Google, a thorough search did not produce any results on how to fix this. So we started to dig in.

After a couple of hours of banging our heads against the wall, we noticed that this screen appears after the game is started and a couple of frames where rendered. It seemed it was definitely related to iOS SDK and was probably some ViewController pushed on top of the game view controller. So we decided to find out where that code was called from.

After some file “grep’ing” we found out that these texts are stored in libplatform.a binary and we had no source code of it. The next step was to try to prevent this code from executing and we started exploring all of the SDK code we had hoping to find out calling points. Luckily, the solution came pretty easily from there. The CardboardAppController.mm has a couple of functions, and one of them is “void launchOnboardingDialog(){…}” so by just commenting it’s content we solved this black screen problem!

Hopefully this helps some of you out there who may be dealing with the same issue. This was a quick fix, but we are working on a solution that will stop this problem from happening in the first place. Stay tuned…

November 5, 2013

Toyota – Millennial’s AR Spot

Bully! recently wrapped the live action component of a broadcast Toyota spot codenamed Millennial’s AR. Shot in beautiful downtown Baltimore, the spot features near-future glasses tech that allows a sidewalk café patron to scope out the new 2014 Corolla.

Toyota spot live action shoot in downtown B-more

Bully! recently wrapped the live action component of a broadcast Toyota spot codenamed Millennial’s AR.  Shot in beautiful downtown Baltimore, the spot features near-future glasses tech that allows a sidewalk café patron to scope out the new 2014 Corolla.  From his POV, we see his Face-bubble Friend’s reactions (Bully!’s answer to Facebook,) vehicle features pop out and a prompt to find the nearest Toyota dealer – all in (simulated) augmented reality.

The project (with 186 Advertising – holla!) was an exploration in wearable-tech interface design and gesture recognition – and let us get back to our roots and play in the broadcast space again.

Establishing and POV shots of the Corolla from the vantage point of our coffee drinking actor were expertly choreographed (thanks to HackStone Film Group) and a variety of swipes and gestures where shot on green screen for later compositing.

The spot is slated for release the end of 2013.  The actual glasses with our interface, hopefully sometime soon.

December 30, 2012

Natural User Interface in Mobile AR: SOUR PATCH KIDS – In Sour Vision

Bully! recently launched a new Augmented Reality game for Mondelēz International promoting the “SOUR THEN SWEET” SOUR PATCH KIDS candy and their X-box game WORLD GONE SOUR.

Bully! recently launched a new Augmented Reality game for Mondelēz International promoting the “SOUR THEN SWEET” SOUR PATCH KIDS candy and their X-box game WORLD GONE SOUR.

The game, found on the App Store (here) uses AR image markers (found here and here) to spawn a Bully! created set featuring Dolly Doll, a main boss character from the X-box game.  Dolly, controlled by the mischievous Yellow SOUR PATCH KID character, throws anything at hand – toy blocks, doll heads, firetrucks – at the user.  The user must dodge the projectiles and, with the help of the Green SOUR PATCH KID character, throw tag-alongs (small SOUR PATCH KIDS) back at Dolly to win.

Beyond being among the first AR advergames, a key differentiator of the game is that the user must actually duck to avoid being hit by virtual projectiles – not just move back and forth with on-screen buttons but physically move his or her body to avoid being hit by the Dolly’s  barrage.

This technique, utilizing an operation that comes ‘naturally’ to people as an input to computer interaction, is referred to as a natural user interface (NUI.).  This kind of interaction is being actively explored by groups such as but in its infancy with mobile device experiences.

“One of the things we wanted to explore with the SOUR PATCH project was how NUI inputs could be used to add a new dimension to mobile game play – where the user has to move in real space to interact with virtual content – or in the case of SOUR PATCH game, duck the virtual blocks Dolly Doll throws.” said Carlson Bull, Creative Director and founder of Bully! Entertainment.
“As mobile technology progresses we’ll be able to utilize this kind of real-world interaction to create some very interesting engagements.” Says Bob Berkebile, VP of Technology and Innovation.   “We can envision a new genre of experiences that seamlessly blends natural inputs and virtual content to dramatically enhance game play, learning and brand activations.”

“The SOUR PATCH game is a step toward this future,” said Bull.  “When AR and gesture recognition systems make their way into eye ware, a world of possibilities will open up.”

March 12, 2012

COBRA PUMA Golf – Gyroscopic Games

How fast can you shoot flying oranges out of the sky? How fast can you do it without falling down the stairs?

How fast can you shoot flying oranges out of the sky? How fast can you do it without falling down the stairs? Download the COBRA Orange Out game from the App store, and you can answer both of these questions. Using the iPhone’s gyroscopic capabilities, Bully! created the game to allow golfers stuck inside for whatever reason to get their virtual hands on a COBRA’s line of AMP clubs and obliterate flying oranges to their heart’s content.

Check it out if you are itching for some time on the back nine, but life seems to be getting in the way. A word of warning though, do watch out for stairwells.

Get a preview here >

Get the iOS app here >