We're a small team from Azyenberg Group and we started this blog series to document our moments of sudden realization about where we could go, what we might do and how this new frontier of VR/MR/AR is really shaped. We’re calling them “Holographic Moments.” We all have different responsibilities and insights, so you’ll hear from each of us in this blog series whenever we are moved to write.
Don't wanna be here? Send us removal request.
Text
Holo-Moments - “Transitioning from Unity Development to HoloLens Development”
My name is Jeff Johar, and I come from an art background. I’ve been working in the animation cinematic industry for over a decade. During my first job as a 3D modeler, my supervisor asked if I could rig a prairie dog’s face- a job that a technical artist would do, not a 3D modeler. Within two weeks of that they said “Yep, you’re the guy.” Since then I’ve been a technical artist.
Your typical technical artist would know how to program to automate certain pipeline tasks, so when I started, I focused on tools for automating to help other people in the pipeline. This gave my co-workers shortcuts that enabled them to do their work in a fraction of the time of doing it manually.
Not until I was put on jobs that required game asset preparation did I start playing around in Unity (a game engine used to make 2D or 3D games). I love when I see someone press a button and it does something (a character moves, jumps, lights change…) as a result of a code I’ve written. To me, this is a way more gratifying experience than watching someone view an animation film, which is a passive interaction.
When we heard that Unity was the tool of choice Microsoft had picked for the HoloLens, I was excited. Whenever I delve into new technology, there is always this fear of the unknown, a fear of the unexpected, a fear that I’m too old for this. For me this fear was alleviated by having familiarity with the tool I would need to use to create and develop for the HoloLens, my old friend Unity.
Now the more that I develop and R&D for the HoloLens, the more I realize the same tricks for any typical Unity project work just as well on this device!
Here is one of the tricks I want to share:
Layers: Those who are new to 3D programs may struggle with the idea of object layering. An example of this is in video games during X-Ray mode, like in Batman Arkham Origins: Detective Mode. There are two types of object layering, 2D and 3D. The 3D layering works just like our world: when the sun is eclipsed by the moon, you can no longer see the sun. Same goes for a 3D scene.
2D layering is much like stacking a deck of cards, when you place the ace of spades on top of the ace of diamonds, all you see is the ace of spades.
But when we use 2D layering in a 3D world, we can cheat the order of appearance.
Using our previous example of 3D layering: even though the sun is further away from the viewer than the moon, and it is being obstructed by the moon, we can specify that the 2D layer of the sun is in front of the moon. This will result in the ability to see the sun through the moon.
When dealing with the HoloLens, you can use this trick to your advantage and you should! You can utilize layering not only to reveal… but also to hide.
We created a HoloLens app called Shroomy to assist students at the USC Holojam on how to build and deploy in the HoloLens. Shroomy is an anthropomorphic, mushroom creature that plays fetch with you in the HoloLens.
The following pictures were taken from an MRC (Mixed Reality Capture) session of Shroomy. Usually in the HoloLens you use the color black for occluding, because in the device black does not show up. However, in MRC and the Unity editor, this is not true and gives us an excellent use case of layering.
before
after
In the before picture, the black box blocks (say that 10 times) the hole simply because it is opaque. This is unsightly for MRC, so we change the black to non-renderable (transparent). This results in displaying not only the hole, but also the contents of the hole – but we don’t want to see the contents of the hole!
So, in order for objects behind the box to no longer be seen (the contents of the hole), we need to specify through a script (body of code) that they are on a layer behind the box and thus can be hidden from view as shown in the after picture.
Though this is only one example, in general I am thoroughly impressed and excited at the capability of integrating old Unity tricks into the new hardware.
0 notes
Text
Holo-Moments - “ARP and HCAP”
Hi, I’m Meredith Zohar, a Product Manager at Ayzenberg focused on experiential products including Mixed Reality, Augmented Reality and Virtual Reality. With my background in digital and integrated production, I’ve dreamed of working at the intersection of digital, video, gaming, emerging technology, and physical production for my whole working life. This is finally it!
While we are working on all versions of ‘reality,’ we are particularly excited about the idea of Mixed Reality with its broad spectrum of potential uses and technical capabilities. To that end, we’re developing products for the Microsoft® HoloLens, the gold standard in MR. But how do you DO that? What kinds of applications make sense in MR? How do you ideate, prototype, design, build, test, optimize, and deploy a Mixed Reality experience? To some degree, we simply leverage existing production models. In others, this is an entirely new space with novel problems to solve as a community (and can I just say, I LOVE that!).
As part of rolling out the HoloLens device, Microsoft has put together an Agency Readiness Program to help jumpstart the capabilities of companies like ours that dive into new technologies head first. We’ve always been known as a future-forward company with incredible strength in both creative and technology skills so it’s no surprise that Ayzenberg is one of 16 lucky agencies to participate in the program.
The Agency Readiness Programs consists of a handful of multi-day training sessions at Microsoft with guided hands-on development exercises, informational panels led by subject matter experts, app demos and presentations digging into how they were made. Our team is about 80 percent done with the program and we’ve learned a lot. The first session introduced the team to the HoloLens, its core capabilities and benefits, and the idea of envisioning for HoloLens. The second one dove into developing for the HoloLens and its unique capabilities, including spatial mapping, spatial audio, gesture, gaze, and voice. The third session focused on optimization and trouble shooting. The most recent session focused particularly on holographic capture.
The first question we discussed to kick off the session was when to build from scratch vs. capture. While we explored a number of scenarios around environments, objects, and humans, the resounding answer was ‘if authenticity is critical then capture’. To offer a few examples, this is the difference between capturing a generic office vs. the Oval Office in the White House; a generic coffee cup vs. the mug I hand painted for my mom when I was eight; a few people to fill out a bustling crowd vs. a historical figure like Buzz Aldrin (in NASA’s Destination: Mars experience, featured at the Kennedy Space center) or a famous actor with a lot of character like George Takei (featured in Microsoft’s Actiongram app).
When authenticity is required, there are a handful of capture techniques available: 2D (controlled frame), 360 (spherical, uncontrolled frame), motion capture (using tracking markers to create a skeleton rig to which a 3D character model can be applied), scanning/photogrammetry (capturing and merging photographic data with geometry), and volumetric capture (a combo of mo-cap and scanning to capture motion and image data), which includes Microsoft’s Holographic Capture (HCAP) system. In essence, HCAP merges synchronized input from over a hundred RGB and infrared (IR) cameras, strategically placed around a calibrated 360 green screen stage, to accurately capture both image and 3D geometry data. That data is then processed, starting as a dense 3D point cloud depth map, which is converted into a high-poly mesh, then decimated to the appropriate level of quality for the intended usage (with perceptually important details such as faces and hands left at a higher quality level), and finally compressed and exported as a single streamable 3D asset. And that can take 24 hours for just a short clip – wow!
While all of that is super cool and helps immeasurably to determine the right capture method(s) for any given scenario, some of the most interesting takeaways for me were the best practices the Microsoft team has developed. One of the most memorable suggestions was to think about production from a theater perspective; 3D capture is most akin to theater in the round, where there is no central camera and a director can’t control the viewers’ PoV, so production needs to be considered from all angles. I’ve heard this challenge mentioned in various 360 capture-related circles, and this was the first time I’d heard the theater comparison. Also, interestingly, shots should start and finish in a neutral position to allow the most flexibility for integration into the final product. However, looping is extremely challenging because of the difficulty involved in starting and stopping in the exact same position. This is an important point to consider when sorting out a concept and scripting the experience. Finally, it is important to understand the goals of the shoot and what’s possible with the technology going into it, then plan and rehearse to be able to make the most of the shoot time. That allows you to be flexible because sometimes the most amazing shots are spontaneous!
0 notes
Text
Holo-Moments - “Learnings”
Hi! I'm Izzy. My title at Ayzenberg is "Broadcast Administrator", but within the Holonauts, I think of myself as a production coordinator. I help support the 'Nauts by pushing projects forward with schedules, meetings, being a guinea pig for new developments and any other way I can be of assistance. I graduated with a BFA in Acting from USC, but during my senior year I became involved with storytelling in VR. Since then I have been immersed in immersive media in some way or another.
Already I've learned a lot in my month with the Holonauts, not simply about the device itself, but about how to share it, how to talk about it, how to create in it, and to embrace that we are exploring as pioneers into unchartered territory. For those of you that are just like me (a month ago), here are some things about the HoloLens I've learned: 1. It is not VR, it’s MR. Virtual Reality replaces the real world with a virtual world, the HoloLens adds holograms to the existing world, MIXED REALITY, yeah-pretty cool.
2. Heads up, hands free. Our focus with the HoloLens is to use it as a tool. Though the opportunities for entertainment in the 'Lens are great, we want to reach those that would benefit from having a holographic computing system with them anywhere and everywhere.
3. Everyone is a beginner. There are very few experts of the HoloLens, we are all learning how to create for it, together. When we went to USC for the Jam, I watched Philippe Lewiki (CEO of AfterNow) become so captured by a student’s question, that he went off to a little corner of the packed room and started playing in the ‘Lens trying to find the solution right there and then. For me, that space of free exploration and even playing field is tremendously exciting.
0 notes
Text
Introducing Holographic Moments
For over 200 years, people have been calling the moment when a new idea suddenly comes to mind a “lightbulb moment.” But, lightbulbs aren’t so new anymore.
Our experience at Ayzenberg as early developers of content for the Microsoft HoloLens is that our insights often begin, not with a light bulb, but with the sudden introduction of an actual hologram mixed into the real world. Only when we’re flooded with the sensation of seeing it, experiencing it and learning from it, are we able to make real progress toward the promise of mixed reality.
We started this blog series to document our moments of this sudden realization about where we could go, what we might do and how this new frontier is really shaped. We’re calling them “Holographic Moments.”
We’re a small team and we all have different responsibilities and insights, so you’ll hear from each of us in this blog series whenever we are moved to write. Each post will contain a different perspective about what we do.
My name is Matt Bretz and I’m one of three directors working in the Microsoft HoloLens mixed reality research lab. My background is in writing and directing for the theatre, film, video and advertising. My partners in leadership are a Silicon Valley software engineer and a CG director from Columbus, Ohio. With a small team of game developers, designers and digital producers, we are winding our way along the frontier of storytelling in a world that unites our ever-evolving digital inventions with the immutable physical reality in which we find ourselves.
Recently, we decided to partner with the Interactive Media and Games Division at USC’s School of Cinematic Arts for a 24-hour HoloLens hackathon or “HoloJam.” With HoloLens dev kits running a cool $3,000 a pop, one of the biggest barriers we face in creating a sustainable developer community is a lack of access to the devices. At a HoloJam, we share our devices and participants are able to share their ideas. By doing so, we learn as much, or even more, than what we are there to teach.
This HoloJam’s theme was inspired by some of the passions of Alex Kipman, co-creator of the HoloLens, called “Through the Looking Glass: Super Powers, Strange Places and Shared Hallucinations.” Participants created a narrow, vertical slice of an application which benefits from being untethered to a computer or other device, using the real world around us and pouring digital assets over this world to transform it into something entirely new.
We know how to write code to connect us. We know how to design assets we can all share. What we don’t know yet is how to think collaboratively about this brave new world. This is what will be required of us as we head into this new frontier of storytelling, inspired by holographic moments we create together. Let’s develop.
0 notes