Sep 082014

Jiri Wehle as depicted in Timefire by TimefireVR in Scottsdale, Arizona

The long quiet here at PSOIH is due to the fact that things changed along the way as we were building VR. Virtual Reality is a huge undertaking and it turned out, of some interest to others. As the Game Developers Conference (GDC) concluded and April played out, not only was Sony getting into the game with Morpheus, Facebook grabbed Oculus and gave them enough money to start to do very serious work.. Simultaneously I started a conversation with an old friend, Jeffrey Rassas, who took an immediate liking to what Joe and I were doing and he also saw what we might accomplish with greater resources. In just a couple of weeks we were on our way to hiring others and inviting them to join us in our new office.

One thing I learned between January (Steam Dev Days) and March (GDC) was that nearly everyone was having problem pronouncing PSOIH and so we changed the name to TIMEFIRE. Our new domain is over at – someone who I cannot seem to contact already owns the domain of plain old timefire dot com.

As TimefireVR or simply Timefire we migrated away from UDK and fully embraced UE4 (Unreal Engine 4 from Epic) and now Joe has two others helping him explore the possibilities that UE4 has to offer. One of those new people here at Timefire is Ariana Alexander who created this characterization of Czech musician Jiri Wehle using MakeHuman, Blender, and UE4. We’ve also brought on Brinn Aaron who is working with Blender, UE4, FMOD, Bitwig, Circle Synth, GlitchMachines tools, and of course Allegorithmic’s Substance Designer. Luis Chavez joined us from a video background but has quickly adapted to Blender and UE4, but has shown great strength in mastering SVERCHOK – a Blender Addon that is used primarily as a parametric architectural tool that also has many options to be used as an element for creating art. Also on board is Rainy Heath and Dani L’Heureux who are working their ways through Blender, Substance Painter and Substance Designer, some Illustrator and MakeHuman.

Check out what we’re creating over at, join us on Facebook or send us a Tweet about your own experiments in VR

Mar 182014

Sixense STEM prototype being shown at GDC (Game Developers Conference) in San Francisco, CA

Being in the right place at the right time, score! Met Danny Woodall and Amir Rubin from Sixense at GDC this afternoon. These are the guys behind the STEM Wireless Motion Tracking system that was successfully funded this past October on Kickstarter. Today though they are at GDC – Game Developers Conference being held in San Francisco, California. Lucky us, we were shown a new prototype that NO ONE else has seen yet. Matter of fact it was just assembled from parts printed by the guys over at Shapeways. The new unit is incredibly light, there will be little if not zero fatigue holding these controllers for extended periods of time. The older molded plastic unit is on the left for comparison.

Sixense STEM positional electronics in their raw form attached to the Oculus Rift DK1 at GDC - Game Developers Conference in San Francisco, CA

When the GDC Expo opens to the public tomorrow, these will be the electronic components (seen in their raw state prior to further engineering miniaturization shrinking magic) that will be attached to the Oculus Rift Developers Kit 1 to facilitate head tracking…….hmmm, if only Oculus were showing new units with head tracking, wouldn’t that be sweet?

Mar 092014

Unreal Engine 4 girl made in Blender painted with Allegorithmic's Substance Painter Beta by Joe Cunningham

Soon Joe and I will leave for our first Game Developers Conference, better known simply as GDC. It’s held annually in San Francisco, California, which we’ll be road-tripping our way up to on a 1,532 mile (2,482km) adventure. I’m writing this blog entry as a two-part story to get some of the planning details, expectations, and sense of excitement leading into this out of the way. This way my next post will focus on the event itself.

GDC is the world’s largest and longest running professionals-only game industry event; their description not mine. How do they know it’s pro’s only? Probably because of the expense to attend, this isn’t cheap. Exposition floor tickets alone are $195 each while a full pass will cost you between $1,475 and $2,100 depending on when you register. Add transportation, food, and hotels and soon a small developer will approach close to $1000 in costs for even a minimal attendance option – per person!

Last summer I started considering a trip to GDC. Earlier in the year it had been impossible for us not to see the announcements coming from the conference regarding Oculus and the “Infiltrator” demo showcasing Epic’s Unreal Engine 4 (UE4). So I went over to the GDC site, but prices and options to purchase were not posted yet. Then in September or October the information went live and for a brief moment I had the opportunity to buy admission to the Independent Games Summit (IGS) for only $695. I also was reading on the internet that first time attendees shouldn’t worry about the full conference and summits, that just visiting was enough, even overwhelming. While Joe and I progressed in our own development regarding our VR project, I finally returned to the registration page. The IGS passes were sold out, that settled what part of the event we’d be attending. The lesson here is to plan well in advance, better than six months out if you are thinking of going. Read everything you can about other people’s experiences and start saving money.

So what’s the attraction to a couple of guys just getting started in the game industry? This is our Superbowl, our World Cup, it is the Olympics and the Oscars all rolled into one giant geek fest celebrating the developers who are changing how we see and interact with our world. Gaming is not just some teen sitting in isolation killing zombies, it is an evolving phenomenon that alters humanity’s relationship to education, entertainment, social interactions, medicine, work, war, and soon how we travel. The companies that are presenting workshops or have booth space often hold off on making major product announcement leading up to GDC because they know the world is listening during this event. It is in large part these announcements and demonstrations that played a role in our deciding to attend.

The first and foremost among those hoped-for announcements will be from Epic, or maybe Oculus. When the doors open I suppose we’ll have to flip a coin to decide whose booth we’ll bolt to first. Both companies are likely to make some major announcements, so large that in retrospect this will be one of those defining moments in history that we as a society look back to the events of this week to recognize this was when we first learned the world was changing in such a dramatic fashion.

For about 10 years Epic has been working on their next-generation game authoring engine. There is no guarantee that it is ready yet to make a public appearance, but signs that Epic may be about to unleash this super-charged tool are many. Over the past seven months Epic has rolled out five videos that have allowed us to see the software and has put on display its User Interface. Their booth-space in comparison to the previous year is huge. At CES (Consumer Electronics Show) in Vegas they allowed Nvidia to run a UE4 demo and had prepared a special presentation for the guys at Oculus to showcase their Crystal Cove prototype which Valve was also enthusiastically supporting. With Sony and Avegant breathing down Oculus’s neck in the race to reach the consumer market with VR, it would appear to me that it is in Epic’s own interest as an early supporter of the technology to help push out what could be the premiere VR authoring environment.

Then we have Oculus themselves who have an amazing amount of floor space reserved at GDC, three separate booths as a matter of fact on two floors. Previous statements have said they would be ready with a consumer model when they figure out positional tracking, well that’s exactly what they were showing in early January at CES. Another 60 days have passed and I’m certain they have not slowed down on refining how their technology works. If they were to announce Dev Kit 2 (DK2) at GDC, my guess is they would give themselves enough lead time to prepare the units, but maybe a bit of extra time figured into things if they knew certain elements were about to find their way into a DK2, so that they can tempt us with an announcement that new units for developers will ship within a relatively short time after GDC. They are certainly not going to GDC with 3,000 square feet of space to simply show us what we have all already seen. I expect something HUGE!

Next up in importance is yet another encounter with Allegorithmic. Just this past week they released to the general public a beta version of Substance Painter on the Steam Early Access program operated by Valve. Last month they sent out invites to those who might be interested in attending a Substance User Group meeting to be held during GDC, we wasted no time on RSVP’ing our answer, which in turn is forcing us to leave a day early. We are looking forward to learn about plans going forward regarding Painter and what they have in mind for how Substance Designer is to be improved upon.

Marmoset will be present, but Toolbag 2.0 was recently released so I think they’ll be there mostly to meet with their user base to get feedback and let some lucky few know what they have in mind for future versions. Speaking of software on the horizon, Quixel has a booth and there is no way they are there only to show their current version of nDo2 which is having problems operating with Photoshop CC for a number of users. I’m pretty certain we’ll see a new version of nDo, dDo, and their new Megascan service.

These are just the major areas of interest we are at GDC for. There’s also Simplygon, Nvidia, Valve, Speedtree, Perforce, the Belgian Trade Commission and dozens of other vendors we are yet to discover that have us thrilled about this prospect of being on hand for this amazing conference.

Then there are the parties.

Tuesday after our user group meeting we’ll head down the street to the NativeX Party. While neither of us are Corona Labs users, we are interested in learning all we can about mobile gaming and with NativeX cosponsoring the party, we might learn something more about mobile ads. Hey, we’re new to GDC and can’t turn away from any opportunity to learn, and be entertained.

Come Wednesday night we have two parties to attend, first up we’ll head to the Novela Bar for a party sponsored by Kontagent+PlayHaven and VentureBeat. I’m starting to see a theme here that after we’ve hung out all day with the developers of the tools we work with, the guys who handle the all important “after-the-game-has-been-released” job are there to talk to us. From here we will walk over to AT&T Park – home of the San Francsisco Giants for a giant party hosted by YetiZen. A couple of live acts will be performing, tournament video gaming will run all night and people on the VIP list will be able to test drive a new Ferrari.

Thursday following GDC an event is taking place at Swissexsf titled “Spatial Storytelling: Augmented and Virtual Realities” where we are promised an evening of immersive games and installations incorporating virtual and augmented reality. Some of the participants are Disney, Stanford Virtual Human Interaction Lab, OuterBody Labs, apelab, and BandFuse. Sadly the festivities are planned to stop at 10:00 p.m.

Finally on Friday, the last day of the conference is a party hosted by 8bitSF and Pow Pow Pow called: Band Saga LIVE. It will be over at the DNA Lounge with a bunch of bands including Metroid Metal and An0va performing. We’re still waiting to see if kiip responds to our RSVP to attend their party on Wednesday night. This will be tight as I’m sure that a “Things wrapped in bacon” party is going to be a popular one. And it seems the most difficult party to get into is Notch’s .party( ). Last year this inventor of Minecraft had Skrillex play his party, this year Nero, Kill The Noise, and Feed Me are scheduled to perform. The event seems to have filled minutes after it was made available. There’s always next year.

Image: Woman’s form from MakeHuman. Other 3D elements created in Blender 2.70. Model and sign painted using Allegorithmic’s Substance Painter Beta 5.0. Photo from Jesse Knish Photography Creative Commons Attribution 2.0 Generic license. Rendered in Unreal Engine 4.

Mar 012014

A 3D interpretation created in Blender of a flower by Luigi Serafini from his book Codex Seraphinianus

Creating content for virtual reality when your aspirations are set high can be a daunting task. In order to move beyond the low-poly small bitmap textured environments that have come to typify video game art for the past couple of decades, the small indie team must focus on mastering a wide number of tools. The principals of this creativity are the same as they are for Triple A titles, but instead of 300 people creating the world we are but two guys.

Since this past summer when Joe and I agreed that we should start collaborating on building something for VR, a lot has changed as the game industry undergoes an amazing evolutionary convulsion. What stands out is the fact that hardware, software, and competition from the indies is at a critical stage of altering the business model and shows no sign of letting up. The convergence of GPU rendering power, high quality mobile display tech (the backbone of the VR headset), procedural and physically based textures, the Steam network and various other platforms for content distribution, along with the myriad advancements in general PC hardware such as inexpensive ram and fast SSD’s that allow us to run Blender, Unreal, Substance Designer, Photoshop, and DayZ all simultaneously are giving the individual an incredible amount of leverage.

With this power and opportunity we are facing steep learning curves every single day. At first we struggled with simply learning how to make seamless textures. Then the long slow arch of learning UV unwrapping showed up like Godzilla to crush our ideas that we would just throw these materials on our meshes. Because we were looking to bring these skills to work with Unreal Development Kit (UDK) we were only focused on low-poly meshes. Neither of us had any game creating experience and so as we understood things, we needed to be super conservative with the geometry and size of materials.

Then Epic started teasing more details regarding the much anticipated Unreal Engine 4 and we started dreaming that it must be around the corner. Maybe we should build things with the idea that we’d migrate our work when Epic finally made the update from UDK to whatever they would end up calling the consumer version of UE4? Just as we are making this transition I learned about Allegorithmic’s Substance Designer and their Database of over 700 procedural textures and a new jumble of highly technical stuff was thrown at us. As quickly as we were adapting to this new paradigm of working with textures and materials, Allegorithmic adds Physically Based Rendering (PBR) to the mix. We’d seen hints of this in one of the UE4 demo’s so this just solidified the fact that this was going to be something important. Sure enough, here comes Marmoset with Toolbag 2.0 and a heavy emphasis on PBR too. Better bone up on what this will bring to the workflow.

If that wasn’t enough, David Green with LilChips continued to offer us updates to his still alpha landscape building software called Terresculptor so we had to investigate those changes too. An email pops into my mailbox from Allegorithmic inviting us to Los Angeles for a sneak-peek into Substance Painter; we can’t resist. The first version is primitive, too primitive to work with but great to steal a glimpse into what those guys have in mind. Luckily we couldn’t deep dive into Painter yet and instead held our focus on our other tasks while this software matures, but we now know that 3D painting is certainly going to be another tool in helping us achieve the kind of results that were exclusive to the Triple A guys.

By late December we were fully immersed in the process of building high-poly models, learning about retopology, and that auto-unwrapping UV’s was not going to cut it. Displacement maps in our modeling techniques looked appealing and from what we were seeing from Surface Mimic these maps that were being used in Substance Painter performed great in Blender and our sculpting tool of choice; 3D-Coat.

Problem was we were not making a lot of progress building out our world, all we were doing was learning; all the time. With the New Year came a date on the calendar I needed to tend to; Steam Dev Days hosted by Valve. With a large portion of the Seattle based conference focused on VR, I thought it might be worth the expense to learn more about this platform we are so fascinated in developing for. As I wrote in a previous article, Gabe Newell made it possible for me to get a peak into their VR headset where instantly I knew that not only were we on the right path, this is going to be more epic than I imagined back when I first put on the Oculus Rift Dev Kit 1.

From Palmer Luckey, Joe Ludwig, Michael Abrash, and the rest of the people who were talking about VR during those days in the Pacific Northwest, it was made abundantly clear that large scale environments weren’t ready for prime time yet. I didn’t want to acknowledge this because my hope was Nvidia would rectify poor frame rates with their new Maxwell line of GPU’s, but slowly I’ve given in to the idea that we would have to shelve part of our world for a time and find a different focus. Luckily we found a compromise that would temporarily push our earlier efforts to the side while we focus on a “corner” that will allow for a tighter focus on the intimate instead of the massive.

Riding on the elation of what I’d seen in Seattle and how far Joe and I had already come, I thought it was time to write an email – to Epic. Within 24 hours of that missive I learned that we would be on the next cycle when new devs are brought on. I don’t think I could have been anymore dumbfounded as I was at that moment. In about a week we were both downloading our own personalized (for security purposes?) version of UE4 (not it’s code name but I’m not authorized to speak ‘its’ name :).

Shiny-toy-itis comes on like a fever, Joe disappears into deciphering just what it is we are wrestling with in UE4 while I continue to try to understand the disparate pieces of what we are assembling and nudge our original plans.

It has now been a couple of weeks since our heads were spun around their axis and things are starting to normalize, if that is really possible. We are again making progress in the creative department, but weaving all of this together is a gargantuan task. In three weeks we’ll leave for San Francisco to make our first visit to GDC (the Game Developers Conference) where we will likely once again be overwhelmed by the rapid evolution that is occurring in this industry. None of this is a complaint, on the contrary, we are too astonished right now to see any of this work as a burden. We stand humbled by the gravity of what is coming and are excited to see what is to be shared in the near future.

Feb 262014

Antonin Artaud wearing the Oculus Rift Virtual Reality device

The term Virtual Reality may seem dated, after all it has been over 70 years since poet and actor Antonin Artaud first penned the words “la réalité virtuelle.” Although his 1938 definition may seem far removed from what our technologically advanced world is about to deliver, his ideas were far from off base. He envisaged the theater as a place where the alchemical mythologies of man would become the “incandescent edge of the future.” Well, that is verge of where we are today with a light-emitting headset called Oculus Rift that will allow us to peer not only at the future, but across time and reality in ways the mass of humanity is yet to fathom.

The transmutation from lead into gold was for alchemists what the crafting of story and image into content is for our time. Storytelling is an ancient art, maybe 40,000 years old as dated by cave paintings in Spain. Early plays and dances have existed for thousands of years, but it wasn’t until the 15th century that the narrative became a tradeable currency. The printing press brought with it the ability to distribute information to the masses. For the next 400 years printing would dominate our communication channels until the arrival of the telegraph, motion picture film and telephone. Together with television the moving picture would show people across the globe an unknown world previously only written about in tomes or told to one another in an oral tradition.

It has taken us tens of thousands of years to reach the juncture where information is ubiquitous and driving nearly all human activity. For example the global publishing industry is now worth $108 billion, effectively making it the 60th largest economy if compared to countries. The global movie industry is puny in comparison at approximately $35 billion, while video games generate about $92 billion of revenue. All of these industries are being wrapped up by the new kid on the block, the Internet. That virtual department store/library/theater is currently facilitating over $3 trillion of business – the Internet is the essential utility of our future.

What this all has to do with Virtual Reality is that we are at the point in time that marks the beginning of the future of humanity just as art, printed language, and advanced communication did during their time. It is the convergence point where we enter hyper drive. I make this prediction as though it was as easy as identifying a cave painting where the artist drew a horse and we all know it’s a horse.

All of humankind tells stories, we all have histories, we all celebrate our past; most of us have dreams of the future. In VR all creative and consumptive lines converge. We meld together and share the written word, the image, the game, the transaction – and we do it in an environment that speaks to and puts on display the dreams that live on the “edge of the future.”

We are all about to be thrust into new roles as architects of this future. This will be a place of alchemical experimentation where mythologies will come to life, not as two-hour celluloid epics, but in places we dwell and create new myths. Except, we are neither intellectually prepared or technologically advanced enough for what we must start preparing for – now.

While knowledge is everywhere and readily accessible, how many of us revel in the acquisition of the abstract and intricate? Most of those I see are more interested in the trivial and mass produced banal culture as doled out by faceless corporations concerned with shareholder wealth and executive salaries than in the evolutionary intellectual vitality of their fellow people.

Our next point of embarkation must be on the vehicle of high level brain exploration. The technology to show each other our dreams is soon upon us, though right now it leaves much to the imagination as it can only deliver a fraction of the aesthetic fidelity we are fast approaching.

To return to my statement in the first paragraph regarding what we understand or fathom, Virtual Reality will be a magnifying glass, a kind of tunneling electron microscope that will peel back the layers of the onion to expose things for what they are. We have always been visual learners and quickly pick up on what the image holds. It is within VR that the image will become ever more intoxicating as the technology advances to render greater beauty and detail out of the abstraction of pixels. Humans give order to chaos, we set letters in sequence to form words, we align and contrast colors to create art, we capture fleeting images of light in movies and then stand back in awe and sometimes cry at what we’ve created. Here at the edge of the future we will continue our traditions to make sense of things and while I am still uncertain as to what VR is ultimately going to look like, what I do know is that at the other end of it’s trajectory we will see a global society finally having achieved its Magnum Opus, we are on the verge of discovering the elusive philosopher’s stone.

Original image is available from Gallica Digital Library under the digital ID /ark:/12148/btv1b8539368j. This image is in the public domain because its copyright has expired.

Feb 252014

Complacency breeds stagnation, it makes us wonder what to do instead of forcing us to do.

Conflict throws us into action. We must run from or fight what is confronting us, but what we cannot do is stand still for if we do it will be at our own peril.

In the former situation we will sit passively and entertain ourselves to death while the latter forces us into a warlike stance.

When poised to go into battle we scour our mind for solutions; what are the paths that lay before us?

This then begs the question, when given the relative ease of life in the 1st world, where do we find this conflict?

Most waste this resource in the mental masturbation of arguing with one’s self and then pulling in inconsequential nonsense to occupy a mind that requires daily gymnastics to remain plastic and ready for action. The problem herein is that the drivel many people allow in has no relevance to their own life, it is fodder for a complacency brought on by the inane.

So where do we find the kind of internal conflict that allows our creativity to build our own intellectual Trojan Horse that helps liberate us from complacency? We manifest it in desire and disappointment. When I was a young man my conflict came from my inability to find the relationship I felt I needed. During that time I was a voracious reader – didn’t have anyone to date so what else was I to do? Those words I read stewed in my mind and I continued reading to find answers, but there were none, there was just more conflict. With the fire of the mind blazing like a forest fire in it’s quest for knowledge and companionship I started making stuff, digital stuff.

I was a child of the emergent personal computer revolution, just as words painted pictures in my mind; having the computer make pictures using its processing capability seemed a natural progression and so I embarked on dabbling with digital arts in the mid 1980’s. All of the reading, studying, and cultural exploration of my early 20’s had produced a great curiosity, I had a million questions. It was here that I discovered as so many others have over the course of their lives that the mind abhors a vacuum devoid of answers and given enough inputs it will soon start to disgorge its contents by producing something that explains things to the curious mind. Soon I was shooting video and making record cover art with the wealth of knowledge I was refining.

And here begins the problem, success in one’s endeavors delivers like-minded others and validates the knowledge that produced the fruits of our labor by imbuing us with the currency of achievement: money. Time to get happy and lazy and the start of getting older.

But I don’t want to get old, nor do I want to be in teenage soul ripping conflict either.

So how do we balance that yin and yang of two hemispheres of self and manage them that they might remain a healthy self-governing police force of our cognitive maturity? We break the rules, our own rules. To say this simpler, we step out of our own habits.

I did not suggest we break laws, as a matter of fact I would like to reiterate, we break our rules. What are those rules? Maybe they were the knowledge that we should exercise and the realization that obesity is killing us, so we break out of routine and find a path to tackling our complacency with our diet and lack of running around. Or maybe we have been taking something for granted and have our reality shaken by a dramatic change to our situation that rattles our complacency and forces us to race against stagnation to fix ourselves.

In my case I recently reignited my creative engine by dousing gasoline on a healthy bed of embers. While maybe nothing is easily pinpointed as being the sole flashpoint of having sparked that conflict, what helped me was a binge into guilt. I’ve not played games in a seriously long time, been too busy traveling and writing; and being complacent. Throw caution to the side and dive into far too many hours of playing games instead of maintaining my status quo and I found the inflection point.

You see, I thought hours of mindless entertainment was a kind of complacency I couldn’t afford. Once something becomes normal and routine we tend to fall into a kind of mindless existence, so for me to fall into playing over 100 hours of a game at the exclusion of my “healthy” routine, I started falling into some extreme guilt. I didn’t realize it at first, but I was breaking my rules.

In a crescendo of regrets I had to justify my losing face in playing the game to excess, and my mind started racing for answers. I had to find a path on this battlefield to conquer my conflict. What I had going for me was that I am in process of creating a virtual environment, which might in itself be considered a game. I had recently attended Steam Dev Days and returned with a brain full of ideas regarding the valuable strategic lessons learned from Valve and their experiences in how gaming is evolving. I was being inspired by the game I was lost in too. Just as I’m cresting and about to cross this Rubicon my creative mind is rescued by my conflicting mind and I find an important answer to a vexing monetization question regarding the VR environment I am working on.

A huge problem has been solved, but at what expense? The conflict alienated my relationships because for 12 hours a day I cast my normal to the side and appeared to be lost. I suppose I could have stayed in my new complacency, but fortunately for me my brain enjoys a good round of gymnastics, even when thrown to the mat. I bounced up feeling invigorated to tackle the hard issues of how to implement my genius. Oh yeah, did I tell you I like flirting with self-delusional ideas of grandeur too?

Jan 182014

Steam Dev Days main room in the Seattle Convention Center hosted by Valve

Flew up to Seattle, Washington a few days ago to attend the Steam Dev Days event hosted by Valve. I was desperate to go as it appeared there was going to be an emphasis on Virtual Reality, so I reached out and requested an invite code and much to my surprise I was sent one. Being the over-enthusiastic zealot I am when I focus my attention on something I’m interested in, getting to listen in on Valve’s ideas of where VR is heading fit my needs to satisfy, even amplify my already off-the-chart curiosity. I visited the Convention Center the night before to find out just where I needed to be the next morning. I arrived Wednesday morning while they were finishing setting up the hall and our catered hot breakfast. I was trying to be polite when the guys who created Black Mesa for Counter Strike strode in and sat where I “was” going to sit. We had about a two hour wait before things got underway. As it turned out I sat at the end of the center row, little did I know that this would work out PERFECTLY. Gabe Newell, the founder of Valve, was the first speaker.

Gabe Newell of Valve and John Wise at Steam Dev Days in Seattle, Washington

As Gabe finished speaking I was not one of the lucky people who were able to ask him a question. Not a problem, I saw an opportunity. I got out of my seat and as he made his way out of the hall through the seated crowd of 2000 attendees, I followed him. Once in the hallway I asked my question. As we walked along and he answered, he asked if we could sit down a moment and continue. Wow, I’m kinda blown away by now. After a short while he asked if I had the chance to see their implementation of VR, “Nope.” He got up and said, “Come on.” At the end of the hallway is the man behind the curtain, seriously. Gabe asks Atman Binstock, a Senior Engineer who’s been working on Valve’s VR project if he can get me in? There are a limited number of spots open to demo this prototype headset during these two days and they are already reserved for the likes of people from Rockstar, Ubisoft, Intel, etc. Atman says he can offer me an abbreviated demo. I am indebted to Gabe, but I’m not even sure in my excitement if I offered him my profound thanks.

Valve's demo room where they were showing a prototype VR headset to select attendees at Steam Dev Days

Now I too am behind the curtain and in the magic room. The walls are covered with “Fiducial Markers” – they are what the camera on the headset I’ll put on will see so positional tracking is maintained. For the Oculus Rift demo at CES the week before in Las Vegas, the guys turned the concept around and covered the headset with markers where a camera on the wall kept track of the users head movements; that prototype system was called Crystal Cove.

Valve's prototype VR headset being used at Steam Dev Days

This is Valve’s prototype VR headset I had the opportunity to put on. There are no words to describe the next series of events and impressions except to try and sincerely say that Virtual Reality is going to CHANGE EVERYTHING! EVERYTHING!!! Valve and Oculus call it “Presence” – when any idea that you are in something artificial fades away and you are transported into this “other” place. I stood among cubes, simple cubes stretching off into the distance all around me. I could have been in Tron, it was the most amazing thing I’ve ever seen. Until I was standing in a giant cube with the Yahoo home page on the six sides of the cube, this was the most amazing thing I had ever seen. You should know where this is going. Until amazement gave way to epiphany and near religious experience – I was in the universe of CDAK. This environment that buckled my knees and brought me to tears is a tiny 4k program that originated out of the demoscene. Back in the nascent days of the personal computer industry, “crackers” would create small demos that usually preceded a game they cracked – very common back when I owned a Commodore 64 and then my Amiga. My breath was taken away, I seriously had to gasp at how beautiful the universe was. At this moment, if there had been any uncertainty, I was convinced that VR is going to steamroll humanity. Exploring intricacy and beauty, that which is impossible to realize in our physical world is what’s going to make this ubiquitous. By this time I don’t know what to say to Atman as I remove the headset, I’m stunned. Telling anyone what this was or how it felt will never compare to how the individual is going to find their world view permanently altered from just one encounter with VR – assuming it isn’t going to be in an environment shooting people.

Tom Forsyth of Oculus and Atman Binstock of Valve who are both working on Virtual Reality at Steam Dev Days

I emerge back into the real world. Gabe is gone. On the other side of the curtain I’m introduced to Tom Forsyth who now works with Oculus, but had been with Valve. On his right in this photo is Atman Binstock who does work with Valve. I’m tongue tied. I want to ask question but my mind is reeling. I concede defeat in that my thoughts are not to easily be unscrambled and I say bye, though I want to go back in, and I head back to the conference floor. I walk away with a smile that will have me looking like an idiot the next three days.

A slide of some of the inside info Valve shared with attendees at Steam Dev Days regarding making better games

Now I have to focus on the other reason I’m here in Seattle: learning from Valve. When Gabe Newell gave his introductory talk he said he was only going to focus on two things; the most important two things Valve sees for the future are: Open Platforms and Virtual Reality. With that in mind the people here to talk with us are spilling their corporate guts about what’s working and how things have changed. They point out the economic benefits of in-game trade and commerce, of how they can no longer make all the content and the user wants an active role in creating UGC – User Generated Content. Two examples are how they were surprised when users made more than 20,o00 skins that were being traded in Counter Strike, but that didn’t prepare them with what happened in Portal. Users created more than 318,000 maps, a feat the Valve would never have been able to do on their own. We hear over and over about openness, that we must evolve and learn, and that economic connectivity between developers and gamers is strategically absolutely important. After some amazing talks about the success of Counter Strike, Team Fortress, Portal, and Dota2 it was time for me to attend the afternoon sessions. Talks about Music in Games, Collaborative Coworking, and Steam’s Early Access program were the sessions I chose, I would have liked to attend all the sessions though.

Alexis Khouri and Jeremie Noguer of Allegorithmic at Steam Dev Days

At the end of the sessions was a get together in the main hall with DJ and drinks. There was also the matter of a give-away that was part of our swag bag. At the start of the conference attendees were handed a bag of goodies and a t-shirt. In that bag was a prototype Steam Controller, a notebook, a pen, some stickers, and of course the really cool Steam Dev Days canvas bag it all came in. But we were also given a card that was to be turned in this evening. Before I collected my free gift I ran into Alexis Khouri and Jeremie Noguer of Allegorithmic. I had just met these two guys the month before in Los Angeles when their CEO was over from France to make the official announcement of their new software called Substance Painter. They told me of something totally cool coming to Substance Designer 4.1, but I can’t say anything yet, except that it’s going to be really cool.

A Gigabyte Steam Box handed out at Steam Dev Days

Btw, that card we were given earlier in the day to be traded at the evening social event, it was this i4770 based Gigabyte built Steam Box called Brix Pro. While I was happy to hear about this and receive my very own, nothing was overshadowing my experience in Valve’s VR room. If it hadn’t been for all the great content of the sessions and how open Valve was being with how content is being sold, I could have easily gone home at around 10:30 on this first day. You can be sure though that this mini powerhouse of a computer had me giddy as a kid that this swag bag didn’t disappoint.

Michael Abrash of Valve talking about virtual reality at Steam Dev Days

The next day we heard talks in the main room about ARG’s – Alternative Reality Games. What I was waiting for though were the afternoon sessions as there was a big focus on VR. There was so much interest that the room this was supposed to be held in was expanded to handle 700 participants. Michael Abrash gave a great talk about how close we are to commercial virtual reality. Instead of me describing it you should just go to his slide show and read about it yourself – click here.

We also listened to Joe Ludwig who talked about VR and Steam, check out his presentation – click here.

Palmer Luckey was here with his crew from Oculus including CEO Brendan Iribe, co-founder Nate Mitchell, and Tom Forsyth. Palmer was coming off a successful couple of weeks that saw the company raise $75 million with Andreesen Horowitz and demonstrate the Crystal Cove at CES in Vegas. Palmer leapt on stage and dove into an hour of material that was compressed into a 30 minute presentation.

Joe Ludwig of Valve, Devin Reimer of Owlchemy, Palmer Luckey of Oculus, and Michael Abrash of Valve at Steam Dev Days

After the individual talks the guys sat down to answer questions from the audience. From left to right are; Joe Ludwig of Valve, Devin Reimer of Owlchemy, Palmer Luckey of Oculus, and Michael Abrash from Valve. The conversation lasted about an hour after which we broke for a catered dinner in the main room along with more music and drink.

Palmer Luckey of Oculus Rift fame and John Wise VR pioneer at Steam Dev Days

As I was leaving I ran into Palmer, not the first time, that was at Siggraph just this past July. I said hello and took the blurriest selfie ever, hence the small black and white version I’m posting. Over the two days here I was able to see that an older generation and decades of digital advancements had given opportunity to a young 20-year-old guy to change the world. The people who will make the content that drives that advancement are twenty and thirty somethings who are harnessing complexity and are still able to learn new tricks and hopefully bend with an entertainment industry that is about to go through the greatest contortions it has ever faced. Get ready world, you ain’t seen nothing yet.

Jan 132014

View from the Hauptwache Cafe in Virtual Reality

As Joe stays busy building elements for our growing virtual reality environment, I’m about to head up to Seattle, Washington. Valve is hosting Steam Dev Days this Wednesday and Thursday and I’m lucky enough to be attending. In the off chance I’ll be able to share a view into what we’ve been working on with Gabe Newell or Palmer Luckey, we worked especially hard today throwing a few things together, including Joe’s newest building, the Hauptwache Cafe from Frankfurt, Germany.

View up the canal from Timefire - a Virtual Reality environment being created for the Oculus Rift

Up in Seattle I’ll be listening to Valve’s plans for VR along with being introduced to Steam OS and likely a peek at their Steam Machines. Some of the talks will be addressing building games for these high performance computers that will be running a flavor of Linux for the most part, though I have read some will host dual operating systems, so Windows can be an option. The highlight though will have to be all the information regarding virtual reality. Rumor has it that Valve will be distributing a VR SDK, no idea what to expect with that yet. Palmer Luckey from Oculus Rift fame will be giving a half hour talk on Thursday preceded by a talk from Michael Abrash, one of the instrumental developers behind many of the advancements in VR being made this past year, he works for Valve.

Looking south in Timefire - a virtual reality environment being developed by PSOIH who are Joe Cunningham and John Wise

The images in today’s post are from the environment Joe and I are developing for the Oculus Rift. They are rough drafts we are playing with while we gather our bearings for how things fit while we stumble through learning the Unreal Development Kit. Being a two-person team with neither of us ever having built a video game nor a virtual reality (though regarding the latter, not many people on earth could claim to have built very much VR), we are enthusiastic, thrilled even, to be able to take things as far as we have.

Over the next few days I’ll be collecting and sharing what information and photos that I can, while Joe will work in Blender, though he would much rather be studying his brains out delving into UNREAL ENGINE 4!

Jan 112014

As we are in a new year and so much is unfolding regarding the state of VR and the roll-out of the Oculus “Crystal Cove” Rift at the Consumer Electronic Show in Vegas this past week, we have been evaluating and re-evaluating the way we are doing things. One of the big lessons we have learned so far is that we know very little about how things are going to evolve, because it’s all moving much faster than we could have anticipated, probably faster than the guys at Oculus dreamt too.

What hasn’t changed is that we have to create content – lots of content. This post is a recap of the tools and processes we are fairly certain are going to be the most essential to our workflow. While we have featured the individual tools and occasionally combinations of software that are helping us realize our VR aspiration, this is the first grand overview of the entire process so far, though some tools and process will be left out as this is already going to be a lengthy post. So let’s jump right in.

Building a virtual Arctic landscape with Terresculptor

While everything starts with a concept when it comes down to starting a world, we start with TerreSculptor Pro. This software is still in Alpha, but already is a powerful contender for the best landscape/terrain building package available. Today we are going to demonstrate building a frosty arctic environment. First off we chose a 2km by 2km landscape because it’s small and fast to work with. In the Generators we apply a Ridged Noisemap, and then head to Transform where we apply some Hydraulic Erosion. In the Extractors module we create a couple Weightmaps using the Slope parameter for the upper and lower areas of the ice field. Happy with the appearnce of the landscape, it is exported as a Raw16 Heightmap that will be imported into Epic’s UDK (Unreal Development Kit) – as opposed to UE4 which we are “patiently” waiting for.

Creating game assets with Blender for Virtual Reality

Next step will be building some assets to populate this new landscape. We start with Blender to make a boulder. We delete the default cube and add a sphere. In the Modifiers tab we add a Subdivision Surface and Displacement. In Edit mode we enable Proportional Editing and with vertex select drag the sphere around to start creating something that starts to have the rough shape of a boulder. Using a Displacement Modifier we use a Voronoi type texture to change our smooth boulder into a rough hewn one. Once happy with the general appearance of the boulder we duplicate the mesh, put it in a new layer, this will be our low-poly model. From here we can reduce the subdivision amount to something more reasonable for a game asset. With the mesh in our first layer, we adjust the subdivisions upwards, this will be the model we bake Normals Map from over in Substance DesignerWe UV unwrap the low-poly model and save out the two meshes, making sure to add Smoothing and Triangulate the mesh before exporting as FBX files.

Sculpting a mesh in 3D-Coat for a Virtual Reality project

Our next game asset we build in 3D-Coat. We have chosen 3D-Coat as our sculpting tool because Pilgway sells a non-commercial/educational version for only $99 compared to $450 for the educational version of Zbrush; the professional version is also a lot more expensive than 3D-Coat. Starting with a cube we stretch it into a rectangular form, the rough proportions of the glacier face we are about to create. Using a simple Draw brush on the Surface model we shape the rectangle into a more organic form. Then we select a Mask from one of the Displacement Maps we’ve purchased from Surface Mimic. This Mask will allow us to “displace” the surface of our model.

Our next game asset we build in 3D-Coat. We have chosen 3D-Coat as our sculpting tool because Pilgway sells a non-commercial/educational version for only $99 compared to $450 for the educational version of Zbrush; the professional version is also a lot more expensive than 3D-Coat. Starting with a cube we stretch it into a rectangular form, the rough proportions of the glacier face we are about to create. Using a simple Draw brush on the Surface model we shape the rectangle into a more organic form. Then we select a Mask from one of the Displacement Maps we've purchased from Surface Mimic. This Mask will allow us to "displace" the surface of our model.

Sculpting the mesh is definitely one of the most fun aspects of building stuff and using Masks and Displacement Maps is incredible, as very quickly we are able to bring out an amazing amount of detail that would be difficult to sculpt using other methods. After completing the model, we are able to “Autopo” or retopologize our high-poly glacier in 3D-Coat with pretty good results, even though we haven’t fully learned the details of the process. With a low-poly in tow we return to Blender.

UV Unwrapping a mesh in Blender for a Virtual Reality project

Back in Blender we import the meshes. This step is here because we are most familiar with UV Unwrap in Blender. After unwrapping our model we export both versions as FBX; time to start applying textures.

Creating textures in Substance Designer for meshes made in Blender for a Virtual Reality project

As we have written before, we have chosen Allegorithmic’s Substance Designer to do our texture/material work. Again, here we have a great bargain with the company offering a non-commercial/educational version for only $95. We need a Normal Map baked off the high-poly mesh, it’s easy here in Substance with their baking tools. With the Normal dragged into the Graph view we add a procedural Ice texture, tile it, blend its Normals with the mesh Normal, add a Specularity channel and publish the .SBSAR file. Now with a tiny 3MB procedural file as opposed to a multi-dozen megabyte bitmap we can take this custom texture into UDK.

Working with Substance Painter by Allegorithmic on a Virtual Reality project

We are going to fork a bit out of the workflow for a second. A few weeks ago we drove to Los Angeles for a presentation regarding Allegorithmic’s new creative software called Substance Painter. Not only did we see it in action, a few days later we were able to download the first Alpha of this 3D painting package. This is a fork because we can’t actually use this in our workflow yet. The ability to paint on 3D meshes promises to make our characters a lot more interesting than if we were just to texture them with bitmaps. Not only will we be using procedural textures out of Substance Designer, we’ll be painting directly on the models to add more layers of customization. It’s still really rough around the edges, but we can already tell that this is going to make a lot of people super happy. Last word about Painter, it was with the announcement of this software that we came to learn about Surface Mimic’s Displacement Maps, and PopcornFX that make the incredible particle effects used to paint our rock.

Previewing a mesh in Marmoset Toolbag 2.0 using a texture from Substance Designer

Back to the flow. While working in Substance Designer we get a pretty good idea of what a texture is going to look like on a material, but if we want to see our mesh in various lighting scenarios and make some quick lighting adjustments that will give us a better idea of how an asset will look in game, we bring our mesh and material into Marmoset’s Toolbag 2.0. Not only does Toolbag work with a PBR (Physically Based Render) process, we can choose from a host of included Skyboxes, bring in our high-poly meshes, set things up for rendered proofs, record animations, even preview our Substances.

Bringing together assets from Terresculptor, 3D-Coat, Blender, and Substance Designer in Unreal Development Kit for a Virtual Reality project

Now it’s time to bring all the pieces together in UDK. We first import the Heightmap along with the Weightmaps. Then we import the meshes and the Substances. The boulders become icebergs, a plane with a water material becomes a glacial lake. Load up a robot and add a boat from another scene and in just a few hours a basic environment is starting to take shape. We started from scratch and improvised along the way to see how long it would take us to prototype something out of nothing. If this were a project intended for inclusion in our world, a lot more time would be given to working out details in every aspect of the terrain, meshes, and textures.


Jan 022014

A view of town as it is being built by PSOIH

Like the days and year ahead there are things that cannot be seen until they present themselves. There’s a lot of work that goes on behind the scene, behind the curtain, and within the mind when magic is being manifested. We are aware that this is the year when a new reality is going to unfold, a place familiar, yet different; it is popularly known as Virtual Reality. With this reinterpretation of our brave new world the doors of perception will surely be knocked from their hinges. This new food of the gods is being built using the very electrons that manifest and alter our reality, we are the engineers of tomorrow and a billion futures. Happy New Year world.