The biggest news coming out of the Game Developers Conference (GDC) in San Francisco might be about the next gen Occulus Rift Dev kit or the new Sony “Project Morpheus.” However, the true sneaker surprise is currently nothing more than a footnote in the announcement of the Unity 5 engine. That small but vitally important announcement was the new partnership between Firefox and Unity in creating a plugin free browser experience that uses Unity as the content controller. This is not the only partnership between Firefox and a game engine, Unreal 4 has also been migrated to the browser.
People often miss the implications of a stronger browser integration for AAA level browser content. However, when put into the context of next-gen immersive technologies, you need to consider what types of interactions users will want to experience as they browse the web. Will a static 2D page remain the standard in the future where augmented and virtual reality devices permeate the interaction space? I suspect that 2D browsing won’t vanish, but that 3D web experiences will feel more natural to people from an HCI perspective, for applications where word processing isn’t vitally important.
Creating a heavy client (local content) WebGL system might sound counter-intuitive in this apex age of the “cloud.” However, utilizing local resources rather than a streaming service such as Onlive or nVidia Grid is actually starting to make sense. We live in a world where Moore’s law continues for chip design and parallelism on the GPU, but non-commerical bandwidth provided by ISPs has remained stagnant for the last 5 years. Recently though, monolithic Comcast has annexed Time Warner, and Verizon has effectively killed net neutrality.
These big self serving monopolies no longer need to innovate on residential speeds, they can instead focus their resourced now on the important business task of killing off all content competitors who are reliant on their services. (Netflix, GoogleTV, Amazon Prime, P2P, maybe all WebRTC.) We are headed into a “cloud-service” dark age. Low bandwidth content won’t suffer, but video and web-gaming is being forced into an arena where either it pays up or it won’t work. This might lift a decade from now if Google fiber or gigabit WiMax/LTE appears for reasonable price… But otherwise, we should settle in because winter is coming.
Google Chrome Experiments and Firefox helped to get some of these initiatives started, but truly, the advent of git & mercurial, and the generosity of superbrains like “Mr. Doob” helped to shape the popular ThreeJS engine. However, despite the “Awesome Factor” of these exciting new technologies, none has truly emerged yet which has the full capabilities of a modern game engine. Unity had its plugin based web player, but it wasn’t the same render environment as the primary engine itself. This new iteration of Unity looks to be essentially the full engine, maybe with lower poly-count.
Here’s the most popular of the WebGL experiments to date:
- ThreeJS: The favorite of most WebGL devs everywhere. It’s free, open source, well documented, allows for lower language shader integration. The /r/Simulate team used it for our WebHexPlanet app. Everyone loves ThreeJS! COLLADA to JSON exists for loading models but animations are still challenging.
- BabylonJS: Originally created as Microsoft Codeplex project, but eventually released under Apache 2.0 license. It handles very similarly to ThreeJS in terms of scene library calls and animation. It’s not been around as long as Three though, so it has less extensions at the moment.
- Goo Engine: Proprietary software, but has a lot of animation focused libraries. The idea is that Goo would like to be interaction focused instead of scene focused. I imagine only time will tell.
- SceneJS: The implementation of the SceneJS API includes a scene graph engine, which uses JSON to create and manipulate nodes in the graph. This is similar to the architecture designed by Aaron for MetaSim, which worked on top of ThreeJS.
- Virtual World Framework: The VWF was founded originally with DoD money, but is now open under Apache license. VWF utilizes NodeJS with web sockets for a messaging layer. There is also an impressive Virtual Sandbox with the inclusion of authoring tools and instance storage. This project is very powerful and probably the most underlooked for its capabilities.
- More are listed at WebGL-Game-Engines.com
However, this has all been the how and the what, but I need to elaborate on the why. WebGL is important because of the next generation web discussed earlier. Augmented and Virtual Reality hardware is starting to proliferate among consumer devices, such as the newest clash between Sony and Occulus. Whereas the decade of the naught years was focused on the hardware wars, this decade will begin to focus on the peripheral wars. Visual immersion (Occulus), tactile sensing (touch screens), and full body motion (Kinnect) have already become part of the entertainment experience. These technologies are only going to improve as new types of devices enter every 6-18 months. The mobile market is sluggishly toying with Google glass, but once contact lens AR is fully commercialized, it will be difficult for the public to resist the utility of full AR immersion.
Which leads us back to WebGL. Once we have undergone what Kurzweil defines as the transition from mobility to ubiquity, the web will not be something that just exists on a pocket device, it will be everywhere. Our world is 3D, so we will need to have fast-deploy web standards which operate in a 3D space. Building this infrastructure on the existing technology of the web will mean that augmented locations can be visited as easily as a web page is today. It will feel more intuitive than reading a four inch screen, and may very well become the most common method of human interaction. Certainly screen resolution has been trending upward much faster recently than it ever has before.
Imagine Skype/Facetime on steroids, cameras and lidar pick up the room around you, generate a 3D model of your friends, and then display them as if they were there with interpolation for smooth animation. All the current signs indicate that this will evolve from web standards, not some other universally compiled set of rendering standards. It will probably even utilize HTTP for asset streaming and latency-agnostic communications.