OK, it wasn’t me exactly. It was more like some of my software-based representatives.
Hi, I’m Ed (@edhjones). And I’m not one of the AppsLab folks. But I’m always interested in the work they do, so I try to hang out with them whenever we’re co-located. I was intrigued then when, a couple of months prior to OOW16, I get a mail from Jake (@jkuramot) CC’ing Raymond (@yuhuaxie). It said, simply:
“Ed did something cool w APEX and Minecraft that he showed at Kscope15 … you two should talk and share notes.”
What was this cool something that I did? For starters, whilst undoubtedly cool (even though I say so myself!) it wasn’t really Minecraft. Although, to be fair, it did look quite a bit like it, thanks to my rather basic 3D modeling skills, and because I borrowed some textures from BDCraft. It’s actually something that I whipped up for the APEX Open Mic night at the Kscope15 conference. It was just an experiment at the time, so I was very excited that the AppsLab wizards might be able to put it to some use.
The Original
It’s a web application running on Oracle’s internally hosted APEX (#orclapex) instance. It uses Three.js to present an interactive 3D visualization of information in the Oracle database. And it just so happens that that visualization looks somewhat like a blocky character walking around in a low-poly world. The data in question is provided by back-end service calls to the US Geological Survey’s point query service, which is then cached in the database and provided to clients as streamed chunks of JSON. In the case of the demo, the elevation data was used to simulate a scaled down version of Hawaii.
Other service calls reach out to the Clara.io browser-based 3D modeling and animation software, from where some of the character models are loaded on-the-fly. Other scenery data, like rocks and trees, are generated procedurally based on pseudo-random seeds calculated from the object’s geographical location in the virtual world. No Man’s Sky, eat your heart out.
The “game” aspect of the demo is implemented as (yet more!) service calls to Oracle’s Social Network. Conversations which you create in the Social Network, and tag in a certain way, appear in the visualization as chests. You open the chest, and you see a “parchment” containing the related OSN conversation, and that gives you a clue as to where the next chest might be, and so on, until you complete the treasure hunt.
It’s also multi-player so that many people can be hunting together. And they can see each other, unlike No Man’s Sky. And it’s integrated with our internal OraTweet micro-blogging platform, built many years ago by Noel (@noelportugal) and still going strong, to allow those players to talk to each other from within the game.
But, why? As I say, it was an experiment; an experiment into the amazing capabilities that today’s modern browsers provide, specifically in the way of hardware accelerated 3D graphics and HTML5 audio, and it’s a demonstration of how seamlessly Oracle Application Express (APEX) is able to interface to a multitude of external services, and efficiently handle large volumes of data. There are a lot of data-points in a map of Hawaii. It was (IMHO) a cool experiment, I’d moved on to other things, but now it was about to get a new lease of life.
The Remix
The discussions kicked-off with Jake and Raymond mentioning that they were investigating some interesting experimental control schemes and devices, but they needed something (fun, preferably) to control. Exactly what those control schemes are will be the subject of a future post from Raymond but, suffice to say; if we could resurrect my experiment, and connect it up to these devices, then that surely would be a cool demo for Oracle OpenWorld.
Since I didn’t know what environment I’d be running in (it might not have access to Oracle’s internal network, or any network at all, for example) I wanted to make it a bit easier to move the application around and I wanted to reduce the dependencies upon other systems. So, here’s what I did:
- I modified the original APEX application to, instead of serving up an interactive web-application, just serve up the data upon which that application relied, so I was able to create a new (simplified) version of the application running more or less from static files.
- Ported (a simplified version of) the back-end logic from the original PL/SQL to Node.js running a locally hosted Express-based server.
- Added gamepad support via the HTML5 API, because I have a PS4 controller that’s color-coordinated with my Mac. 😉
- I wanted to make it prettier, so I added a reflective animated water shader. Showcasing, even more, how powerful browser-based 3D has become.
- I added decent collision detection (in the original demo I just cunningly avoided running through anything!) by integrating the Matter.js library. Modern JavaScript is more performant than you might imagine.
- Since I’d removed many of the external service calls, including those to the Oracle Social Network, we needed something for the players to do; the chests now shower you with treasure courtesy of GPU driven particle effects!
- And there’s also pigs! Initially they’re trapped in pens around the island, but you can push the barriers out of the way and set them free. Then they just wander around aimlessly until they, rather sadly, fall into the sea.
- But all is not lost because, fortunately, you are equipped with a “Magic Ball” which looks strangely familiar. You can launch this toward any pigs you’ve freed, capture them, and teleport them to a place of safety.
All fun and games. But we still needed some kind of controls. And, at this point, I had no concrete idea of exactly what scheme Raymond was dreaming up. So, we needed a “loose” way of providing bi-directional communications between the game and something.
The browser client then was connected up to the server using socket.io to facilitate real-time communication between the two. When certain events happen in the client (you rescue a pig, for example) then messages are sent to the server, when the server sends certain messages (for example, a command to “push” something) then the client performs a pre-determined action, like pushing a barrier out of the way.
At the server end, I added functionality to listen for messages sent to specific MQTT channels, interpret them and pass appropriate actions on through the websocket connection to the browser client. The theory being; we can now connect up any input device, even remote ones and multiple different ones, as long as they’re able to send the right messages to the right channel on an MQTT broker somewhere.
To test this out, before we had the real control devices available, I simply used jQuery Mobile to whip up a quick interface for my phone (served from the same Node.js server as the main application) which sends messages to the broker which then get passed on to the client. It’s a pretty cool experiment to be able to control a 3D world that’s hosted on my MacBook (but deployable to any Node.js application container platform, I used Modulus) running in Chrome, on a gaming PC displayed on your TV in your lounge, from an interface your phone, whilst standing on the sidewalk at the opposite end of the street, through messages being bounced from my tiny home town in Australia via an MQTT channel on the other side of the planet.
At this point, I made a final push to github and was done. Now it was up to Raymond to weave his Maker Magic and connect up his innovative control devices. Happy that my little 3D people and pigs would be off on their own to Oracle OpenWorld 2016, I simply left it in the more than capable hands of our beta-testers.
@Jake: did you guys do any usability testing? Or did you consider the event at OOW to be “usability field testing”?
Thao is going to perma-ban me from commenting on your blog LOL
@Joyce: No, this is purely research for now, exposing a concept in a fun, engaging way to elicit feedback and responses from people.