Kscope15 Impressions

July 6th, 2015 2 Comments

As per Jake’s post, we got to spend a few days in Florida to support the Scavenger Hunt that we created for the Kscope15 conference.  Since it ran pretty smoothly, we were able to attend a few sessions and mingle with the attendees and speakers, here are my impressions of the event.

IMG_20150620_062305

This was my first time at Kscope.  Jake hyped it up as a not-to-miss conference for Oracle developers and despite my high expectations of the event, it did not disappoint.  The actual conference started Sunday but we arrived Saturday to setup everything for the Scavenger Hunt, dot a few i’s and cross some t’s.

We also ran a quick training session for the organizers helping with the administration of the Scavenger Hunt and later that night started with actually registering players for the hunt.  We signed up about 100 people on the first evening.  Registration continued Sunday morning and we picked up about 50 more players for a grand total of 150, not bad for our first Scavenger Hunt.

IMG_20150621_180905

The number of sessions was a bit overwhelming so I decided to focus on the Database Development and the Application Express track and picked a few session from those tracks.  The first one I attended was called “JSON and Oracle: A Powerful Combination” where Dan McGhan (@dmcghan) from Oracle, explained how to produce JSON from data in an Oracle Database, how to consume JSON in the Oracle Database and even how to use it in queries.

It turns out that Oracle 12.1.0.2 has some new, really cool features to work with JSON so be sure to check those out.  Interestingly, our Scavenger Hunt backend is using some of these techniques, and we got some great tips from Dan on how to improve what we were doing. So thanks for that Dan!

Next I went to “A Primer on Web Components in APEX” presented by my countryman Dimitri Gielis (@dgielis).  In this session, Dimitri demonstrated how you can easily integrate Web Components into an APEX application.  He showed an impressive demo of a camera component that took a picture right from the web application and stored it on the database.  He also demoed a component that integrated voice control into an APEX application, this allowed him to “ask” the database for a row and it would retrieve that row and show it on the screen, very cool stuff.

That night also featured the infamous “APEX Open Mic” where anybody can walk up to the mic and get five minutes to show off what they’ve built with APEX, no judging, no winners or losers, just sharing with the community, and I must say, some really impressive applications where shown, not the least of which one by Ed Jones (@edhjones) from Oracle, who managed to create a Minecraft-like game based on Oracle Social Network (OSN) data where treasure chests in the game represent OSN conversations. Opening the chest opens the conversation in OSN. Be sure to check out his video!

The next day, I attend two more sessions, one by our very own Noel Portugal (@noelportugal) and our Group Vice President, Jeremy Ashley (@jrwashley), I am sure they will tell you all about this through this channel or another so I am leaving that one for them.

IMG_20150621_181205

The other session was called “An Introduction to JavaScript Apps on the Oracle Database,” presented by Dan McGhan.  Dan demonstrated how you can use Node.js to enhance your APEX application with among other things, WebSocket functionality, something not natively offered by APEX.  Here I also learned that Oracle 12c has a feature that allows you to “listen” for particular changes in the database and then broadcast these changes to interested parties (Node.js and then WebSockets in this case), this is for sure something that we are going to be using in the future in some of our demos.

The 3rd day was Hands-On day and I attend 2 more sessions , first “Intro to Oracle REST Data Services” by Kris Rice (@krisrice) from Oracle, and then “Soup-to-Nuts of Building APEX Applications” by David Peake (@orcl_dpeake) from Oracle.

In the first one we were introduced to ORDS, a feature in the Oracle DB that allows you to create REST services straight on top of the Database, no middle tier required!  I’ve seen this before in MySQL, but I did not know you could also do this in an Oracle DB. Again this is a supper powerful feature that we will be using for sure in future projects.

The second, two-hour, session was a walk through of a full fledged APEX application from start to finish by the always entertaining David Peake.  I must admit that by that time I was pretty much done, and I left the session half way through building my application. However, Raymond @yuhuaxie) managed to sit through the whole thing so maybe he can give some comments on this session.

All I can say is that APEX 5.0 was extremely easy to get started with and build a nice Web Application with.

And that was KScope15 in a nutshell for me.  It was an awesome, exhausting experience, and I hope I can be there again in 2016.

Cheers,

Mark.

Seven Weeks with the Fitbit Surge

July 2nd, 2015 3 Comments

As my wearables odyssey continues, it’s time to document my time with the Fitbit Surge.

I ended up wearing the Surge a lot longer than I’d worn the Nike+ Fuelband, the Basis Peak and the Jawbone UP24 because June was a busy month, and I didn’t have time to switch.

For comparison’s sake, I suggest you read Ultan’s (@ultan) review of the Surge. He’s a hardcore fitness dude, and I’m much more a have-to-don’t-like-to exercise guy, which makes for a nice companion read.

As usual, this isn’t a review, more loosely-coupled observations. You can find lots of credible reviews of the Surge, billed as a “Super Watch” by the recently IPO’ed Fitbit, e.g. this one from Engadget.

Here we go.

The watch

As with most of the other wearables I’ve used, the Surge must be setup from software installed on a computer. It also requires the use of a weird USB doohickey for pairing, after which the watch firmware updates.

IMG_20150512_070824

I get why they provide ways for people to sync to software installed on computers, but I wonder how many users really eschew the smartphone app or don’t have a smartphone.

Anyway, despite Fitbit Connect, the software you have to install, saying the firmware update process will take five to ten minutes, my update took much longer, like 30 minutes.

Physically, the Surge is chunky. Its shape reminds me of a door-stop, like a wedge. While this looks weird, it’s really a nice design idea, essentially tilting the display toward the user, making it easier to read at a glance.

IMG_20150513_063914

I found wearing the device to be comfortable, although the rubber of the band did make my skin clammy after a while, see the Epilogue for more on that.

The display is easy to read in any light, and the backlight comes on automatically in low light conditions.

Surge carries water resistant rating of 5 ATM, which amounts to 50 meters deep, but for some reason, Fitbit advises against submerging it. Weird, right?

Not one to follow directions, I took the Surge in a pool with no ill effects. However, once or twice during my post-workout steam, the display did show some condensation under the glass. So, who knows?

The device interface is a combination of touches and three physical buttons, all easy to learn through quick experimentation.

The watch screens show the day’s activity in steps, calories burned, miles, and floors climbed. It also tracks heart rate via an optical heart rate sensor.

In addition, you can start specific activity tracking from the device including outdoor running with GPS tracking, which Ultan used quite a lot, and from what I’ve read, is the Surge’s money feature. I only run indoors on a treadmill (lame), so I didn’t test this feature.

The Surge does have a treadmill activity, but I found its mileage calculation varied from the treadmill’s, e.g. 3.30 miles on the treadmill equated to 2.54 on the Surge. Not a big deal to me, especially given how difficult tracking mileage would be for a device to get right through sensors.

Speaking of, the Surge packs a nice array of sensors. In addition to the aforementioned GPS and optical heart rate sensor, it also sports a 3-axis accelerometer and a 3-axis gyroscope.

The Surge tracks sleep automatically, although I’m not sure how. Seemed to be magically accurate though.

Fitbit advertises the Surge’s battery life as seven days, but in practice, I only got about four or five days per charge. Luckily, Fitbit will inform you when the battery gets low via app notifications and email, both of which are nice.

Happily, the battery charges very quickly, albeit via a proprietary charging cord. Lose that cord, and you’re toast. I misplaced mine, which effectively ended this experiment.

The app and data

As Ultan mentioned in his post, the Fitbit Aria wifi scale makes using any Fitbit device better. I’ve had an Aria for a few years, but never really used it. So, this was a great chance to try it with the Surge.

Fitbit provides both mobile and web apps to track data.

I mostly used the mobile app which shows a daily view of activity, weight and food consumption, if you choose to track that manually. Tapping any item shows you details, and you can swipe between days.

It’s all very well-done, easy to use, and they do a nice job of packing a lot information into a small screen.

From within the app, you can set up phone notifications for texts and calls, a feature I really liked from wearing the Basis Peak.

Noel, send me a text message.

Noel, send me a text message.

Unfortunately, I only got notified about half the time, not ideal, and I’m not the only one with this issue. Danny Bryant (@dbcapoeira) and I chatted about our Surge experiences at Kscope, and he mentioned this as an issue for him as well.

Fitibit offers Challenges to encourage social fitness competition, which seems nice, but not for me. There are badges for milestones too, like walking 500 miles, climbing 500 floors, etc. Nice.

Sleep tracking on the mobile app is pretty basic, showing number of times awake and number of times restless.

Fitbit’s web app is a dashboard showing the same information in a larger format. They hide some key insights in the Log section, e.g. the sleep data in there is more detailed than what the dashboard shows.

Fitbit Dashboard

Fitbit Dashboard

Track My Sleep on Fitbit

Fitbit Log

Track My Activities on Fitbit

Fitbit Log

I have to say I prefer the Jawbone approach to viewing data; they only have a mobile app which dictates the entire experience and keeps it focused.

Fitbit sends weekly summary emails too, so yet another way to view your data. I like the emails, especially the fun data point about my average time to fall asleep for the week, usually zero minutes. I guess this particular week I was well-rested.

fitbitEmail

I did have some time zone issues when I went to Florida. The watch didn’t update automatically, and I did some digging and found a help article about traveling with your Fitbit with this tip:

Loss of data can occur if the “Set Automatically” timezone option in the app’s “Settings” is on. Toggle the “Set Automatically” timezone option to off.

So for the entire week in Hollywood, my watch was three hours slow, not a good look for a watch.

And finally, data export out of Fitbit’s ecosystem is available, at a cost. Export is a premium feature. “Your data belongs to you!” for for $50 a year. Some consolation though, they offer a free trial for a week, so I grabbed my data for free, at least this time.

Overall, the Surge compares favorably to the Basis Peak, but unlike the Jawbone UP24, I didn’t feel sad when the experiment ended.

Epilogue

Perhaps you’ll recall that Fitbit’s newer devices have been causing rashes for some users. I’m one of those users. I’m reporting this because it happened, not as an indictment of the device.

I wore the Surge for seven weeks, pretty much all the time. When I took it off to end the experiment, my wife noticed a nasty red spot on the outer side of my arm. I hadn’t seen it, and I probably would never have noticed.

It doesn’t itch or anything, just looks gnarly. After two days, it seems to be resolving, no harm, no foul.

The rash doesn’t really affect how I view the device, although if I wear the Surge again, I’ll remember to give my skin a break periodically.

One unexpected side effect of not wearing a device as the rash clears up is that unquantified days feel weird. I wonder why I do things if they’re not being quantified. Being healthy for its own sake isn’t enough. I need that extra dopamine from achieving something quantifiable.

Strange, right?

Find the comments.

The Week That Was Kscope15

June 30th, 2015 Leave a Comment

Noel (@noelportugal), Raymond (@yuhuaxie), Mark (@mvilrokx) and I traveled to sunny Hollywood, Florida last week to attend Kscope15 (#kscope15), the annual conference of the Oracle Development Tools User Group (@odtug).

Check out some highlights of our week.

If you read here, you probably know that this year, Noel had cooked up something new and different for the conference, a scavenger hunt.

This year was my fourth Kscope, and as we have in past years, we planned to do something fun. At the end of Kscope14, Monty Latiolais (@monty_odtug), the President of the ODTUG Board of Directors, approached us to collaborate on something cool for Kscope15.

We didn’t know what exactly, but we all wanted to do something new, something fun, something befitting of Kscope, which is always a great conference. So, we spent the next few months chatting with Crystal (@crystal_walton), Lauren (@lprezby) and Danny (@dbcapoeira) intermittently, developing ideas.

We eventually settled on a scavenger hunt, which would allow attendees to experience all the best parts of the conference, almost like a guided tour.

Once we had a list of tasks, Noel developed the game, and with Mark and Raymond pitching in, they built it over the course of a few months. Tasks were completed one of three ways, by checking in to a Raspberry Pi station via NFC, by staff confirmation, and by tweeting a picture or video with the right hashtags.

We arrived in Hollywood unsure of how many players we’d get. We didn’t do much promotion in advance, and we decided to limit the game to 500 players to ensure it didn’t get too crazy.

Over the first few days, we registered nearly 150 players, and of them, about 100 completed at least one task, both well above my conservative expectations.

During the conference, we had a core of about 10-20 dedicated players who made the game fun to watch. They jockeyed back and forth in the top spots, trolling each other on Twitter, and waiting to complete tasks to allow fleeting hope to the other players.

In the end, we had a tie that we had to break at the conference’s closing session. Here are the final standings:

winners

Congratulations winners, and thank you to everyone who played for making the game a success.

And finally an enormous thank you to ODTUG and the Kscope15 organizers for allowing us this opportunity. We’re already noodling ways to improve the game for Kscope16 in Chicago.

Stay tuned for other Kscope15 posts.

QS15: Measurement with Meaning

June 28th, 2015 3 Comments

Walking into something as a newcomer is always an adventure of reality interacting with expectations. Though I wasn’t quite sure what to expect at the Quantified Self conference, it wasn’t what I expected. But in a good way.

QS15 Twitter robot

Tweet-painting robot at QS15

The conference was structured around three main activities: talks given on the main stage, breakout sessions, which took place at different smaller areas during the talks, and break times, where one might check out the vendors, grab a snack, or chat with fellow attendees.

The talks, about ten minutes each, were mostly about the speaker’s successes in changing some aspect of their life via quantifying and analyzing it. This is partly what I wasn’t expecting—the goal-focused and very positive nature of (most) everyone’s projects.

True, some of the presenters might be tallied on the obsessive side of the spectrum, but by and large, it was all about improving your life, and not recording everything as a method of self-preservation.

On this last point, one presenter even provided this quote from Nabokov, which generated a touch of controversy: “the collecting of daily details … is always a poor method of self-preservation.”

One important theme I saw, however, is the role of measuring itself—that the very act of quantifying your behaviors, whether it’s diet, exercise, TV watching, or your productivity, can change your behavior for the better.

Granted, there can also be profound personal insights from analyzing the data, especially when combining multiple sources, but it’s possible some of these benefits come from simply tracking. Especially when it’s done manually, which takes a great deal of persistence, with many people petering out after a few weeks at the most.

This presents an interesting question about technology’s increasing proficiency at passive tracking, and the aim to provide insights automatically. For instance, the Jawbone UP platform’s Smart Coach is supposed look at your exercise and activity data with your sleep data and give you advice about how to get better sleep.

If someone had tracked this manually, and done the analysis themselves, they may not only be a lot more familiar with the facts about their own sleep and exercise, but any insights derived might be more likely to be absorbed and translate to genuine change.

When insights are automatically provided will they lead to just as much adoption?

Probably not, but they could reach a lot more people who may not be able to keep up with measuring. So it’s probably still a good thing in the end.

The other important theme was something that I’ve also been encountering in other areas of my work—the importance of good questions.

For most of the QS projects, this took the form of achieving a personal goal, but sometimes it was simply a specific inquiry into a realm of one’s life. Just looking at data can be interesting, but without a good question motivating an analysis, it’s often not very useful.

In the worst case, you can find spurious connections and correlations within a large set of data that may get you off in the wrong direction.

And while at the beginning of the conference it was made clear that QS15 was not a tech conference, there was plenty of cool technology in the main hall to check out and discuss.

There are too many to cover in much detail, but here are a few that intrigued me:

The conference had a lot to offer—some inspiration, some cool technologies, surprisingly good lunches, and quite a bit to think about.

More About Me at QS15

June 27th, 2015 2 Comments

I always thought of myself as a control freak, Type A, self-aware (flaws and all) person but then I attended the Quantified Self Conference last week in San Francisco.

Image from QS15

Image from QS15

There is so much more one can do to learn about one’s self. The possibilities are endless on what I can quantify (measure about myself) and there are so many people capturing many surprising things.

Quantified Self, if you haven’t heard, is “a collaboration of users and tool makers who share an interest in self knowledge through self-tracking,” as described by by Gary Wolf and Kevin Kelly. I’ve also been an admirer of Nicholas Felton, who has beautiful visualizations of his data.

The two-day conference consisted of morning and afternoon plenary sessions, and in between, the day is filled with ten-minute talks on the main stage (where practitioners share their own QS work, tools, and personal data), with breakout sessions for group discussions and office hours for hands-on help happening concurrently. There were plenty of topics for a newbie QS-er like me or a longtime enthusiast.

My conference experience in numbers:

Videos and presentations should be posted in the coming weeks but until then, here is a summary of from Gary Wolf.

Beyond the numbers, I was surprised, inspired and learned a few lessons. It is amazing what quantified self-ers are capturing, the extent and effort they take, and their life changing impacts. There is plenty of fitness, diet, and health tracking happening, but others are tracking things such as:

The list goes on but this sampling gives you a sense of the range of self tracking.

While lots of recording was being done with commonly available sensors, devices, and apps, there was a lot of data being recorded manually through pen-paper journals and spreadsheets.

There are endless measures (and many low and high tech tools) but recording is not the end goal. The measures help inform our goals and the actions to achieve those goals. There were several talks about the importance of self-tracking to understand your numbers, your similarities and your differences to population normals.

In “Beyond Normal: A Conversation,” Dawn Nafus (@dawnnafus) and Anne Wright (@annerwright) discussed the importance of self-tracking to gain awareness on whether the standards, baselines, and conventions apply to you. Population normals are a good starting point but they shouldn’t define your target as you are unique and the normals may not be right for you (#resistemplotment).

Image from QS15

Image from QS15

My takeaway, don’t worry about getting the perfect device or tool. Start with finding a goal or change that is important to you. Record, measure, and analyze – glean insights that move you along to being your best self. It is not about the Q but the S.

Mid-June Roundup

June 16th, 2015 Leave a Comment

A busy June is half over now, but we still have miles to go before July.

We’ve been busy, which you know if you read here. Raymond went to Boston. Tony, Thao (@thaobnguyen), Ben and I were in Las Vegas at OHUG 15. John and Thao were in Minneapolis the week before that. Oh, and Anthony was at Google I/O.

The globetrotting continues this week, as John and Anthony (@anthonyslai) are in the UK giving a workshop on Visualizations at the OUAB meeting. Plus, Thao and Ben are attending the QS15 conference in San Francisco.

And next week, Noel (@noelportugal), Raymond, Mark (@mvilrokx) and I head to Hollywood, FL for Kscope15 (#kscope15).

Did you hear we’re collaborating with the awesome organizers (@odtug) to put on a super fun and cool Scavenger Hunt? If you’re going to Kscope15, you should register.

You can do it now, I’ll wait.

Back? Good check out the sweet infographic Tony C. on our team created for the big Hunt:

posterLayout

Coincidentally, one of the tasks is to attend our OAUX session on Tuesday at 2pm, “Smart Things All Around.” Jeremy Ashley (@jrwashley), our GVP, and Noel will talk about the Scavenger Hunt, IoT, new experiences, design philosophies, all that good stuff.

Speaking of philosophies, VoX has a post on glance-scan-commit the design philosophy that informs our research and development, and more importantly, how glance-scan-commit trickles into product. You should read it.

And finally, Ultan (@ultan) and Mark collaborated on a post about partners, APIs, PaaS and IoT that you should also read, if only so you can drop a PaaS4SaaS into your next conversation.

If you’re attending any of these upcoming events, say hi to us, and look for updates here.

A Framework for Wearables, Glance

June 12th, 2015 Leave a Comment

Not long ago, Ultan (@ultan) wrote about our framework for wearable, and other, devices. We’re calling it Glance to reflect the OAUX glance-scan-commit design philosophy.

Noel (@noelportugal) produced a video highlighting Glance on several smartwatches as well as in the car, on Android Auto.

It’s pretty sweet. Check it out:

Glance has been in the works for more than a year now, and it arose out of our collective frustration with the effort involved developing for multiple device SDKs.

The goal of Glance is to do 75-80% of the overlapping work: calling Oracle Cloud Applications APIs, working with required cloud services like Apple Push Notifications and Google Cloud Messaging, deploying a companion mobile application, built in Oracle’s Mobile Application Framework, of course.

With all that done, we can build for and plug in new devices (ahem, Pebble Time) much more easily and with much less effort. Initially, we built Glance to support the original Pebble and Android Wear smartwatches, and the Apple Watch was our first proof-point for it.

IMG_0098

IMG_0875

We’re happy with the results so far, and Glance has made it much easier for us to build prototypes on new devices. Now, if only we could get access to CarPlay.

Some Wearable Tech Glances Beat Others

June 12th, 2015 Leave a Comment

This idea of the glanceable user experience of wearable technology is now everywhere.

They’re all at it.

Testing out the Apple Watch in Oracle HQ’s Fitness Center spin class. On the road is another matter.

Testing out the Apple Watch in Oracle HQ’s Fitness Center spin class. On the road is another matter.

There is the OG Misfit Wearables Shine, Apple’s Glances, and of our course the Oracle Applications Cloud User Experience (@usableapps) concept of glance on the smartwatch, part of our Glance, Scan, Commit design philosophy.

But, not all glances are equal. How well a glance works for the wearer depends on the user experience notion of context of use: the wearer, the type device, what the wearer’s up to at the time, the information they need, the connectivity, et cetera.

Take road cycling for example. I find the Google Glass cards-based glance as applied to the Strava Glassware totally awesome when cycling. Still.

Strava glassware. Heads-up performance details on a card.

Strava glassware. Heads-up performance details on a card.

Glass is a heads-up device, so that means eyes on the road. Combined with the audio updates on my cycled segments and so on, it’s a fantastic UX. It’s convenient. It doesn’t distract me. And, it’s safe. I don’t have to look down at my wrist and take my eyes off the road even for a second to glance at the important stuff.

Ain’t no stoppin’ us now. Except to look at smartwatches.

Ain’t no stoppin’ us now. Except to look at smartwatches.

Looking down at my wrist or changing hand position to glance at my progress on a smartwatch such as my Apple Watch Activity or Workout built-in apps, at my Fitbit Surge Bike stats, or at my Motorola Moto 360 Android Wear Google Fit analytics while hammering along on a bike at 30 mph on a public road is just too risky for me.

Glancing at these smartwatches’ UIs later, of course is great, whether it’s for progress on miles, calories, duration, or even to ensure that important data’s actually being sent to the cloud where I can do more with it.

Fight for pink. Moto 360 glance notification of Strava bike ride data heading to the cloud.

Fight for pink. Moto 360 glance notification of Strava bike ride data heading to the cloud.

I have the same opinion about heads-up glance on devices like Google Glass when I am running, though the durability of Google Glass, battery life, and still having to pair it with another device is a pain.

No sweat. Context of use for sporting devices needs to bear in mind personal weather conditions.

No sweat. Context of use for sporting devices needs to bear in mind personal weather conditions.

Running in cities requires you to keep your wits about you: be sharp and look ahead. Glancing down from the upcoming path even for a second might mean going home with an injury or worse. Generally, with my smartwatches, when I’m out running, I’ll glance at the data or analytics when I stop at a traffic signal rely on the audio update from my paired smartphone (although it ruins the music) on occasion.

Somewhere to run in London with the Apple Watch Glances (see what I did there?)

Somewhere to run in London with the Apple Watch Glances, see what I did there?

The ability to glance at performance statistics in heads-up mode, combined with those audio progress reports in your ear, is the way to go when cycling and running with wearable tech IMHO. Arguably, too, an audio component is “glance for the ears”. Glance should be multi-mode and not just about the visual, not least for accessibility reasons. We can’t all see as well as each other.

Activity wearable tech designers and developers take note. Eyes on the prize, or road in this case, please. It’s a good reminder about the importance of context of use when gathering user requirements.

Sweet Dreams at the EyeO Festival

June 10th, 2015 Leave a Comment

 

cherryjohn-at-eyeO


I often tell people that you need both a left brain and a right brain to be a software designer: a left brain to analyze and understand, a right brain to dream and wonder. The EyeO Festival which Thao and I just attended in Minneapolis, was food for our right brains.

EyeO is about the intersection of art and code: generative artists (who use data and algorithms to produce works of art), art installations (which often require sophisticated coding), and those who see coding itself as an art form. It is not so much about knowledge transfer as it about building a community, meeting world-class data artists and hearing their back stories.

I attended fourteen talks in all and saw many wonders.

The JPL crew controlling the Mars rover use Microsoft HoloLens goggles to create an augmented reality, allowing scientists in remote locations to stand together on the surface of the planet. Each person sees their own desk, chair and monitor sitting in a crater with the rover just a few feet away. As their eyes scan the area, little dots of light show where each person is looking; when they use their mouse to teleport to a nearby ridge, others see their avatars walk to the new location. They can even walk around the rover and point to where it should go next.

The design team at nervo.us (she’s a biologist, he’s a physicist) is interested in how complex forms arise in nature from cells growing at different rates. Using their own custom software, they create spectacular simulations and turn these into 3-D printed art objects. One of their most stunning creations is a kinematics dress, made supple using thousands of tiny interlocking plastic hinges perfectly fitted to the laser-scanned image of a customer’s body. With scary-hard math, they generalize a moving body from a single scan, compute not just how the dress will look but how it will hang and twirl, and even prefold it so that it will fit in today’s small 3-D printers.

Perhaps the most jaw-dropping demonstration was a sneak preview of “Connected Worlds,” an installation that will be opening soon at the New York Hall of Science. Three years in the making, it creates a Star Trek style holodeck with a 50-foot waterfall and six distinct biomes populated by whimsical plants and animals. Children move physical logs to redirect virtual water into the various biomes; if they make the right decisions wonderful trees will grow and attract ever more magical animals. The team at Design I/O described technical challenges and lessons learned, some of which might be applicable to future AppsLab motion-tracking projects.

One of the topics I found most stimulating was new and improved coding languages. I have used Processing, a language developed specifically for artists, to create some of the interactive visualizations we show in our cloud lab. It was a thrill to meet and talk with Processing’s co-inventors and hear their plans for new evolutions of the language, including P5.js, Processing.py, and the upcoming Processing 3.0.

But the most interesting talk about languages was by a guy named Ramsey Nassar. Ramsey is an uber-coder who creates new computer languages for fun. He argues that most coders are stuck using alienating, frustrating, brittle languages created decades ago for a world that no longer exists. He wants to create languages that facilitate “post-human creativity,” new forms of creativity not possible before computers. Some of his languages, like god.js (which makes code look like biblical text) and Emojinal (made completely out of emoji), are just for fun. Others, like Alb (the first entirely Arabic coding language), Arcadia (for Unity 3D VR game development), Zajal (inspired by Processing), and Rejoice (a stack language based on Joy), are practical and mind-expanding. I plan to talk more about why coding languages should matter to designers in a future blog post.

As with any conference there were countless small discoveries, too many to report in full. Here are just a few…

Amanda Cox of the New York Times talked about making data more relatable by using geocoding to default the initial view of a large geographical dataset to the user’s own locale. Another interesting technique was having users guess what a plotted curve would look like by drawing it before showing the actual curve.

One clever flourish I noticed was the use of tiny single-value pie charts placed beneath each year in the X axis of a time-driven line chart to add an extra dimension of data about each year without distracting from the main point of the chart.

Sprint, the telephone company, started out as a railroad company that used their existing right of way to plant cell towers. Sprint stands for Southern Pacific Railroad Internal Networking Telephony.

Reza Ali is an amazing generative artist who turns data and music into images, animations, and tangible objects. One of his secret weapons is ofxPro. Check out his music videos for the band OK Go.

Into LED arrays and Raspberry Pi? Check out Fadecandy.

Timescape is a visualization-based storytelling platform, currently in beta. Looks interesting.

How long does it take the New York Times team to create world-class infographics? As long as you have plus one half hour.

What kind of music do coding language nerds listen to? The Lisps of course!

My right brain is full now. Time to dream!

Asteroid Explorer Launched

June 10th, 2015 Leave a Comment

On Monday, we launched Asteroid Explorer at the Harvard-Smithsonian Center for Astrophysics.

Jeremy (@jrwashley), DJ, Kris Robison and I attended the launch event, which you can watch here. My part of the presentation begins at 36:36.

This event was the culmination of NASA’s Asteroid Hackathon event back in October. Remember that?

Here’s the event abstract:

Abstract: In October 2014, NASA’s Asteroid Hackathon event was hosted (with several other NASA partners) at the SETI institute in Mountain View, California. Team NOVA’s overall winning solution for this hackathon allowed users to explore relationships among the Minor Planet Center’s asteroid data. The elegant interface was not just eye-catching, the repeated learning that hackathon participants experienced in the “science fair” portion of judging greatly impressed the judges. More than once, people discovered relationships among asteroid data parameters that they didn’t previously know about. A perfect outcome for one of the primary goals: to increase public knowledge of asteroids. Dr José Luis Galache (Acting Deputy Director, Minor Planet Center) and DJ Ursal (Director, Product Management at Oracle) teamed up together through the Oracle Volunteering initiative to implement the winning entry from the Asteroid Hackathon on the Minor Planet Center website. On June 8th they will be launching the website as part of the for the Harvard-Smithsonian Center for Astrophysics’s Solar, Stellar and Planetary Sciences division seminar series. The team will be discussing this project as it relates to cooperation between the Minor Planet Center, NASA, Oracle Volunteering, and its goal to inform and involve the general public.

This volunteer effort is a great success, and the result is well received and appreciated by the astrophysicists attending the launch event.

IMAG6211

Jeremy Ashley, GVP Oracle Applications User Experience, speaking at the launch of Asteroid Explorer

The NASA Grand Challenge program executive Jason Kessler (@soughin) was at the White House, talking up the Asteroid Hackathon and this volunteer work there, before calling into the event via Skype.

IMAG6205

The event was live broadcast through the Minor Planet Center’s YouTube channel, and the audience at the Center was mostly astrophysicists.

On the roof-top of the Harvard-Smithsonian Center for Astrophysics, there are several telescopes, including the famous Harvard Great Refractor. But we liked this cute mini Astro Haven.

IMAG6201

A bit about Asteroid Explorer, the main part of the web tool utilizes Crossfilter, D3.js and Highcharts. I processed Asteroid data into proper slicing, grouping to feed into Crossfilter to render the interactive filter bar charts and tables; also, I created bubble chart to render series of property data for looking into correlation of any pair of properties, and dynamically react to filter bar charts range slider.

Screen Shot 2015-06-08 at 11.48.36 PM

Screen Shot 2015-06-08 at 11.49.58 PM

Google I/O 2015 & User Experience

June 3rd, 2015 Leave a Comment

Google I/O 2015 has just ended.  There are lots of aspects regarding to Google I/O.  Lets have a taste of it from the user experience perspective.

There are lots of features announced in Google I/O, and you may find a lot of focus have been around user experience this year.

First, for the pre-show, there were planets and a whale flying across multiple screens surrounding the keynote room, showcasing the latest advancement of VR and animations.  Below was the whale that someone recorded.

Next, here is the visualization of the Android mobile adoption.

This visualization is pretty impressive.  It uses the screen asset efficiently, it is easy to understand, and it shows a lot of key information in an elegant way.

As Dave Burke pointed out, “the central theme of M is improving the core user experience of Android.”  To name a few, Android M now provides simpler and more granular control for app permissions, better web experience, app links, and fingerprint support.

Before M, users have to allow all the permissions that an app requested in order to install the app.  I had a lot of issues with this approach in the past.

For example, I did wish to install Facebook app on my phone, and I did at one point.  To my dismay, the Facebook app immediately scanned through all my contacts and suggested friends to me from my contact list.  But hey, I do not wish to become a Facebook friend with my banker, please.  With the new app permissions, it is promising that I do not need to disclose my contact information to Facebook anymore if I have the Facebook app installed.

Google Now, other than showing you your traveling cards and parking cards, now drills into the user app to derive the current user context. Being able to retrieve contexts will not only allow Google to provide more relevant results, but more appropriate actions as well. Users will no longer need to switch to another app (such as the browser), look for some information, and get back to the previous app. Now they can simply go to Google Now and ask questions like “who is the author of this song,” and it will return back the results they need without losing context. This is a huge time savings for users.  Privacy is a potential concern for lots of people, but let us defer it to somewhere else.

IoT is certainly the next thing, and everyone is looking for a winner.  Google is doing the same and introduced Project Brillo and Eave as their step and direction towards this area.  With almost everything now moving to the cloud and become easily accessible, IoT development is easier than ever before.  This is the era of IoT, and it is part of user experience as well.  How can you call yourself having a great user experience when the user can not even pick and choose the device they like?

Last but not least, having Cardboard for education is such a great idea.  I now wish that I was born in this era and being able to experience and learn about geography, biology, chemistry classes in this new interactive way.

To close this off, here is me lying on a hammock and watching one of the completely packed sessions in the cafeteria.

IMG_20150529_161903

Did you go to Google I/O?  Please feel free to comment below if you have anything you would like to share.

Kscope15 Scavenger Hunt

May 29th, 2015 2 Comments

Are you attending Kscope15 (#kscope15)?

If yes, then you should join the Kscope15 Scavenger Hunt. We have partnered with the Oracle Development Tools User Group (aka @odtug) and created a fun way to get points and win prizes!

Why? The Oracle AppsLab (@theappslab) team wanted to find a user-friendly way to start a conversation with the ODTUG membership about this whole “Internet of Things” thing. Plus, we love Kscope and ODTUG!

This scavenger hunt is a fun way to start dipping your toes into an emerging technology that is going to be a major focus area for Oracle on the road ahead.

So join us, and you’ll be able to go back to work on Monday with some cool IoT talking points. Like that fact that you now know what “IoT” stands means.

Visit http://kscope15.com/scavenger and register now!

posterLayout

Stay tuned to hear more on how we built this and how we are leveraging mobile, IoT and wearable technologies for this fun activity.

Oh, and be sure to attend our Tuesday session at 2pm, “Smart Things All Around” to hear Jeremy Ashley (@jrwashley), our GVP, and me present a deep dive discussion and wax philosophically on what it all means.

But wait, there’s more. Make sure to stop by our AppsLab table to chat us up during the conference.

Busy Times Are Afoot

May 27th, 2015 1 Comment

Lots going on here in AppsLab land and in Oracle Applications User Experience (@usableapps). This here is a recap post.

Showing the Oracle Applications User Experience Roadmap to Oracle’s Asia Partners

At the beginning of May, Anthony (@anthonyslai) and Raymond joined a large contingent of the OAUX team in an two-stop tour of Asia. The first stop was Singapore.

Here’s the dynamic duo in action, setting up our demos to show to a large group of Oracle partners.

singapore

For a full download of the event, be sure to read Misha’s (@mishavaughan) recap post.

Laying out the Oracle Applications User Experience Strategy for Partners in Beijing

After Singapore, the team headed to Beijing for more partner events, and as you can, the turnout was phenomenal.

beijing

Misha has a full debrief of the Beijing leg of the tour as well.

IoT Hackathon in Guadalajara

Even though Noel (@noelportugal) was bummin he didn’t get to go to Singapore and China, his spirits brightened when Laurie (@lsptahoe) asked him to serve as a mentor for her Internet of Things (IoT) hackathon in Guadalajara.

noel_lab

That’s his happy face, as he sits in his mobile IoT lab. Check out his pet Amazon Echo there in the front. That case is full of goodies. Note the soldering iron.

Laurie has a full review of the event, and you can read about the AppsLab team’s entry here.

Coming Soon

We’ll get a brief respite, then come several conferences.

Next week, Thao (@thaobnguyen) and John will be attending Eyeo (@eyeofestival), as in the festival, not the Google conference (@googledevs). Anthony will be at that one, i.e. Google I/O, so look for his recap here next week.

On June 8, OHUG 2015 begins, and several of us will be attending, doing research and testing. Gozel (@gozelaamothhas a full rundown.

And hey, I’ll be presenting with Aylin Uysal (@aylinuysal); our session is called Oracle HCM Cloud User Experiences: Designed for Work Styles across Devices, and it’s Tuesday, June 9 at 1 PM. So, come by if you’ll be at the show.

In mid-June, Anthony and John head to the UK for the OUAB meeting, specifically to present and demo some of the team’s visualizations work.

Near the end of the month comes KScope 15 (#kscope15), and several of us will be going.  We have something special planned for ODTUG’s (@odtug) annual get-together. Stay tuned for details.

Consider yourself current.

Nymi Band Impressions

May 26th, 2015 2 Comments

Editor’s note: Here’s the first post from Osvaldo Villagrana (@vaini11a), one of our AppsLab Mexico team members. Enjoy.

During last week I’ve been playing with Nymi Discovery Kit I got back in our AT&T hackathon participation, and here are my impressions as a developer for Nymi SDK point of view and as a user.

For those who don’t know, this band is wearable biometric identity device that let’s you use your heart’s unique signature (a.k.a. Electrocardiogram or EGC) to authenticate and validate your identity.

Main problem they want to solve is avoid user remembering all passwords, PIN numbers and security codes used in our daily basis.

First off Discovery Kit includes the band, Bluetooth dongle for Windows and USB cable for charging the band. Bluetooth dongle is included because at the beginning Nymi band only could be paired with Windows OS but now can be paired through OS X and Android as well.

unbox

Nymi band material at first feels cheap and easy to bend it and break it, but it really fits very well on my wrist. Band connections terminals are very exposed in both ends of the cord to water or dust but they say is water resistant but not waterproof.

band3

Band is adjustable and can accommodate wrist sizes up to 7.5” in circumference. A full charge takes approximately two hours when you use a wall outlet or computer and battery last 3 days.

Setting up the band is requires some steps; band must be enrolled and authenticated with your own ECG using the NCA (Nymi Companion App) app available in Windows, OS X and Android. I decided use Android app this time. I tried OS X and Windows but it’s the same. Once the band is clasped on your wrist it will confirm you the charge level and immediately will enter in broadcast mode.

I found this step a bit confusing as there’s no feedback when band is already in broadcast mode so you are not quite sure if your band is ready to be discoverable. Funny thing is there’s no way to turn it off.

After band is clasped, Android app asks for putting your finger over the sensor in the band. It takes like a minute for the app to analyze and save your ECG info in the app. After that, you’re ready to pair your Nymi with any NEA (Nymi Enabled app or third party apps). Band supports just up to 7 different apps profiles (they say in coming updates will be supported more).

Anytime clasp is opened, band must be authenticated once again but with the same NCA app was before. If you want to use any other NCA app (OS X or Windows), the band should be reset and start over the setting up. This is not ideal.

NEA’s must provision a unique key-value (profile) that is saved in the band for future use and this happen only once for each NEA. The NEA should store the provision returned from the band for future communication. On subsequent usage, NEA’s validate against the provisioned Nymi band. Once validation is successful, the NEA can assume an authenticated user. All those steps must be implemented by the developer using the SDK’s for different platforms.

band4

To complete the exercise, I wrote an Android app that makes provisioning and validating flow and finally gets user authenticated if user is close enough to the device, in this case mobile or tablet. After I got authenticated my wife wore the band and tried to get authenticated but authentication failed all the time as expected.

SDK is good but needs some enhancements, though. Even at Nymi, they are having hard time with problems in their own NEA’s like the unlock app for Mac OS X that currently is not working and I have posted couple of issues and bugs I found.

As first attempt for this new authentication automatization niche, I like it, and I think is good enough.

I see a lot of potential and possibles use cases for this band in enterprise. Definitely I would use it, but what I would really love is a band that can handle authentication, sport tracking and motion, notifications and time in the same device. Probably that’s too much for now but I’m looking forward to seeing that device soon.

IoT Hackathon Field Report: Mexico Edition

May 25th, 2015 Leave a Comment

I recently ventured down to Mexico to participate in an Internet of Things (IoT) hackathon organized by Laurie Pattison’s (@lsptahoe) Apps UX Innovation Events team with some of my fellow AppsLab members, Luis Galeana, Tony Orciuoli, and Osvaldo Villagrana.

IoTgraphic

Being the lone non-developer, I wasn’t sure how much I would be able to contribute—but I had done some research pertaining to our use case, so I felt I had at least that much to offer.

Our rather illustrious use case pertained to a perennial workplace problem—lines to use the bathroom. In MDC, there is a preponderance of men, and so apparently waiting can be an issue. Some of my research has found that elsewhere, where there are more women than men, lines to use the women’s bathroom in the office place can be a serious annoyance.

Thus was born what was originally playfully titled Bathroom Management (BM), though we ended up with Presence, which would work more generally as a presence management system that could also handle conference room reservations, among other things.

I had never been part of a hackathon, but I definitely found out the appeal. As a lover of deadlines, and my own experiences coding at night (definitely the best time for coding), it seems just right for this sort of thing. Free snacks and beverages, food carts for lunch and dinner, and a beautiful view from the MDC office 9th floor, it was an excellent setting.

I was able to help intermittently with thinking through some of the logic of our scheduling system, and with our pitch at the end, so I did feel I added something, even if the lion’s share of the work was done by the other three. Being a two-day hackathon, we had one late night, which I stuck around for, and ended up reading about and playing with Python, in the hopes it might come in handy. It didn’t, but there’s always next time.

Our presentation of Presence garnered some good laughs, which we didn’t quite expect, but at least everyone was engaged. We had a great demo showing our scheduling system for bathroom stalls, which included proximity sensors, sounds, and displays in the stall, and a web interface for scheduling, as well as IM, phone, and watch notifications when the stall you reserved becomes free.

We came in third, after two other solid entries, and took home the People’s Choice award, perhaps because our solution filled a real need in the office! I did learn a lot from the other winners, particularly on how we could have pitched it better to highlight the enterprise applicability. So again, there’s always next time.

All in all I found it highly favorable, and hope I have another chance to do it again in the future.

walking

Another Take on Twilio Signal 2015

May 22nd, 2015 1 Comment

Editor’s note: Mark (@mvilrokx) and Raymond are at it again. Earlier in the week, they each provided a take on last weekend’s Bay Are Maker Faire, and this week, they both attended Twilio’s (@twilio) first developer conference, Signal. Mark’s take is here; now, it’s Raymond’s turn. Enjoy.

Twilio is no stranger to us at AppsLab. We have embedded Twilio Voice, SMS in applications such as Taleo Interview Evaluations, IoT call at Maker Faire 2014, and Daily Asteroid report, etc. It is simple yet powerful approach to achieve some real useful communication for some interesting projects.

But I never imagined Twilio is so big, that it is big enough to host a conference and get thousands of enthusiastic attendees.

They have come a long way – at the conference, they announcemed a slew of new products, and some of them are rightfully timely and empowering. A couple of samples:

I think Twilio worked on its strength to position itself really well. They strive to provide composeable API as building block (just like Lego), and make it easy for developer to embed communication capability, and non-friction from users (no need to install anything).

In the current world, you pretty much have one app for one of anything, you have one app for ordering pizza, one app for calling taxi. Let’s Magic help you, a service built on Twilio, by just texting your desire to a number, and “hopefully” your wish is fulfilled :) That’s called non-friction!

Another use case is “Code for American”. Users can text to a number, and get your card balance. Such quick easy way to access some quick information provides real “accessibility to information”.

And one more use case is “American Red Cross” for disaster response, where they can form and coordinate the ad-hoc group of volunteers, where the group may be fluid.

In retrospective, our Taleo Interview Evaluation demo build can be thought of a very good use case for providing easy access to information and transaction.

With Twilio’s new release and capabilities, I look forward to building new contextual enterprise application for easy access and interaction.

Now here is a fun bit:

As usual, the Conference gives every attendee a backpack, and this time, with a twist. It has littleBits to power a 8×8 LED panel which can be attached to the backpack.

littleBit

And during $Bash event, they have cloudBit as prize to give out. Mark and I were determined to win that cloudBit, so that we can extend the LED panel display on the backpack, to be controlled remotely over Internet! We found out the most efficient way to win points, which is by playing Pinball games. We worked together, and of course, we got what we aimed for.

cloudBit

And by the way, I became the champion of the night for Pinball games on the floor, by scoring over 430,000 points in one game.
That’s a nice surprise to me too, that I have got talent in Pinball game!

Twilio Signal Conference 2015

May 22nd, 2015 Leave a Comment

Editor’s note: If you read here, you know we heart Twilio, especially Noel (@noelportugal). Remember the Rock ’em Sock ’em robot build?

This week, Twilio (@twilio) held its first Signal conference and Raymond and I were there to see what’s new in the world of web enabled communications and the likes.

Signal-Twilio-Conference-640x265

For those of you not familiar with Twilio, here’s their spiel from their About page:

Twilio powers the future of business communications.  Enabling phones, VoIP, and messaging to be embedded into web, desktop, and mobile software.

For example, they provide REST APIs that can send and receive phone calls and text messages (SMS), allowing you, as a user of their services, to implement these extremely complex features in your applications, whether they are mobile, web or desktop apps with very little effort.  They provide many more features and announced a bunch of new ones at the conference, see their website for more details on those features.

I had no idea that Twilio is as big as it is: there were 2000 attendees at the conference and apparently, Twilio is the second largest provider of phone numbers in the us, right behind T-Mobile.

The conference started of with a pretty impressive magician’s act in which actual Twilio APIs were used, very original I thought.  It the proceeded with a bunch of keynotes, lead by the CEO of Twilio, Jeff Lawson.  He stressed the importance of services, comparing them to Lego blocks that, in the right hands, allow you to build anything by composing these services, just like you would do with Lego.

Among the lineup of key speakers was Werner Vogels, CTO of Amazon who gave a history of how Amazon moved from a monolithic architecture to a more Service Oriented Architecture, then towards Micro Services and finally towards an architecture that now aggregates these Services into useful components.  They had to build an infrastructure to support these changes which eventually led to what we now know as AWS, very interesting talk.

One other interesting topic I remember from the opening presentations was Jeff Lawson mentioning that the next big evolution in communication will be for them to become context-aware. i.e. rather than you having to enter your 17-digit account number on your phone and then having to identify yourself again and again to the agent that you get transferred to with some weird question about the street you grew up in, this information should be available when a call gets made, leading to much better quality of service and a much higher throughput of calls.

The rest consisted of product announcements and partners getting to explain how they use Twilio in their business.  We then attended a bunch of sessions, some more interesting than others, I’ll limit myself here to the more interesting ones.

4Y6A3701-640x265

Image from Twilio

I’m a huge fan of ngrok so I was delighted to attend a session by the maker of this tool, Alan Shreve.  Turns out that it was written in Go, and Alan gave a few examples of how this language made it easier to build these types of tools.  He also mentioned that rewriting an existing tool into a new language is a great way to learn that new language as you limit the scope and can focus purely on the language itself.  He also stressed  not to be discouraged if you discover that a tool already exists, competition is a good thing and it validates the business case.

Also very informative was a talk from Guillermo Rauch, the creator of socket.io of which I also am a huge fan.  The talk didn’t focus on socket.io itself, but on the challenges you will face when you start building realtime applications, something that socket.io allows you to do: conflict resolution, throughput, diffing etc.

Kate Heddleston gave a talk about One-click deploy for service-oriented architectures which is a project that she worked on that allows you to deploy (with 1 click), a fully operational environment, including load balancers, db servers etc. on Amazon EC2, using Docker.  It seemed like an excellent alternative to the likes of Heroku and I definitely will check this out more in the near future and see if this could be leverage somewhere for our work in the AppsLab.

Probably the most interesting talk of the whole conference, for me at least, was by Neil Mansilla from Runscope about API testing & debugging.  He didn’t just gave a sales pitch about Runscope but laid out a whole bunch of tools that you can use to test APIs, from Apache Benchmark to Charles and Wireshark.  I am definitely going to check out Runscope!

What I took away most from this conference though is that APIs are the future: IT infrastructure is turning into APIs (AWS), electronics is turning into APIs (littleBits) and telecommunication is turning into APIs (Twilio, of course, but also switch).  I am convinced that Enterprise apps will also evolve into this direction and Enterprise APIs will enable developers to compose and integrate easily with other, non-enterprise APIs, allowing them to build new and exciting applications, just as developers started doing with tele-communications when Twilio appeared.

Another Take on Maker Faire 2015

May 20th, 2015 Leave a Comment

Editor’s note: Here’s another Maker Faire 2015 post, this one from Raymond. Check out Mark’s (@mvilrokx) recap too for AppsLab completeness.

I went to the Maker Faire 2015 Bay Area show over the weekend. A lot of similarity to last year, but a few new things.

In place of our spot last year, it was HP-Sprout demo stations. I guess HP is the main sponsor this year.

hp-sprout

Sprout is an acquisition by HP, that they build a large touchpad and projector, as attachment to HP computer. It is kind of combination of projector, extended screen, touch screen, and working pad – that seems to blend physical things with virtual computer objects, such as capture objects into 3D graphics.

TechHive’s Mole-A-Whack is quite good station too – it is a reverse of classical Whack-A-Mole.

mole-a-whack

Here’s a video of it in action:

They use arduino-controlled Mole to whack kids who hide in the mole holes, but need raise head out of the hole cover (which is arduino-monitored), and reach to push a button (MaKey connected) to earn points.

The signals go into a Scratch program on computer for tally the winner.

This pipe organ is an impressive build:

fire-pipe-organ

As usual, lots of 3D printers, CNC mills, etc. and lots of drones flying.

Also I saw many college groups attending the events this year, bringing in all kinds of small builds for various applications.

Maker Faire 2015

May 19th, 2015 Leave a Comment

This weekend the 10th Annual Maker Faire Bay Area took place in my backyard and rather than fighting traffic for 2 days with the +130,000 attendees I decided, as I have for the last 9 years, to join them.

Unlike last year, Oracle had no presence at the Maker Faire itself, so I had plenty of time to walk around the grounds and attend sessions.  This post is an overview of what I saw and experienced in the 2 day madness that is called the Maker Faire.

For those of you who have never been to the Maker Faire, the easiest way to describe it is as a mix of Burning Man and a completely out of control hobbyist’s garage, where the hobbyist’s hobbies include, but are not limited to: everything tech related, everything food related, everything engineering related and everything art related, all wrapped up in a family friendly atmosphere, my kids love the Maker Faire.

You can find the tech giants of the world next to the one person startup, beer brewers next to crazy knitting contraptions, bus sized, fire breathing rhino’s next to giant cardboard robots etc.  And nobody takes themselves too seriously, e.g. Google was handing out Google Glasses to everybody … Google Safety Glasses that is :-)

Google Safety Goggles

My new Google Glasses :-)

The first thing I noticed was that the Faire expanded . . . again.  A huge tent was erected on what was a parking lot last year that was housing the Make:Labs, I didn’t actually get to spend any time in there but it contained an exploratorium, startup stuff and a section for Young Makers.

Which brings me to the first trend I observed, makers are getting younger and younger and the faire is doubling down on these young folk.

Don’t get me wrong, the faire has always attracted young kids, and some of them were making stuff, but there seem to be more and more of them, the projects they bring are getting more and more impressive and the faire’s expansions all seem to be to cater to these younger makers.

One of the sessions I attended was called “Meet Some Amazing Young Makers” where a 14 year old girl showed of a semi-autonomous robot that could map the inside of caves.  She was showing us the second iteration, she build the first version . . . when she was 8!  Another young man, 13, build a contraption that solved a Rubik’s cube in under 90 seconds.  It wasn’t just that they build these things, they gave solid presentations to a majority adult audience talking about their builds and future plans.

Another trend that was hard to ignore is that the Internet of Things (IoT) is getting huge and it’s definitely here to stay.  There weren’t just many, many vendors promoting their brand of IoT hardware, but a whole ecosystem is developing around them.

From tools that let you visualize all the data collected by your “things” to remote configuration and customization.  This trend will not just Cross the Chasm, it’s going to rocket right passed it.

I attended a panel discussion with Dominic Pajak (Director IoT Segments, ARM), Paul Rothman (Director of R&D at littleBits Electronics), Andrew Witte (CTO, Pebble), Alasdair Allan (scientist, tinkerer) and Pierre Roux (Atmel) about the current state of IoT and the challenges that lay ahead.

One of the interesting points raised during the discussions is that there currently is no such thing as the Internet of Things!  All these “things” have to be tethered to a phone or other internet capable device (typically using BLE), they cannot connect to the internet directly.

Furthermore, they cannot communicate with each other directly.  So it’s not really an IoT rather the regular “human internet” with regular computers/phones connecting to it, which in turn happen to have have some sensors attached to them that use the internet as a communication vehicle, but that doesn’t really roll of the tongue that well.

There is no interoperability standard at the moment so you can’t really have one device talk to a random other device.  This is one of the challenges the panel felt has to be solved in the sort term.  This could happen with the adoption of IP in BLE or some other mechanism like Fog Computing.

Another challenge brought up was securing IoT devices, especially given that some of the devices could be broadcasting extremely personal information.  This will have to be solved at the manufacturing level as well as at the application level.

Finally, they also mentioned that lowering power consumption needs to be a top priority for these devices.  Even though they have already come a long way, there still is a lot of work to be done.  The ultimate goal would be self sufficient devices that need no external power at all but can harvest the energy they need from their environment.

One such example mentioned is a button/switch that when pressed, uses the energy you put in to press it to generate enough power to send a on/off signal to another device.

Massimo Banzi, co-founder of the Arduino Project, also gave a talk (as he does every year) about the State of Arduino.  It seems that a lot of that state is in legal limbo at the moment as there are now seemingly 2 arduino companies (arduino.cc and arduino.org) with different views of the future of the project.

As part of his vision, Massimo introduced a partnership with Adafruit to let them produce arduino’s in the USA.  Also as a result of the legal issues with the Arduino brand name, he introduced a new “sister” brand called Genuino (Get it? Genuine Arduino) which will allow them to keep producing at least in the US.

Other announcements included the release of the Arduino Gemma, the smallest Arduino ever, the Modulino, a arduino like product designed and produced in their Bangalore, India, office and a focus on online tools to manage and program arduino’s.

I also attended a few sessions that talked about the BeagleBone board.  I am interested in this board because it bridges that gap between the Raspberry Pi and the Arduino, on the one hand it has a Linux OS, but on the other hand it also has Real Time GPIO pins making it interesting for IoT projects that require this.

It also can be easily programmed using JavaScript (it comes with a node server build in) which is something I am currently working with, I’ll probably write up another blog post about my findings with that board when I get some time to play with it (yes, I got one at the Maker Faire :-).

And finally, some other things you can find at the Maker Faire:

Game of Drones:

Fire and Art:

IMG_5591

Robots that solve Rubik’s cubes:

Cheers,

Mark.

Design Time @ Run Time: Apple Watch Put Through Its Paces in Beijing

May 18th, 2015 2 Comments

Observations on UX research and road-testing wearable tech in the wild. The vehicle for today’s message is Ultan O’Broin (@usableapps), taking advantage of Oracle Applications User Experience events and outreach to evaluate the fitness and health option on the Apple Watch—and to continue his Fitbit Surge exploration—this time in China.

The Watch Ethnography (say what?)

All the warnings about running in Beijing proved wrong: that my clothes would turn black; my skin would turn grey; I’d need a facemask; I wouldn’t see any other runners; I’d attract the attention of security personnel with my blue hair.

None of this happened.

I shoulda guessed. Running is one of the most “unasked-for-advice” activities out there, usually from non-runners or “joggers.”

Instead, I saw lots of other runners in Beijing’s parks and streets, mostly locals, with a small number of “ex-pats.” At times there were so many runners—and power walkers—early in the morning that I had to weave hard to get by them. On the long, straight streets of Beijing, I saw hardcore runners in action, percentage-wise more than, say, in Dublin.

Running in Beijing. Scene from Temple of Sun Park.

Running in Beijing. Scene from Temple of Sun Park.

I saw lots of runners sporting colorful running gear; more than I’ve seen in San Francisco, though the styling was far short of the effortless funky co-ordination of the lemons, oranges, and blacks of the Nordic scene. Yes, I’m a running fashion snob. It was kinda hard to tell what fitness devices the Beijing crowd was packing, but I did see some Garmins: a sure sign of serious runners.

I did one run to the Forbidden City and Tiananmen Square, a 10 miler; hauling myself around the Central Business District and diplomatic zones on other days. The eyes of Chinese security guards swiveled to follow me as I strode by, but generally they seemed nonplussed with my blue hair and obvious Apple Watch. I was kinda disappointed I didn’t end up on CNN.

Running to the Forbidden City. Alas, selfie sticks were not forbidden.

Running to the Forbidden City. Alas, selfie sticks were not forbidden.

The best time to run in Beijing is clearly in the early morning. Public parks were open by 5:30 AM and full of runners and walkers by the time I arrived. There is very bad air pollution in Beijing, but growing up in pre-smokeless-coal-carbon-fuel-ban Dublin, it really didn’t seem that menacing. However, I did detect a markedly poorer air quality later in the day. Your mileage may vary on that one, I guess.

The Device Findings

These runs in Beijing were another opportunity to test out the Fitbit Surge but really to try out the newer Apple Watch in another location. There are other comparisons between these two devices.

Both performed flawlessly, though I preferred the superior build quality of the Apple Watch, which is outstanding, and its UX with configurable glances display and superior styling. Henry Ford’s “Any Color As Long As It’s Black” as applied to smartwatches and fitness bands is #fashtech #fail by this stage.

Again, I was particularly impressed with the rapid GPS acquisition and holding capability of the Surge. I’ve used it on three continents now, and I love its robustness and long life battery.

Fitbit Surge GPS recording from Tiananmen Square run (on iOS)

Fitbit Surge GPS recording from Tiananmen Square run (on iOS)

The Apple Watch’s built-in Workout app proved easy to use for my runs. It has indoor and outdoor options for other activities too, whether with target metrics, distance, time, or calories, or you can use it for an “open” hustle. I was a little disappointed that the watch app doesn’t enable wearers to recall more basic run details from the last activity but being able to see real-time progress was great. I also enjoyed using the Apple Watch built-in Activity app too. Its simple and colorful progress analytics for exercise, moving, and standing were fun to glance at throughout the day, though the data is not for any serious runners or QS fanbois out there.

Using both of these Apple Watch apps together provided a compelling health and fitness experience.

Apple Watch Activity App

Apple Watch Activity App

Apple Watch Activity App

Apple Watch Activity App

Being able to use both devices without carrying a smartphone with me on a run was the UX joy. Being freed from dodgy Bluetooth pairing and GPS signal worries, and that tricky music selection procedure required by a smartphone, saved me 5 mins (about three quarters of a mile distance at my speeds) at the start of each run. Being able to see my performance in real time—on the go—without having to fish out a smartphone, was awesome.

That’s what a smartwatch glance UX is all about: being kept in the moment.

The battery life of the Apple Watch didn’t make it longer than 10 hours because of my runs, though without this kind of exertion, it seemed to last most of my waking day, which is reasonable.

What’s Next?

I normally carry a smartphone when running as my music platform, but increasingly to take Instagram images during my journey. The Strava app GPS integration with Instagram is a fave running experience. I did carry my Apple iPhone 5 in Beijing, to take pictures—no, I don’t really carry a selfie stick—and to try out the Strava app for comparison. The Instagram integration seemed to be DOA though.

So, my thoughts on wearable tech super watch evolution, and the emergence of the standalone wearable device as the way to go for smartwatches, were reinforced from my Beijing experience.

However, a super watch UX needs to be flexible and offer more capability. I’d like to see onboard music and image capture capability on the watches themselves somehow. Audio notifications for time, speed and distance and geographic points would also enhance the experience immensely. However, what such enhancements would mean for the bane of wearable tech UX right now—battery life—yet alone device size, remains just another challenge to be solved. And it will be.

And what UX research methodology lessons might be gleaned from running in Beijing with wearable tech? Firstly, don’t assume anything about your ethnographic experience upfront. Try it yourself on a dry run first to iron out any possible kinks. Run at different times of the day, over different distances and routes, in varying weather conditions, and, of course, with different devices along the way. Most importantly, find real native runners to follow around, and record what they do from start to finish, what they do offline as well as online, and with what tools, on their runs.

Running, just like user experience, is about the complete journey, a total contextual experience, not just where your rubber meets the road.