I don’t particularly like protective cases for phones because they ruin the industrial design aesthetics of the device.
Here at the ‘Lab, we’ve had spirited debates about cases or not, dating back the original team and continuing to our current team.
I am not careful with phones, and the death of my Nexus 5, which I’ve only had since October 2014, was my fault. It was also very, very bad luck.
I usually run with a Bluetooth headset, the Mpow Swift, which I quite like (hey Ultan, it’s green), specifically because I had a few instances where my hand caught the headset cord and pulled my phone off the treadmill deck and onto the belt, causing the phone to fly off the back like a missile.
Yes, that happened more than once, but in my defense, I’ve seen it happen to other people too.
However, on July 8, I was running on the treadmill talking to Tony on the phone, using a wired headset. I’ve found the Mpow doesn’t have a very strong microphone, or maybe I wasn’t aiming my voice in the right direction. Whatever the reason, the Mpow hasn’t been good for talking on the phone.
While talking to Tony, possibly mid-sentence, I caught the cord and pulled the phone off the deck.
Unlike the other times, this time, the phone slipped under the treadmill belt, trapping it between the belt and housing, sliding it the length of the belt, and dragging it over the back drum.
I stopped the treadmill and looked under, but it was trapped inside the machine. After sheepishly asking for help, we were able to get the machine to spit up my mangled phone.
Interestingly, the screen is completely intact, which gives an idea of how tough it really is. The phone’s body is sadly bent in an aspect that describes its journey over that drum. Luckily, its battery hasn’t leaked.
The device didn’t die right away. While it wouldn’t boot, when I connected it to my Mac via USB, it was recognized, although it wouldn’t mount the storage like it normally would. Something about the device consuming too much power for USB.
I tried with a powered USB hub, but I think the battery gave up the ghost.
Happily for me, I had recently bought a second generation Moto X on sale, and I’d been postponing the switch.
Unhappily, every time I switch phones, I lose something, even though I keep backups. When my Nexus 4 died mysteriously, I lost all my photos. This time, I lost my SMS/MMS history.
Like I said, I’m careless with phones.
If you’re involved in enterprise user experience (UX) it will come as no surprise that the humble pen and paper remains in widespread use for everyday business.
Sales reps, for example, are forever quickly scribbling down opportunity info. HR pros use them widely. Accountants? Check. At most meetings you will find both pen and paper and digital technology on the table.
That’s what UX is all about, understanding all the tools, technology, and job aids, and the rest, that the user touches along that journey to getting the task done.
Although Steve Jobs famously declared that the world didn’t need another stylus, innovation in digital styli, or digital pens (sometimes called smartpens), has never been greater.
Microsoft is innovating with the device, h/t @bubblebobble. Apple is ironically active with patents for styli, and the iPen may be close. Kickstarter boasts some great stylus ideas such as the Irish-designed Scriba (@getbscriba), featured in the Irish Times.
It is the tablet and the mobility of today’s work that has reinvigorated digital pen innovation, whether it’s the Apple iPad or Microsoft Surface.
I’ve used digital pens, or smartpens, such as the Livescribe Echo for my UX work. The Echo is great way to wireframe or create initial designs quickly and to communicate the ideas to others working remotely, using a pencast.
Personally, I feel there is a place for digital pens, but that the OG pen and paper still takes some beating when it comes to rapid innovation, iteration, and recall, as pondered on UX StackExchange.
An understanding of users demands that we not try to replace the pen and paper altogether but to enhance or augment their use, depending on the context. For example, using the Oracle Capture approach to transfer initial strokes and scribbles to the cloud for enhancement later.
You can read more about this in the free Oracle Applications Cloud User Experience Innovations and Trends eBook.
Sure, for some users, a funky new digital stylus will rock their world. For others, it won’t.
And we’ll all still lose the thing.
The pen is back? It’s never been away.
Yesterday, our entire organization, Oracle Applications User Experience (@usableapps) got a treat. We learned about Oracle’s corporate citizenship from Colleen Cassity, Executive Director of the Oracle Education Foundation (OEF).
I’m familiar with Oracle’s philanthropic endeavors, but only vaguely so. I’ve used the corporate giving match, but beyond that, this was all new information.
During her presentation, we learned about several of Oracle’s efforts, which I’m happy to share here, in video form.
First, there’s the OEF Wearable Technology Workshop for Girls, which several of our team members supported.
Next up was Wecyclers, an excellent program to promote recycling in Nigeria.
And finally, we learned about Oracle’s 26-year-old, ongoing commitment to the Dian Fossey Gorilla Fund.
This was an eye-opening session for me. Other than the Wearable Technology Workshop for Girls, I hadn’t heard about Oracle’s involvement in these other charitable causes, and I’m honored that we were able to help with one.
I hope we’ll be able to assist with similar, charitable events in the future.
Anyway, food for thought and possibly new information. Enjoy.
Kscope15 (#kscope15) was hosted at Diplomat resort along beautiful Hollywood Beach, and the Scavenger Hunt from OAUX AppsLab infused a hint of fun and excitement between the packed, busy, and serious sessions.
The Scavenger Hunt was quite a comprehensive system for people to win points in various ways, and keep track of events, points and a leaderboard. And of course, we had one Internet of Things (IoT) component that people could search for and tap to win points.
And here is the build, with powerful battery connected to it, complete with anti-theft feature, which is double-sided duct tape All together, it is a stand-alone, self-contained, and definitely mobile, computer.
Isn’t it cool? I overheard on multiple occasions people say it was the coolest thing at the conference.
One of the bartenders at the Community Night reception wanted to trade me the “best” drink of the night for my Raspberry Pi.
I leased it to him for two hours, and he gave me the drink. That fact is that I would put the Raspberry Pi on his table anyway for the community night event, and he would give me the drink anyway if I knew how to order it.
On the serious side, APEX (Oracle Applications Express) had a good showing with many sessions. Considering our Scavenger Hunt Web Admin was built on APEX, I am interested in learning it too. After two hands-on sessions, I did feel that I’d use it for quick web app in the future.
On the database side, the most significant development is ORDS (Oracle REST Data Services) and the ability to call a web end-point from within database. This opens up possibility of monitoring data/state change at the data level, and triggering events into a web server, which in turn can trigger client reaction via WebSocket.
Again the Kscope15 was a very fruitful event for us, as we demonstrated Scavenger Hunt game and provoked lots of interest. It has some potential for large event and enterprise application, so stay tuned while we make some twist to it in the future.
Editor’s note: Raymond (@yuhuaxie) forgot to mention how much fun he had at Kscope15. Pics because it happened:
ODTUG (@odtug) commissioned a short film, which was shot, edited and produced during the week that was Kscope15. It debuted during the Closing Session, and they have graciously shared it on YouTube. It’s 10 minutes, but very good at capturing what I like about Kscope so much.
Noel appears to talk about the Scavenger Hunt at 7:29. Watch it here.
This here is a wrap-up of that content, but let’s be honest. If you like OAUX content, you really should follow the official blogs of OAUX: Usable Apps in the Cloud, VoX, user experience assistance: cloud design & development.
Oh and follow @usableapps too. That’s done, so let’s get recapping.
Over on VoX, you can read all about Oracle’s Cloud Application user experience strategy in three short posts.
In the first part, read about how we apply Simplicity, Mobility, Extensibility to Cloud Applications. In part two, read about big-picture innovation and how it drives our Glance, Scan, Commit design philosophy. Finally, in the big finish, read about how we ap
ply all this to designing and building experiences for our cloud users.
More Apple Watch
You’ve read our takes, and Ultan’s, on the Apple Watch, and now our GVP, Jeremy Ashley (@jrwashley) has shared his impressions. Good stuff in there, check it out if you’re looking for reasons to buy a smartwatch.
Not convinced of the value? Longtime friend of the ‘Lab, David Haimes (@dhaimes) might have what you need to go from cynic to believer.
We Heart APIs
We love us some APIs, especially the good ones. Developers are users too.
Speaking of APIs and developers, check out two videos that tie developer use cases with PaaS4SaaS.
Big Finish, ERP Cloud and Cake
Told you they’ve been busy.
As per Jake’s post, we got to spend a few days in Florida to support the Scavenger Hunt that we created for the Kscope15 conference. Since it ran pretty smoothly, we were able to attend a few sessions and mingle with the attendees and speakers, here are my impressions of the event.
This was my first time at Kscope. Jake hyped it up as a not-to-miss conference for Oracle developers and despite my high expectations of the event, it did not disappoint. The actual conference started Sunday but we arrived Saturday to setup everything for the Scavenger Hunt, dot a few i’s and cross some t’s.
We also ran a quick training session for the organizers helping with the administration of the Scavenger Hunt and later that night started with actually registering players for the hunt. We signed up about 100 people on the first evening. Registration continued Sunday morning and we picked up about 50 more players for a grand total of 150, not bad for our first Scavenger Hunt.
The number of sessions was a bit overwhelming so I decided to focus on the Database Development and the Application Express track and picked a few session from those tracks. The first one I attended was called “JSON and Oracle: A Powerful Combination” where Dan McGhan (@dmcghan) from Oracle, explained how to produce JSON from data in an Oracle Database, how to consume JSON in the Oracle Database and even how to use it in queries.
It turns out that Oracle 184.108.40.206 has some new, really cool features to work with JSON so be sure to check those out. Interestingly, our Scavenger Hunt backend is using some of these techniques, and we got some great tips from Dan on how to improve what we were doing. So thanks for that Dan!
Next I went to “A Primer on Web Components in APEX” presented by my countryman Dimitri Gielis (@dgielis). In this session, Dimitri demonstrated how you can easily integrate Web Components into an APEX application. He showed an impressive demo of a camera component that took a picture right from the web application and stored it on the database. He also demoed a component that integrated voice control into an APEX application, this allowed him to “ask” the database for a row and it would retrieve that row and show it on the screen, very cool stuff.
That night also featured the infamous “APEX Open Mic” where anybody can walk up to the mic and get five minutes to show off what they’ve built with APEX, no judging, no winners or losers, just sharing with the community, and I must say, some really impressive applications where shown, not the least of which one by Ed Jones (@edhjones) from Oracle, who managed to create a Minecraft-like game based on Oracle Social Network (OSN) data where treasure chests in the game represent OSN conversations. Opening the chest opens the conversation in OSN. Be sure to check out his video!
The next day, I attend two more sessions, one by our very own Noel Portugal (@noelportugal) and our Group Vice President, Jeremy Ashley (@jrwashley), I am sure they will tell you all about this through this channel or another so I am leaving that one for them.
The 3rd day was Hands-On day and I attend 2 more sessions , first “Intro to Oracle REST Data Services” by Kris Rice (@krisrice) from Oracle, and then “Soup-to-Nuts of Building APEX Applications” by David Peake (@orcl_dpeake) from Oracle.
In the first one we were introduced to ORDS, a feature in the Oracle DB that allows you to create REST services straight on top of the Database, no middle tier required! I’ve seen this before in MySQL, but I did not know you could also do this in an Oracle DB. Again this is a supper powerful feature that we will be using for sure in future projects.
The second, two-hour, session was a walk through of a full fledged APEX application from start to finish by the always entertaining David Peake. I must admit that by that time I was pretty much done, and I left the session half way through building my application. However, Raymond @yuhuaxie) managed to sit through the whole thing so maybe he can give some comments on this session.
All I can say is that APEX 5.0 was extremely easy to get started with and build a nice Web Application with.
And that was KScope15 in a nutshell for me. It was an awesome, exhausting experience, and I hope I can be there again in 2016.
As my wearables odyssey continues, it’s time to document my time with the Fitbit Surge.
For comparison’s sake, I suggest you read Ultan’s (@ultan) review of the Surge. He’s a hardcore fitness dude, and I’m much more a have-to-don’t-like-to exercise guy, which makes for a nice companion read.
As usual, this isn’t a review, more loosely-coupled observations. You can find lots of credible reviews of the Surge, billed as a “Super Watch” by the recently IPO’ed Fitbit, e.g. this one from Engadget.
Here we go.
As with most of the other wearables I’ve used, the Surge must be setup from software installed on a computer. It also requires the use of a weird USB doohickey for pairing, after which the watch firmware updates.
I get why they provide ways for people to sync to software installed on computers, but I wonder how many users really eschew the smartphone app or don’t have a smartphone.
Anyway, despite Fitbit Connect, the software you have to install, saying the firmware update process will take five to ten minutes, my update took much longer, like 30 minutes.
Physically, the Surge is chunky. Its shape reminds me of a door-stop, like a wedge. While this looks weird, it’s really a nice design idea, essentially tilting the display toward the user, making it easier to read at a glance.
I found wearing the device to be comfortable, although the rubber of the band did make my skin clammy after a while, see the Epilogue for more on that.
The display is easy to read in any light, and the backlight comes on automatically in low light conditions.
Surge carries water resistant rating of 5 ATM, which amounts to 50 meters deep, but for some reason, Fitbit advises against submerging it. Weird, right?
Not one to follow directions, I took the Surge in a pool with no ill effects. However, once or twice during my post-workout steam, the display did show some condensation under the glass. So, who knows?
The device interface is a combination of touches and three physical buttons, all easy to learn through quick experimentation.
The watch screens show the day’s activity in steps, calories burned, miles, and floors climbed. It also tracks heart rate via an optical heart rate sensor.
In addition, you can start specific activity tracking from the device including outdoor running with GPS tracking, which Ultan used quite a lot, and from what I’ve read, is the Surge’s money feature. I only run indoors on a treadmill (lame), so I didn’t test this feature.
The Surge does have a treadmill activity, but I found its mileage calculation varied from the treadmill’s, e.g. 3.30 miles on the treadmill equated to 2.54 on the Surge. Not a big deal to me, especially given how difficult tracking mileage would be for a device to get right through sensors.
Speaking of, the Surge packs a nice array of sensors. In addition to the aforementioned GPS and optical heart rate sensor, it also sports a 3-axis accelerometer and a 3-axis gyroscope.
The Surge tracks sleep automatically, although I’m not sure how. Seemed to be magically accurate though.
Fitbit advertises the Surge’s battery life as seven days, but in practice, I only got about four or five days per charge. Luckily, Fitbit will inform you when the battery gets low via app notifications and email, both of which are nice.
Happily, the battery charges very quickly, albeit via a proprietary charging cord. Lose that cord, and you’re toast. I misplaced mine, which effectively ended this experiment.
The app and data
As Ultan mentioned in his post, the Fitbit Aria wifi scale makes using any Fitbit device better. I’ve had an Aria for a few years, but never really used it. So, this was a great chance to try it with the Surge.
Fitbit provides both mobile and web apps to track data.
I mostly used the mobile app which shows a daily view of activity, weight and food consumption, if you choose to track that manually. Tapping any item shows you details, and you can swipe between days.
It’s all very well-done, easy to use, and they do a nice job of packing a lot information into a small screen.
From within the app, you can set up phone notifications for texts and calls, a feature I really liked from wearing the Basis Peak.
Unfortunately, I only got notified about half the time, not ideal, and I’m not the only one with this issue. Danny Bryant (@dbcapoeira) and I chatted about our Surge experiences at Kscope, and he mentioned this as an issue for him as well.
Fitibit offers Challenges to encourage social fitness competition, which seems nice, but not for me. There are badges for milestones too, like walking 500 miles, climbing 500 floors, etc. Nice.
Sleep tracking on the mobile app is pretty basic, showing number of times awake and number of times restless.
Fitbit’s web app is a dashboard showing the same information in a larger format. They hide some key insights in the Log section, e.g. the sleep data in there is more detailed than what the dashboard shows.
I have to say I prefer the Jawbone approach to viewing data; they only have a mobile app which dictates the entire experience and keeps it focused.
Fitbit sends weekly summary emails too, so yet another way to view your data. I like the emails, especially the fun data point about my average time to fall asleep for the week, usually zero minutes. I guess this particular week I was well-rested.
I did have some time zone issues when I went to Florida. The watch didn’t update automatically, and I did some digging and found a help article about traveling with your Fitbit with this tip:
Loss of data can occur if the “Set Automatically” timezone option in the app’s “Settings” is on. Toggle the “Set Automatically” timezone option to off.
So for the entire week in Hollywood, my watch was three hours slow, not a good look for a watch.
And finally, data export out of Fitbit’s ecosystem is available, at a cost. Export is a premium feature. “Your data belongs to you!” for for $50 a year. Some consolation though, they offer a free trial for a week, so I grabbed my data for free, at least this time.
Overall, the Surge compares favorably to the Basis Peak, but unlike the Jawbone UP24, I didn’t feel sad when the experiment ended.
Perhaps you’ll recall that Fitbit’s newer devices have been causing rashes for some users. I’m one of those users. I’m reporting this because it happened, not as an indictment of the device.
I wore the Surge for seven weeks, pretty much all the time. When I took it off to end the experiment, my wife noticed a nasty red spot on the outer side of my arm. I hadn’t seen it, and I probably would never have noticed.
The rash doesn’t really affect how I view the device, although if I wear the Surge again, I’ll remember to give my skin a break periodically.
One unexpected side effect of not wearing a device as the rash clears up is that unquantified days feel weird. I wonder why I do things if they’re not being quantified. Being healthy for its own sake isn’t enough. I need that extra dopamine from achieving something quantifiable.
Find the comments.
Noel (@noelportugal), Raymond (@yuhuaxie), Mark (@mvilrokx) and I traveled to sunny Hollywood, Florida last week to attend Kscope15 (#kscope15), the annual conference of the Oracle Development Tools User Group (@odtug).
Check out some highlights of our week.
If you read here, you probably know that this year, Noel had cooked up something new and different for the conference, a scavenger hunt.
This year was my fourth Kscope, and as we have in past years, we planned to do something fun. At the end of Kscope14, Monty Latiolais (@monty_odtug), the President of the ODTUG Board of Directors, approached us to collaborate on something cool for Kscope15.
We didn’t know what exactly, but we all wanted to do something new, something fun, something befitting of Kscope, which is always a great conference. So, we spent the next few months chatting with Crystal (@crystal_walton), Lauren (@lprezby) and Danny (@dbcapoeira) intermittently, developing ideas.
We eventually settled on a scavenger hunt, which would allow attendees to experience all the best parts of the conference, almost like a guided tour.
Once we had a list of tasks, Noel developed the game, and with Mark and Raymond pitching in, they built it over the course of a few months. Tasks were completed one of three ways, by checking in to a Raspberry Pi station via NFC, by staff confirmation, and by tweeting a picture or video with the right hashtags.
We arrived in Hollywood unsure of how many players we’d get. We didn’t do much promotion in advance, and we decided to limit the game to 500 players to ensure it didn’t get too crazy.
Over the first few days, we registered nearly 150 players, and of them, about 100 completed at least one task, both well above my conservative expectations.
During the conference, we had a core of about 10-20 dedicated players who made the game fun to watch. They jockeyed back and forth in the top spots, trolling each other on Twitter, and waiting to complete tasks to allow fleeting hope to the other players.
In the end, we had a tie that we had to break at the conference’s closing session. Here are the final standings:
- Justin Powell (@justincasefacts)
- Joe Aultman (@1708_396)
- Bjoern Rost (@brost)
- Peter Koutroubis (@PeterKoutroubis)
- Cory Dresher (@cdresche)
Congratulations winners, and thank you to everyone who played for making the game a success.
And finally an enormous thank you to ODTUG and the Kscope15 organizers for allowing us this opportunity. We’re already noodling ways to improve the game for Kscope16 in Chicago.
Stay tuned for other Kscope15 posts.
Walking into something as a newcomer is always an adventure of reality interacting with expectations. Though I wasn’t quite sure what to expect at the Quantified Self conference, it wasn’t what I expected. But in a good way.
The conference was structured around three main activities: talks given on the main stage, breakout sessions, which took place at different smaller areas during the talks, and break times, where one might check out the vendors, grab a snack, or chat with fellow attendees.
The talks, about ten minutes each, were mostly about the speaker’s successes in changing some aspect of their life via quantifying and analyzing it. This is partly what I wasn’t expecting—the goal-focused and very positive nature of (most) everyone’s projects.
True, some of the presenters might be tallied on the obsessive side of the spectrum, but by and large, it was all about improving your life, and not recording everything as a method of self-preservation.
On this last point, one presenter even provided this quote from Nabokov, which generated a touch of controversy: “the collecting of daily details … is always a poor method of self-preservation.”
One important theme I saw, however, is the role of measuring itself—that the very act of quantifying your behaviors, whether it’s diet, exercise, TV watching, or your productivity, can change your behavior for the better.
Granted, there can also be profound personal insights from analyzing the data, especially when combining multiple sources, but it’s possible some of these benefits come from simply tracking. Especially when it’s done manually, which takes a great deal of persistence, with many people petering out after a few weeks at the most.
This presents an interesting question about technology’s increasing proficiency at passive tracking, and the aim to provide insights automatically. For instance, the Jawbone UP platform’s Smart Coach is supposed look at your exercise and activity data with your sleep data and give you advice about how to get better sleep.
If someone had tracked this manually, and done the analysis themselves, they may not only be a lot more familiar with the facts about their own sleep and exercise, but any insights derived might be more likely to be absorbed and translate to genuine change.
When insights are automatically provided will they lead to just as much adoption?
Probably not, but they could reach a lot more people who may not be able to keep up with measuring. So it’s probably still a good thing in the end.
The other important theme was something that I’ve also been encountering in other areas of my work—the importance of good questions.
For most of the QS projects, this took the form of achieving a personal goal, but sometimes it was simply a specific inquiry into a realm of one’s life. Just looking at data can be interesting, but without a good question motivating an analysis, it’s often not very useful.
In the worst case, you can find spurious connections and correlations within a large set of data that may get you off in the wrong direction.
And while at the beginning of the conference it was made clear that QS15 was not a tech conference, there was plenty of cool technology in the main hall to check out and discuss.
There are too many to cover in much detail, but here are a few that intrigued me:
- Spire, a breath tracking device that says it can measure focus by analyzing your breathing pattern. If someone is interested in examining their productivity, this could be a promising device to check out. Also, it can let you know when you need a deep breath, which has various physiological and emotional benefits.
- Faurecia manufactures seats for automobiles, and they were showing off a prototype that uses piezoelectric bands within the chair itself to measure heart rate and breathing patterns. This is great because it can do this through your clothing, and detect when you’re falling asleep, and possibly institute some countermeasures. The data could also sync up with your phone, say through Apple’s Healthkit, if you want to add it to your logs.
is an activity and sleep tracker that uses a ring form factor, which for some people may be easier to sleep with than a wrist band. Their focus is on sleep and measuring how restorative your rest is. I look forward to seeing how this one develops.
The conference had a lot to offer—some inspiration, some cool technologies, surprisingly good lunches, and quite a bit to think about.
I always thought of myself as a control freak, Type A, self-aware (flaws and all) person but then I attended the Quantified Self Conference last week in San Francisco.
There is so much more one can do to learn about one’s self. The possibilities are endless on what I can quantify (measure about myself) and there are so many people capturing many surprising things.
Quantified Self, if you haven’t heard, is “a collaboration of users and tool makers who share an interest in self knowledge through self-tracking,” as described by by Gary Wolf and Kevin Kelly. I’ve also been an admirer of Nicholas Felton, who has beautiful visualizations of his data.
The two-day conference consisted of morning and afternoon plenary sessions, and in between, the day is filled with ten-minute talks on the main stage (where practitioners share their own QS work, tools, and personal data), with breakout sessions for group discussions and office hours for hands-on help happening concurrently. There were plenty of topics for a newbie QS-er like me or a longtime enthusiast.
My conference experience in numbers:
- 4 plenary session talks
- 30 session talks
- 1 breakout session on “The Quantified Self at Work”
Videos and presentations should be posted in the coming weeks but until then, here is a summary of from Gary Wolf.
Beyond the numbers, I was surprised, inspired and learned a few lessons. It is amazing what quantified self-ers are capturing, the extent and effort they take, and their life changing impacts. There is plenty of fitness, diet, and health tracking happening, but others are tracking things such as:
The list goes on but this sampling gives you a sense of the range of self tracking.
While lots of recording was being done with commonly available sensors, devices, and apps, there was a lot of data being recorded manually through pen-paper journals and spreadsheets.
There are endless measures (and many low and high tech tools) but recording is not the end goal. The measures help inform our goals and the actions to achieve those goals. There were several talks about the importance of self-tracking to understand your numbers, your similarities and your differences to population normals.
In “Beyond Normal: A Conversation,” Dawn Nafus (@dawnnafus) and Anne Wright (@annerwright) discussed the importance of self-tracking to gain awareness on whether the standards, baselines, and conventions apply to you. Population normals are a good starting point but they shouldn’t define your target as you are unique and the normals may not be right for you (#resistemplotment).
My takeaway, don’t worry about getting the perfect device or tool. Start with finding a goal or change that is important to you. Record, measure, and analyze – glean insights that move you along to being your best self. It is not about the Q but the S.
A busy June is half over now, but we still have miles to go before July.
We’ve been busy, which you know if you read here. Raymond went to Boston. Tony, Thao (@thaobnguyen), Ben and I were in Las Vegas at OHUG 15. John and Thao were in Minneapolis the week before that. Oh, and Anthony was at Google I/O.
The globetrotting continues this week, as John and Anthony (@anthonyslai) are in the UK giving a workshop on Visualizations at the OUAB meeting. Plus, Thao and Ben are attending the QS15 conference in San Francisco.
You can do it now, I’ll wait.
Back? Good check out the sweet infographic Tony C. on our team created for the big Hunt:
Coincidentally, one of the tasks is to attend our OAUX session on Tuesday at 2pm, “Smart Things All Around.” Jeremy Ashley (@jrwashley), our GVP, and Noel will talk about the Scavenger Hunt, IoT, new experiences, design philosophies, all that good stuff.
Speaking of philosophies, VoX has a post on glance-scan-commit the design philosophy that informs our research and development, and more importantly, how glance-scan-commit trickles into product. You should read it.
And finally, Ultan (@ultan) and Mark collaborated on a post about partners, APIs, PaaS and IoT that you should also read, if only so you can drop a PaaS4SaaS into your next conversation.
If you’re attending any of these upcoming events, say hi to us, and look for updates here.
It’s pretty sweet. Check it out:
Glance has been in the works for more than a year now, and it arose out of our collective frustration with the effort involved developing for multiple device SDKs.
The goal of Glance is to do 75-80% of the overlapping work: calling Oracle Cloud Applications APIs, working with required cloud services like Apple Push Notifications and Google Cloud Messaging, deploying a companion mobile application, built in Oracle’s Mobile Application Framework, of course.
With all that done, we can build for and plug in new devices (ahem, Pebble Time) much more easily and with much less effort. Initially, we built Glance to support the original Pebble and Android Wear smartwatches, and the Apple Watch was our first proof-point for it.
We’re happy with the results so far, and Glance has made it much easier for us to build prototypes on new devices. Now, if only we could get access to CarPlay.
This idea of the glanceable user experience of wearable technology is now everywhere.
They’re all at it.
There is the OG Misfit Wearables Shine, Apple’s Glances, and of our course the Oracle Applications Cloud User Experience (@usableapps) concept of glance on the smartwatch, part of our Glance, Scan, Commit design philosophy.
But, not all glances are equal. How well a glance works for the wearer depends on the user experience notion of context of use: the wearer, the type device, what the wearer’s up to at the time, the information they need, the connectivity, et cetera.
Glass is a heads-up device, so that means eyes on the road. Combined with the audio updates on my cycled segments and so on, it’s a fantastic UX. It’s convenient. It doesn’t distract me. And, it’s safe. I don’t have to look down at my wrist and take my eyes off the road even for a second to glance at the important stuff.
Looking down at my wrist or changing hand position to glance at my progress on a smartwatch such as my Apple Watch Activity or Workout built-in apps, at my Fitbit Surge Bike stats, or at my Motorola Moto 360 Android Wear Google Fit analytics while hammering along on a bike at 30 mph on a public road is just too risky for me.
Glancing at these smartwatches’ UIs later, of course is great, whether it’s for progress on miles, calories, duration, or even to ensure that important data’s actually being sent to the cloud where I can do more with it.
I have the same opinion about heads-up glance on devices like Google Glass when I am running, though the durability of Google Glass, battery life, and still having to pair it with another device is a pain.
Running in cities requires you to keep your wits about you: be sharp and look ahead. Glancing down from the upcoming path even for a second might mean going home with an injury or worse. Generally, with my smartwatches, when I’m out running, I’ll glance at the data or analytics when I stop at a traffic signal rely on the audio update from my paired smartphone (although it ruins the music) on occasion.
The ability to glance at performance statistics in heads-up mode, combined with those audio progress reports in your ear, is the way to go when cycling and running with wearable tech IMHO. Arguably, too, an audio component is “glance for the ears”. Glance should be multi-mode and not just about the visual, not least for accessibility reasons. We can’t all see as well as each other.
Activity wearable tech designers and developers take note. Eyes on the prize, or road in this case, please. It’s a good reminder about the importance of context of use when gathering user requirements.
I often tell people that you need both a left brain and a right brain to be a software designer: a left brain to analyze and understand, a right brain to dream and wonder. The EyeO Festival which Thao and I just attended in Minneapolis, was food for our right brains.
EyeO is about the intersection of art and code: generative artists (who use data and algorithms to produce works of art), art installations (which often require sophisticated coding), and those who see coding itself as an art form. It is not so much about knowledge transfer as it about building a community, meeting world-class data artists and hearing their back stories.
I attended fourteen talks in all and saw many wonders.
The JPL crew controlling the Mars rover use Microsoft HoloLens goggles to create an augmented reality, allowing scientists in remote locations to stand together on the surface of the planet. Each person sees their own desk, chair and monitor sitting in a crater with the rover just a few feet away. As their eyes scan the area, little dots of light show where each person is looking; when they use their mouse to teleport to a nearby ridge, others see their avatars walk to the new location. They can even walk around the rover and point to where it should go next.
The design team at nervo.us (she’s a biologist, he’s a physicist) is interested in how complex forms arise in nature from cells growing at different rates. Using their own custom software, they create spectacular simulations and turn these into 3-D printed art objects. One of their most stunning creations is a kinematics dress, made supple using thousands of tiny interlocking plastic hinges perfectly fitted to the laser-scanned image of a customer’s body. With scary-hard math, they generalize a moving body from a single scan, compute not just how the dress will look but how it will hang and twirl, and even prefold it so that it will fit in today’s small 3-D printers.
Perhaps the most jaw-dropping demonstration was a sneak preview of “Connected Worlds,” an installation that will be opening soon at the New York Hall of Science. Three years in the making, it creates a Star Trek style holodeck with a 50-foot waterfall and six distinct biomes populated by whimsical plants and animals. Children move physical logs to redirect virtual water into the various biomes; if they make the right decisions wonderful trees will grow and attract ever more magical animals. The team at Design I/O described technical challenges and lessons learned, some of which might be applicable to future AppsLab motion-tracking projects.
One of the topics I found most stimulating was new and improved coding languages. I have used Processing, a language developed specifically for artists, to create some of the interactive visualizations we show in our cloud lab. It was a thrill to meet and talk with Processing’s co-inventors and hear their plans for new evolutions of the language, including P5.js, Processing.py, and the upcoming Processing 3.0.
But the most interesting talk about languages was by a guy named Ramsey Nassar. Ramsey is an uber-coder who creates new computer languages for fun. He argues that most coders are stuck using alienating, frustrating, brittle languages created decades ago for a world that no longer exists. He wants to create languages that facilitate “post-human creativity,” new forms of creativity not possible before computers. Some of his languages, like god.js (which makes code look like biblical text) and Emojinal (made completely out of emoji), are just for fun. Others, like Alb (the first entirely Arabic coding language), Arcadia (for Unity 3D VR game development), Zajal (inspired by Processing), and Rejoice (a stack language based on Joy), are practical and mind-expanding. I plan to talk more about why coding languages should matter to designers in a future blog post.
As with any conference there were countless small discoveries, too many to report in full. Here are just a few…
Amanda Cox of the New York Times talked about making data more relatable by using geocoding to default the initial view of a large geographical dataset to the user’s own locale. Another interesting technique was having users guess what a plotted curve would look like by drawing it before showing the actual curve.
One clever flourish I noticed was the use of tiny single-value pie charts placed beneath each year in the X axis of a time-driven line chart to add an extra dimension of data about each year without distracting from the main point of the chart.
Sprint, the telephone company, started out as a railroad company that used their existing right of way to plant cell towers. Sprint stands for Southern Pacific Railroad Internal Networking Telephony.
Into LED arrays and Raspberry Pi? Check out Fadecandy.
Timescape is a visualization-based storytelling platform, currently in beta. Looks interesting.
How long does it take the New York Times team to create world-class infographics? As long as you have plus one half hour.
What kind of music do coding language nerds listen to? The Lisps of course!
My right brain is full now. Time to dream!
Jeremy (@jrwashley), DJ, Kris Robison and I attended the launch event, which you can watch here. My part of the presentation begins at 36:36.
Here’s the event abstract:
Abstract: In October 2014, NASA’s Asteroid Hackathon event was hosted (with several other NASA partners) at the SETI institute in Mountain View, California. Team NOVA’s overall winning solution for this hackathon allowed users to explore relationships among the Minor Planet Center’s asteroid data. The elegant interface was not just eye-catching, the repeated learning that hackathon participants experienced in the “science fair” portion of judging greatly impressed the judges. More than once, people discovered relationships among asteroid data parameters that they didn’t previously know about. A perfect outcome for one of the primary goals: to increase public knowledge of asteroids. Dr José Luis Galache (Acting Deputy Director, Minor Planet Center) and DJ Ursal (Director, Product Management at Oracle) teamed up together through the Oracle Volunteering initiative to implement the winning entry from the Asteroid Hackathon on the Minor Planet Center website. On June 8th they will be launching the website as part of the for the Harvard-Smithsonian Center for Astrophysics’s Solar, Stellar and Planetary Sciences division seminar series. The team will be discussing this project as it relates to cooperation between the Minor Planet Center, NASA, Oracle Volunteering, and its goal to inform and involve the general public.
This volunteer effort is a great success, and the result is well received and appreciated by the astrophysicists attending the launch event.
The NASA Grand Challenge program executive Jason Kessler (@soughin) was at the White House, talking up the Asteroid Hackathon and this volunteer work there, before calling into the event via Skype.
The event was live broadcast through the Minor Planet Center’s YouTube channel, and the audience at the Center was mostly astrophysicists.
On the roof-top of the Harvard-Smithsonian Center for Astrophysics, there are several telescopes, including the famous Harvard Great Refractor. But we liked this cute mini Astro Haven.
A bit about Asteroid Explorer, the main part of the web tool utilizes Crossfilter, D3.js and Highcharts. I processed Asteroid data into proper slicing, grouping to feed into Crossfilter to render the interactive filter bar charts and tables; also, I created bubble chart to render series of property data for looking into correlation of any pair of properties, and dynamically react to filter bar charts range slider.
Google I/O 2015 has just ended. There are lots of aspects regarding to Google I/O. Lets have a taste of it from the user experience perspective.
There are lots of features announced in Google I/O, and you may find a lot of focus have been around user experience this year.
First, for the pre-show, there were planets and a whale flying across multiple screens surrounding the keynote room, showcasing the latest advancement of VR and animations. Below was the whale that someone recorded.
Next, here is the visualization of the Android mobile adoption.
This visualization is pretty impressive. It uses the screen asset efficiently, it is easy to understand, and it shows a lot of key information in an elegant way.
As Dave Burke pointed out, “the central theme of M is improving the core user experience of Android.” To name a few, Android M now provides simpler and more granular control for app permissions, better web experience, app links, and fingerprint support.
Before M, users have to allow all the permissions that an app requested in order to install the app. I had a lot of issues with this approach in the past.
For example, I did wish to install Facebook app on my phone, and I did at one point. To my dismay, the Facebook app immediately scanned through all my contacts and suggested friends to me from my contact list. But hey, I do not wish to become a Facebook friend with my banker, please. With the new app permissions, it is promising that I do not need to disclose my contact information to Facebook anymore if I have the Facebook app installed.
Google Now, other than showing you your traveling cards and parking cards, now drills into the user app to derive the current user context. Being able to retrieve contexts will not only allow Google to provide more relevant results, but more appropriate actions as well. Users will no longer need to switch to another app (such as the browser), look for some information, and get back to the previous app. Now they can simply go to Google Now and ask questions like “who is the author of this song,” and it will return back the results they need without losing context. This is a huge time savings for users. Privacy is a potential concern for lots of people, but let us defer it to somewhere else.
IoT is certainly the next thing, and everyone is looking for a winner. Google is doing the same and introduced Project Brillo and Eave as their step and direction towards this area. With almost everything now moving to the cloud and become easily accessible, IoT development is easier than ever before. This is the era of IoT, and it is part of user experience as well. How can you call yourself having a great user experience when the user can not even pick and choose the device they like?
Last but not least, having Cardboard for education is such a great idea. I now wish that I was born in this era and being able to experience and learn about geography, biology, chemistry classes in this new interactive way.
To close this off, here is me lying on a hammock and watching one of the completely packed sessions in the cafeteria.
Did you go to Google I/O? Please feel free to comment below if you have anything you would like to share.
Why? The Oracle AppsLab (@theappslab) team wanted to find a user-friendly way to start a conversation with the ODTUG membership about this whole “Internet of Things” thing. Plus, we love Kscope and ODTUG!
This scavenger hunt is a fun way to start dipping your toes into an emerging technology that is going to be a major focus area for Oracle on the road ahead.
So join us, and you’ll be able to go back to work on Monday with some cool IoT talking points. Like that fact that you now know what “IoT” stands means.
Visit http://kscope15.com/scavenger and register now!
Stay tuned to hear more on how we built this and how we are leveraging mobile, IoT and wearable technologies for this fun activity.
Oh, and be sure to attend our Tuesday session at 2pm, “Smart Things All Around” to hear Jeremy Ashley (@jrwashley), our GVP, and me present a deep dive discussion and wax philosophically on what it all means.
But wait, there’s more. Make sure to stop by our AppsLab table to chat us up during the conference.
At the beginning of May, Anthony (@anthonyslai) and Raymond joined a large contingent of the OAUX team in an two-stop tour of Asia. The first stop was Singapore.
Here’s the dynamic duo in action, setting up our demos to show to a large group of Oracle partners.
After Singapore, the team headed to Beijing for more partner events, and as you can, the turnout was phenomenal.
Misha has a full debrief of the Beijing leg of the tour as well.
Even though Noel (@noelportugal) was bummin he didn’t get to go to Singapore and China, his spirits brightened when Laurie (@lsptahoe) asked him to serve as a mentor for her Internet of Things (IoT) hackathon in Guadalajara.
We’ll get a brief respite, then come several conferences.
Next week, Thao (@thaobnguyen) and John will be attending Eyeo (@eyeofestival), as in the festival, not the Google conference (@googledevs). Anthony will be at that one, i.e. Google I/O, so look for his recap here next week.
And hey, I’ll be presenting with Aylin Uysal (@aylinuysal); our session is called Oracle HCM Cloud User Experiences: Designed for Work Styles across Devices, and it’s Tuesday, June 9 at 1 PM. So, come by if you’ll be at the show.
Consider yourself current.
Editor’s note: Here’s the first post from Osvaldo Villagrana (@vaini11a), one of our AppsLab Mexico team members. Enjoy.
For those who don’t know, this band is wearable biometric identity device that let’s you use your heart’s unique signature (a.k.a. Electrocardiogram or EGC) to authenticate and validate your identity.
Main problem they want to solve is avoid user remembering all passwords, PIN numbers and security codes used in our daily basis.
First off Discovery Kit includes the band, Bluetooth dongle for Windows and USB cable for charging the band. Bluetooth dongle is included because at the beginning Nymi band only could be paired with Windows OS but now can be paired through OS X and Android as well.
Nymi band material at first feels cheap and easy to bend it and break it, but it really fits very well on my wrist. Band connections terminals are very exposed in both ends of the cord to water or dust but they say is water resistant but not waterproof.
Band is adjustable and can accommodate wrist sizes up to 7.5” in circumference. A full charge takes approximately two hours when you use a wall outlet or computer and battery last 3 days.
Setting up the band is requires some steps; band must be enrolled and authenticated with your own ECG using the NCA (Nymi Companion App) app available in Windows, OS X and Android. I decided use Android app this time. I tried OS X and Windows but it’s the same. Once the band is clasped on your wrist it will confirm you the charge level and immediately will enter in broadcast mode.
I found this step a bit confusing as there’s no feedback when band is already in broadcast mode so you are not quite sure if your band is ready to be discoverable. Funny thing is there’s no way to turn it off.
After band is clasped, Android app asks for putting your finger over the sensor in the band. It takes like a minute for the app to analyze and save your ECG info in the app. After that, you’re ready to pair your Nymi with any NEA (Nymi Enabled app or third party apps). Band supports just up to 7 different apps profiles (they say in coming updates will be supported more).
Anytime clasp is opened, band must be authenticated once again but with the same NCA app was before. If you want to use any other NCA app (OS X or Windows), the band should be reset and start over the setting up. This is not ideal.
NEA’s must provision a unique key-value (profile) that is saved in the band for future use and this happen only once for each NEA. The NEA should store the provision returned from the band for future communication. On subsequent usage, NEA’s validate against the provisioned Nymi band. Once validation is successful, the NEA can assume an authenticated user. All those steps must be implemented by the developer using the SDK’s for different platforms.
To complete the exercise, I wrote an Android app that makes provisioning and validating flow and finally gets user authenticated if user is close enough to the device, in this case mobile or tablet. After I got authenticated my wife wore the band and tried to get authenticated but authentication failed all the time as expected.
SDK is good but needs some enhancements, though. Even at Nymi, they are having hard time with problems in their own NEA’s like the unlock app for Mac OS X that currently is not working and I have posted couple of issues and bugs I found.
As first attempt for this new authentication automatization niche, I like it, and I think is good enough.
I see a lot of potential and possibles use cases for this band in enterprise. Definitely I would use it, but what I would really love is a band that can handle authentication, sport tracking and motion, notifications and time in the same device. Probably that’s too much for now but I’m looking forward to seeing that device soon.
I recently ventured down to Mexico to participate in an Internet of Things (IoT) hackathon organized by Laurie Pattison’s (@lsptahoe) Apps UX Innovation Events team with some of my fellow AppsLab members, Luis Galeana, Tony Orciuoli, and Osvaldo Villagrana.
Being the lone non-developer, I wasn’t sure how much I would be able to contribute—but I had done some research pertaining to our use case, so I felt I had at least that much to offer.
Our rather illustrious use case pertained to a perennial workplace problem—lines to use the bathroom. In MDC, there is a preponderance of men, and so apparently waiting can be an issue. Some of my research has found that elsewhere, where there are more women than men, lines to use the women’s bathroom in the office place can be a serious annoyance.
Thus was born what was originally playfully titled Bathroom Management (BM), though we ended up with Presence, which would work more generally as a presence management system that could also handle conference room reservations, among other things.
I had never been part of a hackathon, but I definitely found out the appeal. As a lover of deadlines, and my own experiences coding at night (definitely the best time for coding), it seems just right for this sort of thing. Free snacks and beverages, food carts for lunch and dinner, and a beautiful view from the MDC office 9th floor, it was an excellent setting.
I was able to help intermittently with thinking through some of the logic of our scheduling system, and with our pitch at the end, so I did feel I added something, even if the lion’s share of the work was done by the other three. Being a two-day hackathon, we had one late night, which I stuck around for, and ended up reading about and playing with Python, in the hopes it might come in handy. It didn’t, but there’s always next time.
Our presentation of Presence garnered some good laughs, which we didn’t quite expect, but at least everyone was engaged. We had a great demo showing our scheduling system for bathroom stalls, which included proximity sensors, sounds, and displays in the stall, and a web interface for scheduling, as well as IM, phone, and watch notifications when the stall you reserved becomes free.
We came in third, after two other solid entries, and took home the People’s Choice award, perhaps because our solution filled a real need in the office! I did learn a lot from the other winners, particularly on how we could have pitched it better to highlight the enterprise applicability. So again, there’s always next time.
All in all I found it highly favorable, and hope I have another chance to do it again in the future.