I’m finally harvesting my open browser tabs, sharing some of the cool stuff that has been patiently waiting to be shared with you, gentle reader.
Chrome Remote Desktop for Android
Chrome Remote Desktop has been quietly awesome for a few years, and Google recently extended it to Android. So now, I can troubleshoot my parents’ computer from the road. Yay.
Project Ara Gets Closer
Squee! I’m not really sure why, but I’m so geeked for Project Ara phones, i.e. Google’s upcoming modular smartphones. Design your phone from the sensors up to the screen size, or something like that.
Everything is DIY now, so why not?
Speaking of everything being DIY now, some clever bloke built a Raspberry Pi smartphone.
The UX Drum
Longtime friend of the ‘Lab, Floyd Teter (@fteter) wrote a post about the importance of UX. I concur.
And finally because everyone gets excited about networking gear, especially this time of year, I give you my latest bit of nerd pr0n, the Linksys WRT1900AC. Short version, it’s a really fast wifi router, something every telecommuter should covet. Want the long version? Techcrunch did a review.
Speed comes at a price, namely $250, but I’m asking myself why pay for a big pipe when wifi has always been the choke point?
How about you? Care to share your open browser tab nuggets?
You know what to do.
On this team, we all carry Android devices, lots of them, including phones. Even Noel (@noelportugal) has finally been converted.
Everyone on the team, minus me, is an Android developer, and as they build for new devices like Google Glass and the upcoming Android Wear watches, the ability to project screen images becomes more essential.
Case in point, at a recent sales event, I was showing a Glass app and companion tablet app that Anthony (@anthonyslai) and Raymond built as a concept demo for Taleo interview evaluations.
Using Glass for the first time requires training, so I typically use the screencast option of the MyGlass app to see what the wearer sees. In this case, I was also showing an app on the tablet, so I couldn’t keep the screencast running.
Similarly, when I’m showing Glass or any Android apps to a room of people, projecting the screen images is a bit of an adventure.
Necessity being the mother of invention, Anthony decided to address our collective need for better Android projecting by modifying Android Projector, an open source Java project to support projecting from multiple Android devices.
You can find his code on GitHub.
Android Projector requires adb, part of the Android SDK. If you have adb, run:
And copy the device ID you want to project. Then from the directory where you downloaded Anthony’s version of Android Projector, run:
./android-projector <device ID>
Want to show two devices? Open another terminal session, copy the other device ID, rinse, repeat.
And voila, you can see both device’s screens. If you’re giving a demo, you can now project your laptop’s screen to show all the screens.
Pretty cool, eh? Find the comments.
Unfortunately for him, Ultan won’t be around to attend, so when the Java team came looking for ideas for this year’s Faire, he asked if we’d help. Noel (@noelportugal), a longtime maker as well as a past Maker Faire attendee, jumped at the chance to represent Java and Applications User Experience at this year’s installment.
But wait, there’s more. This year, on May 13 and 14, there will be a conference the week before the Maker Faire, aptly named, MakerCon. This two-day conference focuses on the business of making and will be hosted at the Oracle Conference Center.
Our fearless leader, Jeremy Ashley (@jrwashley), himself an avid maker and tinkerer, will be delivering a keynote on May 13 to kick off the event.
So, for us at least, next week is Maker Week.
Noel has been feverishly assembling a DIY activity for the Java Embedded Playground at the Maker Faire involving some Internet of Things things and a bunch of Raspis. He teased these pictures to give a taste.
Not to spoil the fun entirely, but what he’s building is a set of triggers and results (a la IFTTT), all automated. Visitors will choose an input (e.g. a sensor), a condition (e.g. keyword, hashtag) and an output (e.g. robot arm, Sphero) and watch the magic of IoT happen.
I’m excited to try this myself, especially the Sphero, which looks like outrageous fun, h/t to Tony for that one.
Update: Worth noting that longtime friend and honorary member of the ‘Lab, David Haimes (@dhaimes) will be joining us in the Maker Faire tent to help over the weekend. Come by and see us in all our IRL glory.
Remote Presence Devices or RPDs are finally becoming mainstream with products such as the Beam from Suitable Technologies. Today I kicked the tires of one (virtually of course) thanks to friend of the lab Dan Kildahl. I toured the newly renovated marketing offices at Oracle HQ. My first impression was really good. Around family and friends I am known as the clumsy game player. Yeah, I’m the one that gets constantly stuck against walls during first-person video games. But with the Beam interface I was able to easily navigate around the floor. I didn’t hit any wall, and that is good news.
I asked Dan how it was received around the office. He mentioned mixed opinions, which is completely understandable. All these new technologies are for sure changing social norms (see Google Glass). But as a technologist I just can’t help but feel excited.
What are your thoughts?
If you haven’t talk to me IRL for the past 10 months, then I haven’t pestered you about the wonders of BLE and micro-location. My love affair with BLE (Bluetooth Low Energy) beacons became clear when I heard at WWDC 2013 that Apple was implementing BLE beacon detection in their CoreLocation framework. Apple showed how a small BLE beacon sending a constant signal (UUID + Major + Minor *) at a given interval could help for what is now known as micro-location.
At the time I just happened to be experimenting with wifi and bluetooth RSSI to accomplish similar results. I was prototyping a device that sniffed MAC addresses from surrounding devices and trigger certain interactions based on our enterprise software (CRM, HCM, etc). You can find more on this topic in the white paper “How the Internet of Things Will Change the User Experience Status Quo” (sorry but its not free) that I presented last year at the FiCloud conference.
The BLE beacon or iBeacon proved to be a better solution after all, given its user opt-in nature and low power consumption capabilities. Since then I have been prototyping different mobile apps using this technology. The latest of these is a Google Glass + iBeacon ( github link: GlassBeacon) example. I’m claiming to be the first to do this implementation since the ability to integrate BLE on Glass just became available on April 15 2014 :).
Stay tuned for more BLE beacon goodness. We will be showing more enterprise related use cases with this technology in the future.
*UUID: a unique id to distinguish your beacons. Major: used to group related sets of beacons. Minor: used to identify a beacon within a group
It’s been a busy month around these parts.
Noel (@noelportugal) and I went to the Netherlands, specifically Amsterdam, Utrect, and Nieuwegein, to visit AMIS and show some of the cool stuff Applications UX has been doing. By all accounts the event was a massive success with something like 450 people visiting AMIS during the day to visit.
Here’s some press coverage in Dutch if you’re so inclined.
AUX had about ten different stations showing various demos, including the newly-minted Release 8 Simplified UI for HCM and Sales Cloud, Mobilytics, Oracle Voice, UX Design Patterns, UX Direct, the almighty eye-tracker and our Glass, Pebble and robot stuff, including the hot new robot arm, which we’re now controlling remotely, more on that to come.
As if that weren’t enough, there was also a Secret Chamber that required a non-disclosure for entrance with cutting edge stuff.
I spent my day locked away in the Secret Chamber, while Noel and Patrick (@patch72) handled the crush of people eager to get their hands on Google Glass. The beginning of the event was exclusively for students, and at one point, Noel and Patrick were swarmed by about 60-70 people trying to get a turn with Glass.
Sidebar, Glass elicited some very curious reactions around the NL. People seemed genuinely interested, a different reaction than you get here in the States where people can be outwardly aggressive about Glass infringing their privacy. Noel wore his most of the time, and several people stopped him to ask about them. Several times the exchange went like this:
Is that Google Glass?
Is it real?
Strange follow-up question, maybe there’s a market for bogus tech there.
Anyway, the event was awesome, and everyone at AMIS was so friendly and accommodating and generous to us. Everything about the trip was fantastic.
Speaking of trips, Noel and I will be at COLLABORATE 2014 in Las Vegas, April 7-11, as will other members of AUX. Check out all of the AUX activities over at VoX. Noel and I will be working the booth, so stop by and say hello if you’re attending the conference.
That’s my month so far.
Find the comments.
After years of hounding from me, Noel (@noelportugal) made the jump to Android usage and development about a year ago. He started with a Nexus 7, the first generation one, but it wasn’t until he got a phone, the Moto X, that the transformation was complete.
So now, Noel is mixed ecosystem guy.
We’ve had Chromecasts since they were announced last year (they’re awesome), and with the recent release of the Google Cast SDK, Noel has been kicking the tires and experimenting with the $35 streaming dongle.
All his tinkering lead to his very first app in the Play Store, Newcaster. Newscaster uses voice search to find and read news headlines on your TV via Chromecast. That’s it.
It’s not very functional, more just a proof of concept, but it’s noteworthy, given how long Noel has been an iOS guy. Plus, now that he has some experience with the Google Cast SDK, Noel’s creative juices will start to flow. I hope to see some interesting Chromecast features soon. Stay tuned.
Every time I see him, he manages to work that feature into casual conversation at least once. Unfortunate naming aside, Tony knows his users; his app has a rating of 4.9 stars from 463 reviews, not too shabby.
Anthony’s (@anthonyslai) Moovy has been in the Play Store for nearly four years, since it was called the Android Market. Originally called Happy Feet, this app won the Move Your App! Developer Challenge at Health 2.0 in 2010.
So, there you have it, the collected Android apps of our humble team, at least the ones I can talk about on the intertubes. Stay tuned and maybe someday you’ll read about the others.
Find the comments.
Update: Turns out I missed our most prolific app developer, Raymond. My bad. Check out his three apps.
This one has a story. Raymond’s daughter was asking Santa for a Magic 8 Ball for Christmas, and she got one. Raymond and his son decided to make an app for her.
His son created the 3D ball with center carved out, put light/shade effect, and created background graphics. Raymond created the database to hold standard answers and user-entered answers, and added animation to roll the ball.
Then Raymond’s son did the I18N to include Chinese and Japanese, knowing that Chinese and Japanese speakers do not download apps from English-version of Play Store.
Raymond liked building Android apps so he went ahead and built two more, because, why not?
He and a group of Taleo guys went out for lunch every day, and they got tired of picking a place. So, he made an app for that, i.e. for making choices.
A colleague at Taleo complained to Raymond about always losing receipts when traveling. So we built an expense and receipts app to help people record their expenses and receipts.
So yeah, Raymond is the guy who builds apps to make you stop complaining about first world problems. Helpful dude
Editor’s note: Another cross-post from VoX, this one from Julian Orr. User experience works best when you understand your users. So, help us understand you and your mobile strategy by completing this painless and fun questionnaire.
What is your perspective on enterprise mobility? Tell us!
By Julian Orr, Oracle Applications User Experience
Is there a certain device capability, such as the ability to capture mobile signatures or remotely wipe a device, that is so important to your mobile workflow that it has influenced your enterprise mobility strategy?
When it comes to making decisions about your organization’s enterprise mobility strategy, there are a few inescapable themes:
- Allowing people to use their own devices vs. having to use company-supplied devices
- Using browser-based vs. native applications
- Optimizing your apps for smart phones vs. tablets
- Whether or not to include or exclude a particular mobile platform.
That businesses are committing resources to create and execute a mobile strategy is a given. The permutations of approaches to mobile strategies are endless, and the reasons behind them are varied and nuanced.
These approaches and their justifications are well understood from a generic enterprise perspective, but what are the common themes of an Oracle customer’s mobile strategy? How does it vary from that of the marketplace as a whole?
If one thing is clear, it is that Oracle customers want to do big things with mobility.
At Oracle, we are committed to using customer feedback to continually improve our products and services, and to help you realize exceptional business outcomes.
As such, Oracle has created a survey to capture and understand enterprise mobility from an incredibly important perspective, that of an Oracle customer.
We want to know what our customers are doing now, what you plan to do in the near future, and most importantly, what are the key influences to your strategy — employee engagement, security, cost, or something we have yet to hear about.
Please take our enterprise mobility survey. The survey will remain open until March 28, and will take about 15 minutes to complete. The survey also includes a follow-up option to become more involved in Oracle applications research.
To learn more about the Applications User Experience team, please visit the UsableApps web site.
About a month ago, I walked off a plane into the terminal at SJC. As I hiked that long walk to the exit, I heard a familiar and annoying sound, the Emergency Alert System sound.
The sound was muffled, and it took me a few more steps to realize it was my own phone bleating in my pocket. Somewhat embarrassed, I took it out and quickly dismissed the alert, which happened to be an AMBER Alert.
I dismissed the notification so quickly that I didn’t get much information from the actual alert itself. I walked a bit farther before stopping to see if I could recall the notification and actually read it, nope.
I did get another chance about an hour later, as I stood at the front desk of a hotel, but again, the bleating of alert, coupled with the social awkwardness made it nigh impossible for me to read the alert. I tried again to find it without success.
And so it goes with every emergency alert I’ve received on my phone. They have a knack for coming at inopportune moments, like the time one came through in the middle of the night while we were all asleep. Good times.
So, we’ve got a major usability problem here. On the one hand, smartphones are an enormous boon for emergency officials who need to notify the general public. However, on the phone side, how do you make the alerts more usable without forcing users to resort to turning them off completely?
At the very least, the alert should be retrievable so I can review it in peace after turning off that awful noise.
Interesting problem, thoughts?
Editor’s note: Here’s a cross-post from VoX by Friend of the ‘Lab, Kathy Miedema, about a Raspberry Pi Hackday Noel (@noelportugal) organized and ran a couple weeks ago. The basic idea was to get developers up and running on the Pi quickly and have some fun.
New Oracle developers get a taste of Raspberry Pi
By Kathy Miedema, Oracle Applications User Experience
There is a team within the Oracle Applications User Experience (UX) group that basically plays with interesting technology. We call them the AppsLab (@theappslab). That technology may include fuzzy ears (@ultan) that interact with your brain waves, robot arms, or Google Glass.
Recently, it included Raspberry Pi. And a day of hacking.
My team — the Communications & Outreach arm of the Applications UX group — sometimes works closely with this team. My boss has her own set of fuzzy ears. I’ve tried out the robot arms (I totally suck at moving them). And recently, I was introduced to Raspberry Pi.
Now, I’m a word person – if this small computer had been named anything else, my eyes might have glazed over. But the chance to tell folks about the creative ways that Oracle investigates and explores technology that can evolve the Oracle user experience … well, I’m much better at doing that. Especially if I’ve got a visual place from which to start the story.
Raspberry Pi, above, is actually an inexpensive computer that was originally made for kids. It was intended to give kids a device that would help them learn how to program computers. (Neat story there from the U.K. creators.)
Noel Portugal (@noelportugal), the developer who led the January training and hackday, said the credit-card-sized computer can do anything that a Linux computer can do. It’s easy to hook up and, because it costs about $35, easy to replace. So it’s a perfect starting point for kids, and it has an Oracle connection: Oracle’s Java evangelists worked with the Raspberry Pi creators directly to make sure Java runs natively on the device.
Noel’s one-day event included about 15 developers who also work for the Oracle Applications User Experience team. Many were from Oracle’s Mexico Development Center; others came from the Denver area or the Northwest. AppsLab talking head Jake Kuramoto said the idea was to provide a shortcut to the technology and tap into Noel’s experience with it, then get everyone up and running on it. The day was a way to investigate something new in a collaborative session.
This hackathon took place at Oracle headquarters in Redwood Shores, inside the Oracle usability labs. By the end of the day, I was hearing random, sometimes crazy noises as network hook-ups took hold and programming began.
Our developers were using the Raspberry Pi with their laptops and smart phones to create sounds, issue commands, and send signals through various devices. Noel said the maker community uses Raspberry Pi to control robotics, control a server, switch lights and off, and connect sensors, among other things.
Here’s a look at our developers at work.
OK, so some of this stuff was over my head. But it was fun to watch really focused, talented people do something they thought was fun. The creative bursts that come through while investigating and exploring are motivational. Technology, in any form, is fascinating. When applied to everyday objects in ways that evolve the user experience – it’s like watching science fiction unfold. But on the Oracle Applications User Experience team, it’s real.
The Applications UX team’s mission is to design and build “cool stuff,” as Jake puts it. Team members look at all kinds of technologies, because we know through research that this is what our users are also doing.
Stay tuned to VoX to learn more about the new, interesting, and creative ways we are evolving the user experience of enterprise software with similar methods of exploration. Be the first to see what’s coming!
Shortly after they finished that, they began to complain that the OWI 535 wasn’t good enough. They conspired to convince me that we needed the Lynxmotion AL5A because it was “better.”
Everyone loves the robot arm demo, and it’s quite memorable. So, I made them a deal; get the Lynxmotion, but make it controllable by Leap over the intertubes.
A couple weeks ago, Noel took the box of parts that is the unassembled AL5A and made them into a robot arm, no easy task. There are 21 steps in the assembly instructions for the base alone.
This week, he, Anthony and Raymond have attached the arm to a Raspberry Pi, which controls it, and they’re working all the other pieces, including the Leap code, written in Python, and the other bits.
If all goes well, I should be able to open a browser to show a Dropcam streaming video of the AL5A. I’ll be able to attach my Leap Motion, run a script, and then control that arm remotely by waving my hand over the Leap. Pretty cool stuff.
I had high hopes that the old and busted robot arm would get a post describing the technical workings that went into bringing it to life, and I’ve been promised a post on the details of the new hotness as well. We’ll see. I’m not holding my breath.
Even so, there’s a lot of ingenuity that should be documented. Noel says he thinks this might be the first documented use of a Raspberry Pi to control this particular arm, so there’s that.
So, if you see any of us at a conference this year, ask for a demo of the new hotness.
Find the comments.
So, it’s all David (@dhaimes) all week, or something, because here comes another post about him.
This time, it’s his walking meetings that caught my eye, quite literally.
I was sitting in the lobby of the Oracle Convention Center in January during the IOUC Summit, looking out the window at the 70-degree day, and David and another dude walk by on the trail across the street, deep in conversation. At the time I thought, that looks like a 1-1 meeting, given that David was gesticulating wildly and doing all the talking.
Now I’m pondering the Bad Ideas that would accompany this one. Related, someone has graciously collected all the Good Idea/Bad Idea segments into one epic video.
In other news, we’ve grown again, adding a dude named Bill. If you’re keeping score at home, we’re now a tiny team of eight.
Sometimes, I get to share something awesome.
A post about how Oracle Social Network can be used by Oracle ERP Cloud users happens to be something awesome. Let me explain.
Despite assumptions to the contrary, accounting is a very collaborative exercise in many enterprises. Ultan (@ultan) does a great job explaining why in his post “How to Chat Up an Accountant Safely: Social Networking in the Finance Department.”
If you read that post, you’ll see that honorary ‘Lab member, David Haimes (@dhaimes) is the nexus for this OSN+ERP Cloud feature. Check out his post, “Socializing the Finance Department” on how accountants can use OSN to streamline period close for more background.
Rather than rewrite what David and Ultan have already said, I’ll provide some awesome, at least IMO, backstory.
Back in 2012, when I was a WebCenter evangelist, David and I chatted about a brilliant idea he had to integrate OSN into Finance. WebCenter included OSN at the time, not sure if that is still true or not. If you read the above links, you’ll know what David’s idea was. If not, you should read them, or feel free to proceed with incomplete context.
Deep backstory, I’ve known David for more than a decade and worked with him for several years on E-Business Suite Financials way back in the day. Anthony (@anthonyslai) was with us then as well.
Anyway, I liked David’s idea and coordinated the resources he needed. I didn’t do much, just connected him with the right people within OSN and watched the magic happen.
As Ultan mentions, this is a user experience win. OSN does exactly what the users need, nothing more, in just the right context, i.e. during period close, ERP Cloud users can use OSN conversations to communicate, exchange information and get work done, all in a traceable, easy to consume stream of relevant information.
I like telling stories, and this is a success story, spawned from a phone call from David that I took at the San Francisco Airport Marriott while attending a product management training that was a complete waste of my time.
Maybe someday I’ll get to tell this story to a user. That’s always a hoot for me, humble beginnings and all.
So, here are some conceptual screenshots of what this looks like.
And in conclusion, I give you Hannibal and a victory cigar.
Find the comments.
On March 18, AMIS will be hosting an OAUX (or Oracle Applications User Experience if you’re not into the whole brevity thing) Expo. The purpose of an expo, the brainchild of Misha (@mishavaughan), is to provide a showcase for all the work AUX has been cooking up in one place.
What work is that, you say?
Stuff like Simplified UI for the Sales Cloud and HCM Cloud, cutting-edge technology like Voice and even the research projects we’ve been doing like Glass, the Leap-controlled robot arms, Pebble and geo/wifi-fencing and other stuff still under wraps.
Expos have been a big hit so far. Don’t believe me, check out what WIPRO had to say about the one we had at OpenWorld last year.
I, for one, am really looking forward to this particular expo, not only because this will be my first trip to the Netherlands, but because I’m stoked to meet the good people at AMIS and swap insights about user experience and development.
Oh yeah, and I’m hoping to run into Patrick Barel (@patch72) on his home turf. I’m a giant baby about international travel, so it’ll be nice to have a local point me to the must-see places in Nieuwegein, Utrect and Amsterdam.
Hackathons are a great way to stay up to date on the latest technologies as well as for keep your coding chops fresh. At the beginning of this month Anthony Lai, Raymond Xie, Mark Vilrokx (honorary AppsLab member) and I participated in the AT&T Developer Summit Hackathon. The event was held at Palms Casino and Resort in Las Vegas. The Internet of Things played an important role. New technologies that allow us to pinpoint our indoor location aka micro-location were all the rage. I am huge advocate for the technology that powers this: bluetooth low energy or BLE. Both winning teams used BLE beacons in their projects.
This was the second year that I attended and this time I came back with (almost) the whole team. We had a blast. During the hackathon there were two main tracks. A Wearables track and a Mobile track. There were a lot of sponsors that provided APIs to their products. Between AT&T and sponsors there were ample devices and gizmos available for us to hack with.
I won’t go into much detail about our project, but you will hear more about it later. Our team decided to participate in the Wearables track with an emphasis on public safety. We decided to use AT&T’s M2X platform, which is AT&T cloud’s solution for the Internet of Things and they have branded it as “Machine to Everything.” This includes a basic REST interface to allow devices input and output data. We also used a Freescale FRDM-KL46Z micro-controller with ARM libraries provided by mbed. And if that weren’t enough, our project included our beloved Google Glass and a couple awesome Phillips Hue lights for visual notifications.
For some reason, Noel (@noelportugal) posted a picture of his IoT gadget inventory.
Click to embiggen.
I’m not entirely sure why I liked the hockey puck so much.
It’s a beautiful little piece of technology, absolutely. Some users report that it pays for itself in energy savings rather quickly, sweet. It learns your home/away behaviors and sets itself accordingly, cool. You can control it from anywhere via the mobile apps and web app, great.
I finally settled on the fact that it’s a disruptive innovation in an area where you don’t expect innovation. Companies have been incrementally improving thermostats, but the overall experience has not been rethought. They just keep making them bigger, with more features and harder to use.
Anyway, I did my diligence first to make sure my system was compatible with Nest; they do a really nice job supporting the pre-purchase and installation phases. Installation was easy enough, and the initial setup went quickly. The entire process from unboxing to working thermostat probably took three hours, and I went very slowly.
The Nest itself has a very clever interface with minimal interactions, really only two, pressing the hockey puck into its housing, which makes a satisfying thud, and rotating the dial, which makes a click. Despite having limited interface capabilities, I found entering a wifi password much less annoying on the Nest than it is on a smart TV with a standard remote.
In addition to controlling the heat and air conditioning, the Nest has a motion sensor, so it will light up when you pass it, and presumably, it will set itself to Away at some interval without motion.
The Nest includes a subtle game mechanic, the leaf, which appears when, according to the Nest, you’re saving energy. My brain got a nice shot of dopamine the first time I saw that leaf, and now, I’m compelled to earn it daily and disappointed if I don’t. Nice touch.
The mobile and web apps are where most users will spend the majority of their time interacting with Nest. Similar to the hockey puck’s OS, these apps are simple and not overloaded with features. Here’s the Android app:
Aside from one issue, I’ve been very happy with the Nest so far. Less than a day after installing it, I wasn’t able to control the Nest from any of its accompanying apps because the wifi receiver was off to conserve battery. This struck me as odd, given the device is directly connected to my home’s electricity.
I did some digging and went down a wrong path, but ultimately, all I had to do was upgrade my router’s firmware. Nest’s customer support was quite helpful and responsive.
Other nice features, Nest sends a monthly energy report, which gets more useful over time, and they recently bought a company called MyEnergy that tracks utility usage and offers energy saving tips.
Overall, the Nest provides an excellent experience, well thought out from pre-purchase all the way through continued usage. It’ll really rock if it can pay for itself in 12-18 months.
The other home automation gadget I got for Christmas was a Roomba 770. I’ve always been skeptical about the ability of these robots, but interested in the technology and the potential.
As with the Nest, I did my diligence, and it seems like most of the negative reviews center around people who expected the robot to replace a traditional vacuum cleaner. Luckily, I didn’t have that expectation; I just want it clean enough so I can walk around in bare feet without collecting miscellaneous debris on my feet.
The Roomba does this quite well, and it’s amusing to watch it navigate a room. I keep trying to see patterns, but I can’t discern any. It’s really a marvel of hardware and software technology.
It does take a while to finish a room, and it’s a bit loud. Neither really matters to me though. I’ve found the best time to run it is when we’re away. Otherwise, we bump into each other a lot.
I know less about the Roomba than the Nest, given it requires virtually no setup and configuration. The Roomba does have a long list of features, but I haven’t been curious enough to look at them all yet. So far, it does exactly what I want, and that’s perfect.
So, did you get home automation gadgets for Christmas, or semi-related, see anything at CES that interested you?
Find the comments.
Lots of news coming out of Applications UX lately, so thought I’d share it here.
Over at Misha’s (@mishavaughan) Voice of User Experience blog, you can read about Simplified UI for Oracle Human Capital Management Cloud and Simplified UI for Oracle Sales Cloud. Both versions went live in September when Oracle Cloud Applications, Release 7 was released.
Longtime followers of Jeremy’s AUX team may know Simplified UI as Fuse, which I test-drove for giggles about a year ago on some Android devices and other gadgets. Just FYI, I will never refer to it as that again, for it is the name that cannot be spoken.
Also at VoX, Kathy has a Q&A with one of the partners who attended the OAUX Expo at OpenWorld. Incidentally, Anthony (@anthonyslai), Noel (@noelportugal) and I were there, showing the Leap Motion-controlled robot arms, the Google Glass Sales Cloud concept app Anthony built, and another special project that I am not at liberty to divulge.
Finally, Ultan (@ultan) has a brief roundup of an Oracle partner event held last week in Manchester, specifically focusing on Noel’s demo of Google Glass’ look and translate feature. Noel’s been busy, presenting at UKOUG Tech 13 and trotting Glass around the UK. He has posted some pictures of his travels to our Facebook page.
Maybe he’ll post them to our G+ page so everyone can see, maybe not, we’ll see.
I’m cleaning up all the open tabs for the holidays, so here are some nuggets I found that may or may not be interesting.
Hinkmond Wong of the Java Embedded Technology team did a fun Thanksgiving project, a Turkey that tweets as it cooks.
It’s time for the Internet of Things (ioT) Thanksgiving Special. This time we are going to work on a special Do-It-Yourself project to create an Internet of Things temperature probe to connect your Turkey Day turkey to the Internet by writing a Thanksgiving Day Java Embedded app for your Raspberry Pi which will send out tweets as it cooks in your oven.
MIT’s Dynamic Shape Display
On Telepresence Robots
For reasons I can’t explain, I love robots. So, of course, Ars’ an in-depth review of Suitable Technologies‘ Beam telepresence robot caught my attention. We toyed with a similar idea, using the HEXBUG Spider XL, but it required a lot of hacking and other parts, namely a smartphone. I think the guys were just humoring me.
And Finally, Helvetica: The Perfume
I love this, even at $62 for 2 ounces of distilled water, h/t Kottke.
Since Google announced the Chromecast earlier this year, I’ve been stoked to see how it developed.
The little device has a ton of potential, and even though Google has been a little slow to push its adoption, even slowing down the efforts of some curious developers, they do seem committed to the device.
After four months using the Chromecast every day, I still love it.
There’s a lot to like about the Chromecast, even if you set aside the price. One thing I’ve noticed over four months is there’s a ton of content available in the Google Play Store. The Play Movies & TV app supports the Chromecast, obviously, and before the Chromecast, I never really considered the Play Store as an alternative to Amazon or iTunes for movies and TV.
Unsurprisingly, it’s very easy to buy content and cast it, all from the device. I’ve never used an iOS device with AirPlay to do this with an Apple TV, but my guess is that they’re similarly easy. I don’t know if Amazon probably has anything like this for the Kindle Fire, but I have to assume if they don’t, they soon will.
If you don’t subscribe to Netflix and/or HuluPlus, you’re probably not in the market for a Chromecast. But if you do, it offers a great way to get content onto your TV, smart or otherwise. This is a plus, given the relative size of smart TV ecosystems when compared to Android and iOS.
Maybe it’s just me, but I’d much rather cast Netflix to my TV than use an app built for a smart TV OS.
Speaking of the TV, the Chromecast turns on the TV to the appropriate HDMI input when you cast, which is nice. You can also control volume from the player, if the app supports it, but alas, to turn off the TV, you’ll have to find the remote.
A final unexpected plus I’ve noticed is that having my device at hand while I’m watching and controlling programming means more second screen activity. So, I find myself looking up stuff on IMDB that I would have tabled for never in the past. On the downside, I end up reading email too.
Not many apps have adopted Chromecast yet, which seems to be a combination of Google’s desire to keep development tight and perhaps a wait-and-see approach from content owners.
Right now, only a handful of apps support it, Netflix, HuluPlus, Pandora, Play Movies & TV, YouTube, HBO GO and Play Music. That’s a lot of content, but depending on what you watch, wider support probably matters.
Update: Today, Google announced a slew of new apps that support Chromecast.
Of course, one major feature of the Chromecast is its ability to cast from any computer with Chrome and the Google Cast extension. Although the majority of my experience has been casting from a device, I have done so from Chrome, with mixed results. You can also cast local content from the computer too.
Like anything over a network, speed matters. The faster your wifi, the better the casting experience.
Turns out that router placement for the device that’s casting matters too.
I’ve found that if my device has low (one bar) connectivity to my router, the device often loses connectivity to the Chromecast. This results in an uncontrollable stream, i.e. the player on the device disappears and I can no longer pause or stop playback, bit of a bummer, but not a fatal flaw.
It turns out that all players are not created equally. Most of them do offer the ability to pause from the lock screen on Android, which is very nice, but from a consistency perspective, each player implements casting differently.
For example, Netflix offers a stop button from their player, which is full screen, while HuluPlus does not, because HuluPlus doesn’t seem to offer stop at all in their app.
For comparison, here are the players for Play Movies & TV and YouTube.
So, yeah, each one is different, even those produced by Google for its own apps. Minor complaints, and to be expected.
Google seems poised to expand support for the Chromecast, which is great news. Rumors suggest that media center app, Plex, will soon release support for it. Personally, I’m looking forward to casting from the native Android Gallery so I can cast pictures and video of my daughter.
Anyway, those are all my thoughts on the Chromecast after using it for four months. At $35 a piece, it was a no-brainer to buy one for each of my TVs. I may even buy it as a holiday gift.
Find the comments.