It’s been a busy month around these parts.
Noel (@noelportugal) and I went to the Netherlands, specifically Amsterdam, Utrect, and Nieuwegein, to visit AMIS and show some of the cool stuff Applications UX has been doing. By all accounts the event was a massive success with something like 450 people visiting AMIS during the day to visit.
Here’s some press coverage in Dutch if you’re so inclined.
AUX had about ten different stations showing various demos, including the newly-minted Release 8 Simplified UI for HCM and Sales Cloud, Mobilytics, Oracle Voice, UX Design Patterns, UX Direct, the almighty eye-tracker and our Glass, Pebble and robot stuff, including the hot new robot arm, which we’re now controlling remotely, more on that to come.
As if that weren’t enough, there was also a Secret Chamber that required a non-disclosure for entrance with cutting edge stuff.
I spent my day locked away in the Secret Chamber, while Noel and Patrick (@patch72) handled the crush of people eager to get their hands on Google Glass. The beginning of the event was exclusively for students, and at one point, Noel and Patrick were swarmed by about 60-70 people trying to get a turn with Glass.
Sidebar, Glass elicited some very curious reactions around the NL. People seemed genuinely interested, a different reaction than you get here in the States where people can be outwardly aggressive about Glass infringing their privacy. Noel wore his most of the time, and several people stopped him to ask about them. Several times the exchange went like this:
Is that Google Glass?
Is it real?
Strange follow-up question, maybe there’s a market for bogus tech there.
Anyway, the event was awesome, and everyone at AMIS was so friendly and accommodating and generous to us. Everything about the trip was fantastic.
Speaking of trips, Noel and I will be at COLLABORATE 2014 in Las Vegas, April 7-11, as will other members of AUX. Check out all of the AUX activities over at VoX. Noel and I will be working the booth, so stop by and say hello if you’re attending the conference.
That’s my month so far.
Find the comments.
After years of hounding from me, Noel (@noelportugal) made the jump to Android usage and development about a year ago. He started with a Nexus 7, the first generation one, but it wasn’t until he got a phone, the Moto X, that the transformation was complete.
So now, Noel is mixed ecosystem guy.
We’ve had Chromecasts since they were announced last year (they’re awesome), and with the recent release of the Google Cast SDK, Noel has been kicking the tires and experimenting with the $35 streaming dongle.
All his tinkering lead to his very first app in the Play Store, Newcaster. Newscaster uses voice search to find and read news headlines on your TV via Chromecast. That’s it.
It’s not very functional, more just a proof of concept, but it’s noteworthy, given how long Noel has been an iOS guy. Plus, now that he has some experience with the Google Cast SDK, Noel’s creative juices will start to flow. I hope to see some interesting Chromecast features soon. Stay tuned.
Every time I see him, he manages to work that feature into casual conversation at least once. Unfortunate naming aside, Tony knows his users; his app has a rating of 4.9 stars from 463 reviews, not too shabby.
Anthony’s (@anthonyslai) Moovy has been in the Play Store for nearly four years, since it was called the Android Market. Originally called Happy Feet, this app won the Move Your App! Developer Challenge at Health 2.0 in 2010.
So, there you have it, the collected Android apps of our humble team, at least the ones I can talk about on the intertubes. Stay tuned and maybe someday you’ll read about the others.
Find the comments.
Update: Turns out I missed our most prolific app developer, Raymond. My bad. Check out his three apps.
This one has a story. Raymond’s daughter was asking Santa for a Magic 8 Ball for Christmas, and she got one. Raymond and his son decided to make an app for her.
His son created the 3D ball with center carved out, put light/shade effect, and created background graphics. Raymond created the database to hold standard answers and user-entered answers, and added animation to roll the ball.
Then Raymond’s son did the I18N to include Chinese and Japanese, knowing that Chinese and Japanese speakers do not download apps from English-version of Play Store.
Raymond liked building Android apps so he went ahead and built two more, because, why not?
He and a group of Taleo guys went out for lunch every day, and they got tired of picking a place. So, he made an app for that, i.e. for making choices.
A colleague at Taleo complained to Raymond about always losing receipts when traveling. So we built an expense and receipts app to help people record their expenses and receipts.
So yeah, Raymond is the guy who builds apps to make you stop complaining about first world problems. Helpful dude
Editor’s note: Another cross-post from VoX, this one from Julian Orr. User experience works best when you understand your users. So, help us understand you and your mobile strategy by completing this painless and fun questionnaire.
What is your perspective on enterprise mobility? Tell us!
By Julian Orr, Oracle Applications User Experience
Is there a certain device capability, such as the ability to capture mobile signatures or remotely wipe a device, that is so important to your mobile workflow that it has influenced your enterprise mobility strategy?
When it comes to making decisions about your organization’s enterprise mobility strategy, there are a few inescapable themes:
- Allowing people to use their own devices vs. having to use company-supplied devices
- Using browser-based vs. native applications
- Optimizing your apps for smart phones vs. tablets
- Whether or not to include or exclude a particular mobile platform.
That businesses are committing resources to create and execute a mobile strategy is a given. The permutations of approaches to mobile strategies are endless, and the reasons behind them are varied and nuanced.
These approaches and their justifications are well understood from a generic enterprise perspective, but what are the common themes of an Oracle customer’s mobile strategy? How does it vary from that of the marketplace as a whole?
If one thing is clear, it is that Oracle customers want to do big things with mobility.
At Oracle, we are committed to using customer feedback to continually improve our products and services, and to help you realize exceptional business outcomes.
As such, Oracle has created a survey to capture and understand enterprise mobility from an incredibly important perspective, that of an Oracle customer.
We want to know what our customers are doing now, what you plan to do in the near future, and most importantly, what are the key influences to your strategy — employee engagement, security, cost, or something we have yet to hear about.
Please take our enterprise mobility survey. The survey will remain open until March 28, and will take about 15 minutes to complete. The survey also includes a follow-up option to become more involved in Oracle applications research.
To learn more about the Applications User Experience team, please visit the UsableApps web site.
About a month ago, I walked off a plane into the terminal at SJC. As I hiked that long walk to the exit, I heard a familiar and annoying sound, the Emergency Alert System sound.
The sound was muffled, and it took me a few more steps to realize it was my own phone bleating in my pocket. Somewhat embarrassed, I took it out and quickly dismissed the alert, which happened to be an AMBER Alert.
I dismissed the notification so quickly that I didn’t get much information from the actual alert itself. I walked a bit farther before stopping to see if I could recall the notification and actually read it, nope.
I did get another chance about an hour later, as I stood at the front desk of a hotel, but again, the bleating of alert, coupled with the social awkwardness made it nigh impossible for me to read the alert. I tried again to find it without success.
And so it goes with every emergency alert I’ve received on my phone. They have a knack for coming at inopportune moments, like the time one came through in the middle of the night while we were all asleep. Good times.
So, we’ve got a major usability problem here. On the one hand, smartphones are an enormous boon for emergency officials who need to notify the general public. However, on the phone side, how do you make the alerts more usable without forcing users to resort to turning them off completely?
At the very least, the alert should be retrievable so I can review it in peace after turning off that awful noise.
Interesting problem, thoughts?
Editor’s note: Here’s a cross-post from VoX by Friend of the ‘Lab, Kathy Miedema, about a Raspberry Pi Hackday Noel (@noelportugal) organized and ran a couple weeks ago. The basic idea was to get developers up and running on the Pi quickly and have some fun.
New Oracle developers get a taste of Raspberry Pi
By Kathy Miedema, Oracle Applications User Experience
There is a team within the Oracle Applications User Experience (UX) group that basically plays with interesting technology. We call them the AppsLab (@theappslab). That technology may include fuzzy ears (@ultan) that interact with your brain waves, robot arms, or Google Glass.
Recently, it included Raspberry Pi. And a day of hacking.
My team — the Communications & Outreach arm of the Applications UX group — sometimes works closely with this team. My boss has her own set of fuzzy ears. I’ve tried out the robot arms (I totally suck at moving them). And recently, I was introduced to Raspberry Pi.
Now, I’m a word person – if this small computer had been named anything else, my eyes might have glazed over. But the chance to tell folks about the creative ways that Oracle investigates and explores technology that can evolve the Oracle user experience … well, I’m much better at doing that. Especially if I’ve got a visual place from which to start the story.
Raspberry Pi, above, is actually an inexpensive computer that was originally made for kids. It was intended to give kids a device that would help them learn how to program computers. (Neat story there from the U.K. creators.)
Noel Portugal (@noelportugal), the developer who led the January training and hackday, said the credit-card-sized computer can do anything that a Linux computer can do. It’s easy to hook up and, because it costs about $35, easy to replace. So it’s a perfect starting point for kids, and it has an Oracle connection: Oracle’s Java evangelists worked with the Raspberry Pi creators directly to make sure Java runs natively on the device.
Noel’s one-day event included about 15 developers who also work for the Oracle Applications User Experience team. Many were from Oracle’s Mexico Development Center; others came from the Denver area or the Northwest. AppsLab talking head Jake Kuramoto said the idea was to provide a shortcut to the technology and tap into Noel’s experience with it, then get everyone up and running on it. The day was a way to investigate something new in a collaborative session.
This hackathon took place at Oracle headquarters in Redwood Shores, inside the Oracle usability labs. By the end of the day, I was hearing random, sometimes crazy noises as network hook-ups took hold and programming began.
Our developers were using the Raspberry Pi with their laptops and smart phones to create sounds, issue commands, and send signals through various devices. Noel said the maker community uses Raspberry Pi to control robotics, control a server, switch lights and off, and connect sensors, among other things.
Here’s a look at our developers at work.
OK, so some of this stuff was over my head. But it was fun to watch really focused, talented people do something they thought was fun. The creative bursts that come through while investigating and exploring are motivational. Technology, in any form, is fascinating. When applied to everyday objects in ways that evolve the user experience – it’s like watching science fiction unfold. But on the Oracle Applications User Experience team, it’s real.
The Applications UX team’s mission is to design and build “cool stuff,” as Jake puts it. Team members look at all kinds of technologies, because we know through research that this is what our users are also doing.
Stay tuned to VoX to learn more about the new, interesting, and creative ways we are evolving the user experience of enterprise software with similar methods of exploration. Be the first to see what’s coming!
Shortly after they finished that, they began to complain that the OWI 535 wasn’t good enough. They conspired to convince me that we needed the Lynxmotion AL5A because it was “better.”
Everyone loves the robot arm demo, and it’s quite memorable. So, I made them a deal; get the Lynxmotion, but make it controllable by Leap over the intertubes.
A couple weeks ago, Noel took the box of parts that is the unassembled AL5A and made them into a robot arm, no easy task. There are 21 steps in the assembly instructions for the base alone.
This week, he, Anthony and Raymond have attached the arm to a Raspberry Pi, which controls it, and they’re working all the other pieces, including the Leap code, written in Python, and the other bits.
If all goes well, I should be able to open a browser to show a Dropcam streaming video of the AL5A. I’ll be able to attach my Leap Motion, run a script, and then control that arm remotely by waving my hand over the Leap. Pretty cool stuff.
I had high hopes that the old and busted robot arm would get a post describing the technical workings that went into bringing it to life, and I’ve been promised a post on the details of the new hotness as well. We’ll see. I’m not holding my breath.
Even so, there’s a lot of ingenuity that should be documented. Noel says he thinks this might be the first documented use of a Raspberry Pi to control this particular arm, so there’s that.
So, if you see any of us at a conference this year, ask for a demo of the new hotness.
Find the comments.
So, it’s all David (@dhaimes) all week, or something, because here comes another post about him.
This time, it’s his walking meetings that caught my eye, quite literally.
I was sitting in the lobby of the Oracle Convention Center in January during the IOUC Summit, looking out the window at the 70-degree day, and David and another dude walk by on the trail across the street, deep in conversation. At the time I thought, that looks like a 1-1 meeting, given that David was gesticulating wildly and doing all the talking.
Now I’m pondering the Bad Ideas that would accompany this one. Related, someone has graciously collected all the Good Idea/Bad Idea segments into one epic video.
In other news, we’ve grown again, adding a dude named Bill. If you’re keeping score at home, we’re now a tiny team of eight.
Sometimes, I get to share something awesome.
A post about how Oracle Social Network can be used by Oracle ERP Cloud users happens to be something awesome. Let me explain.
Despite assumptions to the contrary, accounting is a very collaborative exercise in many enterprises. Ultan (@ultan) does a great job explaining why in his post “How to Chat Up an Accountant Safely: Social Networking in the Finance Department.”
If you read that post, you’ll see that honorary ‘Lab member, David Haimes (@dhaimes) is the nexus for this OSN+ERP Cloud feature. Check out his post, “Socializing the Finance Department” on how accountants can use OSN to streamline period close for more background.
Rather than rewrite what David and Ultan have already said, I’ll provide some awesome, at least IMO, backstory.
Back in 2012, when I was a WebCenter evangelist, David and I chatted about a brilliant idea he had to integrate OSN into Finance. WebCenter included OSN at the time, not sure if that is still true or not. If you read the above links, you’ll know what David’s idea was. If not, you should read them, or feel free to proceed with incomplete context.
Deep backstory, I’ve known David for more than a decade and worked with him for several years on E-Business Suite Financials way back in the day. Anthony (@anthonyslai) was with us then as well.
Anyway, I liked David’s idea and coordinated the resources he needed. I didn’t do much, just connected him with the right people within OSN and watched the magic happen.
As Ultan mentions, this is a user experience win. OSN does exactly what the users need, nothing more, in just the right context, i.e. during period close, ERP Cloud users can use OSN conversations to communicate, exchange information and get work done, all in a traceable, easy to consume stream of relevant information.
I like telling stories, and this is a success story, spawned from a phone call from David that I took at the San Francisco Airport Marriott while attending a product management training that was a complete waste of my time.
Maybe someday I’ll get to tell this story to a user. That’s always a hoot for me, humble beginnings and all.
So, here are some conceptual screenshots of what this looks like.
And in conclusion, I give you Hannibal and a victory cigar.
Find the comments.
On March 18, AMIS will be hosting an OAUX (or Oracle Applications User Experience if you’re not into the whole brevity thing) Expo. The purpose of an expo, the brainchild of Misha (@mishavaughan), is to provide a showcase for all the work AUX has been cooking up in one place.
What work is that, you say?
Stuff like Simplified UI for the Sales Cloud and HCM Cloud, cutting-edge technology like Voice and even the research projects we’ve been doing like Glass, the Leap-controlled robot arms, Pebble and geo/wifi-fencing and other stuff still under wraps.
Expos have been a big hit so far. Don’t believe me, check out what WIPRO had to say about the one we had at OpenWorld last year.
I, for one, am really looking forward to this particular expo, not only because this will be my first trip to the Netherlands, but because I’m stoked to meet the good people at AMIS and swap insights about user experience and development.
Oh yeah, and I’m hoping to run into Patrick Barel (@patch72) on his home turf. I’m a giant baby about international travel, so it’ll be nice to have a local point me to the must-see places in Nieuwegein, Utrect and Amsterdam.
Hackathons are a great way to stay up to date on the latest technologies as well as for keep your coding chops fresh. At the beginning of this month Anthony Lai, Raymond Xie, Mark Vilrokx (honorary AppsLab member) and I participated in the AT&T Developer Summit Hackathon. The event was held at Palms Casino and Resort in Las Vegas. The Internet of Things played an important role. New technologies that allow us to pinpoint our indoor location aka micro-location were all the rage. I am huge advocate for the technology that powers this: bluetooth low energy or BLE. Both winning teams used BLE beacons in their projects.
This was the second year that I attended and this time I came back with (almost) the whole team. We had a blast. During the hackathon there were two main tracks. A Wearables track and a Mobile track. There were a lot of sponsors that provided APIs to their products. Between AT&T and sponsors there were ample devices and gizmos available for us to hack with.
I won’t go into much detail about our project, but you will hear more about it later. Our team decided to participate in the Wearables track with an emphasis on public safety. We decided to use AT&T’s M2X platform, which is AT&T cloud’s solution for the Internet of Things and they have branded it as “Machine to Everything.” This includes a basic REST interface to allow devices input and output data. We also used a Freescale FRDM-KL46Z micro-controller with ARM libraries provided by mbed. And if that weren’t enough, our project included our beloved Google Glass and a couple awesome Phillips Hue lights for visual notifications.
For some reason, Noel (@noelportugal) posted a picture of his IoT gadget inventory.
Click to embiggen.
I’m not entirely sure why I liked the hockey puck so much.
It’s a beautiful little piece of technology, absolutely. Some users report that it pays for itself in energy savings rather quickly, sweet. It learns your home/away behaviors and sets itself accordingly, cool. You can control it from anywhere via the mobile apps and web app, great.
I finally settled on the fact that it’s a disruptive innovation in an area where you don’t expect innovation. Companies have been incrementally improving thermostats, but the overall experience has not been rethought. They just keep making them bigger, with more features and harder to use.
Anyway, I did my diligence first to make sure my system was compatible with Nest; they do a really nice job supporting the pre-purchase and installation phases. Installation was easy enough, and the initial setup went quickly. The entire process from unboxing to working thermostat probably took three hours, and I went very slowly.
The Nest itself has a very clever interface with minimal interactions, really only two, pressing the hockey puck into its housing, which makes a satisfying thud, and rotating the dial, which makes a click. Despite having limited interface capabilities, I found entering a wifi password much less annoying on the Nest than it is on a smart TV with a standard remote.
In addition to controlling the heat and air conditioning, the Nest has a motion sensor, so it will light up when you pass it, and presumably, it will set itself to Away at some interval without motion.
The Nest includes a subtle game mechanic, the leaf, which appears when, according to the Nest, you’re saving energy. My brain got a nice shot of dopamine the first time I saw that leaf, and now, I’m compelled to earn it daily and disappointed if I don’t. Nice touch.
The mobile and web apps are where most users will spend the majority of their time interacting with Nest. Similar to the hockey puck’s OS, these apps are simple and not overloaded with features. Here’s the Android app:
Aside from one issue, I’ve been very happy with the Nest so far. Less than a day after installing it, I wasn’t able to control the Nest from any of its accompanying apps because the wifi receiver was off to conserve battery. This struck me as odd, given the device is directly connected to my home’s electricity.
I did some digging and went down a wrong path, but ultimately, all I had to do was upgrade my router’s firmware. Nest’s customer support was quite helpful and responsive.
Other nice features, Nest sends a monthly energy report, which gets more useful over time, and they recently bought a company called MyEnergy that tracks utility usage and offers energy saving tips.
Overall, the Nest provides an excellent experience, well thought out from pre-purchase all the way through continued usage. It’ll really rock if it can pay for itself in 12-18 months.
The other home automation gadget I got for Christmas was a Roomba 770. I’ve always been skeptical about the ability of these robots, but interested in the technology and the potential.
As with the Nest, I did my diligence, and it seems like most of the negative reviews center around people who expected the robot to replace a traditional vacuum cleaner. Luckily, I didn’t have that expectation; I just want it clean enough so I can walk around in bare feet without collecting miscellaneous debris on my feet.
The Roomba does this quite well, and it’s amusing to watch it navigate a room. I keep trying to see patterns, but I can’t discern any. It’s really a marvel of hardware and software technology.
It does take a while to finish a room, and it’s a bit loud. Neither really matters to me though. I’ve found the best time to run it is when we’re away. Otherwise, we bump into each other a lot.
I know less about the Roomba than the Nest, given it requires virtually no setup and configuration. The Roomba does have a long list of features, but I haven’t been curious enough to look at them all yet. So far, it does exactly what I want, and that’s perfect.
So, did you get home automation gadgets for Christmas, or semi-related, see anything at CES that interested you?
Find the comments.
Lots of news coming out of Applications UX lately, so thought I’d share it here.
Over at Misha’s (@mishavaughan) Voice of User Experience blog, you can read about Simplified UI for Oracle Human Capital Management Cloud and Simplified UI for Oracle Sales Cloud. Both versions went live in September when Oracle Cloud Applications, Release 7 was released.
Longtime followers of Jeremy’s AUX team may know Simplified UI as Fuse, which I test-drove for giggles about a year ago on some Android devices and other gadgets. Just FYI, I will never refer to it as that again, for it is the name that cannot be spoken.
Also at VoX, Kathy has a Q&A with one of the partners who attended the OAUX Expo at OpenWorld. Incidentally, Anthony (@anthonyslai), Noel (@noelportugal) and I were there, showing the Leap Motion-controlled robot arms, the Google Glass Sales Cloud concept app Anthony built, and another special project that I am not at liberty to divulge.
Finally, Ultan (@ultan) has a brief roundup of an Oracle partner event held last week in Manchester, specifically focusing on Noel’s demo of Google Glass’ look and translate feature. Noel’s been busy, presenting at UKOUG Tech 13 and trotting Glass around the UK. He has posted some pictures of his travels to our Facebook page.
Maybe he’ll post them to our G+ page so everyone can see, maybe not, we’ll see.
I’m cleaning up all the open tabs for the holidays, so here are some nuggets I found that may or may not be interesting.
Hinkmond Wong of the Java Embedded Technology team did a fun Thanksgiving project, a Turkey that tweets as it cooks.
It’s time for the Internet of Things (ioT) Thanksgiving Special. This time we are going to work on a special Do-It-Yourself project to create an Internet of Things temperature probe to connect your Turkey Day turkey to the Internet by writing a Thanksgiving Day Java Embedded app for your Raspberry Pi which will send out tweets as it cooks in your oven.
MIT’s Dynamic Shape Display
On Telepresence Robots
For reasons I can’t explain, I love robots. So, of course, Ars’ an in-depth review of Suitable Technologies‘ Beam telepresence robot caught my attention. We toyed with a similar idea, using the HEXBUG Spider XL, but it required a lot of hacking and other parts, namely a smartphone. I think the guys were just humoring me.
And Finally, Helvetica: The Perfume
I love this, even at $62 for 2 ounces of distilled water, h/t Kottke.
Since Google announced the Chromecast earlier this year, I’ve been stoked to see how it developed.
The little device has a ton of potential, and even though Google has been a little slow to push its adoption, even slowing down the efforts of some curious developers, they do seem committed to the device.
After four months using the Chromecast every day, I still love it.
There’s a lot to like about the Chromecast, even if you set aside the price. One thing I’ve noticed over four months is there’s a ton of content available in the Google Play Store. The Play Movies & TV app supports the Chromecast, obviously, and before the Chromecast, I never really considered the Play Store as an alternative to Amazon or iTunes for movies and TV.
Unsurprisingly, it’s very easy to buy content and cast it, all from the device. I’ve never used an iOS device with AirPlay to do this with an Apple TV, but my guess is that they’re similarly easy. I don’t know if Amazon probably has anything like this for the Kindle Fire, but I have to assume if they don’t, they soon will.
If you don’t subscribe to Netflix and/or HuluPlus, you’re probably not in the market for a Chromecast. But if you do, it offers a great way to get content onto your TV, smart or otherwise. This is a plus, given the relative size of smart TV ecosystems when compared to Android and iOS.
Maybe it’s just me, but I’d much rather cast Netflix to my TV than use an app built for a smart TV OS.
Speaking of the TV, the Chromecast turns on the TV to the appropriate HDMI input when you cast, which is nice. You can also control volume from the player, if the app supports it, but alas, to turn off the TV, you’ll have to find the remote.
A final unexpected plus I’ve noticed is that having my device at hand while I’m watching and controlling programming means more second screen activity. So, I find myself looking up stuff on IMDB that I would have tabled for never in the past. On the downside, I end up reading email too.
Not many apps have adopted Chromecast yet, which seems to be a combination of Google’s desire to keep development tight and perhaps a wait-and-see approach from content owners.
Right now, only a handful of apps support it, Netflix, HuluPlus, Pandora, Play Movies & TV, YouTube, HBO GO and Play Music. That’s a lot of content, but depending on what you watch, wider support probably matters.
Update: Today, Google announced a slew of new apps that support Chromecast.
Of course, one major feature of the Chromecast is its ability to cast from any computer with Chrome and the Google Cast extension. Although the majority of my experience has been casting from a device, I have done so from Chrome, with mixed results. You can also cast local content from the computer too.
Like anything over a network, speed matters. The faster your wifi, the better the casting experience.
Turns out that router placement for the device that’s casting matters too.
I’ve found that if my device has low (one bar) connectivity to my router, the device often loses connectivity to the Chromecast. This results in an uncontrollable stream, i.e. the player on the device disappears and I can no longer pause or stop playback, bit of a bummer, but not a fatal flaw.
It turns out that all players are not created equally. Most of them do offer the ability to pause from the lock screen on Android, which is very nice, but from a consistency perspective, each player implements casting differently.
For example, Netflix offers a stop button from their player, which is full screen, while HuluPlus does not, because HuluPlus doesn’t seem to offer stop at all in their app.
For comparison, here are the players for Play Movies & TV and YouTube.
So, yeah, each one is different, even those produced by Google for its own apps. Minor complaints, and to be expected.
Google seems poised to expand support for the Chromecast, which is great news. Rumors suggest that media center app, Plex, will soon release support for it. Personally, I’m looking forward to casting from the native Android Gallery so I can cast pictures and video of my daughter.
Anyway, those are all my thoughts on the Chromecast after using it for four months. At $35 a piece, it was a no-brainer to buy one for each of my TVs. I may even buy it as a holiday gift.
Find the comments.
The holiday season is in full swing now, and I’ll bet a lot of people out there will be getting some form of wearable device as a gift.
Personally, I’d like to see more fashion innovation, e.g. intelligent clothing. I’m not a fan of encumbrances, but since clothes aren’t optional, they might as well be smart.
As wearables gain momentum, I find myself increasingly frustrated because there are so many options, so many form factors and price points, and no clear leader where I feel comfortable investing time, money and development effort.
And that’s without even looking at the technical aspects of each device, sensors, SDKs and/or APIs, openness of data, associated ecosystems.
What a hot mess, but hey, it’s exciting too.
Bonus, Misha’s post includes an interview with our very own Anthony (@anthonyslai) about his adventures as a Google Glass Explorer. Noel (@noelportugal) is now in that club too, so expect more Glass content soon, e.g. I heard those two mad scientists got the Glass working to control the robotic arm.
Stay tuned, and as always, find the comments.
Our new team members, Raymond and Tony, have been busy in their short time with us, and they’re embracing the AppsLab way.
What way is that you ask? Since the beginning, we’ve always started with an idea and moved quickly to build something conceptual to see how and if the idea works.
Connect began life as the IdeaFactory, which Rich (@rmanalan) put together in 24 hours to give life to our idea about enterprise social networking. More recently, Anthony’s (@anthonysali) new toy, the Google Glass, begat the Fusion CRM Glass app.
To be clear, none of this is product. It’s not even really project work, although we do sometimes launch projects based on the initial concept work. This is just smart developers, messing around with ideas, trying to see what works.
Over the years, we’ve built lots of these demos, which I’m calling concept demos lately. Some have evolved into full-scale projects. Others have been moth-balled into our Git repo, which I’m told has something like 40-some odd projects in various states of completeness.
I like to think that code never dies. It just waits around for the right circumstances.
Sorry about that, won’t happen again.
Anyway, with Anthony and Noel (@noelportugal) tied up with travel and other projects, Raymond and Tony have taken the baton and cranked out a couple of cool concept demos.
First, they collaborated to build a working geo-fencing demo. The idea here is that data on a device should be subject to physical location, e.g. patient data in a hospital, customer-sensitive bank data. If the device is within the fence, data exist and can be accessed; when the device leaves the fence perimeter, the data are removed from the device and cannot be retrieved from the server.
Here are some shots of the concept demo at work.
Tony did the groundwork development for this one, and Raymond cleaned it up to demo more cleanly. The toughest part of this one was spoofing the GPS with a fake location to fool it into believing it was inside/outside the geo-fence.
Second, Jeremy, our overlord, owns a Pebble watch, so we’ve been messing about with one for giggles. Possibly as a joke, Jeremy said we should build a watchface app for sales reps that showed “motivational” metrics like days to quarter close and percentage of sales quota achieved.
So, Raymond did that.
I guess the lesson is that it’s not always a good idea to joke around developers.
Why do we do stuff like this?
Aside from proving out ideas, projects like these, the Glass app and the Leap Motion-controlled robot arm allow the guys to go hands-on with the SDKs and APIs of devices we may actually build for in the future. These experiences are incredibly valuable because when it comes time to do a full-scale project, they have a baseline understanding of what we can reasonably do and how easy or difficult it will be.
That experience leads to much better estimates of development times, and it removes some of the uncertainty involved. Oh, and it helps control the scope early in a project, which makes execution and timely delivery achievable.
If you’re counting, that’s a win-win-win-win-win, or something.
Yeah, concept demos are usually rough around the edges, but they’re baked enough to give an idea of what’s possible. Plus, concept demos get done quickly, so ideas can be vetted and move on or be tabled without spending a ton of time and effort, e.g. Raymond and Tony banged out the geo-fencing concept demo in less than two weeks, and Raymond built the Pebble concept in under a week.
And that’s real time. They were doing other things too.
A while back, I promised some details on the Google Now TV Card I found accidentally.
I was watching TV via an HDTV antenna and happened to pop open Google Now for some reason or another. Now showed me this card:
Freaky, right? I dismissed it, but my curiosity was peaked. So, I did some digging about the TV Card and went back to give it whirl.
The card only works on broadcast TV, which makes sense when you reverse engineer it a little. Google Now knows where you are, and based on that, can determine the shows that are being broadcast. That helps narrow down the possibilities, but even given that information, I found the card a bit tough to trigger.
I did my testing during daytime TV, and it failed to detect the Ellen DeGeneres show and another show I tried. It did finally work for the Fox broadcast of the MLB playoff series between the Tigers and Red Sox.
Here is the card it showed:
If I remember correctly, the announcers were talking about Torii Hunter.
Pretty interesting stuff, not mind-blowing, but interesting. This is a pretty powerful example of what Google wants to do though, which is integrate all it knows about the world and you, a.k.a. its knowledge graph, and provide what it thinks might be useful to you at the moment.
Find the comments.
If you hurry, you can watch their episode on Hulu. If you decide to wait, they appear in Season 5 Episode 10. Paul and family are the very first segment, so you won’t have to watch the entire episode, although I did because I’ve never seen Shark Tank. It’s an interesting show.
The premise is simple; companies seeking investment pitch a panel of investors, who, if they’re interested, commit a sum of money in exchange for a stake in the business.
Now for the background. Paul founded this little team back in 2007, along with Rich (@rmanalan) and me. Many of you may know Paul, but you might not know that he and his wife started a little lunchbox business called Yubo in their spare time. I think that was in 2009.
Using their savings, they set out to solve a common problem for families, the lunchbox and the jumble of containers and baggies that go into it. Yubo comes with BPA-free, dishwasher-safe containers that fit snugly inside, along with a reusable cold pack.
Plus, the Yubo’s faceplate is customizable and replaceable. It’s an ingenious product. I bought one for my daughter; she loves it; I’ve known Paul for years, etc. Consider that your disclaimer.
I remember the inception days of Yubo. Paul told me about working with an industrial designer and taking late calls with manufacturers overseas, all in his free time. It all seemed very draining, but like every small business, they soldiered through it because they believed in the idea.
Anyway, it was oddly gratifying for me to see Paul and family on national TV, successfully pitching this panel of luminaries. I can only imagine how elated they felt when they struck a deal.
If you’re wondering, Paul recently left Oracle, again; this time for a small company called Achievers.
Good luck dude. Without you, we wouldn’t be doing cool stuff here.
Busy times lately here at the ‘Lab. We’ve grown from a small band of three to six in the past six weeks.
Joining our happy little crew are Osvaldo, whom we were lucky to find on our adventure to Mexico, Raymond, a friend of Anthony’s from Taleo, and Tony, whom I’ve known for many years.
Our ‘Lab veterans have been road warriors lately. Earlier this week, Anthony spoke at the OTN China Tour in Beijing, showing off the Glass concept app he built, as well as the Leap Motion-controlled robotic arms he and Noel hacked together right before OpenWorld.
Noel was in Mexico, and soon, he’ll be heading to UKOUG Tech 13 to speak. His session is called “Oracle Fusion & Cloud Applications: A Platform for Building New User Experiences” at the happy hour friendly time of 17:45 on Tuesday, December 3.
If you’re attending Tech 13, drop by and say hi, or just look for Noel. He’ll be hanging around the show all week.
Anyway, I have a backlog of posts, just not a backlog of time to push them. Stay tuned.