A couple days ago, I was preparing to show some development work Luis (@lsgaleana) did for Android Wear using the Samsung Gear Live.
One of the interesting problems we’ve encountered lately is projecting our device work onto larger screens to show to an audience. I know, bit of a first world problem, which is why I said “interesting.”
At OpenWorld last year, I used an IPEVO camera to project two watches, the Gear Live and the Pebble, using a combination of jewelry felt displays. That worked OK, but the contrast differences between the watches made it a bit tough to see them equally well through the camera.
Plus, any slight movement of the table, and the image shook badly. Not ideal.
Lately, we haven’t been showing the Pebble much, which actually makes the whole process much easier because . . . it’s all Android. An Android Wear watch is just another Android device, so you can project its image to your screen using tools like Android Screen Monitor (ASM) or Android Projector.
Of course, as with any other Android device, you’ll have to put the watch into debugging mode first. If you’re developing for Android Wear, you already know all this, and for the rest of us, the Android Police have a comprehensive how-to hacking guide.
For my purposes, all I needed to do is get adb to recognize the watch. Here are the steps (h/t Android Police):
- Tap on Wear’s watch face to get a menu of options. Be sure to hit the watch face instead of a notification card.
- Scroll down the list of options and select Settings.
- Open About, which is the last option in the list.
- Find Build number and tap on it seven times, and you’ll get the “You are now a developer!” message.
- Swipe right (to go back) to the Settings menu.
- Open Developer options, which is now the last option in the list.
- Find and set ADB debugging to Enabled.
- Tap the checkmark button to confirm.
Now, when I need to show a tablet app driving the Wear watch, I can use adb and ASM to show both screens on my Mac, which I can then project. Like so.
Bonus points, the iPod Touch in that screen is projected using a new feature for QuickTime in Mavericks that works with iOS 8 devices.
I hear you – demoing stuff now requires a huge salad of cables, devices, hacks, calendar entries to make sure the stuff’s batteries are charged etc. etc.
Whither we could just use Chromecast to project…
@Ultan: Actually, we do use Chromecast sometimes to project. Google added screen mirroring a while back, and it is a better option, assuming the network is good enough.
My problem lately has been multiple screens, i.e. tap on this device and something happens on this one.