Design is Fun
This week was about staring at the design choices we have. And saying out, “Sounds fine, but everyone does it. Why can’t we do better?” or “Hey, this design doesn’t fit that sensor!” or “Why would you ever want a user to do *that*? It totally breaks our design philosophy!”. Design is hard. Whoever said it is fun?
This week was also about some harmful fun.
ALS & WinRT – No light at the end of the tunnel?
Varying the screen brightness during video playback inline with the ambient light sounded like a no brainer. It would reduce eye fatigue for our users. The Windows.Devices.Sensors LightSensorReading class gives ambient illuminance in lux units. Tying the app logic to absolute lux values exposes us to calibration errors across devices, but there’s little we could do about it. So, we set off on ALS measurements in varying light conditions- night, no lights, closed room at one extreme and bright mid day, open to sky at the other. (need to re-check: does the sensor saturate? May be not, but we need to take more readings before deciding). The ALS reading range was spread out well enough to do fine grain brightness control.
And that’s when we saw we got stuck. While WinRT does allow ALS readings, it does not seem to have an API for adjusting screen brightness. That was a lot of work, just to end up looking silly. Frantic mails to our Intel and Microsoft contacts are awaiting responses, but we are not very hopeful; brightness control seems to need native code.
Status: Park this aside.
NFC – So Near, Yet So Far
Shufflr is well known for its multi-screen experience. We have added more screens since the Grand Central Demo last year. Our users carry the same experience from their phones, to tablets and to their laptops and and then to their TVs. (This is where WiDi would’ve been awesome, but we’ve said enough). So, is there something we can do for our users who have Shufflr on more than one of their screens? NFC seems to be a good candidate for this sort of a hack. Bring the screens together, say abracadabra and lo!.
Our experiments turned out fine with a HP G62 laptop running Windows 7 and can’t-reveal-the-maker-Win8-surface-tablet-prototype loaned to us by our Microsoft friends. Time to put it together on the Ultrabook.
Okay, given the kind of awesome luck we’ve been having, you probably saw this coming; *There is a problem* . The NFC sensor on our Ultrabook (if there’s one) is not responding. It refuses to identify the other two NFC devices sitting beside it.
Status: Keep trying. There must be some way to make it talk.
Bad times are quite boring. No wonder they don’t last long.
Fast forward. We pulled out the main screen from the Shufflr app and got back to design. In some sense, with WiDi, AOAC, ALS and even NFC shutting out on us, the design choices started becoming clear. The screen should now allow the user to navigate the Daily Fix presentation using touch controls (taps, swipes) along with tilt controls for the same.
From the Windows 8 sensor fusion documentation, we initially thought that the gyroscope is the right sensor to integrate tilt gestures into the app. But (thankfully), we decide to checkout the accelerometer first. The readings it gave were stable enough for us to lift the Ultrabook, tilt it to an angle and get the next video show up on the screen. A small but important step as the terrain was pretty slippery thus far .
After taking many measurements from the accelerometer, we picked the g-force values on the X-axis for detecting left-tilt, at-rest, right-tilt states of the Ultrabook. This, you will know when you do it yourself, is quite tricky. The navigation shouldn’t be quite-slow-but-very-sure and test the user’s patience. Neither should it be very reactive and scare the user away. It should allow the user to get to the next and previous videos in the list, without exaggerated movement. Yet, it should let the user be standing in a hall way, talking to a friend, holding the Ultrabook, shifting weight from one leg to the other, and not surprise him when he comes back to the app (imagine: he paused a video on the screen, started walking, finished talking to his friend and returned to the app only to find the video gone because he tilted the Ultrabook a few times).
Status: For the next & previous videos’ navigation, we completed two designs and implementations with the relatively stable X-axis (in the context of user’s movements). It may not be very easy to create a clean user experience with the Y-axis measurements, for providing tilt controls on the timeline (touch: swipe down) and controls on the app bar (touch: swipe up). But we will get there.
The week ahead
With tilt navigation almost done, we might move on to multi-touch gestures. But we are also tempted to spend a few more hours on the old flames. We still have a little bit of hope for AOAC and NFC. And there is some (early) talk here about getting WinRT to do with DLNA, which it couldn’t do with WiDi.
Right now, it is time for a break, a mug of hot chocolate and some music.
– sAgar, signing off for the folks @ Shufflr.