Sunday, September 26, 2010

In this post I will take a moment to reflect on some elements of the design of the Emerson remote control that came with the TV I currently use.
(1) and (2). The two buttons at (1) are the channel up / channel down buttons. The two buttons at (2) are the volume up / volume down buttons. I find these two sets of buttons unfortunately close to one another. Imagine the following: you are watching a dvd with your friends in a dimly lit room. Suddenly the sound gets unusually loud (fight scene) or awfully quiet (dramatic whispering). In either case the volume must be adjusted and quickly less you suffer hearing loss or miss plot developments. You recall the volume is on the right side of the remote. Is it the top or the bottom? You take a valiant stab, but alas, you've changed the channel and suddenly you are watching a local high school wrestling match in grainy home video! Cries of despair erupt about you. Not fun.
I think these buttons (which must be some of the most commonly used buttons on any remote) deserve space away from other buttons (not in the midst of a 6x4 rectangular grid) and definitely space away from one another.
(3) is the fast forward button. One click jumps a full scene, while holding down this button eventually causes the film to fastforward at a decent rate. This in my experience is also the source of sofa angst. Once the hero and heroine start moving in for the kiss, I'd like to keep moving, but I don't want to jump to the credit sequence. I think the scene jump and simple fast forward are different enough functionality to allow buttons of their own. Also, the time spent recovering from a skipped scene is non-trivial (and non-fun).
(4) is the pause button. This is not a big deal, but I'm used to the play/pause single button paradigm. I also expect (from prior experience) for the pause button to be closer to the play/stop/fastforward setup.
(5) Is the menu arrow buttons. This setup does not cause problems like some of the above issues, but I think these arrow buttons deserve their own real estate (like the play/stop buttons). Embedding in a grid of black buttons is the short path to desturction.
(6) These buttons, except for the down arrow button of (5), I have never used. I really have no idea what they do and yet I don't find myself wishing the remote could do more (just that it could do the basics better). I think removing these buttons would free up real estate for some of the changes suggested above and would lessen the clutter.
Thanks for reading!

Sunday, September 19, 2010

For my blog post this week I am supplying some comments on the paper entitled Noncontact Tactile Display Based on Radiation Pressure of Airborne Ultrasound by Takayuki Hoshi, Masafumi Takahashi, Kei Nakatsuma, and Hiroyuku Shinoda. In their paper they present some results from a holographic system that incorporates ultrasound to provide tactile feedback.

It was not clear to me what sort of pulses could be generated by the ultrasound emitter. The examples of the raindrops falling and the elephant walking swiftly across the user’s palm seemed to only require short pulses of ultrasound. Could a sustained pressure be exerted (if for example the elephant stood still)? Would the experience of a continuous pressure seem less realistic than the short pulses?

The paper pointed out problematic areas with the current setup. First, the users must wear a marker on the tip of their finger to allow hand tracking (though this is far superior to many alternatives as discussed below). There is also the problem of the user’s hand getting in the way of the projection of the holographic image (this is not a problem with the tactile ultrasound emissions, just the holography). Ultrasound emissions can be damaging to human subjects in two ways: first, the emissions can damage skin tissue, limiting the strength of the signals that can be used. Second, the signals may damage ears, so users are required to wear ear protection. Also, the ultrasound emitter is loud, which detracts from the user experience.

Despite the drawbacks it seemed clear that ultrasound was preferable in many respects to alternative approaches. Other approaches include wearing gloves with stimulators adjacent to the skin at all times, using robotic arms, or wearing thin membranes while submerged in water.

This setup does give users the sensation of physically interacting with an object, unlike the Microsoft Kinect. It seems to me this rings truer with our experience of the world. Even when we are typing on a keyboard the world pushes back on us.

At the far end of the spectrum is a setup like the holodecks of Star Trek: complete immersion in a holographic world which admits of interaction. Is equipping a room size space with ultrasound emitters feasible? Or would the blasts of ultrasound from all walls disorient the user, break their eardrums, and damage their skin?

Sunday, September 12, 2010

Swype


The above video demos Swype, a new way to interact with digital keyboards. This reminds me of what I've heard about the shift from typewriters to the first computer. Apparently the first computers closely modeled a typewriter including its limitation. Like a type writer you could only edit the line at the bottom, you couldn't cursor up and down between rows.
I think swype shows us that just because we've kept the keyboard interface on touch phones, we don't need to keep the limitations inherit to plastic buttons. You could not rub your finger over you plastic keyboard. Also, digital keyboards are often significantly smaller than their plaster counterparts making typing harder. Keyboards were designed for typing with all 10 fingers over a large (multiple inch) area. It's no wonder that shrinking the keyboard down to phone size causes problems.One hopes the autocorrecting is very smart, otherwise swyping would be very frustrating.