Today in our software architecture class we talked about simple sensors like IR, accelerometer and such. This led me to think about two iPhone bugs that I have recently seen.
First, iPhone has an elaborate system for detecting device orientation and setting screen orientation based on that. It works well, is noise-resistant, gives good feedback etc. BUT: it makes an important assumption: that the user is vertically oriented, e.g stands up or sits. And when user is horizontally oriented, e.g laying down in bed, the whole thing breaks down. You lie down in bed and try to look at your phone when laying on your side… but the screen orientation is perpendicular (90 deg opposite) to what it should be. Very disruptive/annoying. So you can’t really read your emails in bed with this thing. :P
Secondly, iPhone has a system where during a call there are touchscreen controls that you can use to mute your mic, use keypad for DTMF tones etc. And it’s supposed to be built so that when you’re holding the phone away from face, you can use them, but when the device is next to your head, i.e you’re talking, then brushing against face won’t activate those controls. Well, it failed on me this morning. I was on the phone with someone and suddenly the other party started shouting “where did you go???” I tried speaking, but the other party sounded like I wasn’t speaking. And then I looked at the controls – turns out that somehow “mute” was activated. I think I brushed the phone against my face and the device incorrectly registered this as touchscreen touch.
At least the good mitigating factor in the latter case was that the touchscreen controls are really obvious and salient and upon a single glance it was apparent to me that “mute” was turned on, so I could simply turn it off and go back to normal again.