Killing the mouse. And no, it’s not touchscreens.
For some time now people (analysts?) have been predicting the death of a computer mouse, by touchscreen and lately by multi-touch. I’m, however, going to claim that both of those technologies are absolutely outgunned by a simple Logitech. And as a cherry on the top, I’m going to give my take on the computer mouse killer.
Why touch-technology has lost already
The statement that simple touch technology is going to be replacing the mouse is just simply dimwitted. Why? Simply because they are used in two completely different context: Whereas touch-screen rock the handheld world, computer mouse reigns as the sole emperor of the desktop computing. Don’t get me wrong, I absolutely love touchscreens and even multi-touch, however, neither of them have the power or the ergonomics of even the simplest 5€ mouse, when it comes to desktop computing. Just think of the time you use your computer daily; 2 hours? 4 hours? For me it would be something like 8-10 hours. Now, keep your hand pointed to the screen, as you would be clicking a button on it, and see how long you can go. Now, do the same test with the mouse. See the difference? On continuous use, mouse wins hands down. Sure, touchscreens are intuitive but when it comes to desktop use, they just suck. Unless. And this is big if, unless the whole paradigm of desktop computing changes to something drastically different.
If not touchscreens, then what?
Well, here’s my take: Eyetracking. Eyetracking, in most basic form means that there is basically a camera tracking your eyes, software figuring out where at the screen you are actually looking at and a cursor moving on the screen, following your gaze. Nowadays this technology is becoming more mundane and easy to obtain, meaning that it’s becoming cheaper as the software advances every day. There are even couple of projects dedicated for complete “OS” for disabled people that can be used with just eyetracking (gaze: move the cursor, stare: click). Once you have it you’ll notice the problem for everyday use: With mouse, just when you are about to click something, your eyes already dart to other direction, looking for the next target. With eyetracking only you have to stop on the button and stare to produce a click, and meanwhile your brain is going “next target! next target!”. Of course, this behaviour will be eliminated over time when you get used to using the new UI, but this is also the reason why such UI won’t be capable of taking over the computer mouse; the learning curve is too steep. (not to mention, it’s slower than the things we are used to) But there is something we can do.
The eyetracking is natural pointing device for us, just because the way eyes work: the peripheral vision is good at spotting targets (it’s good at recognizing movement, sees at lower light etc) whereas actual focusing is needed to see the details of the target. Thus, we use peripheral vision when scanning our environments quickly but once we find something interesting we focus on it. However, the fact that we focus on something is only a sign that we might find the thing interesting/dangerous, not that we want to interact with it anyways. So, we need a way to sign the computer when we actually are interested of interacting with the target. This is the area where physical, real life, buttons are on top of the food-chain; their tactile feedback is just unbeatable (we feel the button go down, feel the actual click of the switch and hear the audible ‘click’ sound), not to mention you can rest your finger on it and it won’t activate it (like touch-driven devices). Here’s my solution in full: Use eyetracking for cursor control* and add physical button(s) for the clicking (and throw in physical scroll wheel, just for good measure). Now, the learning curve somewhat disappears, you could just work like you have always done and happily keep clicking away, with the obvious exception that you don’t actually have to follow your gaze with your mouse.
* With cursor I’m talking about the actual code realization of the gaze coordinates, not necessarily visible cursor on the screen. You could even toggle the cursor visiblity whether or not you are touching the button(s).Posted by Mikko Tikkanen | 0 comments