HOME Technology Nov 2008
Touch and go at interface

New ways to navigate

 

While experts at the intersection of computing, behavioral sciences and design look into how hardware, software and wetware (us) can work together the reality is most computer users remain totally reliant on advances from a much earlier age.

Despite billions of dollars of research and development the humble ‘qwerty’ keyboard layout, patented in 1874 by Milwaukee newspaper editor Christopher Sholes, still dominates the way we input data into PCs, laptops and smartphones.

While keyboards come in all sizes and shapes, often for ergonomic reasons, attempts to re-order the keys or even do away with them have been fiercely resisted. However, advances in pointing and clicking, speaking and touching and animated interaction are beginning to break new ground.

While some cellphones respond to simple voice commands, clever voice recognition programmes like Dragon Naturally Speaking from Nuance, claiming up to 99 percent accuracy, have bought great relief to many who are tired of letting their fingers do the talking.

Voice recognition has been perfect for some disabled people, legal offices, writers and others who want to bypass the double handling dictation machines require. The software can not only translate voice into text but issue commands to the computer to open, close and execute commands within certain programs.

Of course you have to teach them to recognise your specific nuances and there’s still the embarrassment of appearing to be talking to yourself and interference from background noises.

Pointing the finger

Over the years text recognition packages that work with your scanner to transform words from the page on to the screen have become extremely useful for researchers and archivists. And handwriting recognition is no longer the trial and mostly error process it was in the days of the Apple Newton or early tablet computers.

Now apparently ‘touching is the new seeing’ when it comes to relating to computers. The revolutionary Nintendo Wii gaming console for example works on kinetic feedback principles. The maturity of touch screen multimedia smart phones, such as the iPhone and its imitators, is further evidence of the trend.

In this more tactile approach fingers become the pointers, selecting applications, movies and music, flipping through images and gesturing to stretch and enlarge what appears on the screen.

The touch screen or touch pad has supplanted or at least complimented the mouse in some environments, particularly on notebook computers. And Microsoft's Vista operating system has improved support for tablet computers, making their accuracy a serious contender to keyboard supremacy, particularly when used in conjunction with a projector and whiteboard in the lecture theatre or sales presentation.

HP’s new TouchSmart all-in-one PC with a 19" and 22" touch screen, has hardware designed by Kiwi company Next Window, designed to change people’s behaviour for short transactions in a shared environment. At the touch of an icon you can go direct to email or web sites for weather, news, TV, recipes or games. It has a virtual keyboard to input text and admirably accurate handwriting recognition as well.

While it’s meant as a home unit it’s quickly finding favour with a number of businesses who’re using it for public or kiosk style units including Te Papa museum in Wellington where it showcases items that remain in storage.

Below the surface

And you can’t get any thinner than the application and touch sensitive screen Microsoft is flaunting around the trade shows. The Surface coffee table computer features a 30-inch touch-screen panel can interact with credit cards and is controlled by hand gestures.

It’s being used in select hotels, restaurants and casinos where patrons can order drinks, make dinner reservations, book shows, watch YouTube videos, play touch screen games and split the bill by dragging menu items onto separate cards. Surface computers, based on Windows Vista, cost around $NZ14,000 each.

And interaction with the environment around us can be enabled through smart navigational devices that can identify services and points of interest along the way.

Local company Navman has been a pioneer in the field of navigation software for marine and fleet vehicles but is increasingly finding a consumer niche helping drivers and pedestrians reach their destination more quickly and accurately. It’s even programmed to help you avoid red light cameras.

Its new Navman S-Series Platinum range (starting at $450) in a 13.5mm brushed metal case has a scrollable "glide" touch screen and a new intelligent search interface. You can input an address, search for a location and it’ll give you verbal ‘turn left, turn right’, type instructions.

Visual computing era

Button-based gaming consoles, mostly about controlling motion and quick click decisions, don’t easily transfer to the input of coherent language although lessons learned from gamers are likely to have a big impact on future interfaces.

At the first NVision conference in Silicon Valley computer graphics researchers claimed we are entering the era of visual computing with more lifelike graphics increasing impacting the way the way we act with machinery and computers.

The rising tide of digital videos, photos, films and television shows on the Internet is forcing graphics card and chip manufacturers to lift their game to process still and moving images in new ways, that provide greater depth and interactivity.

Car makers are using 3D graphics and animation software to help buyers customize new vehicles and take them on virtual test drives and clothing designers can replicate fabric and fashion so finely they can see what it looks like on the client before its even made.


Graphics processing increasingly underpins financial modeling and weather forecasting and computer aided design is being used by manufacturers to help build aircraft and buildings and for training purposes in a range of professions.

It’s thought 3D animation may soon make an appearance in next generation social networking sites and computer simulations will become so realistic that virtual activities will mirror physical experiences.

Next generation touch screen computing could make the mouse obsolete and speech to text research suggests "subvocalisation", where you can talk quietly to yourself, will pick up on the vibrations of the nervous system.

Handwriting recognition remains far from mainstream and relatively slow, which is why we needed keyboards in the first place. The problem is most of us just stumble along until we find our own hunt and peck method which remains with us for life.

If the keyboard is to continue as a pervasive input device some experts are asking, why isn’t touch-typing a mainstream course in our schools? Until there are significant advances in producing a commercially viable ‘thought interface’ it looks like we’re stuck with 134-year old technology for a few more years.

  Back2front    General Interest Webzine