Subscribe
About

AI comes of age

Lezette Engelbrecht
By Lezette Engelbrecht, ITWeb online features editor
Johannesburg, 23 Feb 2010

The focus of artificial intelligence (AI) research has undergone a shift - from trying to simulate human thinking, to specific “intelligent” functions, like data mining and statistical learning theory.

Steve Kroon, computer science lecturer at the University of Stellenbosch, says, in the past, people were enthusiastic about machines that could think like people. “Now, many researchers figure the challenges of the present day are things we need 'alternative intelligence' for - skills that humans can make use of, but don't have themselves.”

The best examples of these “alternative intelligence” fields, notes Kroon, are data mining and machine learning - using advanced statistical analysis to find patterns in the vast amounts of data we're confronted with today.

“That's not to say research isn't being done on human-like AI, but even tasks like speech recognition and computer vision are more and more being seen as tasks that will yield to statistical analyses.”

In the 1950s, researchers began exploring the idea of artificially intelligent systems, with mathematician Alan Turing establishing some of the characteristics of intelligent machines. Consequently, work on AI spanned a wide range of fields, but soon developed an emphasis on programming computers.*

“In the past, there was a lot of research on rule-based systems and expert systems,” notes Kroon. “But now, we're faced with areas where there's so much data, that even the experts are at a loss to explain.”

The triumph of these methods, according to Kroon, is that they're discovering things the experts aren't aware of. “Bio-informatics is a great example of this; analysing the data leads to hypotheses, which the biochemists and biologists can attempt to verify, so this new knowledge can help in the development of new medication.

“We're seeing the shift from computation as simply a tool for use by the researcher to validate his hypotheses, to computation being used to generate sensible hypotheses for investigation - hypotheses humans would probably never have found by manually looking at the data.”

Information generation

Artificial intelligence has become so ingrained in people's daily lives that it has become ordinary, says professor Tshilidzi Marwala, executive dean of the Faculty of Engineering and the Built Environment, at the University of Johannesburg.

“Fingerprint recognition is now a common technology. Intelligent word processing systems that guess words about to be typed are now common. Face recognition software is used by security agencies - the situation has shifted to more advanced and realistic applications,” he states.

“We're already seeing automated systems for so many things in our daily lives,” notes Kroon. He adds that people are reading and remembering facts less than they did a generation ago, relying instead on being able to look up information on the Internet, whenever they need it.

“Combine that kind of technology with things like recommender systems and location-aware tools, and soon you'll have a constant stream of information relevant to you, available for your consumption as you need it.”

A project exemplifying this trend is the Massachusetts Institute of Technology's (MIT's) SixthSense prototype, a gestural interface that projects digital information onto physical surfaces, and lets users interact with it via hand gestures.

“When I look around and see how many people are now using mobile smartphones instead of the desktop computers of a couple of years back, and this SixthSense technology they've been prototyping at MIT, I get the impression that 'augmented intelligence' is going to be a big thing in coming years,” says Kroon.

AI has become such a part of our daily lives that it has become ordinary.

Professor Tshilidzi Marwala, executive dean of the Faculty of Engineering and the Built Environment at the University of Johannesburg.

Web search technologies are widely seen as an application of AI, he adds, with Wolfram Alpha being a prominent example. The “knowledge engine” answers user queries directly by computing information from a core data base, instead of searching the Web and returning links.

“Its premise was that people want answers to questions, not just a list of links. And I think they're right, but there's a long way to go before this is powerful enough to dethrone the classical search engine approach,” states Kroon.

“Understanding the question being asked, and trying to infer context for that question, are difficult challenges in AI before one can even start to construct an answer to the question,” he adds.

Another development in this direction is the new “social search engine” Aardvark, recently acquired by Google. “Aardvark uses machine-learning techniques to understand social networks, and then provides answers to a user's query by passing it on to people that its system believes are the best to answer the query,” explains Kroon.

“So, in this model, Aardvark's 'AI' is simply responsible for teaming up a person with a question and someone who can give that person a good answer. This sort of system works well when you're looking for more personalised responses, like hotel and restaurant recommendations, as opposed to the impersonal information typically served up by a regular search engine.”

Mind over matter

Clifford Foster, CTO at IBM SA, says AI offers significant ways of handling the explosion of data in the world. This follows from the use of computers to simulate intelligent processes, and understanding information in context, notes Foster. “This can be applied in a number of areas, such as a recent system to predict and understand the impact of anti-retrovirals on HIV patients.”

The vast amount of information being generated, and the need to process that information in near real-time, to prevent problems from happening, is simply too much for humans to compute fast enough, explains Foster. “So, if you give machines the ability to analyse and respond to this data, it fundamentally changes the way we manage and use information.”

He points to applications, such as orchestrating traffic lights according to traffic flow at various times of the day, or medical diagnoses for people in remote areas.

“One of the biggest healthcare challenges in Africa is limited access to medical professionals. But if a person could present their problem to a computer capable of understanding the symptoms, it could search medical data banks for related content, ask additional questions for greater accuracy, and provide an informed diagnosis, which could then be passed on to a professional.”

Foster believes investment will continue in areas where AI improves people's lives. “It's become less about trying to replicate the brain and more about complementing the way we connect with the world.”

For example, trying to clone the kind of processing involved in catching a ball, with the need to calculate the trajectory of the ball, activating the correct muscles to catch it, and absorbing the impact of the catch, requires a phenomenal amount of computing power, notes Foster, but it's not very useful.

“For a long time, people were confined by the idea that AI must simulate the human brain, but where's the value in that? A programme that can aid people in solving problems, whether it be running their data centre, or managing traffic flow, or reducing the mortality rate, is much more valuable.”

Marwala argues that intelligent machines will always be created to perform a particular or handful of functions. “An intelligent machine that performs many tasks is as elusive a concept as 'the theory of everything', but the adaptation and evolution of a machine performing a specific task is perfectly possible.”

Foster believes the intersection of technology, business, and people is where AI research is going, and where it can have most impact. “Using intelligent systems to get things quicker to assist in areas such as healthcare and education could have a profound impact on society, and change people's lives in ways they never thought of.”

Source:
* What is artificial intelligence?

Share