Artificial intelligence (AI), hyperimaging, macroscopes and smart sensors are some of the biggest innovations that will help change our lives within five years.
This is according to IBM, which yesterday unveiled its annual "IBM 5 in 5" - a list of ground-breaking scientific innovations with the potential to change the way people work, live and interact during the next five years.
The IBM 5 in 5 is based on market and societal trends as well as emerging technologies from IBM's research labs around the world.
"The scientific community has a wonderful tradition of creating instruments to help us see the world in entirely new ways. For example, the microscope helped us see objects too small for the naked eye and the thermometer helped us understand the temperature of the Earth and human body," says Dario Gil, VP of science and solutions at IBM Research.
"With advances in artificial intelligence and nanotechnology, we aim to invent a new generation of scientific instruments that will make the complex invisible systems in our world today visible over the next five years."
AI and mental health
IBM says brain disorders, including developmental, psychiatric and neurodegenerative diseases, represent an enormous disease burden in terms of human suffering and economic cost. For example, today, one in five adults in the US experiences a mental health condition such as depression, bipolar disease or schizophrenia, and roughly half of individuals with severe psychiatric disorders receive no treatment. The global cost of mental health conditions is projected to surge to $ 6 trillion by 2030, the company notes.
"If the brain is a black box that we don't fully understand, then speech is a key to unlock it. In five years, what we say and write will be used as indicators of our mental health and physical well-being. Patterns in our speech and writing analysed by new cognitive systems will provide tell-tale signs of early-stage developmental disorders, mental illness and degenerative neurological diseases that can help doctors and patients better predict, monitor and track these conditions," Gil says.
In the future, similar techniques could be used to help patients with Parkinson's, Alzheimer's, Huntington's disease, PTSD and even neurodevelopmental conditions such as autism and ADHD, it adds.
Cognitive computers can analyse a patient's speech or written words to look for tell-tale indicators found in language, including meaning, syntax and intonation. Combining the results of these measurements with those from wearable devices and imaging systems, and collected in a secure network, can paint a more complete picture of the individual for health professionals to better identify, understand and treat the underlying disease.
Superhero vision
IBM says more than 99.9% of the electromagnetic spectrum cannot be observed by the naked eye. Over the last 100 years, it notes, scientists have built instruments that can emit and sense energy at different wavelengths.
"Today, we rely on some of these to take medical images of our body, see the cavity inside our tooth, check our bags at the airport, or land a plane in fog. However, these instruments are incredibly specialised and expensive and only see across specific portions of the electromagnetic spectrum," Gil points out.
He notes that in five years, new imaging devices using hyperimaging technology and AI will help people see broadly beyond the domain of visible light. This, by combining multiple bands of the electromagnetic spectrum, to reveal valuable insights or potential dangers that would otherwise be unknown or hidden from view.
Most importantly, IBM says, these devices will be portable, affordable and accessible, so superhero vision can be part of our everyday experiences.
Infinite detail
According to the computing company, today, the physical world only gives people a glimpse into their interconnected and complex ecosystem.
"We collect exabytes of data - but most of it is unorganised. In fact, an estimated 80% of a data scientist's time is spent scrubbing data instead of analysing and understanding what that data is trying to tell us," says Gil.
"Thanks to the Internet of things, new sources of data are pouring in from millions of connected objects - from refrigerators, light bulbs and your heartrate monitor to remote sensors such as drones, cameras, weather stations, satellites and telescope arrays. There are already more than six billion connected devices generating tens of exabytes of data per month, with a growth rate of more than 30% per year. After successfully digitising information, business transactions and social interactions, we are now in the process of digitising the physical world."
He explains that in five years, people will use machine-learning algorithms and software to help organise information about the physical world. This will bring the vast and complex data gathered by billions of devices within human range of vision and understanding.
"We call this a 'macroscope' - but, unlike the microscope to see the very small, or the telescope that can see far away, it is a system of software and algorithms to bring all of Earth's complex data together to analyse it for meaning."
Early detection
Early detection of disease is crucial, says IBM, adding that in most cases, the earlier the disease is diagnosed, the more likely it is to be cured or well controlled.
However, it says diseases like cancer can be hard to detect - hiding in the body before symptoms appear. Information about the state of our health can be extracted from tiny bioparticles in bodily fluids such as saliva, tears, blood, urine and sweat. Existing scientific techniques face challenges for capturing and analysing these bioparticles, which are thousands of times smaller than the diameter of a strand of human hair.
In the next five years, IBM says new medical labs "on a chip" will serve as nanotechnology health detectives - tracing invisible clues in our bodily fluids and letting us know immediately if we have reason to see a doctor. The goal is to shrink down to a single silicon chip all of the processes necessary to analyse a disease that would normally be carried out in a full-scale biochemistry lab.
Smart sensors
Most pollutants are invisible to the human eye, until their effects make them impossible to ignore, says IBM.
The company explains that methane, for example, is the primary component of natural gas, commonly considered a clean energy source. But if methane leaks into the air before being used, it can warm the Earth's atmosphere. Methane is estimated to be the second largest contributor to global warming after carbon dioxide.
In five years, new, affordable sensing technologies deployed near natural gas extraction wells, around storage facilities, and along distribution pipelines, will enable the industry to pinpoint invisible leaks in real-time. Networks of IOT sensors wirelessly connected to the cloud will provide continuous monitoring of the vast natural gas infrastructure, allowing leaks to be found in a matter of minutes instead of weeks, reducing pollution and waste and the likelihood of catastrophic events.
Share