Perhaps the coolest thing about IBM’s 9th “Five Innovations that will Help Change our Lives within Five Years” predictions is that none of them sound like science fiction.
“With advances in artificial intelligence and nanotechnology, we aim to invent a new generation of scientific instruments that will make the complex invisible systems in our world today visible over the next five years,” said Dario Gil, vice president of science & solutions at IBM Research in a statement.
Among the five areas IBM sees as being key in the next five years include artificial intelligence, hyperimaging and small sensors. Specifically, according to IBM:
1. In five years, what we say and write will be used as indicators of our mental health and physical wellbeing. Patterns in our speech and writing analyzed by new cognitive systems will provide tell-tale signs of early-stage mental and neurological diseases that can help doctors and patients better predict, monitor and track these diseases. At IBM, scientists are using transcripts and audio inputs from psychiatric interviews, coupled with machine learning techniques, to find patterns in speech to help clinicians accurately predict and monitor psychosis, schizophrenia, mania and depression.
Today, it only takes about 300 words to help clinicians predict the probability of psychosis in a user. Cognitive computers can analyze a patient’s speech or written words to look for tell-tale indicators found in language, including meaning, syntax and intonation. Combing the results of these measurements with those from wearables devices and imaging systems (MRIs and EEGs) can paint a more complete picture of the individual for health professionals to better identify, understand and treat the underlying disease.
2. In five years, new imaging devices using hyperimaging technology and AI will help us see broadly beyond the domain of visible light by combining multiple bands of the electromagnetic spectrum to reveal valuable insights or potential dangers that would otherwise be unknown or hidden from view. Most importantly, these devices will be portable, affordable and accessible, so superhero vision can be part of our everyday experiences.
A view of the invisible or vaguely visible physical phenomena all around us could help make road and traffic conditions clearer for drivers and self-driving cars. For example, using millimeter wave imaging, a camera and other sensors, hyperimaging technology could help a car see through fog or rain, detect hazardous and hard-to-see road conditions such as black ice, or tell us if there is some object up ahead and its distance and size. Embedded in our phones, these same technologies could take images of our food to show its nutritional value or whether it’s safe to eat. A hyperimage of a pharmaceutical drug or a bank check could tell us what’s fraudulent and what’s not.
3. In the next five years, new medical labs on a chip will serve as nanotechnology health detectives– tracing invisible clues in our bodily fluids and letting us know immediately if we have reason to see a doctor. The goal is to shrink down to a single silicon chip all of the processes necessary to analyze a disease that would normally be carried out in a full-scale biochemistry lab.
The lab-on-a-chip technology could ultimately be packaged in a convenient handheld device to let people quickly and regularly measure the presence of biomarkers found in small amounts of bodily fluids, sending this information streaming into the cloud from the convenience of their home. There it could be combined with data from other IoT-enabled devices, like sleep monitors and smart watches, and analyzed by AI systems for insights. When taken together, this data set will give us an in-depth view of our health and alert us to the first signs of trouble, helping to stop disease before it progresses.
4. In five years, new, affordable sensing technologies deployed near natural gas extraction wells, around storage facilities, and along distribution pipelines will enable the industry to pinpoint invisible leaks in real-time. Networks of IoT sensors wirelessly connected to the cloud will provide continuous monitoring of the vast natural gas infrastructure, allowing leaks to be found in a matter of minutes instead of weeks, reducing pollution and waste and the likelihood of catastrophic events. Scientists at IBM are working with natural gas producers such as Southwestern Energy to explore the development of an intelligent methane monitoring system and as part of the ARPA-E Methane Observation Networks with Innovative Technology to Obtain Reductions (MONITOR) program.
5. In five years, we will use machine-learning algorithms and software to help us organize the information about the physical world to help bring the vast and complex data gathered by billions of devices within the range of our vision and understanding. We call this a "macroscope" – but unlike the microscope to see the very small, or the telescope that can see far away, it is a system of software and algorithms to bring all of Earth's complex data together to analyze it for meaning.
By aggregating, organizing and analyzing data on climate, soil conditions, water levels and their relationship to irrigation practices, for example, a new generation of farmers will have insights that help them determine the right crop choices, where to plant them and how to produce optimal yields while conserving precious water supplies. Beyond our own planet, macroscope technologies could handle, for example, the complicated indexing and correlation of various layers and volumes of data collected by telescopes to predict asteroid collisions with one another and learn more about their composition.
+More on Network World: What advanced tech will dominate your car by 2025? IBM knows+
IBM has had some success with its “five in five” predictions in the past. For example, in 2012 it predicted computers will have a sense of smell. IBM says “sniffing” technology is already in use at the Metropolitan Museum of Art, working to preserve and protect priceless works of art by monitoring fluctuations in temperature, relative humidity, and other environmental conditions. “And this same technology is also being used in the agricultural industry to monitor soil conditions, allowing farmers to better schedule irrigation and fertilization schedules, saving water and improving crop yield,” IBM said.
+More on Network World: Will your car become a mini-data center? IBM thinks that’s just the beginning+
In 2009 it had an expectation that buildings will sense and respond like living organisms. IBM said it is working with The U.S. General Services Administration (GSA) to develop and install advanced smart building technology in 50 of the federal government’s highest energy-consuming buildings. “Part of GSA’s larger smart building strategy, this initiative connects building management systems to a central cloud-based platform, improving efficiency and saving up to $15 million in taxpayer dollars annually. IBM is also helping the second largest school district in the U.S. become one of the greenest and most sustainable by making energy conservation and cost savings as easy as sending a text message,” IBM stated.
This story, "IBM: Next 5 years AI, IoT and nanotech will literally change the way we see the world" was originally published by Network World.