counter Skip to content

Five innovations that, according to IBM, will change our lives within five years

logomacitynet1200wide 1

IBM unveiled the seventh annual "5in5" (# ibm5in5), lists of innovations that, according to the company, have the potential to change the way people work, live and have fun over the next five years.

5in5 is based on market and social trends, as well as on the emerging technologies of research & development laboratories around the world, which can make these transformations achievable. This year the innovations that will form the foundation of the next hour of information technology are being explored, which Big Blue describes as "the era of cognitive systems". This new generation of machines will be able to learn, adapt, perceive and start experiencing the world as it really is. This year's forecasts focus on one element of the new era: the ability of computers to imitate, in their own way, the human senses: sight, smell, touch, taste and hearing.

These perception skills will help us become more aware and productive and will help us think, but they will not think for us. Cognitive computing systems will help us unveil the complexity, keep up with the speed of information, make more informed decisions, improve our health and our standard of living, enrich our lives and bring down all kinds barrier, including geographic distance, language, cost and inaccessibility.

Touch: we will be able to touch through the phone.Imagine using your smartphone to buy your wedding dress and being able to touch the satin or silk of the dress, or the lace on the veil, all through the surface of the screen; or to touch the decorations and the texture of a blanket made by a local craftsman on the other side of the world. In five years, sectors such as the retail sector will be transformed by the ability to "touch" a product through their mobile device. Applications are being developed for retail, healthcare and other sectors, using haptic, infrared or pressure-sensitive technologies to simulate the touch, like the texture of a fabric – while the buyer touches the image of the item with his fingers on a device screen. By taking advantage of the phone's vibration functions, each object will have an exclusive series of vibration models that represents the experience of touch: short and rapid models, or longer and more intense series of vibrations. The vibration pattern will differentiate silk from linen or cotton, helping to simulate the physical sensation perceived when you actually touch the fabric. The current uses of haptic and graphic technology in the gaming sector transport the end user to a simulated environment. The opportunity and challenge in this case are to make technology so present and intertwined in everyday experiences that it provides a greater context for our life, placing technology in front of and around us. This technology will become omnipresent in our daily lives, transforming mobile phones into tools for natural and intuitive interaction with the world around us.

View: a pixel will be a thousand wordsWe take 500 billion photos a year. 72 hours of video are uploaded to YouTube every minute. The global imaging diagnostics market is predicted to grow to $ 26.6 billion by 2016. Computers today understand images only based on the text we use to tag or title them; most of the information – the actual content of the image – a mystery. In the next five years, the systems will be able not only to look at and recognize the content of the images and visual data, but will turn the pixels into meaning, starting to understand its meaning, just as a human being sees and interprets a photograph. In the future, features similar to those of the brain will allow computers to analyze features such as color, structure patterns or edge information and to extract knowledge elements from visual aids. This will have a profound impact on sectors such as health, retail and agriculture. Within five years, these features will be put into practice in the healthcare sector, making sense of the huge volumes of medical data, such as magnetic resonance imaging, CT, radiography and ultrasound images, to acquire information adapted to a particular anatomical area or pathology. The critical elements in these images can be imperceptible or invisible to the human eye and require careful measurement. Being trained to discriminate what to look for in images – for example by distinguishing healthy from sick tissue – and correlating it with the patient's medical record and scientific literature, systems that can "see" will help doctors detect problems with greater speed and accuracy.

Hearing: computers will be able to hear what mattersHave you ever wished you could interpret all the sounds that surrounded you and be able to understand what is not being said? Within five years, a distributed system of intelligent sensors will detect elements of sound, such as sound pressure, vibrations and sound waves at different frequencies. Interpreter these inputs to predict when trees will fall in a forest or when an impending landslide. This system will "listen" to the external environment and measure the movements, or the solicitation, in a material to warn us in case of imminent danger. The raw sounds will be detected by sensors, a bit like in the human brain. A system that receives this data will take into account other "modes", such as visual or tactile information, and will classify and interpret sounds based on what it has learned. When new sounds are detected, the system will draw conclusions based on previous knowledge and ability to recognize patterns. For example, the language of infants will be understood as a language, indicating to parents and doctors what children are trying to communicate. Sounds can be a stimulus to interpret a newborn's behavior or needs. Once taught to a sophisticated speech recognition system what the sounds made by a newborn mean – if the vagaries indicate that a child is hungry, hot, tired or in pain – that system correlates the sounds and babble with others sensory or physiological information, such as heart rate, pulse and temperature. Over the next five years, learning about emotion and being able to perceive moods, the systems will highlight aspects of a conversation and analyze sound pitch, tone and hesitation to help us have more productive dialogues, able to improve interactions. call centers with customers, or to allow us to better interact with different cultures. Today IBM scientists are starting to record underwater noise levels in Galway Bay, Ireland, to understand the sounds and vibrations of wave energy converting machines, and the impact on marine life, using underwater sensors that capture sound waves and transmit them to a reception system for analysis.

Taste: digital taste buds will help people eat smarterWhat would you say if we could make healthy foods exquisite using a different calculation system, built for creativity? Researchers are developing a calculation system that actually perceives the flavor, to be used with chefs to create the tastiest and most innovative recipes. Break down the ingredients to their molecular level and combine the chemistry of food compounds with the psychology behind the flavors and smells preferred by humans. By comparing these data with millions of recipes, the system will be able to create new combinations of flavors that combine, for example, roasted chestnuts with other foods, such as cooked beetroot, fresh caviar and raw ham. A system like this can also help us eat healthier, creating new combinations of flavors that lead us to prefer boiled vegetables over chips. The computer will be able to use algorithms to determine the precise chemical structure of food and why people prefer certain flavors. These algorithms will examine how the chemicals interact with each other, the molecular complexity of the aromatic compounds and their binding structure, and will use this information, together with the perception models, to predict the level of taste satisfaction. Not only will making healthy foods more palatable, but it will surprise us with unusual combinations of foods, designed to maximize our experience of taste and flavor. In the case of people with special dietary needs, such as diabetics, it could develop flavors and recipes that keep blood sugar under control but that satisfy the palate.

Smell: computers will have a sense of smellOver the next five years, tiny sensors built into your computer or cell phone will detect if you are going to get a cold or another illness. By analyzing odors, biomarkers and thousands of molecules in a person's breath, doctors will help diagnose and monitor the onset of ailments, such as kidney and liver disease, asthma, diabetes and epilepsy, detecting which odors are normal and which no. Scientists are already detecting the environmental conditions and gases for the preservation of masterpieces. This innovation is starting to be applied to address the problem of clinical hygiene, one of the biggest challenges in today's healthcare sector. For example, antibiotic-resistant bacteria, such as methicillin-resistant Staphylococcus aureus (MRSA), which in 2005 was responsible for nearly 19,000 hospitalization-related deaths in the United States, is usually found on the skin and can be easily transmitted to any place where people are in close contact. One way to combat exposure to MRSA in healthcare facilities is to ensure that medical personnel follow clinical hygiene guidelines. Over the next five years, IBM plans to exploit a technology that will "annuser" surfaces to detect disinfectants and determine whether the rooms have been sanitized. Using new wireless mesh networks, data on various chemicals will be collected and measured by sensors and will constantly learn and adapt to new smells over time. Thanks to advances in sensor and communications technologies, in association with deep learning systems, sensors can measure data in previously unimaginable places. For example, computer systems can be used in agriculture to "smell" or analyze the soil conditions of crops. In urban environments, this technology will be used to monitor problems with waste, disinfection and pollution, helping municipal companies to identify potential problems before they assume uncontrollable proportions.

Source: IBM

(By Mauro Notarianni)