By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information.
April 27, 2018

Newsletter - April 2018 - Artificial intelligence augmented by human intelligence

April 27, 2018

Newsletter April 2018

What is the Intelligence Artificial ?

THEIntelligence Artificial (AI or AI, in English for Artificial Intelligence) is the Implementation of several techniques that will allow machines to imitate a form ofIntelligence Human. This notion appeared in the years 1950 thanks to the mathematician Alan Turing, who in his book "Computing Machinery and Intelligence", raises the question of bringing a form ofIntelligence to the machines. It describes a test known as "Turing Test" In which a subject interacts blindly with another human, then with a machine programmed to formulate sensible answers. If the subject is not able to make a difference, then the machine has passed the test and, according to the author, can truly be considered "intelligent". (Source 1)

THEIntelligence Artificial existed for over 60 years. Indeed, John McCarthy, the pioneer of AI, has created an evaluation algorithm that plays a major role in programming inIntelligence Artificial, which is particularly found in chess programs (computer program that can play chess). It is also one of the pioneers of cloud computing, or cloud computing which is a set of computer services (servers, storages, databases, network components, software, analysis tools, etc.) provided via a remote network, Like the Internet.

The digital boom and the digital revolution have allowed theIntelligence Artificial To be more and more present in all industries. Indeed, it is implemented in various areas of application such as health with the development of medical treatments adapted to each according to its genetic code, the automobile industry with autonomous cars having systems of Assisted navigation or even voice and visual recognition software (especially for parking) and many others.

The digital transformation that is at the heart of the development strategy of all today's companies is pushing the latter to look at the issues of theIntelligence Artificial, closely related to Big Data, another phenomenon that has exploded since 2012. Indeed, we can see that companies are trying, for example, to apply the AI to specific areas such as the establishment of artificial neural networks composed of servers allowing to treat heavy calculations within gigantic Databases, chatbots for customer service, crypto currency for FinTech companies, etc.

Fascinating technological innovations are now possible thanks to the emergence of the computing power, the Big Data and theIntelligence Artificial.

• Computer power

According to Gordon Moore (1965), the co-founder of Intel, the processing power of computers would double every two years. His theories proved to be accurate. Today's computers have more than efficient systems. Nvidia and Alphabet uses AI to make detailed real-time maps exploited by their test vehicles to visualize the world, for example.

Google, IBM Cloud and Microsoft have launched applications (Tensor processing Unit, quantum systems, quantum computer) that aim to speed up the computer processor to better manage data flows and processing of learning calculations Neural networks.

• Digital data boom

Every day, thousands of data are created and exchanged around the world. The number of data is growing exponentially. Indeed, according to an IBM study, 90% of today's global data has been created in the last two years. By 2025, global data will rise to 163 zettabytes according to international data Corporation, an international company specializing inIntelligence.

THEIntelligence Artificial : Better Algorithms

The learning Machine is a technology of theIntelligence Artificial That allows computers to learn without having been programmed for. Because of the machine learning, computers are able to learn how to solve problems by themselves.

"A Machine learning algorithm receives a set of learning data and then uses that data to answer a question. For example, a set of learning photographs contains several photographs. Some indicate that they illustrate a cat, and others indicate that they do not show a cat. By looking at a new set of photos, the computer will be able to recognize the ones that are about a cat. At the same time, the computer will add these new photos to its learning set. As the program becomes more Intelligent And more able to perform the same task. "

Today, the learning Machine is used in various sectors such as health with the assessment of disease risks such as cancer or diabetes; Or high-tech with smart applications like Siri or Google Now, vocal assistants who have the ability to understand the natural language of men for example. We also have other industries like auto with self-contained car, etc. In theory, computers and machines will continue to grow and gain autonomy to "compete" with theIntelligence Human.

"Without the Big Data, the Learning Machine and theIntelligence Artificial would be nothing. Data is the instrument that allows AI to understand and learn how humans think. It is the Big data that makes it possible to accelerate the learning curve and allows the automation of data analyses. The more data a Machine Learning system receives, the more it learns and the more precise it becomes. "

The Deep learning Revolution of learning by example
"Automatic learning is a sub-domain of theIntelligence Artificial. Deep Learning is itself a sub-category of automatic learning. The most common example of application is visual recognition. For example, an algorithm will be programmed to detect some faces from images coming from a camera. Depending on the assigned database, they will be able to spot an individual wanted in a crowd, detect the satisfaction rate at the exit of a store by detecting smiles, etc. A set of algorithms can also recognize the voice, tone, expression of questioning, affirmation and words.

To do this, Deep Learning relies mainly on the reproduction of a neural network inspired by the cerebral systems present in nature. The developers decide according to the desired application what type of learning they will set up. In this context, one speaks of supervised, unsupervised learning in which the machine will feed on unselected data, semi-supervised, by reinforcement (linked to an observation), or by transfer in which the algorithms go Apply a learned solution in a situation never seen.

On the other hand, this technique needs a lot of data to train and get enough success rates to be used. A data lake or data lake is essential for learning Deep learning algorithms. Deep learning also requires higher computing power to carry out its office.

"In the last 10 years, Deep learning has dramatically changed the data in computer science," recalls Emmanuel Mogenet. Previously, IE in the years 1980 and 1990, learning something from a computer, it was programming it. But a program can be complicated to write, give rise to bugs... and most importantly, it must contain explicit instructions, which it is not always easy to provide on certain complex subjects. "When I walk, I am unable to accurately describe the movement of each of my muscles," the researcher notes. "The real revolution of deep learning is learning by example," he deciphers. Enough to allow the machines to reproduce the cognitive behaviors that the human being himself does not know how to explain! "For example, by showing the algorithm images with a cat, then others without, and pointing out to him the correct expected answer each time, the computer learns to recognize the cats from an image library. At the end of a certain number of images, the Magic works: The system begins to generalize, and knows how to recognize the animal in photos that it has never seen. »

A search engine that really understands queries

Semantics, or understanding the meaning of words, is essential in this respect to Google in order to perfect its search engine.

How theIntelligence Artificial needs to feed on theIntelligence Human?

In a world where theIntelligence Artificial Develops from year to years, theIntelligenceHuman beings must continually evolve. It will always have an important role in the development and optimization of theIntelligence Artificial.

THEIntelligence Can remain competitive by:

  • Finding ways to use theIntelligence Artificial In the various industries
  • Working in collaboration with theIntelligence Artificial > Certain efficiency in the world of AI
  • Developing its technical skills with the development of new technologies

Machines can perform repetitive tasks such as planning and assembling parts in a factory, but abstract creative thinking is a human task essential to the success of the company. Know your human strengths and continue to learn and acquire new skills.
Humans are critical thinkers who can innovate, create, solve problems and establish relationships with other people. People who make this important distinction adapt to an increasingly AI-powered world.

Isahit, an intelligent platform of socially responsible digital tasks or the complementarity ofartificial intelligenceand human intelligence ?

Isahit is a socially responsible startup that makes it possible to link IntelligenceArtificial And Intelligence Human. This French tech for good offers companies a programming interface and a digital sourcing impact platform, for the processing of digital tasks that cannot be supported by a IntelligenceArtificial.

The community of HITers of isahit (HIT = Human Intelligence Task) is mainly made up of women based in Africa, looking for additional income to finance their studies or an entrepreneurial project. Isahit is part of a co-development process by participating in the deployment of the CSR strategy of French companies and giving new opportunities to the people of emerging countries and showing the impact of the complementarity of theIntelligence Artificial and theIntelligence Human.

Sources For more information:
The Big Data: Big DataMachine Learning
THEIntelligence Artificial : TechTargetForbesCloud FactoryScience and the Future

You might also like
this new related posts

Want to scale up your data labeling projects
and do it ethically? 

We have a wide range of solutions and tools that will help you train your algorithms. Click below to learn more!