Earlier we talked about the NEON project, which Samsung is preparing to present tomorrow at CES 2020. However, the cards were revealed even before the exhibition began thanks to the head of the project, Pranav Mistry, who published AI-generated images of a person in Twitter.
Samsung Neon – an artificial man of the future
It looks like this:
Flying to CES tomorrow, and the code is finally working? Ready to demo CORE R3. It can now autonomously create new expressions, new movements, new dialog (even in Hindi), completely different from the original captured data. pic.twitter.com/EPAJJrLyjd
– Pranav Mistry (@pranavmistry) January 5, 2020
Later, one of the users on Reddit posted videos from the NEON website showing the work of avatars:
How were they generated? Suggest your ideas in our Telegram chat. NEON avatars are copies of real people and can be easily found on the web. But at the same time, Pranav in Twitter stated that their copies are capable of demonstrating gestures and facial expressions that real people have never shown before.
How were the avatars created?
Most likely, the company with the help of high-quality cameras filmed many tens of hours of conversations, facial expressions, and hand movements of real people in a special studio. Then all the footage was run through the Core R3 engine. It probably includes a complex system of several neural networks that was trained on the original videos, and is now able to generate various emotions of the characters based on the text input.
Most likely, each neural network is responsible for separate tasks. One can be responsible for facial expressions, the other for the movement of body parts. The most interesting here is the second feature of the Core R3. How does a neural network manage to generate hand movements? She can recognize different parts of the body, create the skeleton of the character, and generate movements based on it.
Many are already working on similar projects, one of them is Magic Leap Mica. However, in this case, a full 3D model is used, which is easy to distinguish from a real person.
In the case of NEON, one gets the impression that this is a real person filmed on video. And this effect is, of course, amazing.
Pranav Mistry, who is the head of Samsung-owned START Labs that develops avatars, said in a recent interview that an artificial human could expand its role to become a part of our lives. An artificial human can replace a virtual host, virtual movie star, or pop artist. And it's hard to disagree with him.
As an example, the film 'Blade Runner' was given, where the main character owned an artificial girl, her emotions were indistinguishable from real ones, she could love. If we look at people, not as something that has a soul and the ability to love, for which many so often put people above any other animals (which, of course, is absurd), we will understand that even robots in the future will be able to love and empathize , because the invisible soul is responsible for all this, and the neural network is in the human brain. The behavior of a living person is the actions of a trained neural network. By the way, genetics and evolution allowed the brain's neural network to be partially trained even at the stage of human birth. We suggest discussing this in the Hi-News chat.
People think that they are individuals and their feelings and emotions distinguish them from animals, but this is not so. In 2017, scientists managed to digitize the worm's brain and load it into a robot. In theory, the same can be done with the human brain. We are stopped only by the lack of sufficient capacities of modern technology.
When that time comes, people will be able to transfer their brains to the digital space and gain immortality. These times are not as distant as many might think.
What do our readers think about this? Did you like the Samsung avatars and how do you see our future with you? Share your opinion in the comments below.