You may be familiar with thispersondoesnotexist.com, a site that generated fake selfies based on NVIDIA artificial intelligence. Google is working on a similar but much more advanced concept called Imagen. Its operation is simple: you enter a description of a few words, and the AI takes care of concocting an image for you. The company released a few examples in a blog post, and the results are stunning. Judge by yourself :
Google is not the first in this segment: there is already the DALL-E software, of which a second version was released last month and developed by OpenAI. According to the Mountain View firm, its tool is more powerful. She had fun comparing her results with those of DALL-E, and her study shows that human evaluators clearly preferred Imagen to other methods.
If these results are impressive, it is however necessary to qualify: the teams undoubtedly chose the best results and omitted to relay the blurred images or beside the plate. DALL-E for example one of evil with negations (“a bowl of fruit without apples”), faces or even with texts. Google offers a small demo on its site, which allows you to play with a limited version of the AI with only a few usable words.
It must be said that the sometimes impressive results obtained thanks to this AI do not encourage leaving its code open source at a time when the fake news separates with a vengeance. ” Potential Risks of Misuse Raises Concerns About Responsible Opening of Code and Demos “, specify the teams of Google.
In addition, the researchers explain that they fed their algorithm with a large amount of unsorted data from the web. In other words, they ingest just about anything, whether it’s pornography or hateful content. ” These datasets tend to manifest social stereotypes, oppressive viewpoints, and disparaging, or otherwise harmful, associations to marginalized identity groups. “, specifies the text.
In addition, the AI would have a general bias in favor of generating images of people with lighter complexions as well as a ” tendency to align images representing different professions with established gender stereotypes “. Imagen’s competitors have the same concerns: DALL-E tends to represent aircraft flight attendants as women, and CEOs as men.
Google also points out that its AI has serious limitations when generating images of people. All of this leads the company to conclude that its product ” is not suitable for public use without additional safeguards in place “.