window.location.replace(''); .
HomeNEWSediting human faces • ENTER.CO

editing human faces • ENTER.CO

In case you don’t know, OpenAI is the company behind DALL-E, the Artificial Intelligence (AI) tool that creates images from textual descriptions. Now, the company announced to its users through a letter that the “forbidden” function was finally integrated into the AI. We talked about editing human faces.

Until now, the company had argued that editing of human faces and bodies would not be implemented for fear of misuse. However, in the letter, OpenAI reports that access to the new feature is being enabled after making a number of improvements to its filters. With these improvements, the company ensures that the AI ​​is ready to remove images with “sexual, political and violent” content.

The truth is that the new feature will allow users to edit their photos with options such as: generate a new variation of the image, edit specific features, change someone’s hairstyle or clothing, etc. The company hopes that the new function of DALL-E will be useful especially for users in the creative industry, that is, photographers, graphic designers, filmmakers, among others.

It may interest you: DALL.E Mini: the AI ​​model that is making memes in 2022

“With enhancements to our security system, DALL E is now ready to support these lovely and important use cases, while minimizing the potential harm of deepfakes,” OpenAI told its customers in the letter. This, without a doubt, is part of the company’s initiative to expand an art-generating tool, so that it does not cause much damage at the level of image impersonation.

DALL-E has been characterized as a tool that takes care of its users, so it maintains different restrictions. In return, we find companies like Stable Diffusion, the main rival of DALL-E that imposes fewer restrictions on users. This translates into faster AI development but also makes it easier for malicious applications to use. In fact, Stable Diffusion has been used to generate pornographic celebrity deepfakes.

Image: OpenAI

Must Read