Microsoft’s AI investment company announced the OpenAI Media Manager for authenticity. OpenAI is developing a new tool that aims to give artists more control over the data used in AI development. However, there are questions about how this tool will work and to what extent it will benefit artists.
The new development could be a response to lawsuits filed by artists, authors and publishers alleging that OpenAI improperly used works to train its algorithms.
With the advent of AI, artists, designers and writers are divided into different schools of thought. Some believe that AI is killing originality, while others see it as a new source of light for the imagination. But one thing is true: AI cannot be trained without the presence of artists themselves. And how will the works of all the world’s artists be accessed to train AI?
The OpenAI Media Manager: Is it the Initiative for the Protection of Artists’ Works?
Scheduled for release in 2025, the OpenAI Media Manager tool will allow content creators to decline the use of their work in AI development. OpenAI says the tool will allow creators to identify which works they own and freely choose how they want them to be used in machine learning research and training. However, there are many details about the tool that have not been disclosed, and its effectiveness will largely depend on these details.
The first comments on the subject were much stricter. There are those who think that it is ridiculous for an artist not to give permission to artificial intelligence while performing his work. If artificial intelligence wants to learn from that work, the company that trains artificial intelligence should get permission from the artist. After all, the artist creates the actual work. This line of thinking could lead to further lawsuits between artists and AI companies in the future.
Setting industry standards: A step forward or backward?
Alongside artists, there are also other companies to consider. OpenAI aims to set industry standards with its OpenAI Media Manager tool. However, it is not clear whether this tool can be used by other companies or how requests related to models that have already been trained and implemented will be handled. These uncertainties raise questions about whether the tool will establish a standard across the industry concerning data-driven design and usage processes.
Other technology companies also offer tools that allow artists and content creators to determine how their data is used in AI projects. However, the diversity of such tools can increase the complexity of usage and may create additional burdens for artists. Jordan Meyer from another startup, Spawning, emphasizes the importance of harmonizing different opt-out systems and mentions his openness to collaborating with OpenAI to simplify this process.
Long-term impacts and expectations from artists for OpenAI Media Manager
The steps taken to control and protect data for artists and creators are of critical importance in the development of artificial intelligence applications and algorithms. However, the applicability, practical effectiveness, and impact of these tools on the industry will become more apparent over time. The success of these initiatives in protecting artists’ rights and establishing ethical standards for data use will depend on the details of the implementation and cooperation within the industry.
Especially in the future, AI laws, which will have an increasingly significant impact, will play a key role in this matter. The European Court of Human Rights and the EU have already implemented an AI law. We have begun to see AI companies gradually starting to operate according to this law. It’s unclear whether such steps would have been taken without the enactment of the law, but OpenAI’s Media Manager seems to be a product developed because of these laws. Additionally, whether it will provide more difficulties than conveniences will also be seen in the future.
Featured image credit: Basil James / Unsplash