OpenAI introduces voice cloning tool

(San Francisco) OpenAI, giant of generative artificial intelligence (AI) and publisher of ChatGPT, presented Friday a voice cloning tool, the use of which will be restricted to prevent fraud or crimes, such as voice theft. ‘identify.


Called “Voice Engine,” this AI model can reproduce a person’s voice from a 15-second audio sample, according to a press release from OpenAI on the results of a small-scale test.

“We recognize that the ability to generate human-like voices carries serious risks, which are particularly significant in this election year,” the San Francisco-based company said.

“We are working with U.S. and international partners from government, media, entertainment, education, civil society and other sectors and are taking their feedback into account as we develop the tool “.

In this crucial election year around the world, disinformation researchers fear misuse of generative AI applications (automated production of texts, images, etc.), and in particular voice cloning tools, which are cheap, easy to use and difficult to trace.

OpenAI assured that it had adopted “a cautious and informed approach” before a wider distribution of the new tool “due to the potential for misuse of synthetic voices”.

This cautious presentation comes after a major political incident, when a consultant working for the presidential campaign of a Democratic rival of Joe Biden developed an automated program which impersonated the American president, campaigning for his re-election.

The voice imitating that of Joe Biden called on voters to encourage them to abstain in the New Hampshire primary.

The United States has since banned calls using cloned voices, generated by AI, in order to combat political or commercial scams.

OpenAI clarified that the partners testing “Voice Engine” have agreed to rules requiring, among other things, explicit and informed consent from anyone whose voice is duplicated and transparency for listeners: they must be clear that the voices they hear are generated by AI.

“We have implemented a range of security measures, including a watermark to be able to trace the origin of all sound generated by Voice Engine, as well as proactive monitoring of its use,” insisted OpenAI.

Last October, the White House unveiled rules and principles to govern the development of AI, including that of transparency.

Joe Biden was moved by the idea of ​​criminals using it to trap people by posing as members of their family.


reference: www.lapresse.ca

Leave a Comment