Meta Platforms: Artificial Intelligence(AI) News: Difference between revisions

No edit summary
Line 4: Line 4:


== AI Developments ==
== AI Developments ==
August 3, 2023: Meta launches AudioCraft, an open-source AI tool that enables users to generate audio and music from text prompts. AudioCraft will have three models namely; MusicGen, AudioGen and EnCodec. Meta said that the three models will be available to researchers and practicioners who will be able to use their own data set to train them. MusicGen was trained using Meta's own and licenced music. On the other hand, AudioGen was trained using public sound effects. EnCodec enables high quality musi generation with less artifacts. Meta also released AudioGen model, which are pre-trained and enables users to generate environmental sounds and sound effects such as a dog barking or cars honking. "While we’ve seen a lot of excitement around generative AI for images, video, and text, audio has seemed to lag a bit behind," Meta said in a blog post<ref>https://about.fb.com/news/2023/08/audiocraft-generative-ai-for-music-and-audio/
August 3, 2023: Meta launches AudioCraft, an open-source AI tool that enables users to generate audio and music from text prompts. AudioCraft will have three models namely; MusicGen, AudioGen and EnCodec. Meta said that the three models will be available to researchers and practicioners who will be able to use their own data set to train them. MusicGen was trained using Meta's own and licenced music. On the other hand, AudioGen was trained using public sound effects. EnCodec enables high quality music generation with less artifacts. Meta also released AudioGen model, which are pre-trained and enables users to generate environmental sounds and sound effects such as a dog barking or cars honking. "While we’ve seen a lot of excitement around generative AI for images, video, and text, audio has seemed to lag a bit behind," Meta said in a blog post<ref>https://about.fb.com/news/2023/08/audiocraft-generative-ai-for-music-and-audio/


</ref>.
</ref>.