adobe-logo

Adobe’s New AI Tools are Designed for Easier Audio Production

As AI continues to grow and anchor itself within the consumer tech landscape, we’ve witnessed what developers are able to do with AI tools at their disposal, resulting in some pretty impressive (and helpful) software applications at times. As such, Adobe Research recently announced that it is working on an AI-based tool designed for music generation and editing.

Known as “Project Music GenAI Control,” the software allows creators to generate music from text prompts, and then have fine-grained control to edit that audio for their precise needs. It’s an interesting approach to how AI can be used, given that most AI applications so far revolve around text and image generation.

READ: Sundar Pichai on Gemini Controversy: “We Got it Wrong”

Commenting on the AI tool, co-creator and Senior Research Scientist at Adobe Research Nicholas Bryan states:

“With Project Music GenAI Control, generative AI becomes your co-creator. It helps people craft music for their projects, whether they’re broadcasters, or podcasters, or anyone else who needs audio that’s just the right mood, tone, and length.”

This isn’t the first time that Adobe has worked with AI – the company’s “Firefly” image generation model for example has become one of the go-to AI tools for image generation, at least according to the company.

Project Music GenAI Control works by responding to text prompts fed into the generative AI model – for example, a user might input a text prompt such as “powerful rock,” “happy dance,” or “sad jazz” to generate music. Once the tools generate music, editing is then integrated directly into the workflow.

The tool also comes with a simple user interface, and users can also transform their generated audio based on a reference melody, in addition to adjusting the tempo, structure, and repeating patterns of a piece of music, fine-tune audio intensity extend the length of a clip, re-mix a section, and so on – it seems that Adobe has designed the tool to be as in-depth as possible.

Adobe says that Project Music GenAI Control is being developed in collaboration with a sizeable team at the University of California, San Diego, including Zachary Novack, Julian McAuley, and Taylor Berg-Kirkpatrick, and colleagues at the School of Computer Science, Carnegie Mellon University, including Shih-Lun Wu, Chris Donahue, and Shinji Watanabe.

Source: Adobe

Exit mobile version