Apple-iPhone-15-Pro-lineup-ProRes-video-shoot-230912

Apple is giving us a sneak peek into its AI powered future

If you’re curious about what Apple’s implementation of AI could look like, you can actually find out right now. Apple released a new open-source AI model called “MGIE,” which it developed in collaboration with the University of California, Santa Barbara.

MGIE stands for MLLM-Guided Image Editing. It basically uses multimodal large language models to interpret user commands to perform image editing. What this means is that you can tell the AI how you want an image to be edited and it will make those edits according to your instructions.

AI powered image editing isn’t new. In fact, that’s one of the new AI features in the Samsung Galaxy S24. But the problem with current AI editing is that it makes the decision for you. It tries to guess what a “good” edit looks like. With this model, users will have more control over the editing process.

For example, say you want the sky in a photo to look more blue. It will edit just that part of the image. In some ways it is more user-friendly. This is because sometimes people don’t know how to edit a photo. This approach is similar to how a client might brief a designer and give them instructions to reach their desired look.

If you are interested in how MGIE works, it is available on GitHub. We’re not sure when Apple might implement this AI in its iPhones, but it could be a glimpse of what we might be able to expect.

Exit mobile version