NewsSoftware

Apple is giving us a sneak peek into its AI powered future

0

If you’re curious about what Apple’s implementation of AI could look like, you can actually find out right now. Apple released a new open-source AI model called “MGIE,” which it developed in collaboration with the University of California, Santa Barbara.

MGIE stands for MLLM-Guided Image Editing. It basically uses multimodal large language models to interpret user commands to perform image editing. What this means is that you can tell the AI how you want an image to be edited and it will make those edits according to your instructions.

AI powered image editing isn’t new. In fact, that’s one of the new AI features in the Samsung Galaxy S24. But the problem with current AI editing is that it makes the decision for you. It tries to guess what a “good” edit looks like. With this model, users will have more control over the editing process.

For example, say you want the sky in a photo to look more blue. It will edit just that part of the image. In some ways it is more user-friendly. This is because sometimes people don’t know how to edit a photo. This approach is similar to how a client might brief a designer and give them instructions to reach their desired look.

If you are interested in how MGIE works, it is available on GitHub. We’re not sure when Apple might implement this AI in its iPhones, but it could be a glimpse of what we might be able to expect.

Tyler Lee
A graphic novelist wannabe. Amateur chef. Mechanical keyboard enthusiast. Writer of tech with over a decade of experience. Juggles between using a Mac and Windows PC, switches between iOS and Android, believes in the best of both worlds.

    Apple’s Vision Pro Needs an Android Competitor

    Previous article

    Samsung will make the Galaxy S24 look more vivid in a future update

    Next article

    You may also like

    Comments

    Leave a reply

    Your email address will not be published. Required fields are marked *

    More in News