🔗 Apple AI paper details how its multi-modal ‘Ferret-UI’ LLM can interpret phone UIs. — theverge.com // Wes Davis

The company has been steadily releasing research papers on its AI work for years. With Ferret-UI, the company seems to be looking at how AI can help with smartphone navigation

This isn’t the same thing, but it sounds quite similar to the Rabbit R1-style of teaching computers how to use human interfaces. I think it’s super smart.