So, we just witnessed one of the biggest Apple WWDC keynotes ever, possibly because of the current tech landscape and everyone’s expectations for AI in 2024. But interestingly, Apple saved the AI talk for the latter half of their presentation. The first half? Not a single mention of AI. Then, in the second hour, they unloaded all their AI updates, and of course, they branded it because they’re Apple. AI for them is now Apple Intelligence.
I watched the entire WWDC24 keynote and gathered my thoughts on all the major announcements. Here’s everything you need to know, along with some of my takes on it.
VisionOS 2.0: More Like 1.2?
Let’s start with VisionOS. We’re already on VisionOS 2.0, less than six months after the Vision Pro was first announced. There are some cool new features, though I’d argue this feels more like a 1.2 update, but okay, they’re calling it 2.0. There’s a new wrist-turn gesture to quickly see the time and battery percentage, then jump right into the control center without looking up at the ceiling like before. It’s reminiscent of the Oculus Quest—definitely an upgrade.
There’s a bunch of immersive media features, like now iOS 18 can let you go through your old photos and use advanced machine learning to turn them into spatial photos, which is really interesting if it works. And there are also new tools to help people create spatial videos and immersive videos on cameras other than the iPhone, but then of course you can only view them on a Vision Pro. So who knows how valuable this will actually turn out to be.
But then, this year, Mac mirroring will get way more resolution thanks to it now doing the foveated rendering on the Mac. So basically it frees up a ton of resources and it lets you double the resolution for an ultra wide view. That is way more room for windows and activities. Love that ! and that’s pretty much it.
Again, it’s not the biggest “2.0 update” we’ve ever seen, but it does feel like there’s some nice updates in there, including some new environments and the ability to rearrange your home screen. Believe it or not, you couldn’t do that before but now you can !
iOS 18: Finally, Customization Galore
Moving on to iOS, this update is one of the most interesting in a long time. Android users, try not to laugh, but yes, iOS now lets you place icons anywhere on the home screen grid. This got an applause moment at the keynote, as if they just invented a new feature. But fine, iPhone users, you’ve been waiting for a long time for this. Have your fun !
But they didn’t stop there. They added a whole bunch of other customization features to the iPhone home screen too, and it started to get a little more interesting. See, there’s this new theme engine now that lets you go in and change the color of every icon and widget on your home screen to the same color. So you can match your wallpaper with your icons to frame it nicely and have matching colors everywhere. It’s a simple thing and we’ve been doing it on Android with launchers for years, but the thing about these is, before this moment, basically every iPhone home screen looked the same. We’ve been saying that. They’ve just given you the ability to go crazy and kind of ruin your home screen if you want to, and people seem to absolutely want to.
iOS 18: New Features, Hidden Apps, and More – A Major Overhaul
Apple redesigned the Control Center to be fully customizable across multiple pages. So again, there’s a lot more going on in the control center, and there are many more customizable shortcuts in more places, including replacing the flashlight and camera on the home screen without jailbreaking, finally and the list keeps going.
Pardon me if you’ve heard these before, but you can hide apps now inside your app drawer and you can lock apps into a hidden folder that makes it easier than ever to hide Tinder from your wife ! There’s also scheduled text messages in iMessage. Great, finally, they’re having some fun over there. There’s also text formatting. There’s messages via satellite. RCS support was very casually but briefly mentioned.
The Photos app is redesigned. There’s also a game mode that now minimizes background activity and minimizes Bluetooth latency for peripherals. Great. And a new automatic categorization in the new Mail app. So you can tell, there’s been a whole bunch of stuff.
AirPods, Apple TV, and WatchOS: Minor but Sweet Updates !
AirPods got minor updates like voice isolation during calls, spatial audio during games, and head nod/shake controls. Apple TV now lets you swipe down to see all the actors, characters, and songs playing in real time, and boosts voices over background music which is nice! WatchOS on the other hand has a training mode that evaluates workout intensity over time which is sweet if you are a fitness freak.
iPadOS 18 : The Calculator Story
Coming on to the iPadOS. So this was supposed to be Apple’s big chance to really convince us that the iPad, it’s something special that’s got a little something to it that would make us convinced that we’d wanna get the newest one, the M4 one that just came out over just your phone or over a Mac. And I’ll say it did get one incredible feature, but it didn’t really do much more than that. So the iPad gets all the stuff I talked about with the iPhone, right? Full home screen customization, customizable Control Center, the new Photos app, et cetera. It’s all great. There’s also a feature in SharePlay that lets you go in and actually remotely control someone’s iPad when they allow it, which seems like a small thing, but I promise you that is massive for family tech support. You know who you are.
But the one massive actually impressive new feature on the iPad in iPadOS 18, and you think I’m joking, but I’m dead serious when I say this. It’s the Calculator app. It’s the new calculator. I swear, I was so ready to dunk on this that I already had sweet but taunting short note drafted up as they were talking about it.
We finally brought the calculator to the iPad, and there were screenshots of it, and it looked just like the iPhone app. And I was like, “Really?
This is what you were so hyped about?” I had that clip of Craig Federighi being like, “Yes, we had to do something super special to finally bring that calculator to the iPad.” But then they pulled the Apple pencil off the side and they pulled up what’s called Math Notes, and it was pretty sick.
So now, on the iPad, you can actually write down equations in handwriting, and it will understand what you wrote, and it’ll answer the question and in your handwriting. And then if you adjust the equation or add new information or something, it automatically updates the answer, which is so sick. I don’t know why I’m so impressed with this, but it was really cool to play with and look at.
So if that wasn’t impressive enough, it also supports variables. So you can have variables written all over the page. So look at this. It has G, and X, and A, and H, and then it’s labeled Y as the height here. So you can ask for an equation using these variables, and it can give you an answer. And if you want, you can do a y equals equation, and it’ll give you a graph with real time adjustment of any of the variables in your equation.
AI stands for Apple Intelligence now!
Finally, Apple got to the AI part. They avoided saying “AI” outright, calling it Apple Intelligence instead. Apple has already had AI features on their devices before. They’ve had the Neural Engine inside of their chips, and they’ve done things like, smartly cutting subjects out of photos. They’ve done things like auto complete on the keyboard. That’s already existed. But this new Apple Intelligence is basically a bucket of these new generative models and what they do on your devices. So there’s these new diffusion models, generative models, and large language models that are all built by Apple that bring new functionality to the supported devices.
Actually, I’ll just get that out the way, right off the bat. These are only supported on the highest end versions of currently available Apple silicon. So that just means iPhone 15 Pro and any iPad with M1 or later and any Mac with M1 or later. So what does that look like? Well, basically starting with these new OSs, there’s a small suite of tools that’s kind of sprinkled across everything. It’s not like there’s one Apple Intelligence app or something like that. They’re just kind of sprinkled throughout.
So here’s an example, writing tools. We all know how powerful these large language models can be. So anytime you’re writing in pages or keynote or basically anywhere where there’s a cursor, you can use the writing tools to summarize or rewrite something that you’ve already written. It can change the writing style or just proofread, just basic useful stuff. Here’s another one. Remember the Magic Eraser tool in Google Photos and these Android phones that use it? Apple’s finally doing that too. Built into their own Photos App is a Clean Up tool, and it’s basically the same idea. This will identify background items. It’ll let you circle items you don’t like in your photos and just get rid of them. Fill in the background with generative fill, super quick.
iOS 18 includes the Genmojis, which means generative emojis. And this made me feel really old ’cause I would never use this, but apparently there’s a whole group of people who go through searching for emojis And for those people, yeah, now you can. You can generate a new emoji, You can literally type it in like a prompt, and the diffusion model will create that new emoji from scratch in the style of all the other emojis.
Now they have an entire thing called the Image Playground that’s built into a bunch of Apples apps, but also has its own separate app that lets you create these nice little square images with prompts in three different styles, sketch, illustration, and animation. And yes, there are also Siri improvements that use the large language models to generally understand context better and just generally be better and more natural at being an assistant that isn’t garbage. And later in the year, it will also be able to pull info from inside of apps and take actions inside of apps for you too.
I remember when this was a huge Bixby feature on Samsung phones. So that’s cool to see. Plus there’s also this big pretty new full-screen animation when you’re triggering Siri to tie it all together. Its voice is apparently slightly updated and you can now also type to Siri instead of talking to it out loud every time, which is long overdue, but also nice. There is notes app summaries, there are phone call summaries, just all kinds of features sprinkled around like I said.
I think I’ll end up making an article, just summarizing or testing or reviewing all of the Apple Intelligence stuff ’cause it’s kind of all over the place, but now you know what it’s.
But one question that’s been floating around the internet is, what stuff is happening on device versus things that have to go to the cloud? And this has also come up a lot because you might have also heard about Apple partnering specifically with OpenAI to integrate ChatGPT-4.0 into this Apple Intelligence stuff.
Apple’s Approach to AI and Privacy
So it’s a big question everyone’s wondering. So I got the official answers from Apple, and the answer is basically almost everything happens on device intentionally, and those are from Apple built models, so that should be the fastest stuff that happens. But in the chance that there are things that are too complex or just outside of the area of expertise of Apple’s models, then it can basically go one of two ways.
One is it will go to a larger server based model that’s on servers that Apple has built with Apple silicon using what they call Private Cloud Compute. So basically the info is never stored or sent to Apple. It’s still gonna have the downsides of having to go off of the device up to the cloud. It might take a bit longer if you’re in an area with terrible internet. It might not work at all, but in the chance that this is like a big complex thing that could benefit from those models, that’s what it’ll try to do.
But the other is when you specifically ask it for something that ChatGPT would be good at and if that happens, then it will specifically ask you,
“Hey, is it cool if I ask chat GPT for an answer to this question?”
Then you can give it a yes or no on the spot. So anytime it wants to do this, whether it’s to upload a photo that you’re asking something about or just do a complex prompt in general, then this little dialogue box pops up and you have to say yes every time, and then you can tap into everything that OpenAI’s model is capable of, or generate even more realistic or varied, random different styles of images that Apple’s diffusion models would never make.
This is all free without an account. At no point is OpenAI allowed to ever store any of these requests, and Apple has also said that they will obscure your IP address.
So OpenAI can’t even connect multiple requests together to form a profile of you. There’s just a lot of thought that’s gone into ideally making this as secure and private version of going to the cloud for AI features as they possibly can.
Check: Apple’s New AI: Personalized and Private in the Cloud
My view
It’s really interesting, a lot of interesting stuff going on here. I think my overall take with a lot of these WWDC -24 announcements and with the Apple Intelligence stuff, first of all, is that the Humane Pin and the Rabbit R1, they were so doomed from the start. Even if they were good up to this point, there’s just no way they could be as good as this stuff on your phone with all the personalization, all the info that they already know about you.
So, yeah, it’s a lot to take in. This WWDC felt like a new era for Apple, with AI taking center stage. The Humane Pin and Rabbit R1 seemed doomed from the start compared to these personalized, integrated features.
Overall, the non-AI updates are solid too, like home screen customization, but AI advancements are the highlight. I’m excited to see how this Apple Intelligence stuff pans out. Stay tuned for more detailed reviews and summaries!