aboutsummaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorXe Iaso <me@xeiaso.net>2025-01-04 13:27:50 -0500
committerXe Iaso <me@xeiaso.net>2025-01-04 13:33:37 -0500
commitcc3cae82866f17abfc507ee5db7ddaf6b76c4259 (patch)
tree57b11b6bf2bb0fe30f213f5136017187354ee3c9
parent98be592d41381b23b09ff463d6cf1c61adb25348 (diff)
downloadxesite-cc3cae82866f17abfc507ee5db7ddaf6b76c4259.tar.xz
xesite-cc3cae82866f17abfc507ee5db7ddaf6b76c4259.zip
stage they squandered the holy grail
Signed-off-by: Xe Iaso <me@xeiaso.net>
-rw-r--r--go.mod3
-rwxr-xr-xlume/src/blog/2025/squandered-holy-grail.mdx318
2 files changed, 320 insertions, 1 deletions
diff --git a/go.mod b/go.mod
index 7ca77be..f8ba1f1 100644
--- a/go.mod
+++ b/go.mod
@@ -1,6 +1,7 @@
module xeiaso.net/v4
-go 1.23.0
+go 1.23.1
+
toolchain go1.23.4
require (
diff --git a/lume/src/blog/2025/squandered-holy-grail.mdx b/lume/src/blog/2025/squandered-holy-grail.mdx
new file mode 100755
index 0000000..09bb18a
--- /dev/null
+++ b/lume/src/blog/2025/squandered-holy-grail.mdx
@@ -0,0 +1,318 @@
+---
+title: "They squandered the holy grail"
+desc: "Why Apple Intelligence failed even though everything it's built upon is nearly perfect"
+date: 2025-01-06
+index: false
+patronExclusive: true
+hero:
+ ai: "Photo by Suliman Sallehi, found on Pexels"
+ file: "shaka-walls-fell"
+ prompt: "A crumbling ruin of a once-mighty building on a hill in Afganistan"
+ social: false
+---
+
+A while ago, I got really frustrated at my Samsung S7. It was failing to hold a battery charge, or having issues with the Wi-Fi, or DNS over LTE or something and I reached a breaking point where I bussed over to Bellevue Square and bought an iPhone 7. It was my first Apple product that I'd ever bought with my own money and my first non-Android phone since I used Windows Mobile 6 on a T-Mobile Dash in high school.
+
+Needless to say, I loved it at first sight and all my phones since have been iPhones. The camera is good enough that I have to go out of my way to make my actual cameras different from what you can get on an iPhone. Hell, the iPhone is a fully capable cinema camera these days. It's easily been one of the best technology moves I've ever done for my creative career. The device enables me to do things and create memories of them to share with others.
+
+## Bicycles for the mind
+
+Way back in 1981, Steve Jobs (one of the co-founders of Apple) described the vision of Apple computers like this:
+
+<blockquote>
+ I read a study that measured the efficiency of locomotion for various species
+ on the planet. The condor used the least energy to move a kilometer. And,
+ humans came in with a rather unimpressive showing, about a third of the way
+ down the list. [...] But, then somebody at Scientific American had the insight
+ to test the efficiency of locomotion for a man on a bicycle. And [...] a human
+ on a bicycle, blew the condor away. [A computer is] the most remarkable tool
+ that we’ve ever come up with, and it’s the equivalent of a bicycle for our
+ minds.
+</blockquote>
+\-[Steve Jobs](https://www.goodreads.com/quotes/9281634-i-think-one-of-the-things-that-really-separates-us)
+
+Apple computers aim to make it easier for people to be creative while spending less energy to do it. One of the big things that Apple made the Macintosh for was typography. With that, they made [MacWrite](https://en.wikipedia.org/wiki/MacWrite), one of the two programs that shipped with every Macintosh computer for free. If you were used to having to write documents out by hand or using a typewriter to make them, the leap to something like a word processor is so amazingly vast that it's difficult for anyone younger than me to comprehend it. We've had them all our lives.
+
+<Picture
+ path="blog/2025/squandered-holy-grail/macwrite"
+ desc="A screenshot of an emulated Macintosh running MacWrite with the first paragraph of The Bee Movie script in it."
+/>
+
+Imagine not being able to reliably use the backspace key when you're writing something. Imagine a world where all you could do was just write more text. Sure, there were ways to "cover up" a mis-typed letter, but they were vastly more inconvenient than just ignoring it or re-typing the word and crossing the wrong one out by hand.
+
+Word processors let you use the backspace key to delete text and then look at the screen to get a reasonable approximation of what the printed document would look like. Before you print it.
+
+To say that this enables a vastly different kind of creative process is like saying that water makes things damp. Word processors like MacWrite absolutely transformed the ways that everyone used computers. They were bicycles for the mind and without them our world would be starkly different. I shudder to imagine that [NaNoWriMo](https://nanowrimo.org/) would be a thing without word processors.
+
+Many companies want to make computers that you can use to do computer things. Apple makes tools that you use as an extension of your body in order to do creative things. They don't just sell computers, they sell something that helps enable you to create things that just so happen to be computers.
+
+This is the big vision difference that puts Apple in its own class. They sell bicycles for the mind.
+
+## Intelligence as a faucet
+
+In June 2024, Apple announced [Apple Intelligence](https://www.apple.com/apple-intelligence/): a set of features aimed at making your smartphone smart. The biggest thing that stood out to me was this example of what Apple Intelligence was going to enable in Siri:
+
+<Picture
+ path="blog/2025/squandered-holy-grail/podcast-other-day"
+ desc="An Apple Keynote slide saying 'Play that podcast my wife sent the other day'."
+/>
+
+If they could really just correlate relationships, categorize links, and make all of that context visible to Siri, that would be fundamentally transformative in ways that the word processor would be to people that didn't have it before. Everything else done with it would be added bonuses or party tricks on the side. The real benefit would be being able to search through all of your digital life across every app with simple queries and then have your phone do things for you.
+
+Sure Craig's example was playing a podcast, but the basic idea holds for other types of media too. "Share those pics from San Francisco to Instagram." All this context that everything is building up would finally be useful to the users instead of just useful to the companies making all of the apps we use.
+
+They wanted to make all Apple devices be able to tap into _intelligence_ as a faucet in the same way that Spotify lets you tap into music as a faucet and that the AWS API lets you tap into compute as a faucet. This is massive and if it was pulled off right would be the new standard that companies like Samsung and Google would clone the same way they cloned the hardware and software design of the iPhone.
+
+In that keynote, they spilled out the vision that computers should work _with_ you in order for you to do what you want to do. They should enable you to be creative. They should be bicycles for the mind. The fact that they are computers should only be a footnote in an appendex titled "implementation details".
+
+Then they casually dropped the holy grail of trusted compute, but in order to understand why it's so big we need to take a little detour into the modern Internet user's view of the Internet.
+
+## Apps as thin as reception
+
+One of the biggest problems with modern applications is that they are thin shells around web services. When you open the Instagram or Bluesky apps, your phone makes a request to their servers and then shows you posts when it gets a response. You don't know or care how those responses are getting made, you just know that when you open the app, you get content and that makes you happy.
+
+However, when you don't have signal, you don't have the app. Go on an airplane and once you run out of reception the app is worthless. You can't queue posts to be made when you get back into signal. You can't view posts that were available before you lost connection. You can't even view things that you just posted in some cases. The app breaks and you are slowly alienated from your data one photo at a time.
+
+This is the way nearly every single app on my phone works with only two exceptions: Signal and everything Apple makes. If you want to read more about how the modern user's experience with the Internet is like, check out Ed Zitron's [Never Forgive Them](https://www.wheresyoured.at/never-forgive-them/).
+
+<blockquote>
+ As every single platform we use is desperate to juice growth from every user,
+ everything we interact with is hyper-monetized through plugins, advertising,
+ microtransactions and other things that constantly gnaw at the user
+ experience. We load websites expecting them to be broken, especially on
+ mobile, because every single website has to have 15+ different ad trackers,
+ video ads that cover large chunks of the screen, all while demanding our email
+ or for us to let them send us notifications.
+</blockquote>
+\-[Ed Zitron](https://www.wheresyoured.at/never-forgive-them/)
+
+Not to mention, you don't know how the services that power your apps work. The market at large does not want to pay for chat programs or social media. Running chat programs and social media apps is mind-bogglingly expensive. Venture capital only lasts so long and the companies involved have to make money somehow. The big pile of user data starts to look like a really good thing to mine in order to make a profit.
+
+## The holy grail of trusted compute
+
+This stands in stark contrast to the goal of something like Apple Intelligence. When possible, Apple Intelligence will run on your device. Apple [went out of their way](https://arxiv.org/abs/2312.11514) to make it possible and easy to run large language models and other AI models on your device without having to make too many compromises in the process. If something is done on your device (or at least on hardware that you can look at, like a Mac mini in your office), then the computation is _infinitely_ more private than anything involving making a request to the outside world.
+
+In Apple's WWDC keynote they claimed that they had a system called Private Cloud Compute that would enable users to have the same privacy guarantees (or more) when making requests out over the network as they did for computations running on their local devices.
+
+This seemed impossible to me. From what I know about how the web service sausage is made, it seems impossible to have all of these guarantees at the same time:
+
+- User data is only used to fulfill requests and then erased.
+- The load balancing infrastructure doesn't know who is making a request and what server it is going to.
+- Researchers are able to inspect and verify the Private Cloud Compute system and simulate it on their laptops.
+- Apple site reliability staff does not have privileged access to Private Cloud Compute nodes and logging is minimized at the compiler level.
+- An attacker cannot reliably figure out which node is being used to make any request from any user.
+
+If you have any modicum of site reliability experience, this seems like an unsatisfiable set of constraints. It seems literally impossible, yet here they are claiming that they have done it.
+
+The [technical details](https://security.apple.com/documentation/private-cloud-compute) of how they pulled this off is well worth reading, if only because it is the first time I have ever seen any company's AI product team put together a cogent security model and release that security model to the public. TL;DR:
+
+- They X-ray the hardware at every step of the assembly process and compare that to reference images in order to combat threats from factory workers adding unapproved hardware to the boards of the servers.
+- You can set up your own local copy of a Private Cloud Compute node and punish it with all the hellfire you want to see if you can break it and get root. Apple will pay you a lot of money if you can.
+- The hardware certification process involves a lot of unrelated people in unrelated departments of Apple.
+- Every Private Cloud Compute node is rigged to not only decertify itself when power is removed, they also rigged the main power for the board to the chassis intrusion switch. Open the server? Power gets cut and the node is de-certified.
+- Every time your devices make a request to Private Cloud Compute, they record the node ID that was used to fulfill it and you can go in and verify that all of the nodes your device used are still certified.
+- The production OS images are free for the public to download and not encrypted in any way.
+- Every package that makes up the important parts of the OS are split into two types: code and data. You cannot mix code into a data package or vice-versa.
+
+This is literal madness in comparison to how most other AI products are run. Most of the time, an AI product is run on some GPUs you got somewhere that run some firmware that you probably haven't tested or verified (even though everyone with access to the GPU can reflash the firmware from software), with bog-standard ngnix or something choosing to route your requests to a service running somewhere without any real guarantees that the service is not logging and storing literally everything you put into it. From a user privacy standpoint, it's basically the same as using Instagram. You assume that everything is being logged and used to make money somehow.
+
+Apple is standing in stark opposition to this and saying "no, we ain't doing that" and then backing it all up with code as well as detailed documentation for how they pulled it all off. They also released the source code for the security-critical parts of Private Cloud Compute [openly on GitHub](https://github.com/apple/security-pcc).
+
+This is the holy grail for remotely attested trusted compute. This OS is the kind of thing that Richard Stallman was warning about in [The Right to Read](https://www.gnu.org/philosophy/right-to-read.en.html). You don't get root there. You don't get a compiler. You don't get a debugger. You don't get anything but the ability to run software that was shipped with the OS image. If this OS were shipped to consumers, you would have a nearly unhackable system that would make it basically impossible to tinker with. There are many reasons why you would want such a thing in [the era of phone scamming the elderly](https://www.youtube.com/watch?v=dWzz3NeDz3E), but it would make it difficult for people like me to be developed with it.
+
+However, for something like Private Cloud Compute, it's a perfect match. All the computer can do is known in advance and nothing else is allowed to happen. This makes it a lot easier to ensure that privacy guarantees are that: guarantees.
+
+It's really frustrating that this foundation of trusted compute is being squandered. I wish I had an OS like Private Cloud Compute's as an option for building production systems.
+
+## What we got
+
+We got the first batch of Apple Intelligence features at the end of October 2024. They've been advertised as if they are all out. With that we got Writing Tools to help you summarize and rewrite text; summaries for notifications, webpages, and emails; Clean Up in case you want to remove things from photos; the ability to search for photos based on their contents; Siri being able to search through the documentation for your device; and Math Notes to let you solve equations in the Notes app. Later we got Image Playground and email categorization. That mythical personal context is omnipresent in the advertising yet somehow, it's not launched yet.
+
+I'm gonna break down my feelings about each of these features in their own little sections after having used them somewhat extensively.
+
+### Math Notes
+
+I just want to start out by saying that out of all of these features the one that I love the most is Math Notes. Holy crap, Math Notes is incredible. It lets you type out things like this:
+
+```bash
+Rent = 2300
+FamilySize = 2
+Rent / FamilySize =
+```
+
+And then the Notes app will just insert `1150` after that last equals sign. It's fantastic. I end up using this to do basic calculations with variables including terrible estimates for how much money I'm making vs spending. It's probably one of the best features ever made for any Apple device ever.
+
+I have zero complaints about Math Notes.
+
+### Writing Tools
+
+I'm not the target audience for Writing Tools. I've written a bunch of articles on my blog and frankly, I don't really need help from an AI model in order to write; I have a bigger problem not writing than I do writing. With that in mind, Writing Tools is basically useless for me. It's just a way to get a slightly worse version of what I already have the ability to make myself better.
+
+Using Writing Tools makes me feel the worst side effects of existing AI tools and how they replace the creative process instead of adding to it. When I put text through Writing Tools I get an opaque answer. I don't get layers into the answer or steerability beyond "make this professional" or "turn this into a table", I just get a blob of text that can be painful to read.
+
+For example, I asked Writing Tools to summarize that last paragraph and here's what I got:
+
+<Picture
+ path="blog/2025/squandered-holy-grail/writing-tools-fail"
+ desc="a modal dialog over the previous paragraph in Apple Notes titled 'Writing Tools Unavailable: Certain capabilities are unavailable at this time. Try again later.'"
+/>
+
+You can't make this up. I asked it to make it friendlier and I got this:
+
+> Using Writing Tools has some serious drawbacks. It feels like it’s taking away the creative process instead of enhancing it. When I input text into Writing Tools, I get a generic response that lacks depth and customization options. It’s like getting a one-size-fits-all answer that’s hard to follow and doesn’t allow for much creativity.
+
+It can be useful for taking a bunch of ranting I do on stream and turning that into an outline that I can use for a starting point in writing, but it does a terrible job of doing the writing for me. I imagine that for other people that don't have extensive English experience that it'd be a lot more useful, but I don't know how useful it is for me.
+
+### Notification, Message, and Email summaries
+
+This is the biggest feature that sounds like a good idea until you actually implement it. The core idea is that when you get a bunch of notifications from your apps, you have a stack of things that can be tedious to go through. A summary is easier to digest and gets the point across much easier.
+
+This works great until it doesn't. Here's the summary of a scam text message I got as I was writing this post:
+
+<Picture
+ path="blog/2025/squandered-holy-grail/sms-scam-fail"
+ desc="A message summary: 'Package delivery delayed due to incomplete address information...'"
+/>
+
+This phrases a _literal scam message_ in ways that make me think immediate action is required. You can see how this doesn't scale, right? It's gotten to the point where the news has reported on how notification summaries [made people think a suspect in custody killed themselves](https://www.bbc.com/news/articles/cd0elzk24dno).
+
+Even more, if you have Apple Intelligence enabled for some of the other features but disable notification summaries because you find them worthless, you can get your notifications delayed up to _five seconds_. It's kind of depressing that telling your computer to do _less work_ makes the result take longer than doing _more work_.
+
+Additionally, none of the summarization features work on my iPhone and I can't be bothered to figure out why and fix it. I personally don't find them useful. I just leave them enabled on my MacBook so that notification delivery is not impacted.
+
+<Conv name="Cadey" mood="percussive-maintenance">
+ Even though it has decent "Apple polish", it just feels half-baked somehow.
+ It's almost like it's not done yet but they were made to just ship whatever
+ they had in order to meet some arbitrary deadline made up by someone that
+ doesn't understand the details. This feels like it's happening across the
+ industry though, especially as companies try to milk the money generator for
+ more money.
+</Conv>
+
+### Clean Up
+
+I don't like Clean Up from a philosophical standpoint. I'm a photographer. When I frame a shot and take it, I want the data coming off of the sensor to be the data that makes up the image. I want to avoid as much processing as possible and I want the photo to be a reflection of reality as it is, not reality as it should have been. Sure, sometimes I'll do some color correction or cropping in post, but that doesn't change the _content_ of the image, only its presentation.
+
+Clean Up is best explained by this famous photo editing example:
+
+<Picture
+ path="blog/2025/squandered-holy-grail/stalin-photo"
+ desc="A picture of Joseph Stalin, former Prime Minister of the Soviet Union next to Nikolai Yezhov before and after being removed from Soviet history after being purged."
+/>
+
+This tool allows you to capture a moment in time as you wish it happened, not as it actually happened. I don't like this from a philosophical standpoint. I'd much rather capture things as they were. As such, I haven't used Clean Up and can't talk about it much more.
+
+### Image Playground
+
+I have a lot of thoughts about Image Playground. I've used a lot of image generation models and I'm currently working on experiments with conveyance (images that convey feelings or moods that would take many words to explain) in generative AI. Here's one of my successful examples:
+
+<Picture
+ path="blog/2025/squandered-holy-grail/sakura-flower-field"
+ desc="A picture of a brown-haired anime woman smiling in a field of blooming pink flowers, heavy depth of field so only the woman and a couple of flowers are in focus. Made with Stable Diffusion 1.5 and ComfyUI."
+/>
+
+I made this using a stack of about 11 to 12 models in a complex diffusion flow using a Stable Diffusion 1.5 finetune from late 2022. Let's call this the upper bound of how good you can get outputs from techniques of the era. There's some glaring flaws (mostly involving the continuity of the fence, but that could be explained with fence construction methods).
+
+In comparison, here's one of my Image Playground generations of the East Berlin TV tower at sunset:
+
+<Picture
+ path="blog/2025/squandered-holy-grail/berlin-tv-tower"
+ desc="An AI-generated illustration of the East Berlin TV tower at sunset."
+/>
+
+This is also pretty good, there's problems with how the sky is half mid-day and half sunset and the windows/decks have a lot of issues with many straight lines, but it'd be mostly passable at a casual glance. Especially on a phone screen. I'm able to see a lot more of the flaws due to my extensive experience with AI tools, but in a pinch you probably wouldn't blink too hard at this.
+
+I hate to admit that this is heavily cherry-picked. Most of the time, you will get horrors beyond mortal comprehension like this:
+
+<Picture
+ path="blog/2025/squandered-holy-grail/hoof-taco"
+ desc="An AI-generated illustration of a taco smoking beer at a party. The taco has hooves for feet and hands. It uses a placid corporate artstyle and communicates nothing."
+/>
+
+This is horrifying. I don't even know where to begin in talking about all of the things that are off or wrong with this image. I also don't think you would need special training or experience to understand what is wrong with this image.
+
+Mind you, this both of those images were generated with plain text prompts. You can add people to these images. Using a photo of yourself is a great way to experience what dysmorphia feels like. Here's one of [Corey Quinn](https://bsky.app/profile/quinnypig.com) doing his typical gremlin smile:
+
+<Picture
+ path="blog/2025/squandered-holy-grail/oompa-loompa"
+ desc="An AI-generated illustration of a man smiling. The proportions are disturbing. The soulless eyes peer into you and make you contemplate where the effort into AI image generation has gone and what good for humanity could have been done with that money and effort. His pupils are square like his teeth."
+/>
+
+I cannot believe that this is a shipped product from Apple. I genuinely am stunned. What the hell is going on over there?
+
+This is from the company that refused to ship so many things that we'll never hear about. This is from the company that _defined_ the idea of having a vision-based product. Of having a product vision so strong that they were willing to accuse people of _holding a device wrongly_ rather than admit they messed up.
+
+<Conv name="Cadey" mood="coffee">
+ I feel like Image Playground (and Genmoji, which isn't talked about here due
+ to the fact that it's difficult to extract emoji from chat messages without
+ losing quality) creates results that are just as soulless and empty as it is.
+ This is the complete opposite of the level of care and quality that I've come
+ to expect from Apple over the years. It's like they've been forced to just
+ ship something due to either investor pressure or not wanting to be behind on
+ the curve; and nobody at the product team was able to stop it from hitting the
+ market.
+</Conv>
+
+And now every company out there is going to copy this with open-weights models and make things that don't look like horrifying monsters. While you get the oompa-loompas of doom staring into your soul on iPhones, the rest of the industry is going to be able to make things like this:
+
+<Picture
+ path="blog/2025/squandered-holy-grail/flux-pro-berlin-tv-tower"
+ desc="A cartoon illustration of the Berlin TV tower at sunset. The sky has many shades of gold and red as it fades to twilight."
+/>
+
+It's frustrating. It'd be better if there was an IntelligenceKit for developers to be creative with the models or something, but there isn't. It all just feels half-baked like they were forced to release it out of obligation to shareholders, not out of choice for meeting the product vision.
+
+## Generative AI is not a product
+
+Back in September, I had a strange dream. If you know me well enough, you know that when I have a "strange dream", that usually means that something wild happened. In this dream, I had a conversation with Steve Jobs about product design, the philsophy of Apple enabling people to be creative, but the most salient point we discussed was this:
+
+> The real way that technology can change lives is by acting as a bicycle for the mind, a way to take human's latent creativity and allow them to focus it and employ it into something that makes their lives better. Imagine picking up a guitar and creating a song by purely feeling out the notes and working it into a melody just from what feels "right". Based on what you've described, most generative AI is useless for that because it removes all the creative control when going from A to B.
+>
+> If anything, the human cost seems like that it would outweigh any process gains from being able to draw a cat on the moon faster. Generative AI is completely useless as a product unto itself, but could be part of a larger product in some way. It should never be the selling point.
+
+\-"Steve Jobs" in a dream, September 2024
+
+Breaking this apart, what does being able to make a terrible illustration of [the Berlin TV Tower](https://cdn.xeiaso.net/file/christine-static/blog/2025/squandered-holy-grail/berlin-tv-tower.jpg) in a second or two really net us in terms of enabling creativity? You get a single final output. You don't get the layers to edit things like the color grading of the sky. Sure it'd be useful for low-effort social media posts, but this is not a product. This is a tech demo, and not even a good one. It'd be amazing if this was released 3 years ago, but it is 2025, not 2022.
+
+If generative AI is not a product, then what is it really useful for? I know how to use it in creative flows because I already have the training needed to be an artist. I know how to use it in research environments due to having years of experience throwing science at the wall to see what sticks. I understand these tools and what they are good and bad at (this is why all of my AI illustrations that I put effort into end up with an anime-inspired artstyle because recreating humans photorealistically gets you inhuman monsters 7 times out of 10).
+
+I think that it's better to view generative AI as an implementation detail, not the critical identity of the product. One of the best ways to understand a product is to start taking things away. If you take color out of a word processor, you still have a word processor. If you take bold or italic formatting out of a word processor, you still have a word processor. If you take font selection out of a word processor, you still have a word processor.
+
+If you take away the display output from a word processor, you have a typewriter instead of a word processor. Thus, the core of a word processor is being able to see on the screen what you would see on the page before you hit print.
+
+The core of why ChatGPT works as a product isn't the AI. It's the experience of each word being typed one at a time by the AI and saving your conversations with the AI for later.
+
+### Where should we use generative AI?
+
+In terms of where I think generative AI is actually useful, it's in places that are not as flashy or exciting. Think data analysis, [qualitative data coding](https://www.simplypsychology.org/qualitative-data-coding.html), data entry, reading data out of images, and things along that nature. I've been working with a fellow redditor on a study involving people's experiences of medtation and the difficult to describe sensations that come up. We want to use generative AI to try and categorize those sensations and see if we can get an effective result without as much drudgery involved as you'd get doing it by hand.
+
+I'll have more news about this by June. It'll involve publishing a paper or two in actual journals.
+
+## Conclusion
+
+I think that Apple Intelligence is a failure of a product from an implementation standpoint. This is frustrating because the foundation they are building on top of is nearly invincible. All data is processed on device as much as possible. Everything that can't be processed on your device is put into frontier-grade security practices to make sure it's as private and encrypted as possible.
+
+The thing that sucks about it is that they made the holy grail of remotely attested trusted compute and then made the end result so much worse to use than manually making your own integrations with [Ollama](https://ollama.com) on the _same device_. Using Ollama lets you pick models that are so much better than what you get with Apple Intelligence. And it'd be just as private.
+
+<Conv name="Cadey" mood="coffee">
+ I just can't help but imagine what it could have been. I know that the Apple
+ we have would never do that, but I just can't help but wonder. Apple spends
+ untold amounts of money trying to create things and they get beaten by a bunch
+ of people in caves with boxes of scraps and consumer GPUs.
+</Conv>
+
+<Conv name="Aoi" mood="coffee">
+ You know, maybe that's why the open-source community will always win here.
+ Apple has no real limits. The open-source community has to milk everything
+ they can get out of the hardware they have. Their withered hardware requires
+ them to use [lateral
+ thinking](https://newsletter.bijanstephen.blog/lateral-thinking-with-withered-technology/)
+ to get what they want. And they'll pretty much always win because then they
+ can deploy their creations into production. With zero modifications.
+</Conv>
+
+Needless to say, they did not give us bicycles of the mind. They gave us marginal improvements that feel like tech demos. The potential was so infinite and it just all feels wasted.
+
+Except for Math Notes. Holy crap. I love Math Notes so much. I wish other note-taking apps had it. It's easily the best feature they've ever come up with.
+
+I have a lot of complicated and nuanced thoughts about all this, and probably still have another 5-10k words left in me. Wish me luck.