Need help?
<- Back

Comments (56)

  • hedgehog
    Me, driving, using Apple Maps for navigation: "Hey Siri, what's my ETA?"Siri: ... "Here's what I found on the web for "what is my ETA'"From outside I don't know the cause but contrary to their normal reputation for better integration between parts of their products it seems like Siri is in some organizational way fundamentally broken.
  • phantomathkg
    As a Hongkonger, native in Cantonese (zh-hk/yue) and fluent in English (en) and work in Singapore (en-sg & zh-sg), I have a even more mundane use case that Siri is not clever enough to support.I want my location to be in Singapore (Singapore SIM/Card etc) I want my UI to be in English. I want my Siri speaks Cantonese to me.For reason on Apple knows why, I have to use English (Singapore) as my UI and Siri language or Apple Intelligence will not turn on. As if the engineer who develop Siri/Apple Intelligence have never think about the needs of those who speaks more than one language.
  • ericwaller
    I find the idea of IRL multi-user UX really interesting. So much of modern computing is built around a 1-to-1 model of users and devices. And then multi-player, collaboration features are built on top of that. Sometimes they’re quite slick (ex. figma) and sometimes they’re pretty clunky (ex. apple family sharing stuff).But what’s really lacking is a model for multiple people sharing a single computing experience in real life. Companion mode in Google Meet or Spotify Jam are two attempts but both still force you through the one user, one device path.Two adults sitting in a car shouldn’t have to constantly think “whose phone is this?” connected to CarPlay. Especially when they’re part of the same Apple “family” and on a Spotify family plan.Two people seamlessly interacting with one “system” would break all sorts of auth and other assumptions, but it seems worth figuring out as computing becomes more and more prevalent in every facet of life.
  • gizajob
    I agree - this isn’t complicated. It seems to me that the issue arises from Apple’s “what-if-ism” - what if you get divorced, what if one of the kids grows up and stops speaking to you, what if the dog dies etc etc, a million different versions of which will get them bad press: “Apple told me to go and pick up my dead child’s cancer medication!” Hence it falls into the Steve Jobs “we say NO to a thousand good ideas before we say yes to one”.And also living without it doesn’t really affect Apple’s bottom line. But yeah I wish I had an AI assistant in my iPhone which would text back my parents with what I’m doing today and reply to their needless updates I get since buying them smartphones.Siri in general seems to be, for me at least, superfluous. The answer to most questions I ask is “I don’t know” or “I didn’t catch that” or “I can’t”. AI in general is still causing me major question marks, especially where it comes to the valuations right now on the stock market. This morning I was watching Bloomberg at the European open and noticed one of my stocks wasn’t really moving as usual, and the presenter then announced that the Nordic markets were closed today because of the Ascension Day public holiday. So I googled “is the Danish stock market open today?” and naturally Google’s AI was the top link, proudly announcing “Yes! The Danish market is open today, here are the hours yadda yadda”. I scrolled down and found the actual link to the market and it showed that, of course, the market is closed, it’s ascension day. So I asked the Google AI - “are you sure about that?” and it thought again and found out that “no, the Danish stock market is closed today. I apologise for telling you it was open without checking”. Honest to god this is the tech that’s putting Nvidia at a $5.5Trilion valuation and keeping the market at all time highs right now? A technology that makes even Google worse?
  • gyomu
    The potential for creepiness and abuse in the example use cases given is too damn high. At Apple scale, a single case of an insane stalker misusing this technology is image destroying - see some of the PR debacles they’ve been facing lately with AirTags.These sort of things are exactly what hand-rolled setups à la OpenClaude are great for- the potential for insane privacy disaster is still there, but in that case you have no one to blame but yourself.Large tech companies aren’t going to take that heat for features that aren’t really monetizable.
  • jerf
    Funny, I was just sort of spec'ing this out to myself yesterday.I'd consider building the system out as an MCP server rather than trying to bundle the agent with it. I had an AI build something out that is just a tasklist that works the way I think about tasks, which I've been using both personally and professionally. It's an MCP server only, which I can expose on the internet with OAuth. It has been surprisingly fun to use, because the AI can spontaneously interact with the information in ways I didn't program in. I have a recurring task with an AI to give me a dump of my current top tasks once a day to my phone.Professionally, I'm working between a lot of different teams with their own Jira boards and I needed something to use myself to organize and prioritize tasks that can't be prioritized within one place in Jira. With the Atlassian MCP server hooked up to the same agent as my code it is fairly trivial to attach a Jira bug to a task and then prompt the AI to do whatever to the bug attached to this task. I put an explicit field for it in to the task definition but you don't even really need that, just putting the bug in the description is all that was really necessary.The point I am trying to make here is, you don't even really have to "design" a product at this point. You just need to expose things to the AI so that when the user makes some vague statement about what they want to do it can convert that into concrete calls. The AI and the user will do things with it that you didn't even think of, and users can just add things by saying things in the descriptions of various tasks. I've mentioned how even if AI were to freeze today for the next 10 years we'd still be learning how to use AI and getting more out of it... this is I think a still under-explored application space.
  • michelb
    This requires integration of various systems, apps and services that just isn't possible until Apple really restructures the entire organisation and ways of working. Many things in Apple's first party ecosystem feel developed by teams completely unaware of other team's products.
  • andai
    I wonder if it's a cultural thing? I occasionally see family-oriented software, but it seems to be mostly solo devs making custom solutions for their own family.Seems like there would be more of a market for it.Although, I guess most software has "user" and "organization", and family kind of slots into the 2nd one. But most of that software isn't oriented to the needs of actual families.Based on the author's list, what's needed is some kind of dashboard that integrates many different systems together. The 2000s were kind of moving in that direction, with different platforms being interoperable, and UIs being highly customizable.
  • bio-s
    Know that my son has a test on Thursday and hasn’t opened the revision material since Monday. A gentle nudge, not a surveillance report.I'm sorry to read that. Looks like it's good that Apple didn't build that yet.
  • KaiserPro
    > And none of it requires SOTA models,I mean thats not actually trueIt requires a shit tonne of context and also has a fucktonne of bad outcomes that people accept with chatGPT but not apple products.> Know that my son has a test on Thursday and hasn’t opened the revision material since Monday. A gentle nudge, not a surveillance report.That requires two bits of context that are hard to find:1) that there is an exam. Ideally it'll be in the calendar, but who's exam is it actually? is it the creator, the invitee or owner of the calendar's exam2) That certain actions on the web == revision. THat requires knowing what the exam is about, what the offical study material is, and more importantly cross account access to web history.> Track our medication schedule and ping people (or me, if someone misses a window) without turning into a clinical monitoring tool.How do you nonintrusive test that medication has been taken? How do you know its the right pills? How do you upload the prescription to do that? how do you handle power of attorney? How do you not get sued when people rely on it?> Coordinate pickup times, grocery lists, meal plans–the sort of mundane family logistics that currently live in a group chat and three different apps.Again sharing of rawe data to model to build a context. How do you screen for privacy? how do you make sure that talking about private stuff (like love interest etc) doesn't leak into other contexts?> Better family e-mail, better event handling, better package tracking across household members.Define better.Look as someone who worked on AR/AI assistant glasses, its trivial to make something like that which works 80% of the time. You can't make it secure though, because it requires removing a bunch of privacy barriers that stop fraud and stuff leaking to third parties.Its a really hard problem to crack to both be accurate, private and secure. You can pick one, at best.
  • taffydavid
    > Surface things on Apple TV that match what we actually watch, not what the recommendation engine wants us to try.That's not a trivial thing to build. By what criteria should a show match what you've already seen if you watched shrinking and below decks and silo this past month?Things with boats? Jason Siegel? Post apocalyptic stuff?
  • crhulls
    I’m the founder of Life360. We have a 100m users by exploiting exactly this. Our app is designed specifically for your close family group vs Apple which has a single user focus.It’s very hard to do both things well and at Apple scale it’s nearly impossible.This is what enabled us to win despite FindMy being launched a few years after us.As a shameless plug I’m building a family AI team as a startup within our larger 600 person org.https://chrishulls.medium.com/life360-is-building-a-family-a...
  • egl2020
    Huge privacy minefield.
  • Zealotux
    Is Apple even able to build software that works these days? iOS has been decaying update after update, MacOS is not getting any better either.
  • shay_ker
    hm yeah. it could be nice for apple to have a list of shortcuts that'd actually be useful based on real activity. but getting all the info needed is hard.
  • ransom1538
    "Siri would be amazing if it could just be a wrapper for chatgpt. Otherwise, it is useless." - Every user in the world.
  • anonthrownaway
    /agree (and wondering why this angle of criticism isn't being constantly brought into all discussions of the Siri fail)Siri isn't lame because of the lack of frontier LLM. Siri is a massive failure of simply coding it to do obvious things, which is a UX failure, which is ironic given Apple's reputation as the UX leaders. I guess it is a low bar considering the competition of MS and Goog.Over the last 6 years, I have fully bought into the ecosystem and it constantly dissapoints me. I invite the UX team to spend a few days watching me struggle with their fragmented ecosystem. But I warn them to not let me get started on AppleTV (the streaming app), where the enshitification takes the crown over all of their competitors. They seem to have jumped the shark past the give the consumer great value stage.
  • amelius
    Why should Apple build this?They should just provide the hardware.
  • oulipo2
    > Know that my son has a test on Thursday and hasn’t opened the revision material since Monday. A gentle nudge, not a surveillance report.No you don't. You want to gain trust with him and talk. And talk with the educational team."A gentle nudge, not a surveillance report."That's exactly what it's going to end up being
  • skywhopper
    This all sounds nice, but I’m not sure how you track your kids’ medication usage or study habits or viewing habits via technology without being hyper-intrusive. And that’s assuming everyone uses all Apple devices for everything.I’m also not sure how any of this can happen given that Apple seems intent on making their apps harder to use and less interested in the users’ preferences over time. They are running away from elegant solutions and simple just-works software.
  • ghostlyy
    [flagged]