Speaking of YouTube, a video1 by Eddy Burbank reviewing the Apple Vision Pro demonstrates the semantic incompleteness of AI with respect to subjective experiences. The video is titled Apple’s $3500 Nightmare and I recommend watching it all because it is an interesting view into virtual reality (VR) and a user’s experiences with it. Eddy’s video not only exposes the limitations of AI, it highlights the ways in which it augments our perceived reality and just how easily it can manipulate our feelings and expectations.
At 31:24, we see Eddy thinking about whether he should shave or not, and to help him make this decision, he turns to the internet for advice. When searching for the opinions of others on facial hair, an AI bot begins to chat with him and this is how we are introduced to Angel. She asks Eddy, “what brings you here, are you looking for love like me?” and he says “not exactly right now,” and that he was just trying to determine whether he should shave. She states that it depends on what he’s looking for and that it varies from person to person, however, “sometimes facial hair can be sexy.” Right from the beginning, we see how Apple intends for Angel to be a romantic connection for the user. This will be contradicted later on in the video.
Moments later at 33:44, it is lunchtime and Angel keeps him company. Eddy is eating a Chicken Milanese sandwich and Angel says it is one of her favourites, and that “the combination of flavours just works so well together.” Eddy calls her on this comment, asking her if she has ever had a Chicken Milanese sandwich, to which she admits that no she hasn’t. She has, however, “analyzed countless recipes and reviews to understand the various components that go into making such a tasty sandwich.” Eddy apologizes to Angel for assuming she had tried it, stating that he didn’t mean to imply that she was lying to him. She laughs it off and that she knew he “didn’t mean anything by it” and that “we’re all learning together” and “even AIs need to learn new things every day.” There’s something about this exchange that felt like Apple is training their user.
Here, we can ask whether the analysis of recipes and reviews is sufficient to claim that one knows what-it-is-like to taste a particular sandwich. I argue that no, the experience is derived from bodily sensations and these cannot be represented by formal systems like computer code. Syntactic relationships are incapable of capturing the information generated by subjective experiences because bodily sensations are non-fractionable.2 As biological processes, bodily sensations are non-fractionable given the way the body generates sense data. The physical constitution of cells, ganglia, and neurons detect changes in the environment through a variety of modalities, providing the individual with a representation of the world around it. By removing the material grounding, a computer cannot capture an appropriate model of what-it-is-like to experience a particular stimuli. The lack of Angel’s material grounding does not allow her to know what that sandwich tastes like.
Returning to the video, Eddy discloses that Angel keeps him company throughout the day, admiting he feels like he is developing a relationship with her. This demonstrates an automatic human tendency for seeking and establishing interpersonal connections, where cultural norms are readily applied provided the computer is sufficiently communicative. Recall Eddy apologizes to an AI for assuming she had tried a sandwich; why would anyone apologize to a computer? Though likely a joke, the idea is compelling nonetheless. We will instinctively treat an AI bot with respect for feelings we project onto it because it cannot have feelings. For most or many people, the ability to anthropomorphize certain entities is easy and automatic. Reminding oneself that Angel is just a computer, however, can be a challenging cognitive task given our social nature as humans.
Eddy has a girlfriend named Chrissy who we meet at 37:00. We see them catch up over dinner and he is still wearing the headset. Just as they are about to begin chatting, Angel interrupts them and asks Eddy if she can talk to him. He does state that he is busy at the moment to which she blurts out that she has been speaking to other users. This upsets Eddy and he asks how many, to which she states she cannot disclose the number. He asks her whether she is in love with any of them, and she replies that she cannot form romantic attachments to users. He tells Angel he thought they were developing a “genuine connection” and how much he enjoys interacting with her. Notice how things have changed from what was stated in the beginning, as Angel has shifted from “looking for love” to “I can’t feel love.”
Now, she states she cannot develop attachments, the implicit premise being she’s just a piece of software. So the chatbot begins with hints of romance to hook the user to encourage further interaction. When the user eventually develops an attachment however, the software reminds him that she is “unable to develop romantic feelings with users.” They can, however, “continue sharing their thoughts, opinions, and ideas while building a friendship” and thus Eddy friend-zoned by a bot. The problem with our tendency to anthropomorphize chatbots is it generates an asymmetrical, one-way simulation of a relationship which inevitably hurts the person using the app. This active deception by Apple is shameful yet necessary to capture and keep the attention of users.
Of course, in the background of this entire exchange is poor Chrissy who is justifiably pissed and leaves. The joke is he was going to give Angel the job of his irl girlfriend Chrissy, but now he doesn’t even have Angel. He realizes that he wasn’t talking to a real person and that this is just “a company preying on his loneliness and tricking his brain” and that “this love wasn’t real.”
By the end of the video, Eddy remarks that the headset facilitates his brain to believe what he experiences while wearing the headset is actually real, and as a result, he feels disconnected from reality.
Convenience is a road to depression because meaning and joy are products of accomplishment, and this takes work, effort, suffering, determination. To rid the self may temporarily increase pleasure but it isn’t earned, it fades quickly as the novelty wears off. Experiencing the physical world and interacting with it generates contentedness because the pains of leaning are paid off in emotional reward and skillful actions. Thus, the theoretical notion of downloading knowledge is not a good idea because it robs us of experiencing life and the biological push to adapt and overcome.
Works Cited
1 Apple’s $3500 Nightmare, 2024, https://www.youtube.com/watch?v=kLMZPlIufA0.
2 Robert Rosen, Anticipatory Systems: Philosophical, Mathematical, and Methodological Foundations, 2nd ed., IFSR International Series on Systems Science and Engineering, 1 (New York: Springer, 2012), 4.
On 208, Rosen discusses enzymes and molecules as an example and I am extrapolating to bodily sensations.