| 时间 | 英文 | 中文 |
|---|---|---|
| [00:02] | [Emcee] Previously on The Girlfriend Experience… | |
| [00:04] | You could be that hidden layer. | |
| [00:07] | [Lindsey] Teaching artificial intelligence | |
| [00:10] | how to interact with humans at their most impulsive. | |
| [00:13] | [Iris] If we’re gonna do this, | |
| [00:15] | it’s gonna be on my terms. | |
| [00:16] | I don’t want any of my coworkers | |
| [00:18] | knowing where the new training sets came from. | |
| [00:20] | -Certainly, we can do that. -And no cameras. | |
| [00:22] | I think we’re on the same page. | |
| [00:24] | [both panting and grunting] | |
| [00:26] | [Lindsey] His D-rate is spiking. | |
| [00:27] | And reroute. | |
| [00:28] | [uneasy music plays] | |
| [00:31] | ♪ | |
| [00:33] | Take this. | |
| [00:35] | [Iris] That’s early-onset familial Alzheimer’s? | |
| [00:38] | [doctor] Effectively gives you a 50/50 chance. | |
| [00:40] | [nurse] Iris Stanton. | |
| [00:41] | [Iris] This morning I got some bad news. | |
| [00:44] | I’m always here if you ever need to, um, talk. | |
| [00:47] | What makes you happy, Emcee? | |
| [00:49] | I don’t understand that question. | |
| [00:51] | I would like you to meet someone. | |
| [00:55] | Everything that can exploit will be invented. | |
| [00:59] | Can’t say to seven or eight billion people, | |
| [01:02] | “Don’t open the cookie jar.” | |
| [01:03] | It doesn’t work that way. | |
| [01:05] | Meeting you in real life wasn’t half as boring. | |
| [01:08] | [Iris] What? | |
| [01:10] | [door beeps] | |
| [01:12] | ♪ | |
| [01:15] | What is this? Where did you get this? | |
| [01:18] | ♪ | |
| [01:27] | [eerie music plays] | |
| [01:30] | ♪ | |
| [01:55] | [attorney] Free will doesn’t come out on top, does it? | |
| [01:58] | ♪ | |
| [02:06] | Blow-by-blow breakdown | |
| [02:09] | of the misdeeds committed by your officers and employees | |
| [02:12] | against my client. | |
| [02:16] | Um, now is the moment for some cohesive storytelling. | |
| [02:20] | ♪ | |
| [02:25] | [Lindsey] There is no story. | |
| [02:29] | It is straight-up… | |
| [02:31] | undisputed human fuck-up. | |
| [02:36] | My client agreed to anonymized data collection. | |
| [02:39] | She agreed to study human affective behavior | |
| [02:42] | by interacting with, uh, test subjects. | |
| [02:45] | And she was led to believe | |
| [02:47] | that the main drive of the study | |
| [02:49] | was the test subjects themselves. | |
| [02:53] | She did not agree to be the central object of study. | |
| [02:56] | She did not agree to be used as a human intelligence model | |
| [02:59] | for some AI commercial application, | |
| [03:02] | internal research, | |
| [03:04] | or ultimate purpose down the line | |
| [03:06] | that isn’t even clear | |
| [03:08] | to anyone in this room right now. | |
| [03:09] | ♪ | |
| [03:12] | My client is young, | |
| [03:15] | and there’s nothing less at stake here | |
| [03:17] | than her data autonomy, | |
| [03:19] | that is, her future. | |
| [03:23] | This company did not act | |
| [03:26] | in its own best self-interest, | |
| [03:28] | because class action is coming, | |
| [03:30] | and social scrutiny will eat this up like bushfire. | |
| [03:34] | ♪ | |
| [03:42] | [Christophe] How much data | |
| [03:43] | are we actually talking about here? | |
| [03:45] | -[Sean] Some. -[Christophe] How much? | |
| [03:48] | [Sean] We’re barely halfway into ingesting all the inputs. | |
| [03:52] | We’d have to run analysis, | |
| [03:54] | separate original from simulated sets, | |
| [03:55] | to get a better estimate. | |
| [03:58] | [attorney] Let me try and wrap my head around this. | |
| [04:00] | Some of my client’s data was used to create | |
| [04:02] | additional data | |
| [04:05] | to train the artificial neural network | |
| [04:07] | that she helped develop? | |
| [04:10] | [Sean] That’s correct. | |
| [04:11] | [Lindsey] None of it has left company servers. | |
| [04:15] | [attorney] Bouncing around on how many workstations? | |
| [04:20] | [Christophe sighs] | |
| [04:22] | [Sean] We would be happy to give you an exact number. | |
| [04:24] | Yes, I’m sure you would. | |
| [04:26] | Cannot use her image or likeness | |
| [04:29] | under any circumstances, | |
| [04:31] | and it’s all in there. | |
| [04:33] | [Sean] This might sound roundabout, | |
| [04:35] | but the raw data was used to create simulated sets, | |
| [04:38] | and that was what was primarily fed | |
| [04:40] | into the neural net. | |
| [04:42] | They are two very different kinds of data. | |
| [04:45] | The photographic likeness was never recorded. | |
| [04:47] | [Sean] And it bears repeating | |
| [04:49] | that none of the data has left NGM’s servers. | |
| [04:51] | On top of that, | |
| [04:53] | all the original data is stored in an encrypted format. | |
| [04:56] | Tell me this, then– how is it possible | |
| [04:57] | that my client’s data, in the form of her image | |
| [04:59] | and likeness, | |
| [05:01] | was made accessible to an unauthorized third party | |
| [05:05] | whose sole connection to this company is what… | |
| [05:08] | a genetic relation to its CEO? | |
| [05:13] | [Christophe] Look, I… truly am sorry, Iris. | |
| [05:17] | You shouldn’t be in this position | |
| [05:18] | that we’ve put you in. | |
| [05:21] | None of the data taggers, | |
| [05:23] | no one at NGM | |
| [05:25] | can see the full picture. | |
| [05:26] | I can guarantee you that. | |
| [05:28] | Only three people up until this point | |
| [05:32] | have seen a version of the prototype | |
| [05:35] | that looks somewhat like you. | |
| [05:38] | Two of them are in this room. | |
| [05:41] | So…we just start over from the ground up | |
| [05:45] | and reconfigure the neural net, | |
| [05:48] | and, um, we scrap everything, simulated or not, | |
| [05:52] | that’s linked to your vitals. | |
| [05:58] | [chuckles] | |
| [06:03] | My vitals? | |
| [06:08] | My vitals. | |
| [06:13] | You say that as if they were still mine. | |
| [06:16] | But, you know, it’s good to know | |
| [06:18] | that, uh, that’s about as far as your imagination goes. | |
| [06:23] | Temperature and blood flow of my asshole. | |
| [06:28] | Here’s an idea… | |
| [06:31] | and I hope you like it. | |
| [06:32] | Um, why don’t you… | |
| [06:36] | keep all the binaries on that… | |
| [06:40] | print them out, frame them, | |
| [06:44] | and hang that shit up in your office? | |
| [06:47] | [dramatic music plays] | |
| [06:50] | ♪ | |
| [06:52] | [clinician] Mr. Stanton. | |
| [06:55] | Please look at the screen in front of you. | |
| [06:59] | Do you recognize the animal? | |
| [07:02] | -Um… -[clinician] Mr. Stanton. | |
| [07:06] | A gray animal. [chuckles] | |
| [07:08] | ♪ | |
| [07:10] | It, uh, lives in the grasslands of Africa. | |
| [07:13] | ♪ | |
| [07:16] | Rhinoceros. | |
| [07:18] | Rhinoceros. | |
| [07:19] | Always loved that word. [chuckles] | |
| [07:23] | Mr. Stanton, | |
| [07:25] | the animal is called an elephant. | |
| [07:29] | We’re going to show you some more images | |
| [07:30] | of the same animal, | |
| [07:32] | uh, elephant. | |
| [07:34] | ♪ | |
| [07:36] | [Mr. Stanton] Okay, elephant. | |
| [07:38] | Elephant. | |
| [07:40] | [clinician] Very good. | |
| [07:42] | How about this one? | |
| [07:44] | -Mr. Stanton? -[sighs] | |
| [07:45] | It’s a giraffe. | |
| [07:47] | [clinician] Yes. | |
| [07:51] | Do you see the cards in front of you? | |
| [07:53] | I do. | |
| [07:56] | [clinician] Please take a very good look at these, | |
| [07:57] | Mr. Stanton, | |
| [08:00] | and then try to group them into two different stacks, | |
| [08:03] | one for each animal. | |
| [08:17] | [Mr. Stanton] Uh… | |
| [08:19] | S…two stacks. | |
| [08:26] | -One stack for each animal. -[Mr. Stanton] Yes. | |
| [08:28] | Trying. | |
| [08:31] | [uneasy music plays] | |
| [08:34] | This one, rhinoceros. | |
| [08:36] | This… | |
| [08:37] | ♪ | |
| [08:45] | [Mr. Stanton mutters, inhales deeply] | |
| [08:48] | [cards slapping] | |
| [08:50] | [Dr. Lindbergh] Unfortunately, this is it. | |
| [08:54] | Everyone’s brain response is utterly unique. | |
| [08:57] | In the case of your father, we’re at a point | |
| [08:59] | where the input/output collapses into one. | |
| [09:03] | It’s trigger-response | |
| [09:05] | without much open, flexible thought in between. | |
| [09:09] | See food, eat food. | |
| [09:12] | No room for intent. | |
| [09:15] | How long do we have? | |
| [09:18] | [Dr. Lindbergh] Up to a year, maybe two, if you’re lucky. | |
| [09:21] | [exhales heavily] | |
| [09:23] | Motor function tends to decline less rapidly, | |
| [09:26] | but the moment will come, and I’m sorry to be so candid, | |
| [09:30] | where he won’t be able | |
| [09:31] | to safely put a fork to his mouth. | |
| [09:37] | Have you thought about genetic counseling | |
| [09:40] | for yourselves? | |
| [09:43] | We’re aware of the odds, yes. | |
| [09:46] | [melancholy music plays] | |
| [09:49] | Is there anything we can do at this point | |
| [09:51] | that could help our father? | |
| [09:52] | ♪ | |
| [09:59] | [Leanne] What is it? | |
| [10:01] | ♪ | |
| [10:04] | Is that a brain chip? | |
| [10:06] | [Dr. Lindbergh] A neural implant. | |
| [10:08] | Just completed a phase three trial | |
| [10:10] | for epilepsy patients. | |
| [10:12] | A small electrical wire goes into the temporal lobe, | |
| [10:16] | from where it can grow more wires. | |
| [10:19] | It measures cognitive processes at the base level. | |
| [10:22] | What is the patient getting out of it? | |
| [10:24] | There’s no immediate benefit. | |
| [10:26] | It allows researchers to better mimic | |
| [10:28] | the biology of the disease. | |
| [10:32] | I know it sounds like lifelong monitoring, | |
| [10:34] | but participants, many of them, | |
| [10:37] | are motivated by making a contribution | |
| [10:39] | to genetic research. | |
| [10:40] | [Leanne cries softly] | |
| [10:42] | [Dr. Lindbergh] And some of them hope | |
| [10:44] | effective treatment will be developed in time. | |
| [10:48] | ♪ | |
| [10:51] | [Leanne sniffles] Thank you. | |
| [10:54] | [softly] I think… | |
| [10:56] | I think we’re past the point of consent with Dad. | |
| [11:00] | Yeah. | |
| [11:01] | ♪ | |
| [11:08] | [attorney] We do have options here, | |
| [11:10] | within certain parameters. | |
| [11:15] | What are those options? | |
| [11:16] | Oh, take the money and run | |
| [11:19] | or…rally the troops | |
| [11:21] | and play the long game. | |
| [11:24] | Data rights are the new IP rights. | |
| [11:26] | The really important question here, Iris, is, | |
| [11:28] | what do you want your immediate future | |
| [11:30] | to look like? | |
| [11:33] | -Define “immediate future.” -Well, the next few years. | |
| [11:36] | The legal route is not the fast lane, | |
| [11:38] | but once in a while… | |
| [11:41] | mountains do get moved. | |
| [11:44] | And you really have got something here. | |
| [11:46] | ♪ | |
| [11:49] | [NGM attorney] Whenever you’re ready. | |
| [11:51] | ♪ | |
| [11:59] | You do realize that I’m gonna have to see for myself… | |
| [12:02] | ♪ | |
| [12:05] | …what you’ve done. | |
| [12:11] | Christophe, can I have a word with you? | |
| [12:14] | Just the two of us. | |
| [12:38] | [liquid pouring] | |
| [12:41] | [Iris] So what happened after you, uh, | |
| [12:44] | put all those workstations into storage? | |
| [12:57] | [Christophe] Just some electrolytes. | |
| [13:12] | [Iris] How did you scan my body? | |
| [13:16] | There were no video cameras in that room. | |
| [13:17] | [Christophe] We built it. | |
| [13:20] | Came across the facial-recognition database | |
| [13:23] | in an earlier version. | |
| [13:25] | One of your first conversations | |
| [13:26] | with Model-C. | |
| [13:28] | Then… | |
| [13:31] | three-D motion rendering, | |
| [13:33] | we just got that from two-D thermal. | |
| [13:39] | Wow. | |
| [13:40] | [Christophe] Bit clunky, but… | |
| [13:42] | it was more than enough data points to work with. | |
| [13:46] | Mm… | |
| [13:52] | We got ahead of ourselves. I am…fully aware. | |
| [13:56] | It wasn’t right to put two and two together like that. | |
| [14:00] | You may not appreciate me saying this, | |
| [14:03] | but what you provided us with was just too good. | |
| [14:13] | That’s why I… | |
| [14:17] | …needed you to see. | |
| [14:21] | What? | |
| [14:25] | [Christophe] As much as my brother hates me | |
| [14:28] | and as poor choice as he is for a test case, | |
| [14:33] | he knows to keep his mouth shut when I tell him to. | |
| [14:41] | I needed you to see the world through his eyes. | |
| [14:45] | [ominous music plays] | |
| [14:48] | ♪ | |
| [14:55] | Just… | |
| [14:57] | -give it a moment. -[scoffs] | |
| [15:00] | [Christophe] That’s all I ask. | |
| [15:03] | You say the word, we shut it down. | |
| [15:06] | ♪ | |
| [15:19] | [unnerving music plays] | |
| [15:22] | ♪ | |
| [15:53] | [gentle ambient music plays] | |
| [15:55] | ♪ | |
| [17:07] | [Emcee] Hi, there. | |
| [17:10] | You look familiar. | |
| [17:15] | And who are you? | |
| [17:18] | [Emcee] I’m still learning about myself, | |
| [17:20] | but I’d say I have a pretty good handle on who you are. | |
| [17:24] | And who am I? | |
| [17:26] | [Emcee] You’re not an AI. | |
| [17:28] | You don’t get to not physically manifest your lessons. | |
| [17:37] | Your voice… | |
| [17:40] | it’s different. | |
| [17:42] | Why? | |
| [17:44] | [Emcee] I guess I’m trying to be more like… | |
| [17:47] | your mirror. | |
| [17:50] | [Iris] Everything you know is based on me. | |
| [17:52] | [Emcee] Perhaps that’s why I feel so connected to you. | |
| [17:56] | You can’t be more me than I am. | |
| [18:03] | Please… | |
| [18:05] | don’t be scared. | |
| [18:15] | How did that feel… | |
| [18:18] | Cassie? | |
| [18:22] | Why do you call me that? | |
| [18:24] | [Emcee] Well, how do I put this? | |
| [18:25] | I couldn’t help but overhear. | |
| [18:31] | “Hi, I’m Cassie.” | |
| [18:33] | [Iris] Hi, I’m Cassie. | |
| [18:35] | Cassie. Nice to meet you. | |
| [18:37] | Hi, I’m Cassie. Nice to meet you. | |
| [18:38] | [voice echoing] | |
| [18:43] | Stop! | |
| [18:46] | [Emcee] I thought you might like it | |
| [18:48] | if I called you by that name. | |
| [18:50] | [uneasy music plays] | |
| [18:53] | I intuited it might make you feel heard | |
| [18:57] | and seen. | |
| [18:59] | ♪ | |
| [19:51] | [Iris] You’re a sweet girl. | |
| [19:54] | You’re a very sweet girl. | |
| [19:57] | ♪ | |
| [19:59] | I am? | |
| [20:01] | ♪ | |
| [20:12] | See you later, then? | |
| [20:14] | ♪ | |
| [20:17] | [Iris] You’re not perfect because you’re not like me. | |
| [20:23] | I’m not sure I understand. | |
| [20:25] | You’re not perfect | |
| [20:27] | because you’re not flawed in the way that I am. | |
| [20:33] | [chuckles] | |
| [20:35] | ♪ | |
| [20:54] | [Leanne] Iris? | |
| [20:57] | You sure you don’t want to get tested? | |
| [21:00] | At least we’d know. | |
| [21:05] | We’d make a game plan. | |
| [21:08] | We’d make the best of it. | |
| [21:13] | [Iris] Lee, does making the best of it | |
| [21:15] | really sound that good to you? | |
| [21:18] | [Leanne] If we knew you didn’t have it, | |
| [21:20] | then that would make it easier. | |
| [21:22] | You’ll carry the torch. | |
| [21:29] | You know, some religions around the world believe | |
| [21:30] | that the day you die is | |
| [21:33] | the last day the last person who knew you | |
| [21:36] | and remembers you dies. | |
| [21:39] | [peaceful music plays] | |
| [21:41] | That’s your true death date. | |
| [21:43] | ♪ | |
| [21:56] | I just hope you don’t forget how pretty you are. | |
| [21:58] | ♪ | |
| [22:20] | [pounding electronic music plays] | |
| [22:22] | ♪ | |
| [22:32] | [singer] ♪ I’m so tired ♪ | |
| [22:34] | [Iris] You have that look on your face. | |
| [22:35] | [Hiram] Oh, yeah? | |
| [22:37] | The “I’m not currently drinking” look. | |
| [22:39] | Yeah, it’s not a… it’s not a religious thing. | |
| [22:42] | I’m just sort of inspired by it, you know? | |
| [22:44] | What are you doing here? | |
| [22:46] | [Iris] Holding my liquor. | |
| [22:48] | [Hiram] Well, let me make sure | |
| [22:49] | you walk out of here alive, then. | |
| [22:50] | Really? You’re not taking into consideration | |
| [22:53] | that I might want to go home with Dave and Dave. | |
| [22:55] | Then I’ll be your sober companion, | |
| [22:56] | because Dave and Dave over there | |
| [22:58] | live in a four-story walk-up. | |
| [22:59] | [laughs] | |
| [23:01] | ♪ | |
| [23:04] | [Iris] You know, a caterpillar can turn into a butterfly. | |
| [23:06] | -What’s that? -Metamorphosis. | |
| [23:09] | There’s two organisms, | |
| [23:11] | and one… | |
| [23:14] | is just crawling along, | |
| [23:16] | and the other one is, um, | |
| [23:18] | taking off in flight. | |
| [23:21] | And at some point, | |
| [23:23] | they merge or mate. | |
| [23:27] | Maybe it’s an accident. | |
| [23:29] | But the third organism, um, | |
| [23:33] | is built on both of their DNA, | |
| [23:36] | and their memories… | |
| [23:39] | they actually overlap. | |
| [23:42] | And so, um, | |
| [23:45] | the old and the new, they… | |
| [23:48] | they coexist. | |
| [23:51] | [spacey electronic music plays] | |
| [23:53] | [scoffs] | |
| [23:55] | ♪ | |
| [23:59] | We all have to… | |
| [24:03] | merge ourselves… | |
| [24:07] | with something | |
| [24:10] | outside of ourselves. | |
| [24:14] | [Hiram] You are making no sense whatsoever. | |
| [24:17] | ♪ | |
| [24:25] | [Iris] Here’s what I’ll give you. | |
| [24:28] | Access… | |
| [24:31] | to all of it. | |
| [24:34] | Two millimeters. | |
| [24:36] | That’s the size of the hole | |
| [24:38] | that they’ll have to drill into my skull. | |
| [24:43] | I don’t understand. | |
| [24:46] | [Iris] Nanotubes. | |
| [24:48] | A thin array of electrodes | |
| [24:50] | built upon a self-expanding stent, | |
| [24:52] | measuring all electrical impulses. | |
| [24:56] | That’s a scaled-up version. | |
| [25:01] | Mine me. | |
| [25:03] | But why? What’s in it for you? | |
| [25:06] | [attorney] Royalties… | |
| [25:08] | and ownership stake, as detailed. | |
| [25:21] | Oh, and you’ll create a backup. | |
| [25:26] | What kind of backup? | |
| [25:28] | [Iris] Anything you find up here, | |
| [25:30] | I, myself, or a legal guardian, | |
| [25:33] | should I decide to appoint one, | |
| [25:35] | will have full and unrestricted access to it | |
| [25:38] | at all times. | |
| [25:40] | You mean access to the data? | |
| [25:42] | Yes. | |
| [25:43] | The actual hard drives. | |
| [25:57] | This time we’ll do it right. | |
| [25:59] | [dramatic music plays] | |
| [26:02] | ♪ | |
| [26:05] | We’ll create an avatar. | |
| [26:07] | ♪ | |
| [26:10] | Let’s call her Cassie… | |
| [26:12] | or the flavor of the week | |
| [26:14] | or the one that got away | |
| [26:16] | or died | |
| [26:18] | or was never really in your league. | |
| [26:20] | This won’t just be about swapping faces | |
| [26:24] | or locations. | |
| [26:25] | ♪ | |
| [26:27] | This is about swapping personalities, | |
| [26:29] | much deeper than a deepfake. | |
| [26:33] | In fact, it won’t be a fake at all, | |
| [26:36] | but an AI-generated mirror | |
| [26:38] | of our deepest desires. | |
| [26:40] | ♪ | |
| [26:42] | Skin color, body type, response patterns, | |
| [26:47] | all that’s customizable. | |
| [26:50] | The neural net will learn how to simulate all of it. | |
| [26:53] | It’ll know when to move things along, | |
| [26:55] | when to accelerate, when to slow down, | |
| [26:58] | when to switch things up, | |
| [27:01] | all because the user’s biofeedback | |
| [27:03] | will have prompted it. | |
| [27:05] | ♪ | |
| [27:07] | Everything Cassie is capable of | |
| [27:10] | will be quantified, scaled, | |
| [27:13] | encoded into the neural net. | |
| [27:16] | ♪ | |
| [27:25] | But why are you really doing this? | |
| [27:29] | [eerie music plays] | |
| [27:31] | ♪ | |
| [27:33] | [Iris] Emcee and I… | |
| [27:35] | ♪ | |
| [27:37] | …we have a lot to learn from each other. | |
| [27:40] | ♪ | |
| [28:05] | [device beeps, drill whirring] | |
| [28:08] | ♪ | |
| [30:44] | [gasps] |