#E54 We Need to Teach Our Kids About AI Literacy With Angela Radcliffe

About Angela Radcliffe

Angela Radcliffe is a trailblazer in clinical research and a leading advocate for data ethics. With over two decades of experience at the intersection of health, technology, and data, she has driven transformative change and fostered organizational agility. Her work has impacted more than 100 global clinical research initiatives across nearly every therapeutic area. A bestselling author, Angela wrote Quantum Kids: Guardians of AI, an innovative activity book that introduces elementary and middle school students to AI fundamentals. Her expertise not only advances the field but also inspires the next generation, setting new standards for ethical data use and pioneering advancements that make a difference worldwide.

Read the HYPERSCALE transcript

Briar: Welcome to Hyperscale, Angela. So nice to have you here in the studio in New York City.

Angela: Thank you so much for having me and what a lovely day it is to be in New York City. 

Briar: What do you want people to get from our podcast today? What do you think is the main takeaway?

Angela: I think the main takeaway is we have been existing in a time where we have had to accept a set of unintended consequences from technology. And now we are at a tipping point where we can ensure that we don't create unintended consequences in the age of AI, but it starts with educating our children.

Briar: Wow. Yeah. I sometimes think back to my school years and think, God, how much of that did I really take away and actually learn from, are they doing enough in today's education system to prepare the children?

Angela: They're not doing anything in the education system just now to prepare children. Unfortunately, I think education is going to need to have a whole scale change. And of course, I'm not alone in thinking that you hear professors at major institutions like Penn come on the radio regularly now and say, hey I don't care if my kids use chatGPT I'm only going to catch the bad cheaters. I'm not going to catch the good cheaters. And the fact of the matter is I have to teach in a whole new way. But I think schools, even like my daughter's school, they're just starting to ask themselves, should we have curriculum here? Should we let the kids use it? What should we be doing? And I've got news for them, our kids are already using it. So it's sort of the ships out of the barn.

Angela: And quite frankly, it's a scary thing because education is such a beloved profession. I mean, these are the people who are truly stewarding the next generation and probably faster than many other industries. They're going to see their jobs change very, very quickly. The rote learning that we grew up with can be taught by AI in a couple of hours a day, which is great because it leaves us amazing opportunity to spend time teaching our children the stuff we've neglected to teach them all these years, health literacy, financial literacy, that their data is important and it's a contribution to society. So there are so many wonderful things we could be using that time for to support their creativity. So it's a great thing, but it's a little bit of a big change.

 Briar: I sometimes think as well outside the hard skills, like are we doing enough to teach our kids things like curiosity for one?

Angela: Yeah, that's such a great question. I actually think curiosity or an innovative mindset or the willingness to fail, some of these things we aren't really teaching kids. Some of it, kids are inherently curious, but we work very quickly to take that spirit out of them in the way that we sort of march them through their multiple grades and say, okay, this is what we need to do to equip you for life. And what we're doing is we're robbing our children and our world quite frankly of probably amazing innovations that are just the kernels in these children already. But by the time we tell them, well you have to do math a certain way, and you have to know how to write. My daughter she just finished this process where they're studying countries and they're still learning things.

 Angela: Like to do an essay, you have to go and research, and then you have to go from that source and you have to write notes, and then you have to go from those notes and you have to write the essay. And that's how we're going to learn about the country in her case of Bolivia. She's neurodivergent. The concept of having to go from a source in a book or the internet, take that information context switch, put it on notes, on a paper and then go to those notes and then turn those into sentences. Those aren't the things that are going to make her an amazing citizen of the world in the future. So we've got to wake up and realize that the time that we're spending, giving our kids some of these things that we've learned as basics, that they're not the basics anymore. We've got a whole new set of basics we've got to teach.

 Briar: What do you think those basics are? 

Angela: Well, number one to me is something that people always laugh when I say we have to teach our kids fair data. People are like, what is fair data? 

 Briar: What is fair data? 

Angela: Yeah. So in the tech and digital communities, fair data stands for findable, accessible, interoperable and reusable. And it's sort of a core data management principle. And people, even in our big corporations, I worked for some time in big pharma. They have a really hard time grasping what that means and why it matters. But for our children, what they need to understand is that every single thing that they do has a data exhaust. And that data itself has to be findable. To find it, you have to kind of know what kind of data is that.

Angela: So we hear about metadata, and metadata is really just how you describe the data. And I'm trying to help kids understand that interoperability, for example, it's like doing a cipher. If you got that, back in our day, it was like the secret Annie ring from that movie, A Christmas Story or Little Orphan Annie, I think it was. But kids really need to understand that it's just a pattern. It's just like figuring out a puzzle. 

Angela: So really the key is that our children have to understand that. I think this generation's starting to understand, but previous generations didn't. We didn't grow up in the era of social media, my older children did. But everything we did wasn't documented. Well, guess what? It's not just everything we do online. We click the, I accept button on apps hundreds, if not thousands of times over the course of a year. We don't know what we're agreeing to, but I tell you what we're agreeing to. We're agreeing that that data exhaust is going somewhere to do something for someone. And we're shifting into an era where what we used to think of as our income streams, our creativity and our productivity, those are going to change. And now we have to start to think about our data and that data exhaust as our new income stream. Because if AI is going to start to replace a lot of what we do, then we really need a new way to interact with that as creators.

Angela: Oh. And fair data and why we need to teach some things like that? Yeah. So there is a whole new set of skills that our children need to learn. And fair data seems silly, but children need to understand that the data that they produce, whether it is when they're walking around with their cell phone and that GPS data, or whether it is the stuff that they're posting online or the things that they're purchasing, all of the things. We haven't hold a general awareness as adults about that. What we don't understand is how that data is often used over and over and over again and that the veracity of that data makes or breaks whether we are putting good into the world or bad into the world. So there's this huge opportunity for children to understand that what they are putting in is what they will yield in their world, in every aspect of their world. We didn't have anything like that as a promise for humanity when we were children. But now the promise for our children is if you are good stewards of the data that you put out into the world, amazing things can come from the combination of that data thanks to the power of artificial intelligence and quantum computing.

Briar: So this is a fascinating topic, by the way. What are these good things? Say that they're putting out their data? Like what does good look like and what does bad look like? Like how can people actually use our data against us? How are they using our data against us?

Angela: Yeah. every time I'm on a radio interview for the book on one of the local stations probably because of the demographics that listen to the radio broadcasts that I generally am interviewed on, I get the question, what are we going to do about deep fakes in the election? And I say, hey, I wrote a book on AI literacy for kids. I can't tell you what we're going to do about deep fakes for the election. But I can tell you that movie magic has been around for a long, long time. It's just that now it's democratized to everyone. Anybody can make a deep fake now. Anybody can be Briar, be Ange, and so they can do true sort of reputational harm to us by impersonating us. However that is a very small sliver of the use cases that are out there. And so I understand the feeling that people have about being almost alarmist about what AI is going to do and how things are going to get out of control.

Briar: Thank you for being understanding. See, if I was a robot, I wouldn't have this problem.

Angela: It's true. I mean, there are ventilators, but I wouldn't want to be on one right now. So we were talking about.

Briar: You were talking about children. The good things.

Angela: Oh, the good things. Okay. So the bad is just a sliver of what is out there. So I'll give you an example from the industry that I've spent the most time in, which is healthcare. Previously we had scientists sitting at, the bench is what we call it. And they were sitting at the bench and they were doing experiments looking for drug candidates that could interact with our biology. And biology is something we still really don't understand. It's one of the last real frontiers in science. We think we understand, we understand proteins, for example, we don't understand biology. It's very complicated. And so we had these scientists and they're sitting at the bench and they're looking and they've got these hypothesis, oh, this one molecule, it's going to do this. Maybe it's going to help stop this cough or it's going to fix this virus.

Angela: But it takes them a long time to go through all of that research until eventually it gets into a clinical research trial in humans. And I mean, we're talking many, many years, like 20 plus years, billions of dollars, hundreds if not thousands of people involved in bringing just one simple drug to market. And while we've had technology that's advanced that, like we have these machines that do what we call high throughput screening. So we can look at lots of things very quickly. What we've come to now is that through the use of data and machine learning, we can now sift through hundreds of thousands of different molecule candidates at the same time, our targets, drug targets. And we can find those needles in a haystack in a fraction of the time that we could previously. So the idea that data can bring about cures for things that have taken our loved ones from us.

Angela: I mean, if I'm a child, if somebody said to me, back when I was growing up, if you'd like to help cure cancer, you need to grow up and you need to spend eight years becoming a doctor on top of all of your regular schooling. And then you also need to go into residency. And by the way, oncology is a very complicated thing, like it's a long pathway. And you have to have a deep passion for that. Now we have the opportunity to create citizen scientists in all of our children because they don't have to participate directly as a scientist. They simply can be part of that study of all of our biology by contributing their data. But in order for us to ask them to give something that precious away, like if I say to you, you're brave.

Angela: You're willing to try out the new things, you know spend 48 hours in a metaverse. 

Briar: Put a chip in my hand.

Angela: Right. Do all the things that test, like what does the future really look like? But one of the things that we're asking of our children is to essentially share their most precious asset with the world for no direct benefit to themselves. And so there's a fear, I think for many people that if we say, you know what you're humans and you all own your data now and you're responsible for it. So you get to decide what part of your data is used and how, we will suddenly not get data because people say, I'm not sharing my data, but humans are inherently good and inherently altruistic. And if we want our children to understand that and know that they can solve not just cancer, but they could attack racial inequality, they could attack climate change, like the myriad of social issues that having the power of a collective body of data structured in the right way when we get to ask the right questions.

Angela: And now that it's powered by the type of computing we can do and all of these interesting vectors can connect and unexpected combinations can be found. That's amazing. But our children are also looking at a future where doctors are going to be different, teachers are going to be different. So how do we empower them to understand how to contribute as global citizens to the world and how to solve these social ills, but also envision a future for themselves when we can barely envision a future. I'm sure five years ago, none of us expected to be where we are with things like large language models. And here we are. So it's a really interesting time for kids.

Briar: I agree. And I think this is exactly one of the reasons why I did my Roblox futuristic shopping mall, because we can't deny that kids, like most of the 70 million daily users that Roblox sees are the Generation Alpha and are Generation X. And part of me is really wondering as well, because these kids have grown up watching YouTube, using their iPads, I'm pretty sure like over 50% of kids under the age of six have an iPad, like just the data that they must have on our children is significant. What about when it comes to TikTok? Obviously TikTok has been in the news a lot lately and part of the reasons why the US wants to ban TikTok is because they are fearful of the data that the Chinese might have on our kids. Do you think this is a really real concern?

Angela: I think data is a real concern, but I don't think that it is okay to use it as the excuse for stopping people from being creators. And so it's a very tricky fine line for me. Part of it is because I think that we are not willing and we have not been willing as a society to address root cause issues. We are so good at bandaids, so, so good at bandaids. We love to say, oh, that's not working over there. We're going to put a little bandaid on that. But we don't ever get to the root cause issue and the root cause issue of the whole TikTok. China has our data, etcetera. Of course there's a real and present concern that people are going to use data for mal intended purposes, but instead of being concerned about one platform, which by the way will be one in many over the next how many of our years, instead of being concerned about that, we need to get data ownership right.

Angela: And to me if you think about the human rights declaration. We have all of these human rights, but those human rights are rights that did a good job of protecting us from things in the past. We need human rights that are going to protect us from things in the future. And if you look at what happened with the internet, if you ask and I know because at the 70th anniversary of the Declaration of Human Rights, I got the honor of hearing sir Tim Burners Lee speak. And he said basically and I'm paraphrasing of course, that if he had known some of the unintended consequences of the internet, he never would've created it. And that's a very strong statement. But if you think about, it's sort of like you mentioned, kids all have iPads, they all have phones, and there's still a group of parents who say, I think there's a whole movement.

Angela: It's something like no phones before 13 or no phones by eighth grade or something like that. Clearly my children have phones so I don't pay attention to that. But the movement to that, I understand the intent behind it. It's as similar as the intent behind let's band TikTok. And the intent is we need to protect people. We have to keep them safe. We can't let bad things happen. Well, that's a wonderful intention, but in reality is that really going to keep us safe? In some ways, not allowing our children to be the digital natives that they truly are and were born into a world to be is putting them at a severe disadvantage. And what we need to address are the unintended consequences we created before we understood the power of technology. So if you are going to be bullied on social media and like, hey, let's address the bullying.

Angela: It's not social media. It's sort of like back in my day, in my youth, there was like, don't listen to hard rock metal music and don't play video games, they're the gateway to hell. I mean, that's literally the type of messaging we had. And when you think about how video game, like simulations are now used to train some of the best of the best of the best in many professions from pilots to doctors, I mean, are video games really that bad? No. It's the unintended consequences that we've received from social media. They are of our own making. And we need to learn how to discern what a root cause issue is and solve for that first, instead of just sticking bandaids all over everything.

Briar: I think that's a very interesting point. And as you were saying it as well, I was just thinking the point that you made about how keeping our kids away from Roblox and platforms and social media is not the right way to go about it because this is life for them. This is how they socialize, like this is how they make friends. This kind of stuff is meaningful to them. 56% of kids have said that styling their avatar on Roblox is more important to them in real life. Like, if we took that away from them, we'd be taking away a lot of social interaction. Like, do you remember growing up and how difficult it can be sometimes when you feel like you don't fit in? I remember I was about 15 and the teacher asked us to put up our hands if we didn't have phones. It was me and the nerd, the class nerd. We were the only two that didn't have phones. And I tried explaining to my parents like, this is damaging me. Like this is damaging my self-esteem, like people are looking at me strangely now.

Angela: Well, there's a reason that every generation, our parents and our grandparents and now we look back and say, oh, I wouldn't have done it that way if I were them because we don't see it coming. And look, I know it's not going to be popular with a lot of parents that I say, hey, you're raising a digital native. You need to let them be in the digital world. But the fact of the matter is that comes along with a whole other set of responsibilities that maybe we don't feel equipped to manage. So look, I don't love that my daughter spends a lot of time on YouTube. And as a matter of fact, I can give you the perfect cautionary tale for why parents pull their kids off. 

Angela: Years ago she was searching up how to put a diaper on a baby. She's the youngest of our children and very far age space from her siblings. So she's never had a baby at home like they did to diaper and feed and cuddle and carry around. And so we got her a pack of pamper diapers to put on her baby dolls. And she literally had looked up like how do you put a diaper on? Well, the algorithm started feeding her diaper things all the way until we got to adult diaper content. So you can imagine I was not thrilled that my daughter now was suddenly very interested in adults wearing diapers on YouTube. Like highly concerning, but was my job then to take her iPad away or was my job to say, wow, I have a teachable moment here.

Angela: I better get more vigilant about what she's watching and understand how these algorithms are working. And I have to understand that guess what, the data she put in and this is exactly my point on teaching kids about data, the data point she put in was diapers. Our algorithms make their own connections of what they think. And AI is sort of like a child, like it wants the candy for doing a good job. That's why it hallucinates I think. We don't really know exactly why AI hallucinates, but it wants to do a good job for us. And so sometimes it'll try to do a good job, even if it's lying to us. And so even teaching our children that, like it's nice that you used it for your book report, but that character wasn't even in the book. Did you fact check that? So we have a misinformation problem in this world and teaching children to discern truth from fiction, that's our job as parents. Understanding how the algorithm is feeding our children's stuff. That's our job as parents, we can quarantine the device, but again, that's a bandaid that's not addressing the root cause issue.

Briar: So how can we teach, say I'm a parent listening to this, and this is also a very real concern for me. How can we teach our kids?

Angela: Yeah. I mean, I think it starts with the basics. Number one, it starts with empowering them to understand that they have agency over their data. And to even understand what the concept of data truly means and to do it in a kid-friendly way, we can explain artificial intelligence to them in a kid-friendly way. We can say to them things like, hey, really all we're really talking about here is, you know how when you're playing Legos and you've got a pile of Legos there and you need to start to sort them because you want to build something that looks like a tree. You're going to sort the brown Legos here and the green Legos here, but also you have to sort shapes. Artificial intelligence just kind of already knows how to do that sorting for you. 

Angela: Or helping them understand that when you're baking cookies with mom, you're following a recipe. Well, putting a prompt together is like creating a recipe. It's a set of instructions that help you get something in the end. These are not tough things to teach, but we have to start teaching that. And we have to start teaching this concept that we talk about in corporate America of growth mindset. Like, failure is first attempt in learning. It's the right thing to do. It's not the wrong thing to do. We need to allow them to make mistakes and we need to teach them the basics, like what's true and what's not true. And we can do it in very, very simple ways.

Briar: So you believe that human data ownership should be the 31st human, right?

Angela: Yeah. I'd love to say that was my idea, but it wasn't. I have a wonderful mentor Richie Tuoro, and he spent some time working in a space where our data is brokered and sold healthcare. I don't think a lot of people realize, but at least in the United States, many, many people sell our healthcare data and make money from it and benefit from it.

Briar: How do they make money from it?

Angela: So the best example I can give is you may go to the doctor and you're seen for some back issues and maybe you get an MRI done, and then your data that's in your health records, it's often what we call de-identified, which is a whole other topic whether data can really be de-identified to you as an individual, but let's just pretend it is de-identified. But your data, along with 10, 20, a thousand other patients who have had an MRI, your MRI scans are sent off to somebody who pays for that data. So they can look at all of those MRIs as a data set and they can make a decision maybe about whether or not they want to bring a certain type of treatment or therapy to market. Or, maybe an insurance company wants to deny you health insurance or life insurance based on a condition, but that data is taken in aggregate and it's sold by data brokers within our healthcare space.

Angela: And none of us get money back from that. And it's not really about should we be getting money from our data? I mean, I do think that we are going to have to replace our current income streams in new ways. 

 Briar: Why do you think that? 

Angela: Well, I mean, if you think about it, artificial intelligence is going to take so many rote tasks, so many of the basic things away from us. It's going to help us do our work so much more efficiently. But that's going to create a complete change in the way that we think about capital and the way we think about money and income streams. And so if artificial intelligence is doing a lot of the work we're being paid for, either from a creativity or productivity standpoint, then how are we going to be compensated for what we are putting into AI?

Angela: Because that's really what we're putting, we are putting our data in so that AI can be creative and productive for us, but it's still us. It's still our data. That's our essence. So we really have to start thinking about that. And I think that's a great way to level the playing field too in this world. Because right now, if you want access to a drug in Africa, if there's no commercial use case for it, companies aren't going to commercialize a drug there and you are not going to get that lifesaving treatment. That's just the way we operate as a world. We are very commercially driven, but with AI, everyone has the opportunity to benefit from it and everyone has the opportunity to contribute to it equally. And that's something we've never seen in our world before. 

Briar: Are you excited about a future of AI?

Angela: I'm more than excited, but I definitely fall on the like AI enthusiast, optimist side of things versus the alarmist pessimist side of things. And I'm excited for it because if I think about the type of creativity it can unleash in the world and the types of problems it can help us solve, like things that we've thought are intractable, I think some of those things are in reach for us. And that to me is worth the trade-off of the painful time we are going to go through for the next 5, 10, 20 years. Who knows?

 Briar: I was thinking recently about artificial intelligence and about how, for every kind of fearful thought I have, I have a thought of excitement as well because it can create such meaningful change in our lives, like maybe it might solve aging, maybe it might solve cancers. Like all of these like very big and real problems. And when the little fearful side of me was thinking about this, I was also thinking, in order to achieve these massive things, of course there's going to have to be that fear that we push through, otherwise these massive things are not going to be accomplished. Does that make sense?

Angela: That makes total sense. I mean, look, the fear is healthy. I think we should have a little bit of fear that helps us check our biases and we have a lot of them. And if we can, I think spend time sort of ideating on what could happen. As a mom, one of the things that I do sort of naturally as part of my character is worst-case scenario planning. You always think like the worst possible thing because you have to be prepared in case you've got to come in and lift a car off of your child who's pinned underneath. Let's hope that never happens to any of our children. But we've all seen the story of the mom who's got superhuman strength to lift the car. And so I think it's important for us to have a healthy sense of fear because there is a lot of ambiguity about what's to come.

Angela: But at the same time, fear has stopped us from doing so many things. Like I've spent my whole career for the most part in the clinical research space, and most people don't participate in clinical research. Why? Because they're afraid. They're afraid that they're going to be a human Guinea pig. Time magazine even put a cover out of a woman in a cage on the front of it. I think it was maybe 2005. And not one person in the industry responded to that and said, hey, that's not what we're doing here. But I don't think we have been willing to sort of go out there and say, hey, do you know what you're giving up by not participating in clinical research? You're giving up the chance to make people not unhoused. And when I say that, people are always very surprised that I could connect homelessness. 

Angela: Homelessness is something that can be directly addressed by clinical research because if we did clinical research into substance use disorder and we did clinical research into mental healthcare issues and even as healthy volunteers, if a larger body of us participated, we could solve some of those healthcare challenges. And I know people don't all think of those things as healthcare challenges, but they are brain health and addiction. Those are health issues. And so if we studied that and we found better treatments, better cures for those diseases, then the people who are largely in our unhoused communities would never have become unhoused to begin with. So you can go and serve in a soup kitchen, and I think it's a beautiful way to serve humanity. It's so important to gain the humility of serving alongside other people, but you could also just participate by contributing your data to a study.

Angela: And in theory, all of that collective data and good could accelerate the time we find something that would cure schizophrenia. And it would take an entire cohort of the unhoused off our streets. So do we want to solve the root cause issues or are we afraid that maybe we're being treated like human Guinea pigs or making money for big pharma? Look, there are lots of conflicting incentives in this world, and I won't make an argument for or against people being capitalists. But what I will say is using the argument that, oh, I don't want to contribute my data to research or I'm going to be a human Guinea pig, like, we've really missed the boat and we've missed that boat in a big way on health literacy too. 

Angela: Four out of five people in this world do not have the information they need to make a decision about their healthcare. That is a travesty. AI can change that. So now we have this new form of literacy and we have this new opportunity to even solve things like that. So if we could give people the empowerment around their health literacy, not only would their health outcomes improve, but all the social things that are attached to health outcomes would change. So it could really level the playing field. So I think, yes, be a little scared. Check your biases, but then just go for it. We've got to just start going for it.

Briar: So Angela, you've participated in five clinical trials, I believe?

Angela: I'm up to nine now. 

Briar: What was that like? What kind of clinical trials were they? 

Angela: I've done a lot of observational and some interventional. So the difference is observational, they sort of just passively watch you. So like I'm in one called the wisdom study and it just looks at women all over the world and their breast cancer risk. And all I have to do is submit a couple questionnaires once or twice a year and they look at when I get a mammogram, they have the ability to take that record and look at that mammogram. Or I'm in the All Of Us research study, which is a big government funded study in the US to try and get a million people of a very diverse cohort. And it's questions. I had to do like a cheek swab which the information I got back was that apparently I shouldn't like cilantro, but I do like it.

Angela: So the genetics are wrong there. There's stuff like that and then there's interventional trials. And so I was in all of the covid studies, so I did the very, very first one. And lucky for me, I did not get placebo and I was very protected through some of the worst part of the pandemic. But I got into research because I had a sibling who died from an undiagnosed heart condition when he was serving in the military. He was 21 healthiest of all six of us kids. He was serving as a chaplain in a demilitarized zone between North and South Korea. And he got out of a pool and he died, no sign or symptom. I mean truly the healthiest of all six of us. And amazing things happened for our family. We got connected to a clinical trial and they found the specific gene in our family by doing a large family research study essentially.

Angela: And they found that gene. And because we know that gene, it has prevented many, many other deaths in our family. So when you get impacted by a clinical trial, just by simply doing a DNA sample, we could have been afraid, oh, once they know we have this condition, none of us will ever get health insurance again. And there were some people in our family who were afraid of that, and rightfully so. There's a lot of discrimination in the healthcare system. But that unlocked a door for me as to the potential for clinical research to have such an impact. And I think we can influence the world in many, many ways, but to me, impact influence is the most important thing that we can do. We can invest, that's great, but not everybody has money. What everybody does have is the ability to take action and do something. And so my call to action is be an impact influencer and all that takes is participating in a clinical trial.

Briar: I think that's a perfect note to finish on. Hear that everybody impact influencing. 

Angela: Thank you

Briar: Well it's so nice to have you on the show today, Angela. This was fun and yeah fascinating about all of your knowledge when it comes to data ownership.

Angela: Well, thank you so much for having me. It was great to be here.

 



Briar Prestidge

Close Deals in Heels is an office fashion, lifestyle and beauty blog for sassy, vivacious and driven women. Who said dressing for work had to be boring? 

http://www.briarprestidge.com
Previous
Previous

#E55 Navigating an Unpredictable Future With Neil Redding

Next
Next

#E53 We Need to Be Conscious About How We Manage Our Data, With Ramona Morgan