Redefining Society Podcast

TikTok Ban and what does it mean for Social Media, and the whole Internet? Seriously, isn’t it time to redefine privacy once for all? | A conversation with Theresa Payton | Redefining Society with Marco Ciappelli

Episode Summary

Dive into the debate on TikTok's data privacy with ITSPmagazine's Redefining Society podcast, featuring Marco Ciappelli and cybersecurity expert Theresa Payton.

Episode Notes

Guest: Theresa Payton, CEO Fortalice® Solutions LLC [@FortaliceLLC]

On LinkedIn | https://www.linkedin.com/in/theresapayton/

____________________________

Host: Marco Ciappelli, Co-Founder at ITSPmagazine [@ITSPmagazine] and Host of Redefining Society Podcast

On ITSPmagazine | https://www.itspmagazine.com/itspmagazine-podcast-radio-hosts/marco-ciappelli
_____________________________

This Episode’s Sponsors

BlackCloak 👉 https://itspm.ag/itspbcweb

Bugcrowd 👉 https://itspm.ag/itspbgcweb

_____________________________

Episode Introduction

In the latest episode of the Redefining Society Podcast, host Marco Ciappelli takes us through a fascinating conversation about the pervasive influence of technology in our lives, with a particular focus on the privacy implications of social media platforms. Joined by cybersecurity expert Theresa Payton, the discussion dives into the complexities of data privacy in the age of social media, using TikTok as a case study to explore broader privacy concerns.

Marco kicks off the episode by highlighting the blurred lines between our online and offline lives and the significant role social media plays in shaping our digital society. Theresa Payton, with her rich background in cybersecurity and her role as the first female Chief Information Officer at the White House, shares her insights into how data privacy concerns are not just about technology but deeply embedded in the fabric of our society.

The conversation takes a turn towards the critical topic of social media's data collection practices, highlighted by a recent study that points out the surprisingly extensive amount of personal data these platforms gather. Payton shed light on how such data collection, while critical for app functionalities, raises significant privacy concerns. This is especially pertinent in today's environment where not just social media platforms, but also shopping and food delivery apps, participate in aggressive data tracking and collection.

A focal point of the podcast was the discussion around TikTok, set against the backdrop of recent legislative efforts in the U.S. seeking to curb the influence of foreign-controlled tech companies over American data privacy. The dialogue extended beyond TikTok to address the broader implications for social media governance and the need for a global privacy framework.

Theresa Payton's perspective on creating a 'privacy bill of rights' and establishing consumer-centric guardrails for data privacy provides a ray of hope. She suggests practical tools and behaviors individuals can adopt to safeguard their privacy, emphasizing the power of informed choices in the digital age.

This episode serves as a compelling narrative on the intersection of technology, security, and privacy, offering both alarming insights and practical solutions. It reminds us that in the ever-evolving landscape of social media and digital technology, staying informed and proactive about privacy is not just advisable—it's essential.

_____________________________

Resources

Breaking Barriers in Cybersecurity: Meet The First Woman CIO At The White House: https://www.forbes.com/sites/nancywang/2024/03/21/breaking-barriers-meet-the-first-woman-cio-at-the-white-house/?sh=4fff8eaa63ef

2nd edition of Manipulated: https://www.amazon.com/Manipulated-Inside-Cyberwar-Elections-Distort/dp/1538188651

____________________________

To see and hear more Redefining Society stories on ITSPmagazine, visit:
https://www.itspmagazine.com/redefining-society-podcast

Watch the webcast version on-demand on YouTube: https://www.youtube.com/playlist?list=PLnYu0psdcllTUoWMGGQHlGVZA575VtGr9

Are you interested in sponsoring an ITSPmagazine Channel?
👉 https://www.itspmagazine.com/advertise-on-itspmagazine-podcast

Episode Transcription

TikTok Ban and what does it mean for Social Media, and the whole Internet? Seriously,  isn’t it time to redefine privacy once for all? | A conversation with Theresa Payton | Redefining Society with Marco Ciappelli

Please note that this transcript was created using AI technology and may contain inaccuracies or deviations from the original audio file. The transcript is provided for informational purposes only and should not be relied upon as a substitute for the original recording, as errors may exist. At this time, we provide it “as it is,” and we hope it can be helpful for our audience.

_________________________________________

[00:00:00] Marco Ciappelli: Hello, everybody. Welcome to another episode of Redefining Society, where we talk about all the way that technology affects our life, in the good, in the bad. We often talk about social media. We often talk about artificial intelligence, generative AI. You have to. It's the law. You have to talk about that. 
 

And, uh, there is other things that we, we need to talk about when it comes down to society, because, uh, we live our life. In a hybrid. Uh, way we are online and a digital world, but we also are in the real world. And yeah, what is the line? There is really blurry. I think it's real. Everything we do online as well as certainly affect one another. 
 

So. Today, we're going to talk about something we use quite a bit, which is social media and, uh, the way that, uh, it gets into politics, it gets into privacy, it gets into our life, and, uh, oftentimes, lately, we hear a lot about this TikTok and, how it is a political topic. And how we talk about privacy and sharing of data with some other country and a lot of things that may sound a little bit confusing. 
 

So today we have Theresa Payton. with us. She is going to explain some of those things. And Teresa's been going out there talking about cybersecurity, working for the White House for a very long time. And we had a conversation or two in the past. I'm so happy to have you back.  
 

[00:01:43] Theresa Payton: Hey, thanks for welcoming back. 
 

And, uh, I can't wait to see how our conversation unfolds today.  
 

[00:01:50] Marco Ciappelli: There is a lot of places we can go. And, uh, and I think we could start with the beginning, which is the people that don't know about you. A little introduction about yourself. I know it would take the entire podcast. So maybe you can keep it short. 
 

And then we dive into the conversation.  
 

[00:02:07] Theresa Payton: So no, absolutely. Thank you so much. And, uh, you know, I, I would just say, um, just real briefly, I spent 16 years in the financial services industry, focusing on customer centered designs and delivering technology platforms that a banking customer like you might want to use, Marco, but at the same time, making sure it's supported the business units and oh, by the way, keep it secure from fraudsters. 
 

And we didn't even call them cyber criminals yet, just. I don't know, digital criminals, um, had the honor and, um, the opportunity to work for, uh, the United States as the first female chief information officer, um, to ever hold first female to hold that role, worked for president George W. Bush from 2006 to 2008. 
 

And it was there that was really a tipping point for me as somebody who's always focused on the human user story and technology and enabling human user stories. And when I saw the capabilities of the adversary, um, if you think about 2006 to 2008, the first ever iPhone was released in 2007. So just a tremendous time of kind of the consumerization of technology. 
 

And it was a tipping point for me realizing that security really didn't design from a human centered design standpoint. Case in point, I have yet to meet somebody who's not in cybersecurity who loves strong passwords, Marco. Like the longer, the better, the more complicated they have to be. And the fact that they can't look like any of the last 15 passwords that you had. 
 

I've never had somebody thank me for that. So I thought about that a lot. And I thought, I really would like to take an opportunity to take the training that the country had invested in me and helping me understand the tactics, techniques, and protocols that the adversary had and use it to protect people and businesses and nations. 
 

And so, uh, kind of with the blessing of my family and the tremendous support of my family founded, uh, Fortilis. And I've got two fantastic women executives who I actually met. During my time at the White House, Bridget Melissa, and, um, we now focus on human centered design and securing, uh, the digital transformation efforts of some of the world's largest companies, um, privately held and publicly traded, as well as, uh, individuals. 
 

We also like to do things like give back. And so we, um, we've Do pro bono training and, uh, fundraising and support for the National Center for Missing and Exploited Children. And, uh, that's just a little bit about us.  
 

[00:05:02] Marco Ciappelli: And then you find time for your personal life.  
 

[00:05:06] Theresa Payton: Yes. Yeah. You have to, you have to, um, be very focused and deliberate about, um, making the right Calls on your calendar. 
 

Um, it's funny you say that Marco, cause I'll have people say to me, you know, how do you find work life balance? And I often tell people, I would like to remove that phrase from conversation because when you say work life balance, it assumes that everybody else has it figured out and like you're the only loser who can't. 
 

And so I, what I want to tell people is life is perfectly imperfect and that's okay. Okay. And what's balanced for somebody isn't balanced for somebody else and balance will change daily. So I always say it's work life choices and priorities, and you just have to have the right system for you to eyes wide open. 
 

Look at your calendar. Schedule things on your calendar and then actually give yourself a performance review and ask yourself, are you skewed, you know, too much in one direction? And what are you going to do to course correct that? Just like you would, um, in your work life. You know, you, you look at goals, you look at dates, you look at deadlines, you look at how much time you spend on a project. 
 

You have to do the same thing based on your priorities in your life.  
 

[00:06:26] Marco Ciappelli: Yeah, I agree. I agree. It's all about balance and change and adapt, adapting to, to things. And, and this come up a lot in my podcast, because we talk about society and technology. I make the joke because it's called redefining society. And I said, you know, every time I redefine it, I have to start all over again because it keeps changing. 
 

So our topics, especially now with, uh, working from home, using technology, virtual apps, API, and all those kinds of things that most of the time we don't need to know how they work in the background. But we use it and sometimes maybe a little bit of knowledge it can help doing these things and and stay You know floating with the changes of life and with the changes in technology and the way we work Um Um, but also in the way we play, because today we're going to talk about app that, uh, entertain people, and many times they don't understand that in order to get stuff from it, we need to give stuff to it, and that's called data. 
 

And sometimes there may be a little Too much. Um, and, and what does too much means. So there was a research that you, you pointed out. And, um, so let's talk about that. Let's dive in. What is that really got your attention out of this research.  
 

[00:07:50] Theresa Payton: Sure. Absolutely. Yeah. There was a great study. 
 

And if anybody wants to dig into the study after this podcast, it was done by Surfshark. And so if you're looking for it, you can, you know, just look for their app privacy checker study. It was done December of 2023. So it's fairly current. And what I loved about this study was it really provides a little bit of context, uh, into the data collection practices of many of the popular mobile applications that we all know and love. 
 

I do have some people push back on me and say, I don't use social media. I'm not on it. But guess what? Your family members have probably texted you or emailed you a video that you love or a meme that you love. And you think it's super funny. So guess what? Based on trackers, you are on social media, whether you know it or not. 
 

And so this is very interesting. Um, when you look at the key findings of this study, and I think this is what's really important to cover here, Marco, is one of the first things in looking at all these different popular social media platforms. I know TikTok is trending in the headlines right now, but, but, uh, we really have to look at all of social media. 
 

Um, And what they found was that most social media apps collect a lot of data and that data is essential for its overall functionality. So it's not even just, I'm collecting data and then I'm going to hand it off to Procter and Gamble and they're going to send you a coupon for toothpaste. It's actually things like GPS. 
 

location and personal details such as tracking your device ID, which a lot of people may not realize it's almost like your, uh, passport number or your social security number to your phone. So it does uniquely identify your phone, your tablet, your laptop uniquely to you as the buyer. Um, and then once they track you, they don't just track you on the platform that you happen to be logged in at, But they look at where you came from and what tabs you had open and where you went to next. 
 

And they do have the ability to track that to store that. Now you may ask yourself, what's important to you and me? And so what I love about the basis of their research is they basically said, okay, there's a lot of, Apple users out there, you know, a lot of people with iPhones. 
 

So they went to Apple's privacy policy and they benchmarked all of these apps against Apple's privacy policy that maybe you've read it, maybe you haven't. I think other than lawyers, I might be the only non lawyer that reads privacy policies. Um, but they actually ranked these apps against the policies. 
 

They found that the most data intensive apps out there are shopping and food delivery apps. They collect, if there's like 32 possible data points based on kind of the Apple privacy policy, they collect on average 21. So I guess they need to know when you're going to be hungry, Marco, so they can anticipate sending you like a little ad, like it's time for you to get a little snack and I can just deliver it for you. 
 

Um, what's really, really interesting was that DoorDash, um, has significant data and collection, data collection and tracking practices out of kind of all of the different types of apps that are out there. I'm not picking on them. They just happen to notice and, and, you know, sometimes when somebody uses more trackers than others. 
 

They might just be really good at it and really good at anticipating what you're going to need next. I know we talk a lot about TikTok, but the study also found that two meta companies, Facebook and Instagram, Are actually considered the most privacy invasive apps when you rank them against the Apple privacy policy . Following in sort of close third is TikTok. So those are things that we all need to be thinking about whether you actually say, I'm on these apps. I'm not on these apps. No, I deleted the app. I don't use the app. Just be thinking about what are some things that you can do. Whether you are an active user, not a user at all, an inactive user, to keep your data safe and secure and also have the opportunity from a privacy perspective, opt in and opt out wherever possible. 
 

[00:12:36] Marco Ciappelli: So let's stay a little bit on, on this overview and the way that people may think about it, right? So when we talk about cybersecurity, most of the time, We have to get to the point where, well, it is between convenience and security, right? So we, we say that a lot in the industry, but people, maybe the regular user doesn't think of it that way because they mostly want the convenience. 
 

So yeah, you want it to get the deal when you, when it's dinner time, wherever you are. They know where you are. And, uh, the deal on travel, the deal on, uh, the entertainment, and, uh, you like to scroll up that Instagram with all the videos there magically. You like them all because guess what? They know what you like, right? 
 

So, uh, why do you think the perception of the user, it's, it doesn't really get into the safety, the security, and the privacy. It's, it's all about. Uh,  
 

[00:13:42] Theresa Payton: well, these apps are not really pitched that way.  
 

[00:13:47] Marco Ciappelli: They don't tell.  
 

[00:13:48] Theresa Payton: They don't tell you so that, you know, they don't, they don't say up front, really, truly say up front, this is going to be a. 
 

An incredible productivity gain for you. But in return, because you get to use this app for free, in return, I'm going to require some of your data, um, to be able to make it free for you to use. And that's, I think the, the biggest, I think for many people is if the product is free, how do they make revenue? 
 

You have to be the product that is for sale. So that's kind of the little hidden agreement behind the scenes, which is you are the product. And by opting in and using the platform, you need to know you're the one who's for sale. You know, Marco, I often tell people when you're clicking through that privacy policy, that's in about a four point font, but you can't use the app until you click. 
 

Yes, I agree. You're often Opting into a level of privacy that you're giving up. So privacy policies typically spell out the lack of privacy you have when you use the app.  
 

[00:15:04] Marco Ciappelli: It's kind of funny. It's a privacy policy against your privacy mostly. It's, it's more to legally protect.  
 

[00:15:12] Theresa Payton: Think about it as a disclaimer and disclosure. 
 

That's what I, I'm like, I know we call it a privacy policy, but really is that really what it is? Because a privacy policy, in my mind, should read something like this. We believe everybody has a right, an individual right, to privacy, safety, and security. And because of that, we're going to give you an opportunity to opt in and opt out of certain features. 
 

Please understand that if you put your privacy settings very high, some of the features and functionality and conveniences may not be available to you. But because we respect your individual right to privacy, safety, and security, we want to make these features available to you. That, to me, would be the language for a privacy policy. 
 

That's not how they read. I mean, tell me, Marco, have you found a privacy policy that sounds like that?  
 

[00:16:05] Marco Ciappelli: No. And also I'm a big fan of opt out by default, not opt in by default. And you shouldn't buy a new TV if we go in the IOT or any other app that you download and that you need to go. And Opt out if you want to. 
 

Like, I'm not gonna name the insurance company that I use for my car, but they specifically said that if you don't want us to sell your information to advertisers, you need to go log in to your account and opt out. So, well, at least they let me know, right? Now, let's talk about the TikTok case, because From this research, TikTok is not even the worst. 
 

The worst is actually, you know, owned by the same company, Instagram and Facebook. But that's apparently okay. I mean, yeah, sure. been going on Congress and talk about privacy and the GDPR, European community, blah, blah, blah. But lately, this TikTok, I mean, what's, what's the politics behind that?  
 

[00:17:18] Theresa Payton: Well, I mean, there is something to be said when you think about a technology company, and we're talking about social media apps, but Yeah, Marco, you brought up a great point, which is our internet of things, any smart devices, smart buildings, and oh, by the way, our cars, most cars built within the last 10, 15 years are really just computers on wheels and we just get the opportunity to ride in them right at this point. 
 

So technology is so ubiquitous. I think, you know, one of the things we have to think through here is what are we trying to accomplish with the recent bill that was passed? So let's, let's talk, unpack TikTok a little bit here. Um, a lot of people may not realize, but in the United States alone, we have over 170 million Americans who use TikTok. 
 

We have over 7 million small businesses in America. Who that's how they actually reach their customers. It is an incredible platform for a lot of people. Now that's not to say that they don't have a responsibility to look at the addictive qualities to Do better as it relates to some of the mental health issues that have been traced back to the platform. 
 

All of the social media platforms have some issues. But when you look at TikTok, some of the concerns come from the origin story. So the origin story, different from Metta, which is headquartered in the United States, the origin story is it was born out of ByteDance, which is a headquartered company in China. 
 

I would I'm not a politician. I was a political appointee. So I don't want this to be taken as sort of a political opinion because it's not, but I think the best way I would describe our current relationship with China would be frosty, um, maybe ice cold, but it's fairly frosty right now and there are proven legitimate concerns. 
 

in other parts of technology ecosystem where there were concerns that data could have or did get sent somehow surreptitiously to China. From American businesses, American people. So the, the bill that was passed, I want people to understand, because I know in the media it's often talked about as the tick tock ban, but the bill that was passed is not called the tick tock ban bill. 
 

It's called protecting Americans from foreign, foreign adversary controlled applications. There is a precedent for analyzing technology that has an origin story that is not the United States. And so there's a, there's a process. You have to apply if you want United States businesses or individuals to use it. 
 

There are frameworks you're supposed to follow. You could be subject to Both planned and surprise on site inspections where both from a physical security perspective, but also a cybersecurity perspective, you might have to follow an inspection. This bill, um, still has not, uh, cleared to be signed fully into law, but, um, as of you and I talking right now, Marco, however, Assuming it is going to be fully passed, um, into law. 
 

Um, TikTok is first in line, um, to be considered under this particular bill. We do have precedent though, uh, with other companies. So for example, people may recall, um, China's Huawei Technologies and ZTE, um, who both kind of got sideways with the U. S. government, and they were rated as posing an unacceptable risk to U. 
 

S. national security. But banning an app versus a technology. is fairly unprecedented in American history. Um, we do know that there was a short period of time, uh, where there were bans on, uh, TikTok in other countries, but that was fairly on. Um, and so it'll be interesting to see Once this is enacted, will ByteDance agree to a sale? 
 

That remains to be seen. So they'll be given a short period of time to divest themselves. Who has the money? Does ByteDance, like, I mean, I have so many more questions than I have answers, but let's just say the clock runs out. And they don't decide to sell or they don't find a buyer. I don't know if all technology platforms have done their scenario planning here. 
 

So one, there will be an impact on small business. No, there's no doubt in my mind there will be an impact on those small businesses that use it to reach their customers. There will be an impact, um, emotionally for a lot of people who count on it for, you know, You know, rolling pandas, for example. I mean, who doesn't love a rolling panda? 
 

And I'm not even, I don't even have a Tik TOK account. But because I did some research on Tik Tok, looked at rolling pandas on their website without an account, I now have rolling pandas in my feed constantly. And I have to say, I really adore it and it really makes my day. So I've got so many questions around have the app stores all done scenario based planning to say, Worst case, I might be told not to just remove it from the store. 
 

What if I'm told I have to search for it on phones and delete it? I mean, we don't know where this is going to go. This is unprecedented, right? And so I have another question, all of you, internet services providers in America, are you going to have to block traffic to TikTok and have you playbook that and then all social media platforms? 
 

If I'm a user on Instagram, in Germany and I love this TikTok video of rolling pandas and I cross post the TikTok video onto Instagram from Germany and my pals in the United States see it on Instagram. Is that a violation of the ban? I don't know. Marco, these are questions I have. Do you have answers for me? 
 

I just changed the interview process. I'm interviewing you now.  
 

[00:24:04] Marco Ciappelli: No, I, the first of all, I love when we create more questions than answers because that make people think. And, and I think that as you're clearly outlining here, there's not an easy solution. You know, when you can call a ban, but what does it mean, right? 
 

I mean, you just outlined a few of the things that may be consequential to this and how I think for the nature of the internet. You can't really wall it. I mean, we will become what we don't want to be in a way. Right. So, and I have issues sometimes, even when I, I go to Europe and I look at the GDPR and how, you know, I got. 
 

Pop things. It doesn't matter if I'm a European, American or in other countries because where you are and how things are Different but it's not a clear cut and I think that that's the thing. So my question back to you is Wouldn't it be a better way to handle situation like this because as a user now as a As a government, I may see, you know, okay, it's an adversary. 
 

It's a little frozen relationship, as you mentioned. But as a user, if the company that get a lot of stuff from me And affect my life and knows what I do, what I don't do, and things that maybe I don't want to share if I knew that I was going to share it. Does it really make a difference if it's based in the United States or in the UK or in Italy or in China as a user? 
 

And I think that's why the users are coming out and, you know, the influencers and the people that create content there, like, can't do that. Can't ban it. So. Maybe a redefinition of privacy here, a redefinition of how we handle these things, and we can't just build a wall.  
 

[00:26:05] Theresa Payton: Yeah, we, we would be better served, globally, all citizens would be better served, if countries would really truly enact a individual privacy Bill of Rights, and have that Bill of Rights be in plain sight. 
 

Easy to understand consumer language. Have a place for arbitration when you find issues with your data. Have the opportunity if you're going to monetize data in a way where it's easily linked back to me and my digital footprints, then I'd like a piece of that action. Why can't, if you're going to make 50 cents off me in that moment, I would like 25 cents or 49 cents of that, but you know, whatever it is. 
 

So businesses, businesses do need the ability to collect and monetize information. that they collect as part of their operations. There are good reasons for doing that. However, there should be a win win in that data collection. So for example, I came from financial services. Marco, I don't think you would appreciate it if I had said to you, you know, now that you've opened up a deposit account, I'm going to have a banker follow you around. 
 

And every time they notice, um, an event in your life, um, they're going to tap you on the shoulder and give you an application for the next banking product. You would find that pretty creepy if we had somebody following you around. However, I can tell you even in the days before social media, We had predictive models that would say, Marco, uh, just bought a house. 
 

It's kind of this size. He doesn't have kids yet, but it's kind of interesting to us. And so based on demographics, based on other customers, we have that are a lot like Marco, he might need a new car soon. And why don't we let Marco know that we really value and treasure him and let's beat the other banks to the chase and offer him the absolute best. 
 

Financing possible because we love Marco and we just have a feeling now that he bought a house, he might need to upgrade his car. We may find that you open up a custodial savings account and the child's about five years old at some point. And in banking, we would have models to say, Guess what? Kids typically go to driving school at age 15. 
 

Typically parents want to upgrade a car and give the used car to the kid. Let's offer that. So there's all these different patterns of life and behaviors. Where if you're just doing good customer service and customer experience, you want to be able to look at that data and model it in a way that is not obtrusive, that is not creepy, but does give you conveniences and does offer you things during these different milestones of your life. 
 

But fast forward to the data aggregation we have going on today. Taking your data as a banker and selling it to a third party, even if I anonymized you, I had to disclose that to you, Marco, and you got to look at it and, and opted and opt out. Of those third party marketing services. Now, based on how data collection works in the digital space for social media, they're not regulated like the banks are. 
 

Um, so we need to do something different and it's not about hitting each company on the head and saying, shame on you. Why don't we set the guardrails at the consumer level, just like we've done for protecting consumer credit cards, just like we've done in other areas of our lives and we're past due, but you know what? 
 

We can get caught up here and it really needs to happen globally. And then we have. A duty of care standard that all big tech social media platforms know they must follow.  
 

[00:30:23] Marco Ciappelli: Yeah, a few things that I like here. It's the fact that you're referring to the pre internet age and how marketing, advertising, Politician group research they go and decide where am I going campaign next year? 
 

How you know, how do I address this? Target audience the social demographic and and so on and coupon in the in the supermarket At least you get a discount there But the amount of information that you're giving on the things that you buy, you're giving away everything about your life. If you buy a medicine, if you buy a certain product or another. 
 

And this has been going on for the longest coupon time. Right. So, uh, but, but the dimension now it's, it's a little bit different, I guess. And, and I want to ask you one last question, which is you mentioned the guardrail and we hear that a lot, like AI act, the guardrail. The people that want to just don't have the AI in the highway at all. 
 

People that want to not have guardrail at all. But maybe I think the guardrail is a good concept. Meaning just put the limit and the safety, but let people drive through, right? So are we finding ourselves in the same situation with everything at this point? That is online from generative AI to technology, all the new connected technology, the car. 
 

I did an episode about the information that the car gets about you was in Mozilla research, like a few months ago. They even know your sexual orientation. What does it mean that we're driving a car, right? So, is guardrail a good, a good model for you from your opinion?  
 

[00:32:24] Theresa Payton: Yeah, I, I think having that discussion around guardrails is very good because they're gonna change over time. 
 

You know, as technology becomes enhanced and there's new innovations that we don't know about today, having those guardrails, having laws on the books that are flexible, that focus on guardrails because we. Just focus squarely on one technology. What's hot today is not going to be what's hot tomorrow. Um, you know, you can ask a platform like my space, how things are going right now. 
 

Right. And so if we had done all of our laws about concerns about my space, we'd be overcome by events. And so I think we need to focus more on regardless of a company's origin story, what country they're in, what are the guardrails that we all should follow? And those need to be the standard duty of care. 
 

If you want to do business on the internet, if you want to collect data, if you want to sell data, if you want to be a business, you just have to follow these rules.  
 

[00:33:30] Marco Ciappelli: And  
 

[00:33:31] Theresa Payton: if we start to do that country by country, then social media, big tech devices, cars, you know, everything that's got technology built in we'll have to follow that duty of care. 
 

And, and I know that this can be done because we, we have a framework and a construct for it in the physical world. So for example, I don't feel great if I get into an elevator and I don't see a certificate in there or it's like five years old. I'm like, okay, I think I'll take the stairs. Thank you very much. 
 

And, and that's like an international standard. You typically see a certificate of the last inspection in an elevator. Um, so we have a construct for doing this. of saying, you know, here's the duty of care if you're going to be in a certain type of business, here's an inspection, and here's a certification. 
 

You can still, you know, have challenges, but at least there's A minimum set of guardrails around safety, resiliency, reliability, and we're really lacking that. Most things today are a recommendation and a suggestion. Maybe it gets found out if you're in a heavily regulated industry, it gets found out during an audit. 
 

But social media, other than a few guardrails around, are you taking payment data? Um, what are you doing with somebody's profile? Um, you don't, you shouldn't be storing people's social security numbers or things like that in the clear. It should be encrypted. But other than those basics, it is still very much kind of a wild, wild west as it relates to data. 
 

And even when Our social media companies tell us Marco that, well, I'm not sending data that says it's Marco. Like it's, it's, it's anonymized. I've, I've removed his name. That's actually not enough anymore. Based on the computing processing power that we have and big data analytics and incredible algorithms, we can correlate with great. 
 

Um, correlation and accuracy and get it down. Um, so Ted and I wrote about this in our book, Privacy in the Age of Big Data. He's a brilliant lawyer. And we saw the research studies say that you could take Anonymize data from a social media platform, purchase data from a couple different third party marketing brokers, correlate the data, overlay what we believe your current zip code to be, and within down to about 35 people, guess if it's you Marco or somebody else. 
 

That's pretty close anonymized, if you agree, um, to say it's either Marco or 34 other people in his neighborhood. Um, so you're starting to get where Even if the data is anonymized one place and sent somewhere else, the aggregation portion de anonymizes you pretty quickly.  
 

[00:36:36] Marco Ciappelli: Yep. And you're just jumping to george Orwell.  
 

[00:36:39] Theresa Payton: Well, and, and if we have another moment, Marco, I do want to give people hope because there are some things, you know, while we're, yeah. So, so Marco. So while we're waiting for Superman and while we're waiting for these privacy bill of rights and guardrails, there are some incredible, mostly free tools that people should just try it out and just think about using them. 
 

So for example, I love browsers that were built with privacy in mind first versus privacy bolted on later. For example, Privacy Browser Brave. Oh my gosh, it's one of my favorites. So B R A V E, brave. And you can actually turn it so tightly towards privacy, you will see websites break before your eyes. Um, so you can set all of your different privacy settings and then it'll let you know if there's a favorite website that you use that literally will not function if you don't. 
 

You can let just that website have just enough functionality to work. And it tells you while you're sitting there, we've blocked this many trackers and you'll, you'll be surprised, um, how many different places you visit and the trackers that are on there. So Brave can be a great way to grab some of your privacy back. 
 

Think about having more than one email account. So the email account perhaps that you're using for, you know, maybe conversations with the family or you don't really want it tied to marketing and maybe it's between you and your family CPA or your, your doctor or whatever it is. Um, proton mail can be a great way to sort of firewall off your life with a free product. 
 

Um, So be thinking about these different ways. Uh, another one is don't always give your real cell phone out, especially if you're just trying to get coupons and things like that. Get a burner number, uh, from a, for example, Google Voice, Talkatone, you can get a free burner number. You can still forward it to your actual cell phone, but that can be a way to again, just kind of create some different privacy barriers for yourself. 
 

Reading privacy policies isn't always practical for everybody, but when you get an opportunity, definitely take a look. Mozilla does great research on the lack of privacies you have on a lot of platforms. And Canada does a lot of great research reports across different technologies. And where you may be uncomfortable about the privacy settings for those technologies. 
 

So those are just a few things that everybody can do. They're not super technically complicated. They don't cost a lot of money. And it can be something very easily integrated to how you already live your life online.  
 

[00:39:27] Marco Ciappelli: Yep. Great advice. Great advice for sure. And I think some companies just to go to the free market, they are actually embracing this. 
 

Like, you know, what ProtonMail is doing, you know, Apple, if you have an Apple ID, they suggest you to use a different email, hide my email. It's a really, really good thing, as you said. And it may not be the solution to the internet and the manipulation that, of our. Desires and and what advertising has been doing for many many years anyway, and um, But we I I think we need to empower ourself a little bit I think that's that's the message too. 
 

It's not our job to be cyber security people but it is our job to you know before you buy a product to read the Ingredient maybe and the label and, um, and that comes with the security as well. So a lot We could talk about more but we're gonna end it here. I want to thank you for these. I think it's been very helpful to understand the whole TikTok ban situation or what we can do to get a better Privacy or maybe how the world should look at these things. , Moving forward and redefine what we need as technology. Expand and grow and change so quickly that it sometimes gives you vertigo. Like, what's new today? What am I going to talk about today? So with that in mind, I know that you're coming out with the second edition of your book, which you presented years ago on my show, Manipulated. 
 

And I would love for you to come back and talk about that. And that's, uh And all more psychological approach to, to all of this. So Teresa, thank you so much.  
 

[00:41:20] Theresa Payton: Thanks for having me on. And I'd love to come back and talk to you about Manipulated. It's, uh, is available for pre order right now on Amazon for anybody who's interested and, um, really unpacks what is happening across not in social media is not to blame. 
 

Um, really, I like to describe it as the misuse of technology by people with ulterior motives. And so kind of help people think about how to spot and stop these manipulation campaigns.  
 

[00:41:50] Marco Ciappelli: Very good. Very important tool in our arsenal for sure. And for everybody else, stay tuned, subscribe, and, uh, there'll be many more interesting conversation coming up and, uh, hopefully we'll make you think. 
 

So stay tuned. Thank you again, Teresa. Thank you, everybody.  
 

[00:42:08] Theresa Payton: Thanks everybody. Thank you, Marco.  
 

[00:42:11] Marco Ciappelli: Bye.