Debra Farber: Shifting Privacy Left With Privacy by Design

April 08, 2024 Tim Freestone and Patrick Spencer
Debra Farber: Shifting Privacy Left With Privacy by Design
More Info
Debra Farber: Shifting Privacy Left With Privacy by Design
Apr 08, 2024
Tim Freestone and Patrick Spencer

Debra Farber, a globally recognized privacy, security, and ethical tech advisor with nearly two decades of experience, discusses data privacy, privacy by design, and the growing field of privacy engineering in this Kitecast episode. As the host of the Shifting Privacy Left podcast, Farber is dedicated to building a community of privacy engineers and bridging the silos between various industries and research areas.

In this Kitecast episode, Farber emphasized the importance of embedding privacy into product development from the outset. She highlighted the role of privacy engineers in assessing risks, minimizing data collection, and ensuring compliance with regulations such as GDPR. Farber also discussed the challenges organizations face in hiring privacy engineers due to the high demand and limited supply of qualified professionals in this relatively new field.

Farber explained the distinction between privacy by design and privacy-enhancing technologies (PETs). Privacy by design is a set of high-level principles focused on integrating privacy into systems from the beginning, while PETs are specific tools and techniques that help achieve compliance with data protection principles. Some examples of PETs include anonymization, homomorphic encryption, secure multi-party computing, and differential privacy.

The conversation also touched on the potential return on investment for organizations that prioritize privacy. By minimizing data collection and addressing privacy concerns early in the development process, companies can reduce downstream compliance costs, legal expenses, and the risk of fines associated with data breaches or privacy violations.

In addition to the above, Farber shared her thoughts on artificial intelligence and its impact on personal privacy. While acknowledging the potential risks, she emphasized that the real threat lies in the unchecked powers of those bringing AI to market without appropriate safety measures and testing. Farber advocates for the ethical development and deployment of AI technologies, ensuring that privacy standards are applied correctly to mitigate risks and protect individuals’ rights.


Shifting Privacy Left Media: 

Check out video versions of Kitecast episodes at or on YouTube at

Show Notes Transcript

Debra Farber, a globally recognized privacy, security, and ethical tech advisor with nearly two decades of experience, discusses data privacy, privacy by design, and the growing field of privacy engineering in this Kitecast episode. As the host of the Shifting Privacy Left podcast, Farber is dedicated to building a community of privacy engineers and bridging the silos between various industries and research areas.

In this Kitecast episode, Farber emphasized the importance of embedding privacy into product development from the outset. She highlighted the role of privacy engineers in assessing risks, minimizing data collection, and ensuring compliance with regulations such as GDPR. Farber also discussed the challenges organizations face in hiring privacy engineers due to the high demand and limited supply of qualified professionals in this relatively new field.

Farber explained the distinction between privacy by design and privacy-enhancing technologies (PETs). Privacy by design is a set of high-level principles focused on integrating privacy into systems from the beginning, while PETs are specific tools and techniques that help achieve compliance with data protection principles. Some examples of PETs include anonymization, homomorphic encryption, secure multi-party computing, and differential privacy.

The conversation also touched on the potential return on investment for organizations that prioritize privacy. By minimizing data collection and addressing privacy concerns early in the development process, companies can reduce downstream compliance costs, legal expenses, and the risk of fines associated with data breaches or privacy violations.

In addition to the above, Farber shared her thoughts on artificial intelligence and its impact on personal privacy. While acknowledging the potential risks, she emphasized that the real threat lies in the unchecked powers of those bringing AI to market without appropriate safety measures and testing. Farber advocates for the ethical development and deployment of AI technologies, ensuring that privacy standards are applied correctly to mitigate risks and protect individuals’ rights.


Shifting Privacy Left Media: 

Check out video versions of Kitecast episodes at or on YouTube at

Patrick Spencer (00:02.203)
Hey everyone, welcome back to another Kitecast episode. Tim, my co-host, is joining us. Tim, how are you doing today?

Tim Freestone (00:08.595)
Yeah, I'm good Patrick, you know just counting the days till our AI overlords take over but until then alright

Debra Farber (00:15.288)
I'm going to go ahead and turn it off.

Patrick Spencer (00:17.056)
Are you real? Are you an avatar? Well, we have a real treat today. We talked to a number of folks in previous episodes about data privacy, but we haven't had someone who is almost exclusively focused on data privacy. And that's what we have today. She does other things as well, but she's a real expert. Deborah Farber is joining us.

Tim Freestone (00:20.617)
Let's see, yeah I'm good, I'm real.

Patrick Spencer (00:43.103)
And she's going to talk about, as I said, data privacy, data privacy, shifting left. What's that mean? Privacy design. She's a globally recognized privacy security and ethical tech advisor in principle. She's the host of the shifting privacy left podcast. So at the end of the program, we'll have her point our audience to where you can find that podcast so you can subscribe.

She has almost two decades of experience managing privacy and data protection issues and serves as a strategic advisor to a number of organizations. There's a long list here. We're gonna talk a little bit about some of them. Privacy Tech, the Institute of Privacy by Design, Privato, SecureV, we'll go through the rest later on in the podcast.

She's held a lot of different privacy and security roles at places like Amazon Prime video Which is I'm sure gonna be interesting to our audience AWS big ID visa trust arc IBM Man, my resume doesn't look very good Tim after introducing her Deborah thanks for joining us

Tim Freestone (01:48.178)

Debra Farber (01:51.908)
Yeah, it's a pleasure to be here. I always love coming on a adjacent topic podcast because then there's already a foundation for discussion and then it's a matter of filling in gaps and kind of just rounding out the discussion. So thank you for having me.

Patrick Spencer (02:10.079)
If you're like me, I'll hear someone interviewed on one podcast and then they were also interviewed on some other podcasts and I'll go check that off. And while I'm listening to podcasts, I'm Googling the person's background. I was listening to one last night, the guy that just published a book, interesting character as often as the case. But talking about interesting characters, let's talk about you and your background. And let's start by discussing the Shifting Privacy Left podcast.

When did you launch it? What's it about? Who do you typically interview on these podcasts?

Debra Farber (02:42.42)
Yeah, sure. So the Shifting Privacy Left podcast is about, it's basically serving privacy engineers, and I am going to loosely define that privacy engineers as developers, researchers, anyone who's, let me start, so developers, researchers, data scientists, designers, you know, and anyone architects.

anyone that's working on product development that will, where the product or service is going to be processing personal data, you really need to build with privacy by design, but you can't just use paper to protect privacy like we have for so many years, laws and policies and contracts, they're just not enough. You need to engineer privacy into your products and services. And so privacy engineering is a space that's growing.

rapidly and we could talk about that a little later, but it's, you know, Provato, the sponsor of my podcast basically was like, look, we're selling to privacy engineers, but there's no place to go for them right now. Like, where do we go to get in front of them? I've been an advisor for them for a year at that point. And I started the podcast about 15 months ago. So back in 2002, at the end of 2002. And

Tim Freestone (04:09.602)
Oh wow.

Debra Farber (04:10.744)
Uh, yeah, they've been about 54 episodes so far comes out around weekly. Uh, and it is, um, you know, been successful. It's been, uh, my desire to build a community of privacy engineering. Um, and to, you know, there's even silos amongst like data scientists and, uh, you know,

that are working on one privacy enhancing technology versus data scientists working on another, right? There's silos across research, there's silos across industry. So my goal is to kind of have a smattering of different technical topics around privacy that you can't find anywhere else, or not anywhere else, which you can't find in most other podcasts that are focused on.

operationalizing privacy. And so far it's been a real great ride. I get to have conversations with really intelligent people working on cutting edge stuff or who've done like, you know, large scale deployments of anonymization and come to tell the story about it. And so hoping that there's could be lessons learned no matter what size company, you know, the listener belongs to that they could pick up some insights of what, you know, what people.

what's gone wrong, what's gone right, and use that in their own practice. So that.

Tim Freestone (05:29.526)
That's interesting. I haven't, I've never heard the term privacy engineer. It makes total sense though. Yeah, it makes absolute sense. It's kind of interesting.

Debra Farber (05:35.368)
Yeah, corollary to the security engineer, right?

Patrick Spencer (05:39.335)
The big companies have those positions, Depper, so you go to like an Amex or a Visa. There's a privacy engineer or engineers on staff.

Debra Farber (05:47.964)
Yeah, you know, they're really coming out of, I would say first and foremost, you see them in big tech companies because let's just say there's been lots of scrutiny, regulatory scrutiny. There's been some re-postering of how the big tech companies are gonna be.

Tim Freestone (05:55.575)

Debra Farber (06:10.369)
complying with different EU regulations like GDPR and e-privacy and now the digital services, the DMA, I forget what this stands for, but

it is now trickling down into other more highly regulated spaces like in financial services, payments, things like that. But they're being hired from the big tech firms that train them on what is privacy engineering, what does that look like within an organization. And so, you know, it's this weird thing where some of the biggest offenders are the big tech companies. But

They are also bringing in some of the and training up like, you know, privacy engineering capabilities and people are cutting their teeth on large scale problems there. And so, you know, it's like you have to go and work in the belly of the beast in order to do the good work. Right. But, you know, I think it's great that at least there's investment in privacy engineering these days. That the problem is that there's a huge gap. There's those who've now like been in the space.

and have been privacy engineers and are getting paid a crap ton of money, not saying that's bad, I'm just saying that's the situation. And then you've got like, you know, companies trying to hire into that role, but don't even know what they need or what they want, or what does a privacy engineer do, right? Because you see security and you've got like, you know, a ProdSec engineer versus a DevSecOps, versus, you know, it's been hyper segmented and privacy is about 15 years behind security when it comes to...

maturity and so there's a lot less of a current, like there's no like privacy engineering textbook out there, right? Like this is in development right now. There are courses and there are like, you know, Carnegie Mellon has the very first privacy engineering master's program and they might even have a PhD program now. I think it's still just master's and pretty robust. I mean, I think it's like 10 years going, if not maybe 15 at this point.

Tim Freestone (07:55.883)

Debra Farber (08:14.883)
But still, there's not enough privacy engineers for the demand. And then there's also this wild fluctuation in the amount of money that they're demanding salaries since there's such a high demand and low supply. And so those are challenges.

Tim Freestone (08:20.267)

Tim Freestone (08:28.586)
What's driving the demand? All companies have a cost to value calculus that they're going through on every hire and every capex and opex expense. What fear is driving the demand right now for this role in privacy in general?

Debra Farber (08:37.516)

Debra Farber (08:48.3)
Yeah, I would think like just like security, security has used this shift, shift left mantra as well, where it's like if you address problems before you ever ship code, you know, and, and you dress them early, then you're reducing your risk and the cost of risk and the cost of litigation and the cost of it. So right now it's kind of like, look, we have all for years we've been doing this privacy stuff we've got our breach program we've got our, you know,

pieces of compliance, but we've never really embedded privacy into the business, right? And so now there's two things I'm seeing. One, in the data science world, it's we want, we have all this data and we wanna be able to use this data for analytics and for maybe even for the benefit of the people whose data it belongs to. But there's restrictions around secondary uses of data. You collect it for one purpose, you can't go and use it for another, you'll get fined, you know, get in trouble.

Tim Freestone (09:39.15)

Debra Farber (09:41.948)
without consent or other appropriate legal means of legal processing. But if you deploy certain privacy enhancing technologies, a lot of them are in cryptography, right? So homomorphic encryption or even just encryption itself, these are tools that you need technical acumen to deploy or.

Well, we won't stick with differential privacy. For now, I'll talk about, so data scientists wanna open up the use of their data. And so one way that you can do that is if you have anonymized data sets, because anonymized data, if it is truly anonymous, and it's very hard to do to have that guarantee be met. But if you can guarantee that your data is anonymous through a mathematical process that is gonna assure that, it's gonna assure the,

that the privacy is anonymous, then it's no longer considered personal data or personal information under any of the regulations. And so that means that you can now use that data for all of these things that can make money. Maybe you could sell insights, maybe you can share insights more for research, maybe. So you could show how you're reducing the costs of the data that you're holding, the costs of the risks around it by

data minimization principle, minimize your footprint of personal data and anonymization is one of the way to do that. And so I'm seeing that like, I'm just giving an example there for anonymization. Anonymization has its drawbacks as well. Like you can't keep the relationships, you can't do as many of the insights as you want because you don't have as much of the raw data to be able to...

use, I guess. I'm not a data scientist myself. So in one respect, it's like, well, as AI is taking off and such data hungry, you know, LLMs that you need to feed to train and all of that, there's, you know, there are tools out there where you could suppress data from going into your models. You could, you know, but first you have to identify what's your personal data. And from

Tim Freestone (11:38.879)
I understand what you're doing.

Debra Farber (12:04.36)
It's the corollary and privacy is what's my personal data. And instead of looking at the confidentiality, integrity, availability of that data, you're looking at the potential privacy harms that can happen. And one of the big things is linkability. So you don't have anonymous data if you can link anything to that person's identity. So anyway, the other area is in just bringing products and services to market. So

Tim Freestone (12:21.858)

Patrick Spencer (12:22.752)

Debra Farber (12:31.796)
You know, it's not good enough to just say, well, we have a product we're selling to people that you have personal data. It's like, you look at the cloud space and it's like the shared responsibilities model that has been so prevalent in cloud, where it's like, you know, we are as a cloud company, we're responsible for privacy and security of the cloud, but you're responsible for the privacy in the cloud, whatever you put in the repositories. The challenge with that is that like,

You know, a lot of the cloud companies haven't been thoughtful about what the customers need is when like, how do you delete data? And when you delete data from the system, is it, you know, how do you know it's really deleted or not marked for deletion or not in garbage collector? And we were being able to like actually take it out of deletion. How do you know, you know, and a lot of that has not been thought through. And so there's a lot of the technical.

There's a lot of the privacy risks that we know today, but then we need technical acumen to look at the different integrating into the different tech stacks, to be able to do code reviews. Privacy engineers are looking at the appropriate architecture for the flow of personal data and making sure that it's data minimization is key when you're architecting. So that if you don't collect the data in the first place and the data is plutonium and you need to have hazards.

uh, you know, make that safe within your organization, then you're, you don't inherit that risk. Right. So there's a lot more risk management upfront before you're ever like shipping the code. Um, and that's, that's what privacy engineers are really focused on. And like security, it's going to be different organizations want different things, so it's actually hard to hire in this area. Um, some people want, like, I just want an engineer that's going to go build me a like data deletion, uh, tool, right. Or I just want an engineer. Like.

That's a privacy engineer, sure, but that's an engineer that's working on creating a privacy product within an organization, as opposed to a privacy engineer that's thinking with privacy by design, deploying with repeatable, manageable, system design thinking and system thinking. So think the corollary of security engineer. And that's...

Patrick Spencer (14:50.031)
How many organizations, playing off of what you just talked about, have shifted privacy left, right? Play on word, from an AppSec standpoint, shifting security left in the development cycle. A lot of organizations, privacy is still a very reactive process. They do it after the fact. It's maybe one of the final steps when you roll out an application, you roll out a program.

Debra Farber (15:10.348)

Patrick Spencer (15:17.167)
how many are actually doing it upfront today and how do you, second of all, how do you get organizations to shift it less?

Debra Farber (15:24.188)
Yeah. So I would say that there's definitely more companies addressing privacy way before the product is actually built.

and they're opining on it when it's still getting built. But I'm not seeing as many engineers being embedded into that process. Maybe it's like have privacy council or the CPOs or someone from the CBO's office come in and we'll have a meeting talking about the product we wanna build and are there any issues there? And I'm seeing a lot of that. What I'd like to see more of and is still a work in progress is more privacy engineers literally doing the work of like,

going to stand up meetings and working in the, you know, product development, like sitting with those folks and like being kind of like the privacy representative in the development team. I think there's a lot more work that needs to be done there. And I think that a lot of it is like there weren't tools that can help with the automation of some of the problems in this space.

We could talk about it in a little bit, but for instance, Provato is a company that I advise, and they actually said that they'd sponsored my podcast, and they are a static code analysis tool for privacy. So, we've seen that in the security space for years, but it's completely different posture and completely different questions you're asking about the data, and it works in the engineer's workflow. And so these things didn't exist. So how could we expect engineers to have...

hit the ground running with solving some of these problems.

Tim Freestone (16:56.698)
So that's interesting. Is it looking for, because shift left code is about vulnerabilities, and you got the OWASP top 10, and all that kind of stuff. But is there going to be, I can't think of a phrase for it now, but an OWASP top 10 version for privacy? And does it look for PII violations and things like that?

Debra Farber (17:03.735)


Debra Farber (17:19.636)
Yeah, great question. So there was for years a OWASP top 10 privacy risks, but it wasn't vulnerabilities, right? It wasn't something you can test to. It was like high level based on privacy principles and wasn't very useful as a result.

Tim Freestone (17:32.385)

Debra Farber (17:38.948)
I have been asked, I've been, this is something that I've been talking about with Provado and their chief scientist who has come from the company called Shift Left Security and is a vulnerability specialist and he's giving his attention now to the privacy problem and kind of, so he's someone who really understands how that was done in security and is now doing it in privacy. While there's no list of privacy vulnerabilities right now, there's definitely like the tool, what Provado does is they're looking at like the,

Tim Freestone (17:46.806)

Debra Farber (18:08.384)
the APIs and seeing, oh, all of a sudden, these data elements are now being included where they weren't previously. And so a lot of deltas between the code pushes and being able to, in the engineering workflow, surface that there's a potential risk and make you make a decision before you proceed. And so I have to get, I know there's been some new additions to the product, but.

Generally, it's just as you would think, it's like it's in the engineering workflow, you're doing your security vulnerabilities, then you would go and take a look at this and it will show you where there's different risks. Now, I know the chief scientist, Sue Chakra, he wants to work with OWASP to create like a top 10 list for privacy. I also wanna say that there is an amazing threat framework, privacy threat framework out there called Linden.

L-I-N-D-D-U-N, each one of those letters, it's an acronym, and each one of those things is a privacy vulnerability category or a threat category. So if you work backwards from like, these are the threats, and here's all the, you know, sorry, came out of like academia and it's like the best one for privacy that's out there, and people are iterating on it. So I really suggest looking at the Linden model. Even the security threat research has...

researchers have opined on Linden model for privacy and think it's like, you know, really good.

Patrick Spencer (19:35.383)
Is it similar to like the NIST privacy framework, Debra, or is it more, more comprehensive?

Debra Farber (19:39.824)
No, no, it's probably closer to more of a security threat modeling framework, which I'm not in depth on. Yeah, and so with this threat modeling framework, I'm like, well, this is great. This is basically like a vulnerability. I mean, it's a threat modeling, but it's a vulnerability like framework. So if we work backwards from that, it's what are the controls that we need to achieve.

Tim Freestone (19:47.01)
Yeah, for the shift left approach, right?

Patrick Spencer (19:48.995)
Yeah, that makes sense.

Debra Farber (20:03.116)
these, you know, prevention of these vulnerabilities being exploited. And so I'm working right now, I'm on a member of the Institute of Operational Privacy Design on their Risk Controls Subcommittee, and we are working to create a basically an equivalent of the NIST 853 controls list for security, which has been actual controls, not like high level risk statements, and create something similar.

Tim Freestone (20:28.906)

Debra Farber (20:33.276)
for privacy, and that'll include things like maybe some privacy enhancing technologies you can use and deploy and certain actions you can take. That can be helpful. So there's going to be a lot of working backwards from work that's out there already. I bring this up because I think we can also use Linden to work backwards to create like here are the risks that you could see like linkability or, you know, you know,

I don't want to go through those now. I'm going to open up a can of worms if I start listing out the risks. But I think there was a question still unasked about that you asked me about privacy engineering and I don't remember what it was, but I'd like to circle back to it if you remember.

Tim Freestone (21:04.578)
That's okay.

Tim Freestone (21:17.07)
Um, no, it was, um, what's the, what's the driver of, of the investment? Cause somewhere, somehow there's bean counters and all these companies going, well, if we invest this, we will either save this or gain this it's gotta be the equation has to be there, so I'm curious what

Debra Farber (21:21.706)

Debra Farber (21:29.205)

Debra Farber (21:35.336)
Right. It's true. And I mean, I could talk about external factors, different regulators in the EU affecting US large companies and their behavior. But it could get into a really deep, long discussion. And what I do want to say I'm seeing a lot of in order to get this shift left mindset as a driver.

is that you could save a crap ton of money if you just stop making the mistakes up front, right? Like think about not only how much compliance work you have to do if you collect all the data just for the sake of it, well, you have to do more data privacy impact assessments. You have to do more compliance and compliance isn't privacy. Compliance isn't security. Compliance is just saying you have the, is getting the docs together to prove you are doing that stuff. And so, you know, what I'm seeing is like,

Tim Freestone (22:26.07)

Debra Farber (22:31.324)
If you minimize your data upfront, you have fewer risks, you have fewer fines, potentials of breaches, and then most importantly, you have, not most importantly, but most expensive, is outside counsel. I mean, you spend a lot less money on outside counsel and outside firms, like your large consulting firms, because you are narrowing the scope of what you need to protect.

Patrick Spencer (22:59.191)
dollars are huge to your point.

Debra Farber (23:01.04)
Yeah, and so I think the ROI is it's just like security. It's save money by just being more mindful upfront and doing the risk assessments upfront. And...

Patrick Spencer (23:10.063)
You think we'll see a equivalent in the privacy area like we do with security in regards to the IBM data breach report that's been published, what 15 years with Poneman? You think we'll see something like that in the data privacy realm?

Debra Farber (23:23.082)

Debra Farber (23:26.9)
Well, so, you know, the funny thing is data breaches was the privacy realm. It literally back when I started in privacy in 2005, it was like our only driver that we had to like get change in the organizations. Um, when it was like, what's a cookie. I mean, this was like, you know, everything was so new back then, right? We don't have the like breadth of, of the space we have today. And, and, and breaches were kind of like the main driver of change for privacy. Once that moved over to security where it was like.

Patrick Spencer (23:31.401)

Tim Freestone (23:42.045)

Tim Freestone (23:46.562)
That's a cookie.

Debra Farber (23:56.712)
a breach of a system equals a, you know, most likely could be a breach of personal data. It became the realm of the security officer and then it became the privacy officer's realm of just messaging it and reporting it because privacy laws require it. So I would argue that like Poneman was a privacy institute. Poneman research is a privacy institution, but then it just became like, you know, the realm of security.

And so I wouldn't say we'd have a separate breach report, but there might be additional metrics that are things that we'd be able to measure. You know, I haven't had an episode on the show yet on metrics, but they need to, you know, a lot of the times the metrics that are created in companies are just spinning our wheels like, oh, we created, you know, we addressed 500, you know, DSARS today. And it's like, okay, but how's that helped privacy? You know?

So there's metrics for the sake of metrics, and then there's metrics for what's going to help move the needle in your org. And while the breach report does a great job of both scaring teams in a good way, like giving you real data that's actually like, oh, crap, there might be a lot of risk here that we now need to address in our org, and we don't want this to be us. And I don't find that that's what moves the needle in privacy.

I find, especially when we're just, what we're trying to do now is get more involved in product development and being thoughtful about the repositories we're putting data in and maybe the use cases for it. And so that's really more like, I'm focused more on how could we get into engineers workflows and how could we have product owners see us as enablers as opposed to, you know, lockers. And I think those are the areas that are gonna speed up.

and putting everybody on the same page, like here are the downstream repercussions of if we do these things wrong. I don't think any engineers out there like to get like, no, we just wanna violate privacy for the sake of it. I think a lot of the times these privacy violations and organizations are so lack of knowledge and it's just like nobody trained them or, and so it's how do we embed privacy into the organization so that everyone's on board, that this is core mission critical.

Debra Farber (26:12.916)
You know, we're obsessed with the customer. Because privacy is not about the data, right? In the end, privacy is about preventing harms to people. And the data that we have on them is basically like, they entrust that to us and we wanna make sure that there's no downstream harms to them. And that should be the thinking. Because who could argue with that, right? And so that you just have to, I think there's just needs to be reframe in organizations. We need to get out of this like,

Well, legal needs to own privacy. I don't think that's the right approach. I mean, we don't have legal owning security. I think one of the, I'm gonna say, it's a controversial statement I say often, but I actually think that lawyers owning privacy early on has been the reason, and like holding onto it is like, this is ours. You know, we determine the privacy stuff has been one of the blockers to shifting privacy left for the last decade plus.

Tim Freestone (27:10.898)
Yeah, that makes sense. You know, I wonder if the privacy industry will follow a similar trajectory of, of security. You know, like if we think back, security kind of started in the cybersecurity, let's say like the seventies with the mainframes and where the passwords were, you know, there was an admin password and the password was admin. And from there we've gotten to where we are today. But in that process, security has kind of gone through this.

Like you said, an inhibitor which drove shadow IT. And then the dynamic became, well, how do we be kind of an invisible security layer so we don't hinder productivity and business needs? And all of the complexities that come with being a business driver and not a business inhibitor, it's probably gonna follow the same challenges in the privacy domain, just five to 10 years behind, I'd imagine.

Debra Farber (28:07.62)
Exactly. Yeah, I'm thinking more like 10, 15 years behind. Yeah. And then it's also funny because people, you know, it's like, oh, all right. Well, privacy is just the simple thing, you know, just, you know, encrypt, encrypt stuff so that you don't, you know, just encrypt it and there's a control. You don't, you know, you're good. You don't have to think about it anymore. And I think GDPR really, you know, for better, I think for better, it has made, because GDPR covers data protection, not

Tim Freestone (28:10.026)
or 10, 15, I guess, yeah.

Patrick Spencer (28:11.527)

Debra Farber (28:36.028)
not specifically privacy only, but data protection. And data protection includes many, many rights that EU citizens enjoy. Privacy is one of them. Security stuff is another. And so data protection is also something under the purview of the CISO. And so if the CISO has to be in charge of data protection, which is basically a lot of the security components in the EU,

they had to become more knowledgeable about GDPR and then learn the distinctions between the privacy requirements and security that really educated them. I think that privacy is bigger than they thought it was because in the past it would be like, you know, yeah, security and privacy work together, but it was more like security was the technical side. Privacy was just, you know, maybe operations. And now we're getting to the point where it's like security teams are realizing they have privacy obligations as well.

But privacy is larger than security. It's not a subset of it. And so I think that it's been more of an educational tool in organizations. So it's starting to become more, instead of, that's not my problem, someone in privacy needs to deal with it, right? It's definitely becoming more collaborative in organizations like big and small, because more understanding of what privacy is. Oh my gosh, please don't ask me what it is right now, because that could be its own episode. Yeah.

Patrick Spencer (29:48.262)

Tim Freestone (29:58.876)
Yeah, that's right.

Patrick Spencer (30:02.739)
You and on your LinkedIn profile as well, I think you've done a podcast or two on this topic. I don't know anything about this, so I'm looking forward to your answer. You have privacy by design versus privacy enabling technology abbreviated PEP. You guys are very good in the privacy realm like we are in the security realm of using acronyms. We have hundreds, right? But talk a little bit about the differences between those two.

Tim Freestone (30:02.891)

Debra Farber (30:18.281)

Debra Farber (30:25.)
Right. We do.

Yeah, actually, there's significant differences. So privacy by design is originally a set of seven principles, high level principles, about how you go about designing privacy into your systems created by Dr. Anne Kevukian, who was the former privacy commissioner in, I think it was Ontario, but it was one of the Canadian provinces. And...

It's been around for a long time now. People talk about, yeah, you gotta do this, basically. It means embed privacy into your business. Like, you know, the very first one is like, aim for a positive sum game. It's not privacy versus security, right? It's together, it's like, how could we bring them together to effectuate like the goal of whatever we're trying to do? That was the first principle, right? So it's like real high level principles, but.

I do want to point out that GDPR has taken those seven privacy by design principles and made it mandatory. And it is in legislation that you must build with privacy by design. So what does that mean exactly? And when have you failed at that? That is right now going to be a matter of court filings and stuff. But in principle, it's embedding privacy into the business, right? It's design it in from the beginning and you can't do it at the end and just add a box of privacy on.

with just contracts and terms and service and what you have to build the capabilities in. So it's really that high level thing. What we're trying to do at IOPD, the Institute of Operational Privacy Design is really put rigor behind that. We've already created a process standard for when you, like if you go through this process, you have built with privacy by design. And that we're currently working on

Debra Farber (32:20.204)
certification standard so you can then you know prove that you get a certification for having done that process right because there's companies out there that are like we really want to you know prove that we've done this and have some sort of public statement on that and so this is this is that uh yeah

Patrick Spencer (32:38.511)
So this will be like a sock tube, and it will be driven by the private sector.

Debra Farber (32:44.056)
It'll be, so it's open-sourced. So, you know, and it'll be driven by the, right, the private sector, agreed. So the idea is you'd get a seal, you know, you could display on your website, you could, you know, but you would have to, there'll be auditors that will be trained on the standard. It doesn't exist yet. All we have now is the standard. We don't have the certification standard. And then, so the realm, it's shaping up, but I just, right now, I wanna make the distinction between

the need to embed privacy by design into your organization, and then privacy enhancing technologies. So they're called PETs, or PETs, privacy enhancing technologies, really can help you achieve compliance with the data protection principles, particularly with like data minimization, purpose limitation, and security. They can help you with protecting individuals' privacy and effectively implement data protection by design.

So whether you have a specific PET or you combine them, it's appropriate for, whether it's appropriate for your processing of personal data is gonna depend on, you know, your set of, your use case, your set of circumstances. So you should really consider implementing PETs at the design phase of any project when you're doing a data protection impact assessment, which is like mandated for certain high risk uses of personal data.

And so PETs are particularly suitable in the context that when you're involving large scale collection and analysis of personal data. So you think AI applications, IoT devices, cloud computing services. And then there's so many of them. I'm just going to list a few of them. But like anonymization, we all know about pseudonymization we've heard of. But homomorphic encryption, secure multi-party computing.

private set intersection, federated learning, trusted execution environments, zero knowledge proofs, differential privacy and synthetic data are just, some of the PETs that are currently either deployed like anonymization has been deployed large scale, whereas homomorphic encryption is still, even though it's commercially viable, it's still gaining its sea legs. And then there's obviously other ones that are.

Debra Farber (35:05.772)
in the labs and they're still working on. And so it's this really robust area where there's so many tools coming out that can enable you to achieve whatever your business goals are in a privacy preserving way. And I think that's exciting because again, this goes with ROI where you could really say if we're deploying these tools, not only can we then say that we are more protecting privacy better and message it and be able to demonstrate.

are compliance, which is the accountability principle under GDPR, you gotta demonstrate that you're compliant. But it also reduces downstream compliance costs and the costs of lawyers and the costs of, the paper chasing, right? Pushing around the paper.

So even things as like if you tag your data upfront and good data governance practices, right? You tag it at the development stage, which is something that Provado enables, then you don't have to go search for your data later, you know, with data discovery tools. And instead you could just use, you know, search to pull up the data you're looking for to find the information around it. So it's really just about being thoughtful early on so that you could reduce risk.

reduce costs, increase privacy, increase your privacy posture.

Tim Freestone (36:31.734)
Got it. Cool. Yeah, yeah. Well, I found it amazing that you were able to rattle off all of those subgenres of privacy.

Debra Farber (36:32.6)
Exciting stuff, right? I mean, I'm excited. I don't know if you could tell how jazzed I get about talking about privacy.

Debra Farber (36:42.664)
I mean, I had notes. I had notes here. I mean, I'm only like telling folks, because I do have ADHD and I want to make that normalized. And so, well, I always have for my own podcast, I have like my script of questions up there just to make sure I'm staying on topic. I wanted to do the same being a guest on your show.

Patrick Spencer (36:43.679)

Tim Freestone (36:45.608)

Tim Freestone (36:52.194)

Tim Freestone (37:01.838)
Nice. I started a thing and then I stopped the thing, but I wanna start the thing again, which is I like to rattle off a few statements and have the guests just say agree or disagree. You can elaborate if you want, but you don't have to. And just kind of compelling, intriguing, whatever statements just to kind of close the.

Patrick Spencer (37:02.407)
Well that's three of us, but we're in trouble.

Debra Farber (37:18.391)

Tim Freestone (37:27.702)
the podcast so I've got, I don't know, four or five if you're ready. So just agree or disagree. All right. So first one, most consumers willingly trade their privacy for convenience without understanding the true cost.

Debra Farber (37:44.936)

Tim Freestone (37:47.666)
Okay. Governments should have backdoor access to private encrypted communications to ensure national security. Okay, good. Anonymous data collection is a myth. Nearly all data can be de-anonymized with the right tools.

Debra Farber (37:56.204)

Patrick Spencer (37:58.663)
Ha ha.

Tim Freestone (38:08.214)
I knew that would be a tough one.

Debra Farber (38:08.912)
It's probably, it's an agree. I'm still just getting my own research around that. It tends to really support agree there. That if it's not today, that if it's anonymous today, it still might not be anonymous tomorrow is basically the way that I see it. You know, it's kind of like if something's secure today, it might not be secure tomorrow. So it's kind of, with anonymity, it feels a little more like an arms race.

Tim Freestone (38:18.547)
Yeah, okay.

Tim Freestone (38:36.978)
Oh, that's a good way to put it. Yeah. Um, so another one, social media platforms are more invested in user privacy than most people think.

Debra Farber (38:49.848)
I agree now, today. That wasn't always the case. However, I mean, the largest numbers of privacy engineers are in like the social media companies, and this is true because now they're trying to demonstrate that they can bring trustworthy systems to market. But...

Tim Freestone (38:54.602)
Yeah, because I got in trouble.

Patrick Spencer (38:56.795)
Ha ha ha.

Debra Farber (39:12.52)
I still think that a lot of their business model is privacy violative. So even though they might have all this great privacy research and expertise and bringing other products to market, a lot of just their ad-supported, violative surveillance business approach is one giant surveillance machine. So.

Apparently, apparently we only kick and scream when it's TikTok in China, but not when it's like, you know billionaires in the United States

Tim Freestone (39:42.492)
Yeah. Okay. So last one, we didn't talk about this at all, but I can't close the podcast without it. Artificial intelligence poses the biggest threat to personal privacy in the next decade.

Debra Farber (39:56.724)
Disagree. Honestly, I think the people bringing it to market have way more.

you know, bringing it to market without having the appropriate safety and testing done, is treating us right now as human guinea pigs. And that we've never done that in any other area. We don't do that in healthcare and research. We don't do that in any other area of research. And I truly think it's the people screaming about how it's going to be the, you know, the end of humanity and it's going to, I don't know, violate privacy or whatever, are the ones that are actually going to try to bring that about. Because.

It's like, if you think this thing's gonna eat your face, don't go and put it on your face. You know, it's like, I don't understand. I think that you could totally, I'm so for AI, but bring it to market in a way that like is ethical. And I think it's a ruse and it's a red herring in the way that they're screaming about potential future annihilation from it. All AI is, is stuff that humans have programmed and trained. So.

Tim Freestone (40:36.24)

Debra Farber (40:58.152)
It's a matter of what we're programming it and training it on. So that statement there is, I think it could be used and it could be dangerous and bad for security and privacy. I mean, we're already seeing new security threats as a result of AI. So again, it's a non-parade.

Patrick Spencer (41:15.299)
So this data privacy standards you talked about, if they're applied correctly, they would encompass AI.

Debra Farber (41:21.756)
If what was it applied correctly?

Patrick Spencer (41:23.019)
If those data standard, the data privacy standards we talked about, if they're applied correctly, they would cover the AI risk, in addition to all the other risks.

Debra Farber (41:28.661)

It's true. It's true. But if they were applied correctly, then the business model that like OpenAI came to market with wouldn't have been, wouldn't have happened. Right. So it was really like a disregard that I believe is on purpose. There's no way he didn't know. I mean, he was freaking Y Combinator CEO. It's not like he hasn't heard about other privacy issues that have gone on. This was like, in my opinion, just a decision that we want to get to market as fast as possible to win the AI, you know, tech race, and we'll do whatever it takes to get there. And they did.

Tim Freestone (41:36.543)
Wouldn't have happened yet.

Debra Farber (41:58.916)
And now it is kind of hard to put the genie back in the bottle, but they're the ones that did that, you know? So to scream that, oh, no, well, I have such existential, you know, there's existential harms that I'm certain that AI is, we need to prevent this is like, you know, a lot of chutzpah. You're the one that's actually causing the problems, right? So I am scared, but I'm not scared of AI as the technology per se. I'm scared about the unchecked powers that we're allowing to just

Tim Freestone (42:18.615)

Debra Farber (42:29.024)
use us as guinea pigs as they're bringing this out to market.

Tim Freestone (42:30.642)
Yeah, that makes sense. That's an interesting perspective. I like it. I I'm all good, Patrick. And let's get some. No, that's it. I. All right.

Debra Farber (42:34.86)
Thank you.

Patrick Spencer (42:39.747)
I was going to ask Tim, do you have any more? I think I counted four. You missed one. Usually you do five. So, Deborah, this has been a fascinating conversation. For those in our audience who want to check out your podcast, where should they go?

Debra Farber (42:44.896)

Debra Farber (42:55.932)
Yeah, so you could go to wherever you get your podcast. Just type in the Shifting Privacy Left podcast. But you could also go to shif I have to do a better job of keeping up that site, though. So.

Tim Freestone (43:10.4)

Debra Farber (43:11.724)
I've had about 54 episodes out. Most of them are pretty technical. It's for anybody though that wants to like learn more about privacy tech and privacy engineering. I myself am not an engineer. So it's really about me asking with my experience of seeing a lot over 18 years of privacy is really asking more nuanced hard questions where the rubber meets the road, not the high level.

you know, a stuff or unpacking a new regulation, right? It's gonna be something that is a little more technical. And so I think anyone could benefit from that, but I always keep the market for privacy engineers like that is my market that I'm trying to serve, that if I'm in service to that market, then it will flourish. And so far that's been the case. So if you are somebody that would like to be a guest, reach out to me at Debra at.

Shif or if you're interested in sponsoring, we are open to that as well. So, thank you.

Patrick Spencer (44:10.151)
That's great. And I know you're a strategic advisor for, we didn't talk about some of them, but four or five, six different entities. You can find those on Deborah's LinkedIn profile and simply click on them to get more information and you can actually engage with some of those organizations if you're interested in knowing more.

Debra Farber (44:16.949)

Debra Farber (44:27.368)
Yeah. Could you give me like a minute so that I could just list them? So Provato we talked about is static code analysis for privacy to build privacy into products at scale. You've got SecuVee, which is data discovery mapping, rights management, and fulfillment. They do also privacy posture management because they're doing continuous monitoring of your data stores and internal privacy risk. And then Prevaini, which is doing

Patrick Spencer (44:30.887)

Debra Farber (44:56.92)
privacy posture management externally. So think, it's different from, but think conceptually a bit site or a security scorecard, but for privacy and with a lot more, like a thousand points of data that they're looking at to determine your privacy posture and that of your entire business network. So that it's great for managing third party risk. And then PrivacyQuest, which is a gamified, the gamified platform, the playground for privacy engineers.

It's kind of think a CTF, capture the flag style competitions, but for privacy engineering and really upskilling both lawyers to understanding privacy engineering, but also upskilling engineers to understand privacy engineering. And so if you're interested in any of those, check those out as well.

Patrick Spencer (45:46.503)
That's great. Those are some really interesting endeavors. So I encourage your audience to check each of those out. And if you would like to watch other Kitecast episodes, you can go to slash Kitecast. Thanks for joining us today. Debra, we appreciate your time today.

Debra Farber (46:02.996)
My pleasure, thanks for having me.

Tim Freestone (46:04.694)

Patrick Spencer (46:04.775)