Transcript

Alison Dean (00:10):
Theorem is the leading innovation and engineering firm for the Fortune 1000. We design, build and deliver enterprise-scale technology solutions and are very excited to present the Breakthrough Podcast, an ongoing series where we interview technology leaders to share their experiences and perspectives on what's next in tech. 

Alison Dean (00:37):              
Welcome to the Breakthrough, I'm Alison Dean, VP of operations at Theorem, and today we are talking with Kelly Huang, the queen of breaking down data silos and currently VP of product at Ethyca, a venture-backed, automated data privacy SaaS company. Kelly sent me this quote from Confucius, “Our greatest glory is not in never falling, but in rising every time we fall.” So hello Kelly. 

Kelly Huang (1:06):
Hello, Alison. Thank you for having me.

Alison Dean (1:09):
Of course, of course. What does that quote mean to you? 

Kelly Huang (1:12):
Honestly, I think that in my personal and professional career, it's been really hard to first off have the personality I have in software. When I started off in software, I was doing engineering. Then moving from engineering into product, before it was kind of en vogue to do so, having cards stacked against you, starting all over from the bottom, but that's basically what I've had to do through software engineering and then through product management. Finally, I feel like I've finally gotten my rhythm, but things happen in software all the time and things don't always go as planned and you just gotta pick yourself up and keep going.

Alison Dean (1:47):
Forge ahead.

Kelly Huang (1:48):
Forging forward, that's what it's about. 

Alison Dean (1:50):
All right. I want to take things back to your college years. Since you graduated with a biomedical engineering degree, I want to talk about how that led to your current career in product. 

Kelly Huang (2:03):
So I actually started in biomedical engineering because I thought that I wanted to go to med school. I thought I was really smart with a big brain. I wanted to go to med school, be a doctor, but I recognized that my real skill was solving problems and not necessarily actually solving them. And so I actually took my engineering part of that biomedical engineering degree and went and worked in software. And I went to work in hospitals. I was actually gathering requirements for electronic medical records, which were EMRs, I would go to the floor, watch doctors and nurses and techs working really hard at doing what they do and figuring out how we can make the electronic medical record work for them rather than against them.

A lot of times when folks go from paper and make a real digital transformation into electronic medical records, they think that the odds are stacked against them. And so yeah, I went from collecting all the requirements on the floor, watching physician, nurses, techs work on what they do, and creating software as an extension of them for helping them to document patient care and make sure that they're providing the best possible patient care and also transitioning care to other providers as well.

Alison Dean (3:08):
You've been responsible for the product vision and monetization strategy for B2C and B2B digital products at multinational Fortune 100 companies. So what is the most crucial aspect of a company's IT strategy in your opinion? 

Kelly Huang (3:25):
I would say that the most crucial aspect of a company's IT strategy I think is actually intentional planning, being intentional about deciding what to build, and looking down the road at potential needs for your company. Not everyone can see around corners. Who can see into the future? But I think that being intentional about what you're building today helps to make a plan for the future and it truly future proofs some of the aspects of the software that you design and potentially other IT decisions you make for your organization. 

Alison Dean (04:02):
What does digital transformation mean to you? 

Kelly Huang (04:07):
From my time doing hospital EMRs, I got to see firsthand an entire hospital in Upstate New York go from all paper to fully digital. And now today in 2021, we walk into a hospital, you have a doctor that's sitting on their computer, they're not making eye contact with you sometimes, but we take it for granted that hospitals are no longer on paper. In my career, I've seen it and we actually moved them fully digital. And I would say that moving things digital, it's an art and a science.

One does not simply turn a business, an organization, an enterprise from all paper to a super modernized, hyper forward-thinking technology company. Truly, I think that folks have the wrong idea when they think that they want to modernize their tech stack or they want to transform their tech stack because they think it's just going to happen overnight. They can snap their fingers and all of a sudden they're digitally transformed, but they're not. It takes planning with intention and ensuring that you're doing the right thing for your business in the way that your business lives and operates today and then looking down the road a little bit to make sure that you're planning for tomorrow. 

Alison Dean (05:15):
So to piggyback on that, do you feel like that was the first large-scale project that you were involved with?

Kelly Huang (05:22):
Yeah. That was definitely the first of the large scale projects that I worked on, for sure. 

Alison Dean (05:28):
What were the most intriguing aspects in your opinion of that EMR transformation project?

Kelly Huang (05:35):
Watching physicians, nurses, techs, MAs, everybody in their natural elements, watching them deliver care to the best quality that they could. This is the earliest and most primitive form of user research, just gathering, watching, observing, just standing as a bystander and watching them go through their normal workflow, then figuring out how automation can help them to be an extension of themselves and how software can help them deliver the best patient care possible within their normal workflows and thinking critically about what their needs might be next.

Because when you think about anticipating needs of providers that are on the floor, treating patients every single day, you have to think about the kind of known unknown as opposed to the unknown unknowns. They're going to need a list of medications, right? They don't know that they need a list of medications, but you know that they might need a list of medications in order to provide patients some follow-up care, right?

Being able to anticipate those needs gave me one of the biggest competitive edges, frankly, as a product leader and seeing live how people use a product, getting instant real-time feedback, being able to go back to my office and then program a change to that instantly and being able to go back the next day and see it done all over again instant feedback loops. It was the OG agile methodology but without a scrum team.

Alison Dean (06:57):
What were the big problems that you recall from that project? 

Kelly Huang (07:01):
Because there was no actual development team, it was basically a development team of one per hospital unit. I would say that a lot of the challenge that we faced was that you were the decider, you were the developer and you were the QA all-in-one. Wearing all of those hats honestly made it incredibly difficult. It was a 300-bed hospital. And when you think about what the patient volume is for a hospital like that, you have to think about the quality of care. These are real people, real humans' lives that these people are taking care of and you have to think about how your work actually impacts that. 

And so being responsible for the full end-to-end solution, I think gave me a very rude awakening for really how quick I could move but also it helped me to really learn teamwork. I had peers that were working on other ambulatory units or other hospital floors. And even though they weren't working directly on my section, I would be able to counsel with them and they would be able to confer with me on the best practices for being able to implement the medical units. 

Alison Dean (08:06):
So now let's shift to the most recent large-scale project that you worked on. What did that involve and what was most intriguing to you about that? 

Kelly Huang (08:15):
So when it comes to large-scale projects, I've worked on a lot. I worked at Activision Blizzard on the global accounts and e-commerce platform. We served almost 1 billion clear accounts worldwide. After that, I worked at Capital One. I was working on the developer exchange, which was the first open banking API platform in the country, basically providing services to the likes of Expensify, mint.com, Abacus, all these third-party products that want to know and understand your bank transactions so that they can help you make better financial decisions, or they can help your business run smoother.

But I will say my most recent large project was probably when I was consulting. We were working with a really large multinational company and they were looking for a full-on digital transformation for an entire business unit. We were very interestingly positioned to be a third party that came in and assessed the landscape of what their current book of business was and how they were running their existing operations. 

And honestly, it felt like my time at the hospital all over again where we got the privilege to be able to observe and watch folks do their jobs in their natural habitat, doing the best that they can with the software product that they were currently operating with. These folks were doing an incredible job. They work hard, they come in, they work over hours, night times, these folks are hustling. 

But being able to observe them, interview them, talk to them about their existing solutions and truly understand what their pain points were, and realizing that in order for them to do their best work, they needed their software to be an extension of them. Does this sound familiar, maybe? They needed their software to work with them and not against them. And what it took was understanding what those pain points were in the existing software solution, understanding that sometimes the data they need isn't always at their fingertips, sometimes the data that they want is at their fingertips, they just don't know how to sort through it.

And the software had no opinion. And I use this term lightly because in software sometimes you don't want your software to be too opinionated. But in their case, the software was exactly what you didn't want it to be, which is not opinionated at all. So it served too much information and made it difficult to digest and comprehend information, find the information you need quickly, especially when time is of the essence. Frankly, it felt like Groundhog's Day because you get to watch people doing the best thing that they can with the tools that they have. 

And it was honestly one of the most fulfilling digital transformation projects, I think I've done being able to modernize an entire business unit. Honestly, we didn't change the workflows, we just enhanced the workflows with the new software that we rolled out to them. And really it was a change in mentality, an acceptance of being able to rebuild and still be excited every step of the way. 

Alison Dean (10:57):
And so moving on from that project, what did you take away that you then applied into your role at Citizen?

Kelly Huang (11:07):
At Citizen, we were building from the ground up a marketplace for cancer patients to come to, to find clinical trials. Since I was at the company, they had actually pivoted now to handling folks with not just cancer, but also rare diseases. They were using a very underutilized part of the law called patient right of access. And that is the principle that you, as a patient of the American healthcare system, have the right to access your personal data from any place that you've received care. 

You, you are entitled to it. If a hospital has information on you, you deserve to know what information they have on you. It's a very interesting dichotomy because patients don't know that they have this right, and we were exposing that to them. Because in the sea of legal jargon, for patients that are the sickest of the sick, they don't want to go find a lawyer, seek counsel and figure out a way to figure out how they can go access their data, they just want it. That's all they know. 

And in every digital transformation product, folks on the front lines want to do the best possible job they can. And they know their job, they're so darn good at their job that they just want to figure out how using the tools they have they can just do it without having to go and scour a manual or read instructions or find data. And it's about understanding those end-user needs and their never-ending quest for data, for their data and for the data that they need and surfacing that as a software solution, rather than trying to do a top-down thing of saying, "We need digital transformation," without being intentful as to why you need it.

Alison Dean (12:42):
So in your experience, what has made for the most effective cross-functional teams? Because you've certainly worked with many departments. So has there been a theme that you've seen that really makes for the most effective version of that?

Kelly Huang (12:58):
I will say high communication is probably one of the best ways to communicate cross-functionally, but this is kind of just the theme of my talk, intent matters. There are certain things that you need to know on a consultative basis. So do you give the other team the firehose of things that they maybe don't need to know? So being intentful about the communication that you need to communicate. 

And also one of the things, especially at Theorem that I've actually carried on through my years after Theorem is the concept of extreme ownership. It sounds so silly and crazy when you think about the two words alone of extreme ownership, but it's so funny, the book Extreme Ownership, it actually kind of dispels and almost makes the title more approachable, honestly, because it's not like, "Oh, gung-ho, yes, own everything." It's really about knowing and understanding where your borders are and owning the crap out of it. 

Because when you know and understand what your borders are and you can do that to the best that you can, other teams cross-functionally are not just leveling up as well, they're not just motivated to do the same, but also you're broadcasting the work that you're doing to them, you are broadcasting communication, you're also broadcasting quality. And I think that that keeps cross-functional teams healthy, it keeps great working collaborative spirit amongst cross-functional teams, especially small teams, but on large teams what that also shows is that business units cross-functionally are able to in highly matrixed organizations, they're all leveling up the organization as a whole, and that I think is the most important thing in working cross-functionally for sure. 

Alison Dean (14:42):
Okay. So Kevin Gray, who is the CIO for the city of Burbank, California, was recently on the podcast. 

Kelly Huang (14:50):
Cool. 

Alison Dean (14:51):
And he has this question for you. “What are the most critical skills and most critical knowledge needed to be successful in digital product management?” 

Kelly Huang (15:00):
Digital product management has a lot of definitions these days. And it spans from creating a product strategy, creating a product vision, selling and pitching the product vision, all the way down to the tactical execution level of I'm a scrum product owner, I wear that hat sometimes, sometimes I put on the scrum master hat. 

So no matter what hat you wear, what I will say the most critical part about that role is selling and pitching a vision, getting buy-in through the vision that you pitch to your stakeholder, your customer, your peers, your engineering team, a client, whomever, you yourself grokking and understanding what that vision and pitch is, distilling it down to the most comprehensible version of that and then pitching it and truly getting buy-in. I would say that far and beyond, hard skills of knowing how to do mock-ups and how to write a great user story. If you can understand what the vision is and the bigger picture is, you are going to be successful as a product manager. 

Alison Dean (16:00):
Good answer. You shared a stat in a UX presentation that you did for Neuron. 97% of Americans are asked to approve privacy policies, but only one in five adults say that they always or often read it before agreeing to it.

Kelly Huang (16:15):
Horrifying, isn't it? 

Alison Dean (16:16):
Yes. And also I think everyone's lying and they don't actually read it so.

Kelly Huang (16:20):
Same. 

Alison Dean (16:21):
Like who is that one? 

Kelly Huang (16:22):
It's a lawyer, it's definitely a lawyer. 

Alison Dean (16:23):
Yeah. It's all the lawyers of the world. So now that you're in the data, privacy realm every day, I want you to explain data privacy for all the people and why people should care about this if they don't already. 

Kelly Huang (16:41):
So what is data privacy? Let's start with that, the basics here. And I do want to actually start with this because whenever people think and say the word privacy, they think of security. And I will say the two definitely go hand in hand. The two things are tightly related, but they are not the same. If you think about it, security is basically what we know and what we hear about in the news a lot, which is, “Has this company breached my information?” We think about security, conflating it with privacy because they are actually breaching personal information. 

Now that breach could have just been an absolute number of likes and that would still be a breach of information, and that's still a threat to cybersecurity for those various companies. The way I think about it is that security provides bars on windows, locks on doors so nobody can get into your house or your apartment or your flat, whatever it is. Privacy, however, is putting curtains on those windows.

Or maybe I'm just very, very hyper-aware that someone's looking into my windows because I brought those bars on them and they just think that looks funny so I'm going to put a mask on, no one's going to know it's me. So in my mind, the difference between data privacy and security is that security truly prevents people from getting in whereas privacy prevents people from knowing and understanding who you are as a human, as a person, as a digital citizen, an internet citizen, that's what data privacy is. So what was the second part of your question? 

Alison Dean (18:10):
Why should people care about this if they don't already?

Kelly Huang (18:13):
That's the million-dollar question, right? You go online, you whisper something near your iPhone, you Google Search something and all of a sudden that ad is following you around Google, the internet, Reddit, Instagram, TikTok, wherever you are, that word has been whispered into the ether and now it's just following you everywhere. Has anyone ever thought about how that happens? 

People are like, "Oh, my phone must be listening to me." Right? But no, no, no, no. It's much more low-level than that. Your phone isn't actually listening to you. Although if you do have certain microphone options, it might actually be listening to you, so yeah. But the data privacy matters because if you care about where your data goes, how it's being sold every single minute to data brokers, advertisers, you name it, data privacy should matter to you.

And just knowing that your information is being sold, that advertisers know without even knowing who Alison Dean is, that they can target you for ads of people that look like you, have the same shopping and buying habits as you, live in the same area as you, that information, that audience building came from a whole bunch of other people that look like you feeding information to those advertisers so they could build those audiences to serve you ads. And guess what? Those businesses, they're paying for it. They're paying for it and they're buying that data so that they can build audiences for people that look like Alison Dean. And that's why data privacy should matter because it's not just that oh, I think these ads are annoying.

Alison Dean (19:45):
Right.

Kelly Huang (19:45):
With the privacy policy stat without even reading, you are signing away the rights for this online company to use your personal information, your name, phone number, date of birth sometimes, sometimes buying habits, whatever it is about, you're allowing these online companies to have this information about you and allowing them to sell that information to advertisers. 

And I'm not going to get all Andrew Yang or anything on this, but the fact is it's like some folks are demanding accountability for that. And whether it's accountability in cost, like I should be paid or otherwise compensated for me selling my data to these third-party companies or demanding accountability via laws, regulations. And obviously, we have a number of legal regulatory guidelines popping up all over the world, LGPD in Brazil, GDPR in the EU, CCPA, California, Virginia just passed a law, and Florida's got one on the docket now too. 

I mean, there's over a dozen states in the United States that are looking to pass these laws because they also agree that consumer data privacy matters. And if these online third-party privately held companies aren't going to demand accountability from each other, then the law feels like they have to interject as well. 

Alison Dean (20:55):
Is there anything we as consumers can do, or is it just the hopes that the law is going to intervene and protect us? 

Kelly Huang (21:03):
Yes, there are things you can do. Demand your information, know and understand what these companies store on you and what they do with your data. Just be an informed consumer. People will do more research on buying a couch. Don't get me wrong, couches are expensive, I get it, I get it. But people will do more research buying a couch, buying a book, buying a water bottle, then they will open an online account and giving their phone number, first name, last name, email address, physical address.

So all I ask of online citizens is to just be informed and stay informed because if you know and understand what a company is doing with your data, you can take action. And that comes in the form of maybe exercising your rights as a California state resident under the CCPA or maybe that just comes as reading the privacy policy, skim the privacy policy. To me, it's all about being an informed consumer. And frankly, as I said before, tying it all through with healthcare, patients, just want care.

They don't care where they get it, they don't care how they get it. They just want the best possible medical care they can get for whatever condition it is that they're seeking the care for. They don't want to sift through legal paperwork. They don't want to go and try to figure out their rights when they're in their most vulnerable moments. Not to say that internet citizens deserve a similar right, but internet citizens should be equally as informed and it's a question of how we educate ourselves about it. 

Alison Dean (22:29):
It sounds like this is an opportunity for a new app, I'm just going to say, some kind of a privacy policy cliff notes creator for you. 

Kelly Huang (22:39):
Auto summarizes the privacy policy. Right when you're about to press create an account it says, "Whoa, whoa, whoa, whoa, are you sure you want to create this account?"

Alison Dean (22:47):
Here are all the things that you need to know, here you go. One, they will seal all of your banking. So it sounds like the moral of this story is people need to pay attention, perhaps treat their info a little bit more sacredly. Certainly, I definitely am someone that puts information in certain places and-

Kelly Huang (23:08):
Yeah, I mean, and it's crazy because you buy a bad water bottle on Amazon, you can go and leave a nasty review and maybe someone will read it and maybe they won't also go buy that water bottle, right? But if a company breaches your information of 150 million users, what accountability can you demand from that? What recourse is there? You can't just go and leave a bad review about Facebook on Facebook. I mean, maybe you can, but who's going to quit Facebook because of that? Don't get me wrong, there are some people that will, but it won't have enough recourse to Facebook for them to take any meaningful action from it. It's unfortunate.

Alison Dean (23:40):
100%. And actually I think that this segues well into user trust center design. You had an example in one of your presentations, and I want to say it was Facebook. You could highlight certain things that popped up to give you some context around the information that they were asking for, but it wasn't great. Let's talk about a better example of user trust-centered design and why this matters for us as the internet users of the world. 

Kelly Huang (24:09):
And honestly, California is doing a great job and the Data Privacy Commission over in the EU is doing a great job of advocating for this principle of privacy by design. Ann Cavoukian, excellent academician in the privacy space, she basically pioneered this concept of privacy by design, building user experiences based around exposing and being very transparent to the user about what data you are collecting and truly creating user trust in the systems that you have built. 

And so when I think about an account creation experience, they ask you for certain information, maybe first name, last name, email address, phone number. Do you ever wonder what they're going to do with your phone number? Why do you need my phone number if I'm going to give you my email address? Or why do you need my phone number at all? Also, why do you need my name? Maybe the service I'm signing up for doesn't necessarily need you to call me by my first name, some are not going to pick up the phone or write me a personal email calling me Kelly.

So I really think that user trust-centered design is the way forward and federal agencies like the EU and California are demanding accountability from online companies to say, "Hey, guess what? We're banning what's called UX dark patterns." And this is truly the future of data privacy. If you have an intentionally misleading design that would effectively coerce the user or mislead the user intentionally so through a flow where they might accidentally buy a product because maybe the color is nice and bright and green, and the cancel button is an outline that just says cancel or no, certain UX patterns cause and erode user trust, and some people don't even recognize that. 

In fact, some people actually steer the exact opposite direction where they actually get hooked. The concept here stems from a very deeply rooted gamification model. I also worked in games. And in-game experience, it just latches onto the serotonin centers in your brain and causes addiction and all sorts of gambling habits and stuff like that. But basically, some of these design patterns are training their users to say, "I always want to hit that green button, I always want to hit that green button."

And then when it's time to go another round because you died in a game or you've run out of jewels or jelly beans or whatever, then you press the green button, it'll spend an extra dollar to buy you 10 more lives or something. Those micro conditioning user experiences are truly what is starting to erode user trust because then you find yourself in a checkout flow where you're pressing the same button and all of a sudden you're buying something or you're just surfing the internet, you're seeing certain things and now all of a sudden you're giving your information away and you had no idea. These design patterns are truly about misleading and intentionally tricking and manipulating the user to do things that the company wants them to do like buy or convert.

Alison Dean (27:01):
Right. Do you think that there's a company that's doing this well? 

Kelly Huang (27:05):
Everyone can be doing more, let's just put it that way. 

Alison Dean (27:10):
Word, okay. Privacy governance in the software development process, what are best practices in that realm? 

Kelly Huang (27:17):
We started this conversation talking about security and data privacy. And it's so interesting because we've seen best practices in cybersecurity start now at day one. Whereas 10 years ago, if you thought about cybersecurity and about best practices of not letting malicious actors into your systems, people weren't thinking about like, "Oh, maybe we shouldn't keep passwords in our code." People were thinking, "Oh crap. Someone is actively trying to break into our systems right now, what the heck do we do?"

So cybersecurity has taken an excellent stance in what the software development industry calls shifting left. And what that means is that they take cybersecurity which previously 15 years ago was actually thought about after they've released software into the wild, and after they've put up all of their web application firewalls and all of their subnets and DMZs, and they've put up all of their protections to say, "Hey, we won't let anyone in now.” After the software has already been released.

They weren't previously designing their software for security breaches. But you know what? Now they are. And over the last 15 years, the cybersecurity industry has truly embraced building software with cybersecurity at top of mind. And they've done a great job, especially with this latest batch of engineers that we've been hiring, they've been doing a great job bringing up and instilling the truths of why cybersecurity is so important in software engineering. 

And that resonates all the way through to when you actually deploy your software to production, when you're actually having millions of users using your product. They translate very, very well once it's been deployed. Now in data privacy, we haven't really gotten there yet because data privacy right now, it's for the woke, it's for the folks that are kind of at the cutting and bleeding edge of understanding where their data is used and how it's being used. 

In my opinion, that's why decentralized currencies and decentralized models are coming up very strong now because people don't want everything centralized and staying in one place where one malicious actor could potentially access hundreds of millions of people's data. So data privacy needs to shift left. And building software with privacy in mind is where this industry needs to be moving. I am proud to say Ethyca, we are on a mission to build the trust infrastructure of the internet. 

And the only way you're going to be able to do that is to build software with data privacy top of mind, because if you think about data privacy from the start after you deploy your software, your lawyers aren't trying to clean up after you. They're not trying to figure out how to reduce the risk for you once your software has already been released to the wild. And I'm not talking like sit down and have a beer summit with your legal counsel, with your general counsel, I'm talking truly and honestly creating a set of policies about how to access personally identifiable data of your customers, creating access policies about it, who can access that data, what applications can access that data, registering your applications. 

I can't even tell you how many companies in the world don't even know where all PII of their customers live, it's crazy. Personally identifiable information should be your company's most sacred asset because if you don't hold that, you lose the trust of your customers and not just that, security breaches, right? Anyway, so thinking about involving your legal counsel at the beginning, and then in the implementation, understanding how you can build in gates and build in policies around how to access personally identifiable data all the way through the deployment life cycle, understanding your risk of deployment, understanding how much PII we're going to be collecting and exposing ourselves for risk. 

Alison Dean (30:46):
Right. And also to piggyback, I'd like to talk about my new favorite word of the day, pseudonymization.

Kelly Huang (30:54):
Pseudonymization. Yes, ma'am.

Alison Dean (30:56):
It just flows, doesn't it? I want you to talk about this as it relates to governance and how companies are doing this effectively if they're doing it effectively.

Kelly Huang (31:06):
Here's the dry stuff, right? You, Alison Dean, have the right to request that your data be deleted from any online company because you are a resident of California. So you could go to any place that you go shopping online as a resident of California given that they get more than 10% of their site traffic comes from residents of California. You should be able to go to their privacy policy and request that they delete your information permanently, gone.

Kelly Huang (31:27):
Because you as an informed internet citizen don't want them to sell that data out the back door to advertisers to continue to monetize based on your personal information. You should be able to go to that and exercise those rights. Now how that company does that is a completely different question because you basically have to take them at face value that they said that they deleted it, you just got to believe them, right?

There are obviously ways that you could go in and find out, you could attempt to log back in if you had an account with them and see if they still have their information there, you would know maybe if you're still receiving marketing emails from them, getting promotions and stuff. Well, they've definitely got my information somewhere, right? But what most folks don't really understand about the concept of deleting information from a database or from a system is that sometimes it actually causes a lot of detriment to the very fragile ecosystem.

Kelly Huang (32:18):
So imagine this. All plankton out of the ocean, that ecosystem that feeds on plankton, that relies on plankton, that has to have symbiotic relationships with plankton, there's a disturbance in that ecosystem basically. Effectively, that's what most people's data ecosystems are like. You delete a row from a database, you delete information from an upstream application or downstream applications and data warehouse or analytics process, you could potentially be breaking the ecosystem that relies on your email address to take another action downstream or a date and timestamp that maybe they just deleted that is effectively now breaking some sort of recurring process.

Pseudonymization is a way that we can potentially thwart this kind of brute force, delete things, wipe things off the face of the earth, an approach that most companies would generally take when they think of someone who wants to delete their data. So pseudonymization is effectively taking that personally identifiable information and masking it in some ways, setting it to some sort of fixed value. Like instead of it saying, Alison Dean, it could say Smalison Smean. 

Kelly Huang (33:19):
No one would know that Smalison Smean actually is Alison Dean because it could be Kelly Huang. I don't know, no one knows what the original values were of that. And additionally, they could take your email address and they could, with a very secure cryptographically secure hash, they could hash your email address and it would no longer be comprehensible to really anyone looking at the database in plain text. 

So any sort of pseudonymization is actually more helpful because if you consistently pseudonymize across the database, then Smalison Smean is going to be found throughout the database, and therefore those relationships to different parts of the system are no longer broken. Because I didn't just wipe Alison Dean off of a row in a database, I instead made it Smalison Smean so that other parts of the database can now refer to those attributes. 

Alison Dean (34:06):
How come everyone isn't doing this? 

Kelly Huang (34:10):
Because it's hard Alison.

Alison Dean (34:12):
Sounds like the greatest idea ever Kelly and everyone needs to be doing it yesterday. 

Kelly Huang (34:17):
Yeah, it is crazy because you talk to a lot of data architects and I talk to a lot of engineers every day, a lot of our customer sites. You know, they are thinking about it. These people are thinking about this problem. And privacy engineering is actually an up-and-coming concept in the software engineering world. Everyone should be doing this. 

And frankly, whether or not you have to abide by the law, when it comes to safe deletions of data, this might actually be a great action for any customer to take whenever they just need to delete data from their data ecosystem because it maintains that what we call referential integrity, which is that all the other downstream systems can find Smalison Smean because they have also pseudonymized her row as Smalison Smean. But to do that consistently in a replicable process, it's hard.

Alison Dean (35:01):
I just had another app idea, so we'll have to talk about that after.

Kelly Huang (35:05):
I can't wait, I can't wait. 

Alison Dean (35:08):
So I want to talk leadership now and I want to ask you, what are the biggest lessons that you've learned from being a leader in technology? 

Kelly Huang (35:17):
I would say for me personally, I've had a pretty untraditional upcoming through technology in my opinion. I started, as I said, in biomedical engineering working in hospitals and then I just decided I wanted to try on software. And I will say something that I've never believed in just from seeing other leaders and organizations that I've been in, something I've never really believed in is kind of this top-down thou shalt not or thou shalt something. 

I really believe what it means to lead is really what it means to serve. And frankly, when you lead, you don't feel like you're leading, you feel like you're working in service to everybody else. And that's what I love about what I do. So I'm not even going to lie. I don't even feel like I'm leading anything. I feel like I am working in service to the folks that work at my company and the folks that I interact with on a daily basis for my customers and for all the internet citizens of the world right now, working in service to them because it doesn't take someone that's a self-proclaimed leader to actually lead, it takes someone that wants to work in service to everybody else to be a leader. 

Alison Dean (36:18):
I love that. So it sounds like you have a servant leader approach. Has that been a thread throughout your career that you've noticed or has your style evolved through the years? 

Kelly Huang (36:32):
I would say that's probably the most consistent thread from my early days in my career from school. Working through school also being very active in organizations, you learn that stuff very quickly, that it's not about just telling people what to do all the time, it's about working in service to them so that naturally people will want to follow and people will naturally want to give because they see you giving 110% staying up till 11:00 to make signs for the next day or working for a charity and volunteering and still hustling your ass off. 

They see that and naturally, they want to do that as well. It's not some weird culty type of thing, it's really just the nature of human belonging in my opinion. I would say that's the most consistent thread of any quality trait I think I've had over my career, and honestly, I'm proud of that, never changed. But product management hard skills are the things that I think have evolved for me the most. It's hard. I mean, product management, it's changing every single day. 

I personally believe that product management is one of the hardest disciplines because you have to know how to speak so many different languages, language of business, language of engineering, language of design, language of pitching. You have to know all of those because you work in service to all of these other organizations. And truly, that's why I believe that my success in product has really been based on being in service to everybody else. 

Alison Dean (37:57):
Got to love it Kelly. All right. How do you encourage innovation within your teams?
 
Kelly Huang (38:02):
Generally speaking, I like to foster a blameless culture amongst teams. I've not had to be responsible for directly HR managing or anything in the engineering team, but you get the most out of engineering teams, software development teams, business teams, marketing teams when you give everybody the freedom to think outside of the box. And the only way for people to go there is if they know that they can. I've been at big companies, I've been at super small companies.

It's kind of a punch in the gut to say like, "Stay in your lane, do only this one thing. If you get out of that, I'm going to wag my finger at you and write you up in a bad performance review." It's really easy to fall into the trap of stay in your lane and then realizing that there are repercussions if you don't. And so giving people the freedom to explore other opportunities, other ideas in the world of software development, PoCs, proof of concepts are a really great way to do discovery on an organic idea that's come up, additionally, in doing discovery projects. 

Discovery projects are a great way to just understand how you can evolve your business processes or your company. It's a way for you to just understand where your opportunities are, to grow as a company and as a leadership team, a management team, a software development team, a multinational enterprise, doing those discoveries and maybe investing some time and energy into third-party agencies to do that makes it so that you can get a completely objective outside opinion on how to really better optimize and innovate as a company. 

Alison Dean (39:35):
What do you want your direct reports to remember you for? 

Kelly Huang (39:41):
Ooh, all my direct reports already remember me for my air horn. I'm looking at you, Theorem software engineering team who all remembers me for my air horn. But more seriously, I would say that I want my direct reports to remember me for the things that I taught them. And honestly, I think that that's the greatest gift you can give to anyone, is to teach a new skill, whether it's consciously or unconsciously, other folks learning from you, I think that that's one of the greatest gifts that you can give. And I just hope that all of my directs someday will say that they learn something from me.

Alison Dean (40:18):
I love that. What are the most important lessons that you've learned from your mentors? 

Kelly Huang (40:23):
Gosh, so many things. I truly attribute everything I know today to some excellent mentors. It's a very interesting time to be a woman in tech right now, I will say, also a very interesting time to be a BIPOC in tech. I stand on the shoulders of giants. I truly believe that a lot of the folks that I've learned from men, women, purple, green, snail, whatever, everyone that I've learned from has taught me that in order to succeed in this industry, you need to be true to who you are, be a delight to work with, and also again, just be yourself. And I'll never not do that because they taught me that you can succeed doing that. 

Alison Dean (41:01):
I agree with all of them. Okay, couple more things. I want to know what future innovations you're excited about. 

Kelly Huang (41:10):
I don't know if you've been following this, but with the most recent Apple updates, they are banning apps from entering their app store that they believe have, as we were talking about before, misleading consent design patterns so that if they feel, and again, privately held software companies are going mostly unregulated and so far as being able to say and be the judge and the jury of what's fair for consent but I think it's really interesting with the recent update that they're saying that you can't have apps with certain misleading consent patterns on their app store. 

Kelly Huang (41:42):
Additionally, now they're blocking certain personal information with the latest iOS update from being used in apps. And so this obviously is detrimental to folks like Facebook who rely on that information to be able to drive ad revenue for the company. Now, Apple and Facebook are at war. Now don't get me wrong, seeing Zuck sweat is a very interesting thing to me, but what's even more interesting to me is actually what the future of ad tech will bring. Because advertisers and the technology behind it, I'm talking hands to keyboard, coding, and true innovations there.

They're going to have to get real creative real fast because as more government agencies are demanding accountabilities from these third-party and privately held internet companies, man, it's going to be a hard battle to start to pry that personal data out of the hands of these internet companies and I am very interested to see where ad tech goes after this. It's spicy, it's spicy. I love it. 

Alison Dean (42:41):
Can you talk about a breakthrough that you've had recently? 

Kelly Huang (42:45):
Wow. Yeah, I will say that I think that kind of in a similar vein of what we were talking to earlier, I think that data privacy is finally coming to the mainstream and that we are on the rising tide of a time in internet history if you will, and civilization where people will begin to care about their data and the future generations ought to begin to care about their data. I think that the major breakthrough for me was recently, we're now talking to very large multinational companies as our customers. 

Kelly Huang (43:19):
I never would have thought that I'd see that day where some of these very large billion plus user platforms are coming to us and asking for us to help. People are definitely becoming aware. And as people become more aware, I think that's going to pose challenges for not just the current generation of people building the internet, but the next generation of people that will build the internet. 

Because folks coming up in gen Z and below, they don't know what life is like to not give your information away and to not have the sanctity of data privacy. And I think that a real challenge is going to be convincing and maybe there might not be any convincing needed given that some of these companies are now bringing this to be a feature of their own platforms. I think a challenge might be convincing some of these newer generations that data privacy could be a problem for them.

Alison Dean (44:12):
Do you have any final thoughts?

Kelly Huang (44:14):
At what length does a chicken finger become a chicken tender? That's what I would really like to know. No, I mean, I've talked a lot about data privacy today because it's what I think a lot about. And frankly, the future of data privacy, it's scary and exciting all at the same time. I just think it's so funny that we're all very concerned about these problems that we know about, but we're not really concerned about the things that we don't know about.

Data privacy, it's a problem that we know about, we're just not concerned about it. So if there's one thing I hope folks will take away from this, is that just be informed, be an informed internet user, care about your customers, and care about data privacy because being a good neighbor and the golden rule that treating others how you yourself want to be treated, treat your customers’ data how you would want your data to be treated. You don't want that data floating around anywhere, neither do your customers. That's what I would part with.

Alison Dean (45:10):
It's so simple, right? 

Kelly Huang (45:12):
So simple. 

Alison Dean (45:13):
Thank you, Kelly. It's good having you with us.

Kelly Huang (45:15):
So good to be here. 

Alison Dean (45:16):
As I predicted, that was a very thought-provoking conversation. So thank you, thank you. 

Kelly Huang (45:21):
Thank you. 

Alison Dean (45:22):
Thank you for tuning into the Breakthrough, brought to you by Theorem. Make sure to hit that subscribe button and leave us a comment. You can find us wherever you listen to podcasts. And for more great content, follow us on Twitter and Instagram at breakthrupod, that's break T-H-R-U-P-O-D. I'm your host, Alison Dean, until next week.

The Breakthrough Podcast

Subscribe and uncover insights from leaders across industries.

About the Guest

Kelly Huang

VP of Product

Ethyca

Kelly is the VP of Product at Ethyca, a venture-backed startup building the trust infrastructure of the internet. Ethyca’s platform powers data privacy for businesses all over the world facing consumer privacy regulations such as GDPR and CCPA. She's also an alumna of Theorem, with over 12 years of software development and product strategy experience in fintech, ecommerce, triple-A game development, and healthcare IT. She has been responsible for the product vision and monetization strategy for B2C and B2B digital products at multinational Fortune 100 companies. Kelly is passionate about building great products with responsible intent and is an amateur pizza connoisseur.

  Follow on LinkedIn

Let's Connect

We help our clients build great things and grow through challenging ourselves continuously.