BOOK THIS SPACE FOR AD
ARTICLE ADLarry Cashdollar, senior security response engineer at Akamai, talks about the craziest stories he’s faced, reporting CVEs since 1994.
Larry Cashdollar, senior security response engineer at Akamai, has been finding CVEs since the 1990s, around when MITRE was first being established. Since then, he’s found 305 CVEs – as well as various security findings, such an IoT bricking malware called Silex, and cybercriminals targeting poorly secured Docker images.
Cashdollar shares his craziest bug finding stories, including his first flaw (CVE-1999-0765) found during his position as a UNIX Systems Administrator, which accidentally threw a wrench in a demo for a Navy Admiral on the Aegis destroyer class ship.
Beyond his own personal stories, Cashdollar shares the top pieces of advice he would impart on today’s security researchers and those hunting for vulnerabilities. Listen to more on the Threatpost podcast.
For the full podcast, listen below or download here.
Below find a lightly edited podcast transcript.
Lindsey O’Donnell Welch: This is Lindsey O’Donnell-Welch and welcome back to the Threatpost Podcast. I am joined today by Larry Cashdollar, who is the senior security intelligence response engineer at Akamai. Larry has been conducting security research and finding vulnerabilities since 1994. So he can really give a sense of what has changed in the industry in terms of finding and reporting bugs as well as the threat landscape. So Larry, thank you so much for joining me today. How are you doing?
Larry Cashdollar: Good. How are you?
LO: I’m good. Good. I know we were just talking about this. But we’re getting some strange weather here in the northeast, very warm for fall.
LC: yeah, it’s been it’s been wacky.
LO: Definitely. Well, so Larry, just to start, can you tell us a little bit about yourself and how you first got into the security space?
LC: So I was studying computer science at the University of Southern Maine back in the 1993 timeframe. And I had a friend who was in the Linux users group back then with me, and he told me that this company was hiring, what they called at the time “internet analysts” to work on security stuff. And I’m like, okay, I like to, you know, I could work there part time, make some money. And the company I joined was a small consulting company in Portland, Maine. And this company did security for a couple of a couple of companies in Southern Maine, but also a large bank that was out of Manhattan. And what we did was we did, we built firewalls or what we called Bastion hosts back then. So we would handle these firewalls. And we would put in rules to allow you know, certain services like pop mail and send mail and web browser, things like that, to occur while keeping the company secure. And build these these systems to keep these companies connected to the internet, but also keeping them secure. And that’s where I first really sink my teeth into the security industry.
LO: Was that where you were a consultant for the first time when you discovered your first CVE back then?
LC: No, actually, I had left that company to join Computer Science Corporation. Computer Science Corporation is a consulting firm that is subcontracted by large corporations to handle their IT infrastructure. And the company was subcontracted by Bath Iron Works, which is a shipbuilding company up in Bath, Maine. They build Aegis destroyers. It’s a pretty large facility. And when I had, when I was hired on there, the group that I worked for, was in charge of 3,000 Unix systems. And these Unix systems varied from SGI IRIX, to Solaris to AIX to HPUX. So I had this enormous playground of servers and various Unix flavors that I could play with. So that’s where I really started honing my skills on security vulnerabilities.
Actually, there’s a funny story where the first day that on my on the job, my new manager took me to the IRIX or the SGI computer room. And what what this room was a locked room that had a bunch of SGI Indigo boxes. So imagine these purple SGI boxes with the 3d creator cards in them and and they had, this this room was used for CAD drawings that they were developing at the time, for a new submarine system. And my manager sat down at one of the consoles. And he said, you know, he’s like, “in a couple of years, he’s like, once you’ve proved yourself, I’ll give you a login on the systems.” And me, you know, coming from a security company, I knew that SGI IRIX, by default, had no password on the LP user account. So you could literally walk up to any IRIX system that hadn’t been secured, type in LP, hit Enter, and it would log you into the desktop windowing system. So I scrolled up to one of the one of the SGI systems, typed in LP, hit enter and said, “thanks, I don’t need to log on. I already got one.” My manager one swung around on his chair and he goes, “how did you do that?” And I said, “I know IRIX systems have an LP account that has no password by default on all the systems.” And he looked at me and he says, “Would you be willing to do security for us?” And I said, “I was hoping you’d say that.” So from then on, I was the unofficial penetration tester for this network that was literally untouched by any security person, I think ever. There were so many systems that were, you know, misconfigured or or configured insecurely or just wide open. And I really honed my skills on this network of just breaking into all sorts of systems that I would have never had the opportunity to break into if I had been on smaller network. And that’s where I discovered my first vulnerability.
So another story is that we had this SGI Onyx/2, and for folks who don’t know, an SGI Onyx/2 back then was like this $250,000 to half million dollar, depending on how you configured it, computer system that was about the size of a refrigerator. This system was used for 3D modeling, rendering, you know, 3D images and and you know, it was sort of like the “creme de la creme” of computer graphics back then, right. And this, this SGI sat in its own private data center that was locked, you had to enter a PIN code to get into the data center, there was only a select few Unix administrators who had it. I didn’t have it, you know, my team didn’t have it was only the guys who were like Unix admin level three, and I was just a level one. And these guys in this other team would always taunt us about, “oh, maybe someday you’ll get access to the SGI onyx to and, and, you know, someday, you know, we’ll maybe we’ll give you a login and the key code again, or the door.” And it always, you know, sort of aggravated me, I’m like, I know that I can log into that system with an LP account, you know, I really want to get root on that system to show that it’s owned, or once you have administrative access to a system in hacker speaking, “owned the system.” That’s what I wanted to do to this thing. So I thought to myself, well, how can I get root on this system, I need another system that’s just like it that I can examine the operating system and look for vulnerabilities on it. And I realized I access to an IRIX system. You know, in my office, I had, I had one that was nearby that I could tinker with. So I logged into it. And I’m like if I’m looking to elevate my privileges from LP, I should look at Setuid root binaries. And for folks who don’t know a Setuid root binary is a binary that has a bit on it that set to execute as route one, a normal user runs it. So it executes with root privileges. Now this is back then this is, you know, this is pretty sketchy stuff to have a binary just running around with root privileges. This was the first thing that you would look at to abuse on a system that you were trying to hack. This was the this was the starting point, you know, for looking at ways to get higher access as an administrator.
LO: Right. Wasn’t abusing Setuid binaries, that was a pretty big security trend at the time, too.
LC: Yes. Yeah. Yes, yeah. And so I started looking in /usr/sbin/, which a lot of these binaries were living on. And I noticed this binary called Midikeys. And I’m like, what, what is Midikeys? I’ve never, you know, what can this thing be? So I ran it. And on my on my Windows system, this little keyboard popped up. It was like a little piano keyboard. And if you click the keys, it played a little tune, you know, each key was a different note. So I’m like, Oh, this is kind of neat. I don’t see who would spend their time making little songs on this. But but it’s a neat little, little thing, right? And I’m like, you know, if this thing’s
running his route, what what can it do? You know, as root, it can, it looks like it can read and write files. So can I write to files owned by root? So I opened up etc/passwd, add an account for myself and saved it. And sure enough, it was able to write to etc/passwd. So I thought to myself, well, here’s a way I can get root on the Onyx/2. So I’m like, okay, so I log into the Onyx/2, I forward an X Window back to my desktop, and I launched midikeys, and sure enough midikeys is on there. It’s set UID root, and I edit MC password, and I add an entry Larry. And then I enter my, my user ID is zero and my group ID is zero. And I hit blank password, and then I save it. And I’m like, cool. I, I log back in as Larry. I’m in, I have a root prompt. I’m like, cool. I’ve got root on the Onix/2. So then I’m like, Well, I should change the, you know, I should set a password on my account, so that it’s not a password-less account. So I go to change the password. I type the password command and I suddenly realize my account user ID is zero. I’m changing the root password for the root account, not for account Larry. So I’m like shoot, I got to back out. So I hit Ctrl D, Ctrl. D, what happens is the SDI or the IRF system saves the root password as Ctrl D So I just changed the password on this iron on this, this Onyx/2 to to control D. And I’m at this point I’m like, Oh, no, I, you know, I just messed up the system, I need to tell the administrator of the system that by the way, the password is now control D. And so I, you know, I’m afraid of the system administrator, I’m a small guy, I’m five foot six, you know, at the time, I was probably 125 pounds and the sysadmin, I think his name was Dave. He didn’t like me very much because I was this security dork, you know, who was running around trying to make systems more secure, and it was a big hassle. So it’s like, okay, you know, I’m going to have to face this guy. But instead, I talked to my friend, Donovan. Donovan, is, you know, six foot three, six foot four. He’s a big guy. And I’m like Donovan, can you do me a favor and go tell Dave that I changed the root password on the Onyx/2 to control D? And then I messed it up? He’s like sure, I’ll be right back. And I go over and wait and Donovan comes back a few minutes later. He’s like, dude, he’s like, they were so pissed. I was like, what happened? He said, you changed the password at the exact same time that they were about to log in and do a 3D demo for the U.S. Navy. And he was like, they couldn’t log into the system. And they had to scrub the demo.
And I’m like, Are you kidding me? He’s like, no, and there was a Navy Admiral in there. There was all this top brass on the Navy, there were there were top senior executives from from, Bath Iron Works and and he’s like, they couldn’t they couldn’t log in. And I’m like, oh, come on. This is really bad, I’m gonna lose my job. I’ve been on this job for maybe a month. I screwed it up. So I’m sitting there. And I’m like, all right, I should start thinking about how I’m going to leave and pack up my stuff. And I don’t have much stuff here. So it should be easy. And you know, half an hour later, my manager comes down. He’s like, Larry, we need to talk. And I’m like, crud. He’s like, let’s go to my manager’s office. And I’m like, Oh, man. So I go to his manager’s office, and I go in the office, and I see this SANS security poster on the wall. And I’m like, cool. This guy knows about security. He follows security stuff. And he and I sit down. I’m like, Hi, you know, and he’s like, so he’s like, I heard you messed up the Onyx/2. And I’m like, yeah. And he’s like, well, he’s like, I get security. He’s like, you understand that, that, you know, security is important. But he said, From now on, I’m going to need you to go and notify the system administrator, that you’re going to do a penetration test on it, or that you’re testing the system, and that you need to get permission from them when it’s a good time to test it. And I’m like, okay, I can do that. Yeah, he’s like, and then I want you to, I want you to document everything you find, and I want you to send it to me in a report so that I can make sure that you know, the things that you found are documented, and that we know about it, and that, you know, I can I can task people to fix it. And I’m like, great.
So I walked out of there, and I had the sigh of relief, like, Alright, I’m not going to get fired. And it also gave me a, it sort of gave me like a legitimacy, like now that my boss and my boss’s boss know, that I do this sort of thing regularly – And now I just have a process for it – It gave me some more like power to say, “Hey, your systems being audited next week, you know, I’m going to be doing it at this time. Is that is it available for testing? Or do you have another system I can test on?” And it was, it was actually a win for me.
LO: Right.
LC: And I went on to, I can’t tell you how many systems I had broken into and just, you know, exploited, you know, shell scripts that these guys were creating in /temp to get my privilege to root and, you know, I did all sorts of fun things back then it was it was like a, just a huge playground of hacking. And I was the only one in it. So it was it was a lot of fun. That was the first vulnerability and back then MITRE was
sort of this mysterious entity to me, like I didn’t know who they really were. And they just sort of assigned CVE IDs to worthy vulnerability. So I didn’t even ask them to assign a CVE ID I just dropped the details of the vulnerability on Bugtrack. And then I found out later that SGI had issued a fix for it. And then MITRE had issued a CVE ID for it. And so I was so excited that I had my own, you know, CVE that I contributed to the hacking community, finding a vulnerability, a zero day that didn’t exist. And I thought to myself, you know, if I if I ever find 10 of these, I’m gonna make myself a T shirt. And here I am years later, I never made myself a T shirt, but I’m way past 10 now.
LO: You got to make t shirt. I’m sure the landscape was different back then. But you know, going into that office and expecting to get a slap on the wrist. But instead being met with more legitimacy as you, as you say, Yeah, I really like that story. And I know, too, you know, you were talking a little about reporting CVES to MITRE and I know, you know, they were that whole, the whole concept of CVES was introduced in the 1990s to, you know, really identify these these public software flaws, and especially as they were starting to increase in number and frequency. But now, you know, this system is really kind of an essential foundation for for managing for patching these these flaws. How have you seen that landscape shift over the years, since you first discovered this very, very first flaw, or very first CVE?
LC: When I was first, when I was first, a sysadmin, or an internet analyst, we would make sure that our systems were patched before we sent them out. And you just knew that, you know, if there was a buffer overflow for sendmail, you knew there was a patch for this sendmail code somewhere that you needed to apply. And, you know, I remember, you know, downloading patches from from the internet to apply to my sendmail source tree. And we’re looking at each patch to make sure it didn’t do anything weird or or nefarious, and then applying each patch, and then calling it good, but you didn’t, you weren’t able to identify which patch affected which specific vulnerability, there could be two buffer overflows and sendmail on two different variables. But you didn’t know unless you looked at the patch, which one applied to which, so you knew, you’d have to find out, what patch applied to which variable and was that the one that was wonderful for that version of that source code. With CVE, you could say, Oh, this patch addresses this CVE, and this CVE and this CVE, and you had a unique identifier for that specific vulnerability and just made things a lot easier. So it was it was, it was a nice thing to have, once that was developed. And you kind of thought, Well, why didn’t this happen earlier? But, you know, it’s just an evolution, I guess, of developing, you know, practices for security industry.
LO: Right. Right. And I mean, kind of following up on that point. Have you also seen the whole concept of responsible disclosure change and what that means from your standpoint?
LC: Yeah, I wasn’t very good at responsible disclosure until about 2013, 2012. I would just literally find a vulnerability and throw it on Bugtrack. And I finally learned my lesson. It was December of I think, 2012. I was sent to Manhattan, to take a class on Centrify. Centrify is a software product that unifies desktop logins for Linux, you know, across Linux systems and Unix systems. And it ties it to I think it was either LDAP or AD, where you could use the credentials from an LDAP server or an AD server. And so I was, I was sent down to Manhattan to take the Centrify class, and I remember sitting in class, and I’m not that interested in desktops, I like servers. So I’m just like, this isn’t that interesting. You know, it’s just pretty much telling me how to click through and set this up. Because this software have any vulnerabilities on it? So I’m like I’m gonna start poking around as I install software and run it and watch you know, simple things like, you know, to create new accounts in etc/password, does it do any weird stuff in /temp, and I’m looking around I see this, one of the products is creating these temp files. And I’m like these, these temp files are being written in securely. And you know, if, if I write this file, first, I own the file so I can add to it. And look, the software is executing the shell script that it created in temp as root. So if I can add a command to that temp file before it creates a temp file, I’ll get I’ll get it to execute commands as root as a local user. So I spent, you know my time in class, documenting this vulnerability, and I spent time writing an exploit for it, instead of actually paying attention to the class, and then I actually remember I had to smuggle my source code out of the class because the class didn’t have any network ability. It was a local network, there was no internet. I remember they gave everyone a branded. It was a Centrify USB stick, and I think it was, I think it was 16 or 32 gigabytes, it was pretty big. I was pretty impressed with for that time, that size of the USB stick. I remember copying my source code and my advisory to it. And then going back to the hotel that night, and typing it up and dumping it on Bugtrack. And then you know, I just went to bed, and the next morning I get back to class. And then the teachers you know, once everybody settles in, he’s like, so he’s like, “I hope everybody had a good evening last night. You know, folks enjoyed dinner, got dinner, and had a relaxing evening at your hotel” and He’s like, “I know Larry had a good evening last night.” I’m like what? And he’s like, “Is there something you want to tell the class, Larry?” And I’m like, no. Like, I don’t think I have something to tell. And he’s like, I think you do. And he’s like, “weren’t you up late last night, doing some hacking and putting things on Bugtrack?” And I’m like, oh, that Oh, yeah. He’s like, “could you could you care to come to the front of the class and take this, this dry erase marker and tell us what you did last night? And why it’s significant.” And I’m like, I’m like, okay, so I started like, what to the, to the whiteboard, I start explaining to these guys, you know, about writing files and /temp and time of creation to time of use, and race conditions, and half the class didn’t know what the hell I was talking about. The other class, the other half of the class was like, “Wow, that’s really cool. You know, that’s, I never thought of that. That’s neat.” And then the teacher said, He’s like, “Okay, he’s like, so last night, Larry found a security vulnerability in Centrify. And rather than Tell me about it, threw it on Bugtrack. And he’s like, Larry, he’s like, I work for a security company. We’re all on Bugtrack.” And he’s like, “I wish you would have told me about this first, so that I could have told, you know, my bosses about it, rather than all of us reading about it last night, and this morning,” and I’m like, “I’m so sorry. I said, I’m really, you know, I really didn’t realize you know, how much of an impact this would be.” And I said, “it was pretty stupid to me to not think that any of you guys are on bug track, because you are a security company.”
So I finished the class and I got home, I think, a day or two later, and my cell phone rings. And I’m like, gee, why is my cell phone ringing with this number, and I pick it up, and it’s a product manager at Centrify. And I’m guessing I could put my cell phone number in the class signup sheet, because that’s how he got it. And he’s like, Larry, he’s like, we really appreciate what you found. He’s like, but next time, could you just give us a call to let us know what you found, instead of making it public. He’s like, I had to call in my entire development software development team over the weekend. He’s like, we’re a week away from Christmas. He’s like, they’re not happy. They worked all weekend to fix this vulnerability that you found because we had to get this we had to get a patch immediately, because you put this exploit out in public.
LO: Right.
LC: He’s like, I had customers calling me asking me like, you know what, what the fix was, and if we knew about it, he’s like, it was just, it was like, it was a nightmare. And he’s like, I really just, he’s like, call me He’s like, just let me help you find anything again. And I said, I’m so sorry, sir. I said, I’m an idiot. I’m like, that was really stupid to me. I said, I didn’t. I didn’t realize or I didn’t think about how this could impact software developers, and how they would be yanked into work on this. And I said, I’ll never do this again. Right. So that, I learned my lesson. From then on, I give software vendors, you know, two weeks to four weeks to respond. And then, you know, if they don’t respond, then I’ll make a public disclosure of it. But for the most part, you know, I think 99% of vendors I’ve ever approached have been really great and supplied me with a patch. So it was, it was a learning experience, for sure.
LO: Yeah, yeah, that’s definitely a lesson learned. And, but that’s, that’s, you know, definitely positive news, though, that sounds like 90%, as you said, are mostly interacting and trying to push out a patch. That’s always good. I also want to ask, you know, before we wrap up, I want to ask you about kind of the most unique story you have about finding a CVE and, you know, over over the years, what has really stuck out to you.
LC: Yeah, back in 2000, I was working out I chased the dot com bubble out to California. I tripled my income by going from Maine to California in the security industry. And I was working for this small consulting company that was full of a bunch of mathematicians, they were doing all this statistical analysis of, I don’t remember what, and I was this, I was a partner or I was assisted men with another guy. My friend who was is Kennedy, he was a new sysadmin. And I was at sort of a senior sysadmin. Back then, because it was the dot com bubble. And I had more than more than a year of experience with Unix. So people looked at me like I was the inventor of Unix back then, which was funny, kind of silly, right? And the director of the company approached me and she said, Hey, Larry, she was like, could you do us a big favor? She’s like, we’re looking to find some log analysis software for our web server. Now, back then, you know, venture capitalists, if you were trying to get money from them. They were like, well, how much traffic does your website get? And that was their metric for deciding if your company was sort of worth looking at. So everything was, you know, revolving around how many people were hitting your website, how long were they there for, you know, what links they click on, did they buy your product or look into your product or things like that. So I’m like, okay, I can evaluate some software for you. And so the first software I looked at was Sawmill. So Sawmill is this log analysis tool, I think nowadays, it’ll pretty much analyze any log file you want it to. But back then it was, you know, really specific towards Apache and in web server logs and, and syslogs and things like that. And I’m like, well, I’ll take a look at this, I’ll see if there’s any security vulnerabilities. So I started up and I start poking around, and it’s got this interface to it, where you can either browse the interface as a normal user, you know, unauthenticated, or you could browse the interface as an administrator and actually make changes to the software and, you know, execute commands and do things like that. So I was looking at, I noticed that it had, when you look to read a log file, you could click a link, and it would pull this log file up in your GET request as a query parameter. Like what if I put in a dot dot, and try and do path traversal? And sure enough, if I put in the dot, dot slash dot, dot slash etc/password, it read me the first line of the etc/password file, like, oh, wow, this thing actually does a patch traversal. So am I cool, I can read the first line of any system. So then I was looking at it locally on my, on my machine, and I noticed it created this admin.db file, I’m like, well, what’s this thing? So I opened the file up, and it, it looks like a hash, but it’s really short. Like it looks like some sort of encrypted password. But it’s only, you know, eight characters long. And I’m like that’s really small, it should be Oh, it’s not a crypt hash, or an MD five or anything like that. So I type in my password, and I notice that the password that I created, it’s the same length as this “encrypted” in quotes hash in this admin file. I’m like, that’s really weird. Is this a substitution cipher? So I delete my password, and I put the password in as A, and I look at the admin.db file, and it contains a capital Z. And I’m like, Okay, so then I changed my password again to AB, and it has a capital Z and a lowercase C. And I’m like, okay, so is this thing, just substituting stuff? So I put in A, C, A, C, and the admin.db file has capital Z, lowercase C, capital Z, lowercase C, and I’m like, it is a substitution cipher. So I ran the strings command on the binary of this Sawmill software, and it spits out this array that was the actual substitution ciphers stored in an array. And I’m like, so I can take this and I can write a program to decrypt any password that’s put in that file. So this is 2000. So I took my Palm Pilot 3XE, and I wrote a program in Palm C on it to decrypt the file or decrypt the password hash, and give me the clear text. And I tested it on my on my thing, and it worked. I put in the password password, I decrypted the hash string or the cipher string, and it came out as password. So I’m like neat. So what I end up doing, instead of telling the vendor about it, or the actual Sawmill, guys, I go to their website, I use the path traversal bug to grab the first line of the admin file, the admin dot DB, take their cipher string, stick it in my little decrypter password file, which was actually the password was “Wookie,” login as the administrator and their administrator panel on their website, where I then could execute commands on the system as root. So then I tell the software, the owner of the company, or the the guy who was developing the software, hey, I found this hack in your, in your Sawmill product for version 5.X, where I can decrypt your password and I can use path traversal to grab your administrative database file. And by the way, I logged in your website, decrypted your password file, it’s “Wookie.” Here’s all the details and the guy was happy. And he’s like, that’s great hacking. He’s like, that’s awesome hacking dude. He’s like, thanks for telling me. And he’s like, I’m gonna give you a free software license.
And I’m like, Great, thanks. So yeah, he writes a patch, releases a new software version sends me the the key to get the license. I go back to my to the director of the company. I said, Hey, I got you this software for free because I found a vulnerability and here’s a license. Why don’t you buy me lunch someday? And she’s like, oh, thank you so much, great work and looking back on it. It’s like, I was in my 20s and I was reckless and stupid. You know, it’s literally that’s you can’t log into someone system and decrypt their password. And like, it’s that’s illegal. It’s like, what was I thinking?
LO: Definitely got lucky there.
LC: Yeah, definitely got lucky there. So that’s where, you know, actually responsible disclosure comes in where it’s now like, okay, you do not use a vulnerability found against the company that has developed the software, you tell them about it responsibly. And then, you know, give them time to, to, to respond to your request. And then, you know, work with them on it don’t actually demonstrate it on their own infrastructure.
LO: Right. That’s really I mean, that’s, that’s something that is, you know, is good advice, and kind of a good takeaway of this. But do you have any other advice for others, security researchers who are looking for vulnerabilities who are reporting flaws, and anything that you’ve really learned over the past few decades?
LC: Oh, yeah, I, you know, I generally, for me, it wasn’t for monetary gain, it was for the thrill of the hunt. You know, I really enjoyed finding security vulnerabilities that were there puzzles for me and exploiting them. And that just it, it gave me the warm fuzzies. I wasn’t really looking for monetary gain. And I think nowadays, there’s a lot of bug bounty hunters out there that are looking to make money doing this. And it makes sense, I understand that. But I think you have to be careful if the company doesn’t have a bug bounty program, and they say, hey, look, you know, we’re sorry, we don’t have a bug bounty program, we can send you a couple of T shirts, and a water bottle, and some swag, some stickers, but that’s about it. Don’t get upset or offended that they don’t have a bug bounty program, it will probably happen someday. But you know, don’t get upset with them. And you can’t say, well, then I’m not going to give you the details of this vulnerability, unless you give me money, because then that turns into extortion. So you just have to be careful, and, just walk the line of do no harm. And then for vendors, or software vendors, if somebody approaches you and says, Hey, I found a vulnerability, don’t get mad at them, and don’t threaten to sue them – work with them, because they’ve literally found something that you probably didn’t notice. And they’re trying to help your software become better. So, you know, don’t get angry or upset with them. I haven’t experienced that too much myself. But it does happen. I’ve seen researchers where they’ve gotten sued, or threatened legal action against them for finding security vulnerabilities.
LO: Right, right. Well, Larry, thank you so much for coming on to share some of your experiences over the years and talking a little bit about what you found.
LC: Yeah.
LO: And that to all of our listeners at once again, this is Lindsay O’Donnell Welch with threat post here with Larry Cashdollar with Akamai. Thank you for tuning in today and catch us next week on the Threatpost podcast.