It can be daunting to think of the amount of data held by the Federal Government and specifically the DOD. Considering that a full zero trust implementation would mean tagging every piece, regardless of if it is a file, or in a database, we wonder how we could ever get there. Throw into the mix that every day more data is being generated than there is capacity for humans to tag that data effectively. What is there to do? How can we clear the backlog and create new methodologies to tag data effectively in line? Listen this week for some ideas and directions on the matter.
Drowning in Data
[Tom Tittermary]
Hey, everybody, this is Tom Tittermary back with another episode of Zero Trusts Given, your source via podcast form of having zero trust conversations around the DoD, the Defense Industrial Base and the civilian government agencies. Our goal every week is to bring you about an hour worth of content that's going to try to give you one or two or three things relative to a pillar or the topic at large around zero trust that you can use, and hopefully not be too painful to listen to. So again, my name is Tom Tittermary.
I have my co-host, Tom Giannullis with me. Say hi, Tom. Hey, everybody.
And also with us, super excited, been meaning to set with this one up for a while. I have with us AJ Forsyak from Varonis. And we're going to be diving in deep to the data pillar today, which if you've listened to the show at all, you know, is near and dear to my heart.
AJ, you want to give us a quick intro on yourself?
[AJ Forsyak]
Hey, good afternoon, everyone. So AJ Forsyak, work with Varonis on the public sector side, got about 30 years of experience inside the federal government as a military intelligence officer, and then as a cyber professional inside of places like U.S. Army INSCOM, I was at Army Cyber, was a director of capabilities there under General Cardone and General Nakasone. And then I had the privilege of establishing as a senior executive, the executive agent for cyber training for the Department of Defense.
That was a really fun experience, which actually led to me leaving the government. So since about 2018, I've been outside of the government doing consulting and then supporting Varonis on inside of public sector, focusing on the Department of Defense and the Intel community and, you know, work with Tom for a couple of years now, then do my own podcast as well, and talk a lot about data. So data is near and dear to my heart, and we're looking forward to this discussion.
[Tom Tittermary]
Yeah. And we've talked about this on the show before, that data pillar, it's a potentially controversial statement, right? If we're talking about locking down Zero Trust for the DoD by 2027, this is the one where I have some concern, right?
Because you look at the amount of data out there, you look about the number of individuals out there, and you get into that notion of, all right, so for this to work effectively, I have to have a very clean sense of identity, I need to have a very clean sense of context, I need to have a very clean sense of posture. I also need very clean tagging and policy around every unstructured piece. And structured piece of data for the whole DoD, which is exabytes, zettabytes, the amount of data is alarming.
And in a lot of cases, I would argue that this data is getting produced at a rate faster than individual humans can tag it effectively, right? So this is a huge problem. AJ, do you want to get into, like, where do you start?
Like when you come to work from a Varonis perspective, and you're looking at this problem, like where do you start where you're looking to have the most impact?
[AJ Forsyak]
So I think that when we start looking at data, and this challenge exists, not only in the commercial space, but it's obviously prevalent in the department. So I think the idea of data discovery, and data classification is something that plagues the department today. So we, the amount you mentioned, the amount of data that's being produced, is almost non comprehensible.
I mean, it's such a large amount of data. So we start looking at these applications like ServiceNow, and what we're doing with SharePoint, and OneDrive, and email, and you know, all of these chat functions, everything we're doing is producing just an enormous amount of data. And that data is being stored somewhere.
And organizations don't have the ability today to automatically or using an automated fashion to discover that data, and then classify that data. So when I say classify, it could mean, you know, unclass, secret, top secret, but it could also mean something as simple as, is it sensitive, or is it not sensitive? So you know, the government is a relatively highly regulated industry.
So we have to have the ability to classify that data and then identify it to whether it's sensitive or not sensitive based on rule sets that we apply to that. And that's basically what, you know, what Varonis does in an automated fashion. You know, we focus a lot on unstructured data, which is the most difficult data to work with.
You know, structured databases, it's pretty simple to do that.
[Tom Tittermary]
But just so just to break it down, right, because it's I, you and me have this, this parlance, right? Like structured is literally field data that lives in a database and a structured mechanism. And then unstructured would be any file format that you could find, right?
Which gets weird sometimes, because there's PSTs, which are a file that, right, right, right, right.
[AJ Forsyak]
Human generated. Yeah, sorry. But yeah, keep going on that one.
[Tom Tittermary]
No, no, no. And then you get the whole, there used to be a term called semi-structured, which is what we'd call SharePoint, which is like unstructured data that lives in a structured format. But at the end of the day, you end up having to treat the individual components as unstructured data one by one.
[AJ Forsyak]
Correct. Yeah. Yeah.
So that's, that's kind of the, that's the backbone of it. So then we start looking at the next step is right. We start looking at what's out there.
What's, you know, can we discover it first off, you know, and then can you classify it? So then the second aspect of it is looking at the continuous monitoring or analysis of that data. And once again, today, the mechanisms necessarily don't exist.
So I keep going back to a discussion I had with Raj Iyer. So yeah, I think I remember Raj Iyer was the former CIO of the Army. And we had a long discussion.
This was down in Huntsville. I think it was the cybersecurity, one of the cybersecurity conferences down there. He just finished talking and he mentioned about a RMF or CCRI.
And he said, the real issue there is it's a snapshot in time. The challenge is that snapshot changes the minute it's done. Yeah.
[Tom Tittermary]
And by the way, it could have taken two months to build the snapshot. That's correct. So it might be dated the day that it comes out.
[AJ Forsyak]
Right. So he said, if you're not continuously monitoring or continuously evaluating that environment, then it's really futile. So his idea was, and I think Leo, Leo Garcia, the current CIO, was trying to do this where it's this continuous ATO cycle where you're continuously evaluating their environment.
You're continuously looking at, you know, are they, you know, are the, are the settings changes for, you know, from, from a configuration standpoint, you know, but you're continuously evaluating the environment, but then you're also continuously evaluating the data. Yeah. Continuously monitoring the data.
And I think that's, if we could bring those worlds together and do this continuously, then it will really, it really will, it really will change really how things are happening. And then the last part I look at, you know, there's three components I look at, you know, so it's discovery of the data and classifying it. The second is, you know, the continuous risk analysis of that environment.
And then there's, you know, I look at real time data monitoring where today we, we don't have that. We monitor networks in real time, we've got solar winds and things like that, but we're not looking at data in any real fashion that's live. And I remember this, you know, like it was yesterday standing on the ops floor with General Cardone at army cyber where, you know, we're looking at the, you know, these, if you're, if you remember down in Inscom, they had this ops center, it was an ops wall, they had all these TV screens.
And he looked at me and he goes, I have no way to understand what's happening in my environment when it comes to data. That was back in 2015 ish, 2016 timeframe, maybe 15. But I know today the same thing is, is, is, is true.
We're not monitoring data in real time. So those are my three components that I really care about when I start looking at the problem set today.
[Tom Tittermary]
And it's, so it's, it's funny, right? Like we talk about zettabytes or exabytes of data, and we talk about the complications of classifying the data, but the way that I would back when I was neck deep in, in this, in this patch, the way that I would, I would shortcut this and I would, it's a little coy and a little cute, but you guys will give me a second here. What I would say would be the words white and house are either interesting or sensitive, depending upon which word you put the spoken analysis, the spoken emphasis on, right?
The white house, that's super meaningful, a white house, that's just some colorless house in my neighborhood. Right? So then I get into granular individual keywords relative to documents.
One of the things that I found interesting when I was neck deep in this territory is, you know, data would be tagged effectively, like it would actually be tagged. And we would actually change our opinion on an individual piece of data based upon the individuals that were accessing it over time. Like if I have one individual unclass piece of data, right?
That's not tagged in any way, that's remotely sensitive. And then a group, like an AD group, that's all people with massive clearance, all hit the document at the same time. That would lead us to look back at that document and be like, wait a minute, why are they all coming?
Like, how did this become relevant to the mission context? And do I need to think about the classification of this piece of data again?
[AJ Forsyak]
Yeah. So in that, in that context, I would actually look at it from a different lens. So I would look at it from who has access to that information and why do they have access to it, right?
So today we look at, you know, the average, you know, this is a Verona stat, you know, the average user has access to 17 million files.
[Tom Tittermary]
No, but out of, so this is, this is that classic like doors and windows security conversation, right? A house with no doors and windows is perfectly secure, but totally unusable, right? What you're talking about is individuals with access to data, they have all the doors and all the windows.
Right. Because I don't want to get in the way of, I don't want to be that security blocker of the agility to go accomplish mission or task, right?
[AJ Forsyak]
But with that, like I said, I'm looking at it from a different lens, right? So my point is we are creatures of habit. We're humans, right?
We do the same thing pretty much every day. We wake up, you know, I wake up around the same time. I go for a walk or a run, I work out and do this and log into my machine about 730 ish.
But then, you know, I go to the same things, I go to the same websites. So the same thing happens with inside most organizations. We're accessing the same type of data, the same kind of data at the same, usually the same time, right?
So when you look at that, we start building, and this is what Varonis does, we actually start building profiles on individuals. What data do you normally access? And then if you start deviating from, so this scenario you gave earlier, right, where people came in and all of a sudden this group, this AD group, everybody's now accessing a piece of data they haven't accessed before.
Like, why are they doing that? And for us, that is a trigger. So if you think back to like, I always go back to the Jack Teixeira National Guard incident, right, so really sensitive to folks in the Air Guard.
But my point, this is a perfect scenario for what Varonis does, right? So you look at that, here's a system administrator who rightfully has access, who has access to everything because he grants access, right? So it's rightfully so we should have that.
But when you look at what he did, he starts accessing classified data. And then he starts printing that data. So along that kill chain, there's so many indicators along the way that you would alert on or do something about.
A, the reason it's not happening is because we're not, we don't have that continuous analysis of the environment in most instances. Like, why is Teixeira accessing classified files? He shouldn't.
So that's indication number one, bang. Number two is he starts printing this data. Like, why is he doing that?
These are all indicators that are out of the norm, that should have been alerted on, your beautiful insider threat, you know, use case. It's just easy because we are creatures of habit. We've got profiles built.
Well, we have a profile on ourselves, whether we know it or not, that we work inside this normal, what's normal for us. Once you baseline normal, it's easy to see the deviations.
[Tom Tittermary]
We used to talk about this in the same way, right? Where I used to work at another place where we had a tool that would provide us some graphical representation of user interactions to data, right? And it was really interesting because, like, what you're talking about, and I'm sure that this is a long time ago, so what you're doing is way more advanced than what we were doing.
But what you would see is, you know, the typical digital fingerprint of an individual relative to file access, right, is the HR guy's hitting HR stuff. All right, well, if I wake up and the HR is hitting file repositories in R&D, that's not normal. Like, HR people probably don't typically have the technical expertise relative to that.
The one that would get us sideways, and we weren't sure what to do with, legal touches everything, right? But usually, these individual AD groups stay within, like, bucketed pockets of data that are specific to role, right? So out of the door, you immediately get this notion of, all right, we could put some file limits around access for AD groups to these individual types because they're specific to function.
And then I'll, you know, well, I want to access this file, great, I'll give you one-off permission relative to that thing. But what you guys are doing, it seems to me, I talk about signal from noise all the time. When we're talking about, should individuals have access to data, data applications, assets, and services, DaaS, this is a functional component that almost seems necessary to be able to tell, you know, the risk posture of an individual based upon their actions in the environment.
[AJ Forsyak]
Yeah, I mean, I get down to, you start really getting into zero trust and being an advocate, which I am. I mean, I came from the Intel community. We didn't trust anybody, so nobody had access to anything.
So that was our motto back then. But now it's, you know, zero trust is in favor. And the idea is you provide access to people that need access to the information.
That's it. And the only way to do that is, I mean, the term zero data stewards is in the language for zero trust. By the way, that's not easy to do.
And the only way to do that is, when I say data stewards, is how do you identify your data stewards to begin with, right? So the way we recommend it is you start looking at who's accessing data, who's accessing it in that group, right? And then you nominate somebody from that group to say, okay, this person's in charge of that group.
Everybody in here still needs access. And then through a simple process called entitlement reviews, you send out an email, you know, Tom, you've got 15 people in your section. These 15 people access to the information, they have access to it.
But by the way, only three people have access to it over the last 60 days. So we recommend you remove their access of these other people, bang, bang, bang, bang, bang. And these three people still have access to it.
Hit submit. And that change is then put into the system. That's an entitlement review.
That's data stewardship. That's mandated in the zero trust language. If you don't have a tool like Varonis that can actually facilitate that process, it's basically impossible for you to know who's accessing your files.
Or how do we even nominate a Tom to be a data steward? Because that's all based on activity in the environment and what group you're assigned to and all the other aspects that are there. That is, you know, that's, you know, that's pivot table upon pivot table upon pivot table upon a pivot table.
Yeah. And the only way to do that is with automation.
[Tom Tittermary]
It gets really down to the notion of, we used to talk about swim lanes. Like back when I was, so I, in another life, I was managing the intelligence community engineers for a different cybersecurity company. And the way that we used to break it down is, how do I notify people in the organizations when the guy who's supposed to be focusing on Chinese artillery is poking around in Columbia Narcoterror in an open file system?
Like that is massive signal from noise. But to get to that, it's, well, in each of those two individual categories, I need a steward that knows the people involved relative to that process that could do that block and tackling. But it's, there's a notion of like directors and first line managers and you really need somebody who's closer to ground for each of those scenarios to be able to make that call in those areas.
[AJ Forsyak]
And the best, you know, in my view, and also, you know, company I support's view, the best person to grant that access is the person who understands who should have access to it.
[Tom Tittermary]
Yeah.
[AJ Forsyak]
Your typical system administrator has no idea of the S3, you know, your G35 plan shop, who should have access to G35 plans data.
[Tom Tittermary]
Yeah.
[AJ Forsyak]
You know, like he has no idea. So everybody in G3, bam, he just, you know, grant access to everybody in the G3. But there's aspects of G3 that shouldn't have access to that data.
And that's where, you know, as we get more mature on this zero trust journey, the data stewardship piece is the really difficult nut to crack. Not only, you know, we talked about, you know, continuous risk analysis and we talked about, you know, real time data monitoring, but being a data steward, you know, it's, there's, you know, I work with co-coms across the globe and they're struggling with this. All of the co-coms are struggling with this problem.
[Tom Tittermary]
So the one that I go back to consistently about, like my concerns about data in 2027, to nail this down perfectly, right? And to do it perfectly means that every, I'm confident in every user identity, I'm confident about posture, and I have every piece of data tagged. The tagging piece is the one that concerns me the most from a timing perspective, right?
Because the notion I run into is, we're talking about sensitive environments, am I just content to throw AI ML at this scenario and let a mechanism where data enters the front door and I can't see how it came to the answer and it gives me the answer? The concerns around the tagging piece of it. And then the other concern about the tagging piece is, if you had listened to the episode we did with Skip Farmer, I got real nerdy talking biology about, like we need a formalized taxonomy for data classification across DoD.
And I gave it back that, you know, King Philip came over for good spaghetti, kingdom, final class order, family, gene, species, like one universal one. And I feel like sensitivity classification is the easiest one out of the door, because all these documents are marked that way anyway. But could you talk to tagging a little bit, just in terms of the conversations you're having and where that's moving?
[AJ Forsyak]
Yeah, so tagging is one of the, I don't want to say contentious, because everything around zero trust seems to be contentious, but tagging is a really difficult problem. And we have two schools of thought, either you automate the process or you manually tag the process, right? So there's a service right now that's generating over a petabyte of data.
This is unstructured data every 60 days. And this service is moving to a manual process to tag their data.
[Tom Tittermary]
So here's the, let me get nerdy and get into the math of this for a second, right? Because that's super meaningful, or if it's not, depending upon the file size. If the file size is a petabyte, that's easy.
I got one tag. Like, what are the, like, how many files are we talking about in that 60 day window? Wow.
I don't have that one. My gut is millions.
[AJ Forsyak]
Oh, yeah. It's millions and millions of files that are generated every single, literally almost every day, millions of files are generated. So the issue becomes if you're trying to manually tag that data, it's just not physically possible for humans to do that.
It's just not possible.
[Tom Tittermary]
Somebody's going to murder me in the emails, but like, if you count to a million, isn't it like 28 days? Yeah. Is that something along those lines?
Right. So if I had a million files and I had 28, if I did one a second, it would take me 28 days to handle a day's worth. So I need 28 people for, and that's if they're doing it at one a second.
And the scope of the problem is just alarming. And I don't know. Correct.
I talk to people who get this immediately and inherently, and it's like baked into their DNA. And then I get other people that are like, oh, figured out. It's a massive, it's a massive thing.
[AJ Forsyak]
What's interesting about that one statement right there, I used to say this and I've stopped saying it because people get offended. I would say to them, either they get it or I'm talking to the wrong person.
[Tom Tittermary]
There you go.
[AJ Forsyak]
And they'd be like, what would you mean? I'm like, well, yeah, you're, you just don't get it. And I don't think you're ever going to get it.
I mean, you understand the gravity of what we're talking about. So then we start moving to the automation aspect of it. And this is something that, you know, I've been, which, you know, the company has gotten scrutiny from folks within the department is that, you know, your accuracy is only 90%.
Like, oh, okay. So only 90%. I mean, this is, you know, our classifiers are really accurate and they work really well.
And I, in one test we did in a, in a 365 environment, it was over 95% accurate on classifying, and this is millions and millions of files. And I look at that saying humans, I mean, there's a, there's a great visual out there which I, when I give a presentation, it shows it's a, it's a boxing ring and in one corner you've got, you know, firewalls and, you know, all these, all these, you know, like firewalls and endpoint devices. And I say, and the other corner, you have Bob, you know, we're all Bob.
Okay. So Bob's going to make the mistake. Bob's going to click the email.
Bob's going to, you know, users, users, right.
[Tom Tittermary]
Wrongly classify the data. I used to, so I think I told you this my first three years out of college, I was a high school teacher. Yeah.
I'd be like, this job would be great if it wasn't for the students. Like this cybersecurity gig would be awesome if it wasn't for the users, like they will figure out.
[Tom Gianelos]
Every network engineer.
[Tom Tittermary]
And by the way, half the time, well-meaning, but they will, they will figure out the route around my individual protectors.
[AJ Forsyak]
Yeah. So when it comes down to the classification piece, my, you know, my, my thoughts on this are if we don't move to an automated classification process.
[Tom Tittermary]
I don't see an alternative.
[AJ Forsyak]
There is no alternative. You're right. There's an alternative.
And then when you start churning through, you know, if you've got to churn through 15 petabytes of data and you're like, yeah, let's just focus on the new stuff first.
[Tom Tittermary]
Yeah.
[AJ Forsyak]
And then we'll get to, because then it becomes a math problem, right? How much, you know, how much processing power do you want to throw at this? Yeah.
You know, and you know, what areas you want to focus? It's just, it really becomes a math problem. How many GPUs you want to throw at it?
I mean, it's just, it really becomes that simple. But in the end, you know, there's, if we started looking like we, we need to, you know, this is one whole area. We have a whole nother episode about stale data.
So if you look at these environments where I'm, I remember being on, you know, I can share the story. I went back out to, I was a military Jones officer, right? So I went through the officer basic course in, in 1994, right?
And then I went, I went back to the advanced course later. I think it was 2000 or whatever, I went back to the advanced course. So recently I was out in Fort Huachuca and looking at my records from when I was in the officer basic course and I told, talk to the CIO there, I go, Hey dude, you could delete my files.
I'm not coming back.
[Tom Tittermary]
Yeah.
[AJ Forsyak]
Like, why do you have, you know, if you look at the records management policy that's out there, stale data, I mean, this stuff's, you know, it's over six years old, just put this in tape backup or something like dating myself there, but tape backup, cold storage, right? But today that data is highly available. So that's high.
You know, that's, that's expensive storage. You know, it's like, you know, stuff like, um, it's not sitting in Glacier.
[Tom Tittermary]
No, it's in Sutton. It's real time. Yeah.
[AJ Forsyak]
Right. It's like sitting in some, some anyway. So it's that, that part just blows my mind.
And so, you know, that is prevalent across every agency, I would say in the federal government.
[Tom Tittermary]
Yeah.
[AJ Forsyak]
So, you know, this is not like a, you know, buy Varonis to save money on storage, but it's, you know, I, first off, do you even know that? Because most, I would guarantee you, most organizations have no idea of how much stale data they really have and how much risk is behind that.
[Tom Tittermary]
I used to draw this really long extended metaphor. It's two jobs ago. I would give these presentations and I would talk about my, my data garage, everybody's got a messy garage.
Nobody wants to clean it because it's, it's way, nobody wants to get in there and you're rolling through it. But the second you get in there, you're like, Oh God, I got my last year's tax forms just sitting here, top form on a box. And then the other joke I would throw in again, being cute and coy is like, Oh, here's a giant box full of pictures from college.
This has exactly no value to me and it's all risk. That's correct. And it's readily available.
[Tom Tittermary]
Right.
[Tom Tittermary]
It's just sitting there for anybody to come, grab it and snatch it up. And, but it's, it's, again, it's that daunting problem of that's, that's, I don't have the time or the means to spend three weeks sorting out every individual piece of paper in my garage or in my attic. Right.
But again, it's one of those necessary things. There's real risk there.
[AJ Forsyak]
And even, even with that analogy, right? So thinking about your garage, but think about organizations today that are moving to the cloud and they're moving data from on-prem storage solutions into the cloud, right? So while it's on-prem, you can, you know, you can still, you know, go through it and classify it.
You can look at it. You can deal with spillages pretty easily. You can deal with that at a pretty reasonable timeframe and a reasonable cost.
The minute you move that into the cloud, forget it. I mean, it's striped across millions and millions of drives. It never goes away.
It's always there. And that to me is one of the biggest concerns where, you know, I know Navy do this when they went into flank speed, they just like literally one day said, all right, move all your stuff into flank speed. So you can just imagine how much, how much spillage and stuff that's in there that shouldn't be in there.
And if you don't have the ability to, you know, going back to the original thing, can you discover it? And can you classify it? If you don't have the ability to do that, then there's just, you know, I heard it this morning, you know, spraying and praying, you know, praying and praying.
[Tom Tittermary]
So a number of conversations I used to have around the old NISPOM requirements around data spillage, right? And three pass overwrite and seven pass overwrite and all these different things. I found myself in a conversation again, two companies ago with a bunch of developers out in California, and they were saying, we'll give you the three pass overwrite.
We don't understand why you need the seven pass overwrite. And I literally, I broke out and I remember the movie Argo with Ben Affleck. Okay.
So folks, you that haven't seen the movie, go watch it. It's a great movie. Folks, you have, you'll be right there with me.
So there's basically the, they, they stormed the embassy and they immediately started incinerating a bunch of documents, right? Hold that spot. That's a seven pass overwrite is the incinerators, right?
And then the incinerators broke. What do they do? Well, they hit the shredders.
That's the three pass overwrite, right? And then later in the movie, it's meaningful to the story where they get a bunch of little kids to put all the strips back together. They get meaningful Intel back out.
But now you take that, that thought process and you, you, you put it to, you know, I don't have incinerators or shredders. I have somebody's word relative to a cloud service that X, Y, Z was cleaned up like that. That could be a big concern in some cases.
Right?
[AJ Forsyak]
Yeah. Yeah, definitely. Definitely.
It's going to be interesting how, um, yeah, there's a, we can go a lot of ways with that.
[Tom Tittermary]
But yeah, but it's, so the, the, the single biggest thing is right. All these different cloud services have different security models and controls that are almost all kind of hairpinned around the sensitivity of the data. So if I haven't tagged my data effectively, how do I know where to stick it?
And what guarantees do I have for each individual piece of data that I can clean it up if something happens that, that is not, that I'm not okay with, right?
[AJ Forsyak]
No, I agree. It's, it's, you know, it goes back to that automated aspect of it. And if we're not doing that, then we're just, we're just, we're way behind the power curve.
We're just way, way behind.
[Tom Tittermary]
So let me introduce one more. We were chatting about this beforehand. I want to bring up this individual topic, right?
Going back to all those users, all users, this job would be so easy if it wasn't for users. So we're, we're looking into a political environment right now with the federal government where users might not be users in a couple of weeks or a month, depending upon a number of the cuts that are happening out through or throughout the federal government. Could you, you've been having a lot of interesting conversations around this and I think it's a very valid topic.
[AJ Forsyak]
It's, it's, so, um, you know, obviously can't share the clients or the, you know, things like that, but you know, it's in the department. So with Doge, we've got a lot of cuts coming in, right? And individuals are being told, you know, they're not, they're not working anymore.
So there was a, I guess it was last Wednesday I was, I was looking through some stuff on Twitter and there was a, a tweet on individuals that were fired or let go from the department, you know, the, the education department. And they said, Hey, go in and start deleting files. Just go in and just like burn the boats.
[Tom Tittermary]
So this is an individual, a, a, what's the word I'm looking for? They're mad that they were let go, right? So their call out on X is, Hey, everybody else that's being let go.
Fight the system. Disgruntled is exactly the word.
[AJ Forsyak]
Sorry.
[Tom Tittermary]
Disgruntled employee. Yeah. Typically I could find the words.
Thank you, Tom. I couldn't find the word that time. Disgruntled person is saying, Hey, I'm going to encourage other employees that are in the same situation as me.
Let's go take on the system and, right?
[AJ Forsyak]
And start, and start doing it. So you, there's your, your, your malicious insider, your, right? Yeah.
Right.
[Tom Tittermary]
So if you start just insider soon to be outsider, right.
[AJ Forsyak]
Soon to be insider. Right. So if you look at that and then, you know, remember my original statement where your average user has access to 17 million files, they start going in and start blasting all this information out, or they take that information and start uploading it into their Gmail account or, you know, they start moving it out or emailing it out.
You know, there's instances where individuals are walking in after receiving notices and and taking their entire PST files and taking them out of the environment, you know, and then, you know, an individual in this organization is like, well, that should have not happened. I said, okay, I got it, but it did. Here's the example of these individuals that are now, you know, that are now, you know, that have done this already.
They've already exfiled it out and your system didn't block it or your system didn't challenge it in any way. So if you're not, you know, if you have something like, you know, I keep going back to what If you're, if you understand what people's norms are, if your peacetime profiles on them, you know, when they start deviating out or take it a step further, hey, Tom was just listed on that. He's losing his job.
Take his name, put them on a list and start monitoring that guy. Yes. And like, oh, you can't monitor.
Every time you log into a DoD system, you consent to monitoring. So every single time you have to hit the OK button. And that's what we're talking about.
It's to prevent the exfil of data, which is happening. There's a reason there's an F-35 in China that looks just like ours, because data is being exfil every day. Whether it's sensitive or not, information can be pieced together.
So this idea of, you know, the malicious insider to outsider, I love that statement. We have got to be prepared for that, whether it's data that you think is innocuous and no one cares about. That with other information can really be the difference.
And as I keep going back to my roots in the intelligence world, you know, there's tons of information in the unclass world. But until, you know, we bring it together with other data, that's where things start getting really exciting. So folks exfil data, exfilling data out of the unclassified environment is actually dangerous.
And we've got to have the ability to monitor that because it's happening. If you're in the department, if you're in the Department of Defense or in the federal government in any way, shape or form, knowing right now that people are getting notices that they're leaving or they're nervous that they might get a notice that they're leaving. Those are the folks you have to be watching.
You have to be watching the entire ecosystem. And if you're not doing that, where you don't have the, you know, that real-time data monitoring, keep going back to my top three, you're just not able to do it.
[Tom Tittermary]
Well, we were chatting again earlier, right? And this notion of, well, I'm not worried about it because it's unclassed, right? But there's, this topic would come up all the time where you could take a number of pieces of unclass and suddenly it becomes meaningful and potentially sensitive, right?
And you brought up, all right, we'll do a test with the audience out there, right? If I say Fred, that's no big deal, right? Velma, half the audience just got it.
If you were of the age of the people sitting at this table, like you just got it. And then I say Shaggy and you go, oh, now I'm, and now I say Scooby and all right, it's the cast of Scooby-Doo, right? But these individual pieces of data that isolated, by the way, with the users that have access to them, that's typically how they are, but they can go meet in the open and then become a different thing.
[AJ Forsyak]
That's correct. So if you bring those in that same Scooby-Doo scenario, you bring all that information together. Now you know that's Operation Scooby-Doo, right?
And that's, wow, that could be a classified program.
[Tom Tittermary]
I don't think we're allowed to talk about Operation Scooby-Doo.
[AJ Forsyak]
Yeah. I'm just kidding. It could be a class.
[Tom Tittermary]
Hey, hey, nobody come to my house. No black helicopters.
[AJ Forsyak]
By the way, if I stumble on something by accident, again, like, yeah, but when you bring all that information together, I mean, that's how things, that's how things, you know, get spilled onto environments without knowing about it. And if you're not, once again, monitoring that environment, doing actual accurate classification, you would never know. And that's the point that, you know, that's the real fear.
Because we've got, we know that, you know, organizations that haven't adopted any zero trust principles, they've got blatant worldwide, you know, unfettered access to data across their ecosystem. You know, we haven't even talked about like teams and all of these social media platforms, like internal to organizations like that, where we share information, you know, Teams, SharePoint, you know, OneDrive, where we're sharing information, like it's uncontrolled. And because I know for, I know for a fact that organizations have, have lost control of that data inside their, inside the department.
And that's frightening on so many different levels, where people have access to data they shouldn't. Yeah, there was a, I have one real simple story. There was a guard unit that I was working with, that the commander was so convinced that their, their finance department was so locked down, like, they were good, like, there's no way people don't have, you know, there's no way they're gonna access stuff.
So I already knew, because I saw the results of some work that we did, we did a risk assessment in that environment. So there was a PFC who was running the, running the presentation, I said, are you logged into the system right now? He goes, yeah, I go, you have your, your, your tax in, right?
And you're, you're logged in, not like some separate, you're on there, like the nipper net. He goes, yeah, yeah, yeah. So I bring up the presentation, I just happened to have the pathway for a file which had credit card numbers.
I said, go to that, you know, like literally just, you know, like two screens looking, was able to get to that URL, went to that file, open the file, and it had every single credit card number of every single individual in that unit, in that guard unit, to include, you know, the expiration date, you know, the security code, but first off, why are you gathering that to the A, that, that's question number one, but question number two, and this commander couldn't believe it.
I was like, right there, that, your PFC, who, you know, has no reason to have access to any of that data in the finance folder, but has it, like that, that's happening everywhere. And that to me is, you know, we start looking across this, this insider threat piece, or, you know, the malicious insider to outsider, that's really the threat there, because they have access to all this data.
[Tom Tittermary]
So not knowing it. It's all doom and gloom, and there's no fix for it. So that's the show.
No, I don't want to, I don't want to, so we've laid out the problem set, right? Like, we've laid out in painful detail, apologies, everybody, if it got too deep at certain points, but how do we fix it, right? Like, so at a high level, let's do what you guys are doing to fix it, and then I'll talk a little bit about what, the way I think that we're coming at it, and then some of the cool stuff that we're doing together.
[AJ Forsyak]
Yeah, so I mean, just flat out, right, we, you know, Varonis is a, you know, it's a data security platform, and we work, you know, we work with our clients to, A, understand their data. You know, we do that data discovery piece, we understand that, we understand the sensitivity about that data, we understand the permissions around that data, and we understand the activity around that data. And we start building peacetime profiles on individuals from what's normal and what's not normal.
And there's three things that I haven't mentioned yet, which are super important. A, your scans of your environment have to be complete. So remember, we talked about that earlier thing?
You know, you keep, we talked about this beforehand, like it's, if you keep, if you keep having to scan your entire environment over and over again every day, it's never going to finish. The first thing is you have to have complete scans of your environment, that's number one. Second thing is you need context of what that data means.
So who's accessing it, you know, where are they, you know, are they in the United States, are they part of your unit, are they an active directory, what are they doing? So you have to have context around that. And then the last part is a current, you know, is it up to date, do you have all the information on that?
So those are called the three C's. Once you have the three C's on that, and then you couple it with sensitivity, permissions, and activity, you can understand what's happening in your environment. And then you get into all of the different use cases, right?
So privacy, records management, knowledge management, you know, security from a spillage standpoint, user activity monitoring, you know, threat detection and response, you know, can you see this, can you see this? You know, when you start looking at kill chains, there's, you know, when you start looking at a ransomware attack, there's like nine steps in it, right? So there's like nine steps along the way that we could tell you, like, where they are in the process.
So it's kind of, it's those kind of things. So you know, CCRIs, AF, and then obviously the data pillar for zero trust. So that's kind of the, that's the way we see the world.
And we, you know, this is Varonis, and I believe this too. We assume you're breached. So we know there's, we know there's people on the wire.
So the only way to know, you know, we say this all the time to our clients, nobody breaks into a bank to steal a pen. They break into a bank to get money. People are breaking into your environment to get to the data.
So if you're not monitoring your data or understanding what's around your data, then you're never going to know. And one thing that's a big change that our CEO, Yaki Feiderstein, stated that people are no longer breaking into environments, they're logging in. So if you think about that for a minute, they're now logging in.
So that means they're an authenticated user in your environment. So it's not before someone's breaking in and they've got a mask on and are running around and stealing money. These are people in your environment.
So if you don't know what their normal looks like, you're never, they're going to fit in and you'll never know they're there unless you're doing what I started, you know, what I kind of mentioned around the data itself. So that's in a nutshell.
[Tom Tittermary]
So if I break it down, right? And so I don't want to, I'm always wary of saying, hey, let me summarize what you do. If I'm wrong, a hundred percent, let me know, right?
So you guys are one, assisting with the tagging and getting that data classified and figuring out exactly like where the crown jewels are, where's the data I can clean up, et cetera. Other than that, right, from a wide area zero trust perspective, I take five big steps back. You guys are providing absolutely crucial policy decision point data to give me signal from noise about what abnormal looks like in my environment around people accessing data.
[AJ Forsyak]
That's part of it. I mean, just if you look at that data pillar in zero trust, we basically are, we basically satisfy pretty much every one of those factions.
[Tom Tittermary]
And then, so if you, if you guys go back and listen to the David Pearson ServiceNow episode, we talk about this fun thing that we built called master blaster, but the concept works. Master blaster was the little guy who sits on top of the, the little person who sits on top of the big guy and, and Mad Max and the little guy has all the brains and the big guy is the, he just, the little guy tells the big guy who to whack with a bat, right? So I think about that as PDP and PEP, right?
So all that data gets collected. Varonis can hand over to, to, to Zscaler and say, hey, Tom's been acting real weird lately. That's cool.
I can cut their access to everything. I can make it so that essentially like their device is a brick and they can't get to any of these front doors of these things anymore. I can do it in one place.
I can do it very quickly. Right? So there's that, that interaction between the two.
And then the other side of that, I would think, well, if we talk about this malicious insider to outsider, right? All of the egress points for those individual pieces of data, we're talking transmission to commercial cloud services. We're talking about thumb drives.
We're talking about any printers. We're talking about all these different things, right? The vast majority of that from a tool perspective is solved by DLP, data loss prevention.
So Zscaler is deep in the mix of that piece too, both at the, at the endpoint and from a, from a cloud service perspective to say, one, I'm not going to allow you access to this thing anymore. And two, the stuff that you already pulled down on your laptop, I'm going to limit where it goes to make sure that it doesn't go anywhere that could be, you know, detrimental to the department.
[AJ Forsyak]
Yeah. And it goes back to, you know, we're to the left of that. So we're like, you know, so we're monitoring, you know, Active Directory, DNS, VPN, and proxy.
I mean, we're monitoring all those different aspects and that telemetry is then fed into the platform to give you that visibility to get that complete picture of what's going on around that data touch. So I, there's a phrase, one of our reps, her name is Rebecca Tubman, who's also from the intelligence community. She says, we give you data pattern of life on every single data touch.
So if you think about that for a minute, you've got provenance of every single data touch. You could really start painting a picture of what's happening in your environment.
[Tom Tittermary]
Yeah. It's at the end of the day, like when we talk about, we always talk about it takes a village with zero trust, right? I want to pull in as much data as I can about every individual, every device, every piece of data, all the interactions that are happening in order to aggregate risk around individual interactions between a person on a device in a geography and a piece of DAS, data applications, assets, and services.
You guys are generating a ton of super meaningful data that we can use to, or others, we could roll out of service now and aggregate risk.
[Tom Tittermary]
Yeah.
[Tom Tittermary]
And at the end of the day, all right, well, I've associated risk and I've qualified risk. I've measured risk. How do I mitigate risk?
All right, well, now I need, now we're over to PEP, policy enforcement point. So that's where Zscaler comes in. And then how do I guarantee that in this scenario that exceeds my risk threshold, that user either will not get access to that thing or I will terminate their access to that thing?
Yeah.
[AJ Forsyak]
Those are all fair.
[Tom Tittermary]
Yeah. So tell me about, you were telling me about, you're doing a podcast too, over at a Cyber Bytes Foundation.
[AJ Forsyak]
Can I hear a little bit more about that? Yeah. So Cyber Bytes Foundation.
[Tom Tittermary]
Why have I not been invited yet, AJ? Because we're still- I take it personally.
[AJ Forsyak]
Well, now that I have to reciprocate based on this discussion. Absolutely. And we'll think about some other things to talk about and not zero trust.
Yeah. So Cyber Bytes Foundation, it's a 503CB. We're down in the Quantico Cyber Hub, which is outside of the gate of Quantico.
It's great. It's a great organization that we host the Cyber Bytes networking events. They're done usually monthly where we kind of talk about different topics.
You were in for the zero trust one that we talked about a little bit. But that organization is just exposing the government to different programs, different technologies that are out there. It's a space where the government could walk in.
We've got CREDA agreements with Marfor Ciscom and some other organizations. We're able to do different experimentation in there. I know that could be a weird word, but it's more for about networking right now is exposing the government to different spaces.
So in that, we've got a pretty robust podcast platform that we do there. And we also do a lot of outreach to the community, which is important. And then obviously working with the government.
So they have a safe space to come in, operate. There's a software factory in there. There's space for, I know FCA holds events in there where they do their luncheons in there.
So it's kind of a, like I said, it's an air quote safe space for the government to come in and at least interact with industry and kind of talk through some of their challenges I know today there's so much reluctance to do that, but I'm at the point now where I've gone to so many shows, I've been doing the shows, the circuits, meeting conferences for such an extended period of time. I really do like what the Army is doing from these technical exchange meetings, the TEMs. I think they're up to 14, 13 was the last one, 14 is the next one in Dallas.
[Tom Tittermary]
I was very happy when they did one in Philly, right across from the Knicks Roseport.
[AJ Forsyak]
Yeah.
[Tom Tittermary]
That was pretty great.
[AJ Forsyak]
I went to that one. That was a good one. So the issue, what I say there is that that is a forum where the government is speaking directly to industry.
I was just, I just did a Palt down in Orlando where it's a similar type engagement. So I'm, you know, I know they do tech exchanges with the Marfor Ciscom inside the Cyber Bytes Foundation, but the point there is I think the government needs to do more to communicate with industry, their requirements and not wait for a conference. Because it's just, there's the intimacy of these smaller venues where you can actually ask the question, okay, you're, you're saying this in your RFP or your RFI, but really what are you asking?
And you're not breaking any FAR, you know, any regulation by doing that. It's just that level of transparency I think is important. So back to Cyber Bytes, they're trying to give that space so the government can do things like that more, more, you know, with more periodicity and then be more effective.
[Tom Tittermary]
That's always, that's always an interesting one, right? Because like, I'm aware that I'm part of a sales organization and everybody sees it like, we're only asking that question because you're trying to lead it. I'm just trying to help.
Like, I'm like, at the end of the day, the number of times I've written a, what I've thought is a carefully worded response to help the person who wrote the RFI get a better handle on, so I can understand how to better answer. And I've gotten the response in the RFI response, be like, the requirement stands as written. That's really tough.
And it makes me wonder, and a lot of times it's like, why was there, what was the purpose of the RFI? Like, and it's, I thought this was a meaningful question, I guess not. But to be able to provide spaces where you could actually have those conversations is super beneficial.
[AJ Forsyak]
And I've been on the government side. So I've been on, where I've written RFIs and I've been, you know, on selection boards for requests for proposals and things like that and different programs. So I remember, you know, when the folks gave those responses.
[Tom Tittermary]
Oh, you're the one that did that to me. It wasn't me.
[AJ Forsyak]
But I remember like, why aren't we answering the question for them? Somebody took the time to craft that question. At least take 10 or minutes or whatever to respond.
Give them that. But I, that's why I believe in person are, you know, it's hard to, when you're looking someone face to face, to give them a BS answer. Like you want to, you know, at least human nature is you want to respond accordingly and not, you know, pull wool over your eyes.
[Tom Tittermary]
I would say most of the meaningful conversations I have too, is like you ask, a question comes in and answer goes back. The real meaningful communication happens on the response to the answer and the ability to go back and forth one or two more times where that whole RFI RFP process just doesn't allow that to happen. I agree.
In a lot of those cases.
[Tom Tittermary]
I agree.
[Tom Gianelos]
In CCLR we call these like guilt free zones where there's, there's no more, there's no more impetus. Right. You just, you just ask questions and get answers.
Right. And that's.
[AJ Forsyak]
And maybe the government is concerned because, you know, talking to all the government people listening to this, you know, you know, read some Brene Brown, you know, get some, you know, be vulnerable and go out there and, and, and, you know, put yourself out there because you may not, you may not have a great response, but at least you're responding and you're being honest about it. If you don't know, then take the question, get the risk, you know, do the research and then respond accordingly, but don't hide behind it or if he responds as written, that's such BS.
[Tom Tittermary]
So. Yeah. Yeah.
So where can people go and find the podcast and listen to the podcast that you guys do over there?
[AJ Forsyak]
So it's on Cyberbytes Foundation. So yeah, just Google that. It's a cyberbytesfoundation.org, I believe. But yeah.
[Tom Tittermary]
Two pieces of homework for everybody who's happening to listen to this podcast. One is go to Cyberbytes Foundation and check out AJ's podcast over there. It's I'm sure it's fantastic.
I got to find some time to head over there and do that myself. Other piece of homework for everybody listening. If you have any questions, comments, concerns, thoughts, feedback, Tom, you talk too fast is one I'm positive I'm going to get on a regular basis.
Zero trusts given at gmail.com. So again, zero trusts given at gmail.com. Would love to have interaction with the audience around feedback of the show.
Also, if you have a question like us to discuss on air with any of the guests that we have in the room, if we use it, I think we're trying to put together grab bags of some fun zero trust given swag that we can get out to you. Obviously under the government gift limit, but we're trying to do that. But from there, I just wanted to wrap up.
I want to say thank you to my co-host, Tommy G. And thank you so much, AJ. I think this was an awesome conversation.
Thank you for coming.
[AJ Forsyak]
Great being here.
[Tom Tittermary]
Appreciate it. Thank you. Take care.