- Imagine coming to work one day and the entire network of a subsidiary is offline from an apparent act of sabotage. Employees and contractors are back to filling out paper orders and using pencils and notebooks in meetings with customers. Then imagine when you attempt to recover the network, a former IT employee no longer working at the company, unbeknownst to anyone is steering you away from how this event might have occurred. That's the story of a 2018 Nisos investigation which IT employees of a subsidiary sabotaged the routers and caused almost a million dollars worth of business interruption to the parent company who made the acquisition six months earlier. This is the story of how Nisos attributed the insider threats, which culminated in a two year jail sentence for one of the co-conspirators. This is the fifth episode of "Know Your Adversary". Okay, this format is going to be a little bit different because we don't want to reveal client details. Interviewees, client names, Nisos investigators and the co-conspirators will not be directly identified. However, the client agreed to go on record as long as these key details remained anonymous. So let's set the stage. We have four guests on the episode. Our first guest will be called Anthony who's an executive for the parent company between 2017 and 2018 and the parent company bought the subsidiary. We will call the parent company Alpha company and the subsidiary Beta company. The remaining three guests were part of the investigation including Nisos' teammates. We also have two main co-conspirators as well although only one was the main mastermind and who'll be referenced pretty consistently. This is Anthony, the parent company executive. - [Anthony] So I'll start at the very beginning and kind of our lessons learned and things we did wrong. So one is, IT was not involved with the due diligence process. So as CIO, I discovered we were purchasing a company along with everybody else in Wall Street that we acquired this other organization. And so when we did the acquisition, their head of IT and our head of IT, big question around who was gonna run IT for the new combined organization. And the person from the acquired company who was over IT, he thought he was getting the job. And so he was adamant he was getting it. He thought he was the best man for the role. They had actually asked me to come back in the IT and reassume the CIO position about three months in to do all the integration work. And when he didn't get the seat, he was pretty bitter. And so there was a lot of animosity from that entire company because we didn't know who was gonna be running IT. It was just this limbo scenario for about three months. And so once I was announced, he quit the next day and he got all of that team riled up. - Let's start reflecting. During many acquisitions, the parent company conducts due diligence of their target of acquisition. While this is usually heavily focused on financial auditing to ensure financial terms that deal are sound for regulators, depending on the size of the acquisition, there's additional due diligence, other administrative considerations such as IT and security integration. This was completely ignored and setting of a three-pronged problem to work through post acquisition, IT integration, security integration and general roles and responsibility performance management. From an insider threat perspective, often these acquisitions are a blending of cultures and management is about empowering people. And if the people are unclear of their role, striving disgruntlement can occur which is exactly what happened here. - [Anthony] They had an IT department of 11. Within a year, all 11 of them were gone from the organization but he really stirred up a lot of, a lot of drama. And there were two folks who ran the network infrastructure group and they had designed and built it and they had set up VPNs and done all the configurations per device but all open source. So for three months, we didn't even have the passwords. They wouldn't share them, they wouldn't share the admin access with us. They set up federation on the domains but it was a one-way federation so they could see all of our stuff but we couldn't see theirs. And because nobody was over IT, there wasn't that decision maker to say, this is how it's going to be. - Let's talk through this for context. Anytime a company buys another, they need to integrate IT systems. And there's usually a six to 12 month process depending on the infrastructure. For Alpha company as is many large companies, they're primarily windows environment and Beta company was a Linux open-source and cloud shop. For listeners who are not technical, these are completely different infrastructures that take a tremendous amount of integration. Usually the integration favors the purchasing company, in this case Alpha company and the acquisition is required to fall in line with their architecture, policies, procedures, security program, you name it. You can already see where the problems are going to start and fester then it became a nightmare. - [Anthony] We came in one day and half of the company was down from a connectivity standpoint and it happened to be all of the company that we purchased. So turns out one of the systems admin folks about 30 days before had gone out and disabled all of the logs 'cause the logs are held for 30 days, deleted them, disabled them and then on day 31, he cleared all of the configuration files on the cradlepoint devices and changed all of the passwords. And then he just, he ghosted us and when we couldn't find them. We ended up shipping all new Meraki devices out which was our company standard. And it was a long day or two but it was painful from a company perspective. So it cost us all in all between $500,000 and a million dollars for this incident. - So let's review real quick. Alpha company was publicly owned and bought Beta company who is privately owned through private equity. No integration was conducted, no technical diligence, no integration diligence, no security program diligence, nothing. Beta company's IT employees were disgruntled because their IT environment was not used in the integration and two employees sabotage the routers of which company? - [Anthony] The company that was acquired, the infrastructure that they had built. They knew how their infrastructure was built, they knew that we didn't know it and so they went in and they basically dismantled what they built and sabotaged their own network. So it was really two people but only one person got, I'll say, convicted of it. But the one person who actually pressed the buttons and pulled the trigger, he had left the company about a week or two before, just job abandonment. He just didn't show up for weeks at a time so we said, okay, as a company, this doesn't work. So we terminated him and we relied on the acquired company's off-boarding process. Well, the off-boarding process was his buddy who was in this with them and so it turns out he was never actually truly off-boarded. But we didn't think about it. We just assumed, okay, he's off-boarded, they have a process at that company we bought so they'll go through that process and it turns out, either the process was broken or it was intentionally not turned off which is why two weeks after he was terminated, he was able to get into the systems and make these changes. - This is pretty common in large companies unfortunately especially ones that have a lot of subsidiaries. There's often so much focus on integrating the financial mechanisms for regulators, often administrative functions such as IT and security are secondary. Proper off boarding processes were not integrated or even non-existent. And the disgruntled IT administrators still had access into Beta company's environment to cause disruption. So what happens on the day of the incident? - [Anthony] Help desk calls You know, we weren't integrated so we had no monitoring, we had no alerting. We literally started getting help desk calls of email not working, their VoIP phone system wasn't working, they couldn't get onto the ERP system. So it was help desk calls and then we started scrambling as a team and that's when we discovered, okay, it was weird every branch that's down is this company that we bought. All we knew was we tried to get onto those routers just to look around and the passwords didn't work. That was kind of our first clue that it could be sabotage 'cause they worked a few days before. - Having worked so many of these crimes over the years, it's a surreal feeling to be a victim of such an egregious crime. So many things run through your head regardless of which side you were on; Security, IT, legal, human resources or even vendor response. Each department brings a different expertise to solve these problems and get the company back on its feet. Of course, the one question is who could do this? - [Anthony] One of the issues was there was collusion. And so even though one person pulled the trigger, two people had actually planned this and talked about it. And so the person that was colluding with the perpetrator, he actually was on the calls with us, trying to figure out what it was. And I remember one of the statements he made was 'cause I said, "Hey, do you think "it could be this individual?" And they said, "You know, he's not that good. "And if it was him, then we're under utilizing him "'cause he's way smarter than we thought." So he was actually kinda leading us off of the trail of definitely not somebody internal. And as he was looking at logs, he said, "Hey, I see some things from China out here, "I see some things from Russia, IP addresses." So he was really kinda leading us away from it. It took me about two days to really think, okay, I don't believe the person who's giving us this information, who's still working for us. I went back to thinking it was an inside job. - The inside job, can you believe this? The mastermind was on the inside and trying to steer the IT team off the case of the perpetrators. When an incident like this happens, it's all hands on deck for the next couple of weeks and months. - [Anthony] When we had these calls, we had war room set up, all hands on deck. Everybody needs to be on the call for 15 hours until we figure it out and he was very sporadic. So he had joined the call from 10 to 11 and then had joined from three to four and we're texting him and we're calling him and he's not answering. And it was just very, very sporadic, you know. After about a week of going through this incident and just his unresponsiveness, I just, I terminated him only to find out from somebody else that he had another job already and was actively working there. - Now let's get into the instant response of this. There are a lot of stakeholders at the table, all wanting to get the business back online and potentially go after the bad guys if warranted. In this case, it appeared like it would be since it was an inside job. Let's start with getting the business back online which is usually termed confidentiality, integrity and availability of data systems and networks. - [Anthony] The first thing was getting them back online. So we had people tethering phone hotspots just to get access to the internet from their laptops so they could, so they could access the ERP and email and things. And phones, we forwarded all of their phones from the Boyd system to cell phones. We had to hook up printers locally to laptops, go into the ERP and reconfigure all of the printers. So really it wasn't, oh, we have an issue, let's ship new devices and let's wait a day. Let's just say there were 30 branches. We were retrofitting those with a team of eight to get them up and going then we'd have to go in and spend a chunk of work actually undoing the changes that we did so they could just work regularly across the WAN. Really a week was understanding all of that, making the changes, keeping branches up and going. We were in the heat of battle and we were just trying to keep things rolling. After that, we get in more to the investigative phase and we brought in a security forensics company as well as Nisos, right, looking at the logs and who's doing what. And you know, one of the people at this forensics company that we brought in, he used to work for the FBI in their cyber crimes division. And he said, "Hey, I think this is big enough that the FBI might be interested." - We'll get to the FBI investigation later. In the meantime, Nisos and a forensics company was called in to help with the incident response effort. The forensics company and Nisos were charged with digging through the logs and doing the forensic investigation as well as attributing the insider threat and working with the FBI. First, some context on why to call in forensics and attribution experts, including Nisos. In short, the parent company didn't have the expertise on Linux and open source and AWS which again is quite common in large enterprise that run lean. You can only hire skillsets for your resident architecture. - [Anthony] Because it's a different architecture and it's open source and Linux, there was nobody on my team who knew it, just even do base forensics on my internal team. We didn't know how it was configured and there was nobody at that company who was on our side to help us figure it out as it's broken. And so there was no documentation, no historical knowledge. So really just learning how it was even supposed to work in order to then turn around and say, how did it work was just very, very challenging. - So in comes in Nisos and the forensics company. Let's talk to the team of the forensics investigators, threat hunters, offensive operators and open-source analysts who got the initial call from Alpha company's legal team and investigate this matter further. Some of this is redundant but it was really four key points in the investigation where we could point to the true attribution that result the insider threat. The first was rolling back who was deleting the logs from AWS or Amazon Web Services. - [Interviewee 1] Yeah, we got brought in by the client's law firm and yeah, sure enough. So we came in, the firm I was working with at the time to understand that all of the edge routers for this organization were shut down thereby rendering a lot of their networks in their facilities useless. So basically they were in a really tough situation. They were starting to bring all of that technology back up but at the core of it, they needed help with the investigation. So very early on, the client was suspicious of a couple of existing employees. This is a company that had dealt with an acquisition and like a lot of acquisitions, right, it's not just a merger of technologies from an IT perspective but it was a merger of cultures that is really a difficult thing to do. They had to deal with integrating these people into their team and that is what ultimately we will find was the core of the problem here. - Merger of cultures. I think that is a key point to remember throughout this investigation. - [Interviewee 2] That ultimately led to discontent really between the two IT teams. And that's kind of where we came in after this kind of festering discontent on the acquisition side of the environment. There was no logging, everything that had been put in place to connect remote offices, you know, through VPNs and different interconnections between offices, all of that was totally wiped out. There was really no connectivity out to remote offices, their AWS environment which controlled how remote offices checked in, how remote workers were able to VPN in as well as storing. A lot of the infrastructure had been basically taken apart. And that was kind of where we were at day one. You had one side of the house where all of the offices were basically unable to do work without paper and pen. They couldn't process orders, track shipments, track bills of lading, all of that stuff that they would normally have to do at these remote offices. Also, we come to find out there's a employee who's on his way out the door, helping out with the recovery but it doesn't seem that the employee's really doing all that much. He had taken basically a month off of work, was not responding to phone calls for the most part during this outage, just kind of responding to texts. And initially it was thought maybe this person, you know, they're getting ready to start a new job. They're not really interested in helping that much and that's why this is going on. However, it quickly turned into kind of a suspicious situation, looking back at some of the history that that particular user had on some of the servers and some of the systems because it seemed not that, not that they were necessarily helping with the recovery effort but there were indications that possibly this person was cleaning logs, removing entries from logs and that was really suspicious, kind of from the get go. - I realize that a lot of this is a rehash from the Alpha company executive's perspective. But for this to come together that the insider quit, but not really and was partially being helped out with response effort is now deleting the logs? That sounds pretty suspicious and perhaps our first lead on attribution, Let's hear from Anthony to set the stage on how to understand the AWS environment first. - [Anthony] So SSH keys, we had not used as a company. So once we got connectivity working, branches up and going, we really had to jump into the AWS environment. When I see my staff, we have two network engineers and we have two systems engineers. So very, very small team. So we didn't have AWS experience and they were Linux boxes as well. And so they used SSH keys and there was a lot of them on each one of the boxes and they weren't in the normal place. And I don't think this was intentional, I think they just didn't have good standards in setting up gear and equipment and they had zero documentation. So I think every time they'd build a Linux box, depending on who was building it, it was just set up differently and the keys were in different places and access was granted differently. So that lack of standards, every box we jumped on, we literally had to number one, understand it, number two, remediate it. - [Reporter] Here's another perspective from another Nisos investigator. - [Nisos Investigator 2] Some of the logging that was deleted was the Amazon CloudTrail logging which thankfully for us, you can tell them to delete it but in some cases you may have a certain level of CloudTrail logging not turned on and you can tell Amazon to turn it on retroactively so that you could see whatever they would have collected in the past. Were it not for that, we might not have seen some of the obvious deletion of log attempts that we had noticed. The mastermind had that turned off just outright turned off and the entries for turning them off were deleted. But when you delete it, there no evidence left of that having been deleted when it's not on until you make a specific request to Amazon to turn on a logging retroactively in which case they expose what they were already collecting. And you have to specifically ask for that and we knew how to ask for that and we were able to get them to turn that back on and give us the result logs and we saw what was there. - [Reporter] And it starts to get in the technical weed so let me clarify and explain strike one on attribution. Remember beta company was an open source Linux and AWS cloud infrastructure. This environment was not native to alpha company so they were not aware of how to enable the logging but Nisos and our partner did. Even though the mastermind insider thought he was clever enough to disable all logging an urgent Nisos request to Amazon's AWS cloud trail logs flips back on the switch. So when they flipped back on the switch, we could see the logging, including the people who were deleting the logs, who was the former employee who quit. He was partially helping the insight response effort by redirecting the effort away from him. - [Nisos Investigator 2] We saw his user with his key from his location cleaning logs. - [Reporter] So to recap again, upon reactivation of the AWS logging function by the investigative team including Nisos, strike one was the former employee cleaning the AWS cloud trail logs. And he was also redirecting the investigation away from him and his co-conspirator. So what is strike two? - [Interviewee 2] One of the initial suspicions that came out, which actually came from the company or the customer was they had suspected another former employee who had written letters and possibly even made threats to the CEO of the company as well as other employees and really made it clear that they were, they were disgruntled, they weren't happy with the deal that they got as far as the acquisition went, come to find out that this other suspected employee, he's best friends with the employee who's helping with the response effort or not helping so much with the response effort which really kind of started to put things in place as far as who's doing what and who's an innocent party. Turns out well, neither one was an innocent party in this. - [Reporter] So that's strike two. A separate co-conspirator employee threatens CEO prior to the outage and is best friends with the employee who quit and who is lazily helping with the investigation. That's leading my book so it's strike three. Let's hear from Anthony first since he was the victim of the crime. - [Anthony] The IT folks at this company had also had an outside hosting service provider called Linode. So they had some private boxes or personal boxes out there connecting into the systems for years replicating data. From what I was told, initially it was set up as almost an offsite data storage for customer and vendor and items data back in maybe '09 or 2010 when the account was established and it was put on a corporate credit card and the problem was, it didn't work or it didn't do what they wanted it to do but they kept it going. So for years and years and years, there was this off network data center connected to VPN into the corporate network that just sat there collecting all of this data. We actually discovered this whole off-site data center from one of the employees who rolled on another one. So once the FBI got involved, they interviewed people and this one associate said, "Oh by the way, "there is a whole off-site data center "that has a bunch of data "that's not on the corporate network "and here's the company name." So I reached out to the company, I got a login, I got credentials. And it turns out there were about six folks in IT there that all had servers on there. Some of them were torrent servers, some were illegal video hosting and copyright infringement hosting that they had these Plex servers on. Some of them were company data. And, you know, I started going through the support tickets just to try to unscramble this egg, to figure out what it was for, who was doing what with it. - [Reporter] This' a critical point in the investigation. For context, Linode is a legitimate third party, virtual private server service that many developers use for bulk storage among other engineering tasks. It can host applications, PDFs, SQL files, word docs, you name it. It's also used by malicious actors to store dumps after they exfiltrate and steal data from unauthorized access. Anthony who spoke previously discovered this line of service by examining the former files and activity of the former employee who's barely helping with recovery effort. He saw this former employee registered this and called Linode to get the credentials. Perhaps there are critical leads on this server. - [Anthony] I got credentials when I started looking around and I didn't think we had six co-conspirators 'cause it was built in 2009. I just think it was their playground and they thought, Hey, we have some free IT resources so let's take advantage of them. One of them, I think the co-conspirator, he actually has a business that does some sort of day trading or option trading. And he developed some algorithms and come to find out later that he was actually charging people, his customers to use his service, which was all hosted out in Linode. And so it was how do we cut access to all of this in an environment once again that we don't know. - [Reporter] This is where the investigation gets wild because alpha company now has controlled the insider's personal Linode server which he is potentially storing illegal dumps for exfiltration. Remember, this is paid for by the target of acquisition which is now my alpha company. This can't make the co-conspirators happy as this is their treasure trove of data. I remember this part of the investigation like it was yesterday, Nisos became a game of cyber tug of war for two hour period. In physical security parlance, imagine the cops entering the thieves house to recover stolen goods and the thieves aren't home but then realize the cops were there and try to take back their stolen goods while the cops are in mid seizure. - [Anthony] You know, Nisos got the credentials and as we were cutting off access, he started, the co-conspirator tried to get access back in. 'Cause we were shutting down boxes and taking them offline and apparently his customers started calling him saying I can't get onto the platform and so he started, he started freaking out and you know, I had already called the support group and said, "Hey, if this person calls in, "they're no longer authorized on the account. "Anytime he calls," you know, "I want to get an email." "Do not let them back in." And it was literally, y'all saw him out there clicking buttons and then you would block it and then, he's trying to come in this way and you get to block it and hey, hey, we got... We were all on a phone call I remember for two or three hours one night. And it was literally, I see him over here, I see him over there. And then all of a sudden I said, "I just got a request from support that he called support, "asked them to have his password reset." And it was just... It was just intense. It was, it was unbelievable. - [Reporter] What Nisos investigators are really doing the other side of that phone call was keeping the co-conspirators out of the third party Linode server as this had all the critical evidence. It was pretty intense for about three hours. - [Interviewee 3] We were disabling their SSH keys, we had to change passwords on them, we had to respond to password reset requests that they were making through the Linode environment. We also in the process noticed a, a Dropbox client that was turned on on the Linode instances that was still in sync. We turned that off. They were repeatedly trying to log in coming from the, one of the co-conspirators new employee place of employment. As a last ditch effort, he was trying to log into the Linode instances from his new place of employment I guess, fearing that his office gated entry access methods were not working because they were, you know, obfuscated access methods. - [Interviewee 2] During this whole takeover of the Linode account, it was also obvious from the support records and from feedback from the client that they were actually getting from Linode that there was quite a bit of panic on, at least on the masterminds side because they were frantically calling Linode trying to take the account back over, yelling, making threats, according to Linode. So it was abundantly clear that they were in panic mode and knew that this server was kind of the nail in the coffin. - [Reporter] So Nisos and the forensics company are literally keeping the thieves out of their treasure trove of illegal embezzlement activity while the co-conspirators are panicking on the other side. But what was found in this Linode instance that was related to the shutdown of the routers that brought the company offline? That's the key question. This is starting to look like strike three and become the nail in the coffin for the future indictments. After a few hours of cyber tug of war with the co-conspirators, Nisos gained access to the Linode server and started analyzing the contents. In this Linode server, well we found something pretty telling. - [Interviewee 3] They were then trying to get back in the sort of SSH methods. We started examining the servers and as it turns out, the hosts themselves had scripts on them that showed someone trying to set up the, almost the exact deletion command that was used to clear the routers as well as a backup of a slack server that had a bunch of conversations on it. And there was indications that a former employee had been logging in while he was no longer employed. We had looked at the Linode host and we had seen a mistype in the history file that was not necessarily a Linux command. The firewalls were BSD based. The command that they would have used would have been a BSD command and there were some shell scripts that would have been used to remove log files from those BSD devices that had been splayed in the history files of the Linode machine. - [Reporter] What the Nisos investigators are saying is the exact commands that were used to disrupt routers were also found in these linode servers. That's the key piece. This looked to be strike three in refer the FBI was going to be an order. - [Interviewee 2] And there's one other thing to note here also. There were backups of a slack channel that the co-conspirators had created and also had other employees and former employees as a part of. Since this was a backup of a slack channel, of course we had all the messages, both private as well as public channel messages between the users. And within those messages, it was pretty clear the animosity that the co-conspirators had towards the company. And in fact, at one point, even premeditating this attack, discussing how they would actually carry out an attack which was ultimately part of how they pulled the attack off was some of the details that they were talking about. - [Reporter] So to recap the Nisos and forensic partner investigation. We have forensic evidence of the co-conspirators cleaning the logs after reactivating the activity with AWS. We also have an additional co-conspirator threatening the CEO, arguing the acquisition. He was friends with the co-conspirator cleaning logs. And remember the co-conspirator cleaning the logs was also redirecting the company office trail. Third, we have both co-conspirators owning the Linode server. Even though the company was paying for the server, we discovered the shutdown commands for the cradlepoint routers within that Linode server. And forth, there was premeditation between the co-conspirators discovered in the slack and chat logs that hypothetically discussed how an attack would go down. And this was close to actually how the sabotage and the routers was orchestrated. It's time to get the FBI involved. - [FBI Agent] I was able to do a warm transfer if you will between the client and the local FBI office. To this day, that's still at a scary undertaking for a lot of organizations, right? Contacting law enforcement sometimes to them is synonymous with telling a regulator that they messed up or that they had an issue but it's not. And we tried to soften that blow, explain how it would work. We were also working with the outside counsel as well who also worked for you know government prosecutors. So when we did, we made the referral with the relationships we had. The local FBI office was able to set up a meeting and show up, basically get the same background that we just laid out here in this chat and from there, based open an investigation. So at that point, mark you know, there's one of the things the FBI is going to do is they're all about attribution in a case like this. There were some forensic evidence they were able to take, do their own confirmation if you will. Some of the forensic evidence that led to, led to the fact that there was some type of intentional network disruption caused here. So with that, they were able to take their investigation a bit further. That was, let's just say, you know, outside of the computer hardware and the logs that were available to fill in some of the blanks. Ultimately led to the indictment and arrest of one of the co-conspirators in this case who was actually charged and is in the process of serving the time and paid fines as well. So, you know, this is a story of, you know, an insider threat turning into an investigation, turning into a law enforcement for all, turning into, you know, arguably justice being served for multiple criminal acts that occurred to the company but also ultimately to their, you know, their clients, their customers who had to, that may have been impacted by this. - [Reporter] For high-level summary of the case after referral to the FBI, here's Anthony again. - [Anthony] Yeah, so, you know, the first, probably three weeks was just week one, get all of our systems back online and our branches back online. They were running in an impaired state but it was very impaired. Really the next three or four weeks was the investigative work and the forensics work around, do we think we're safe now we had a hole? It must have been a 200 line check sheet on all the things had to check and figure out and we put all of our standard antivirus architecture infrastructure in six weeks. Really after the six week mark is when the FBI got involved and that's when we started shipping them equipment. And, you know, they wanted three or four key cradle points. So we had that, the facility ship them back to us. I think it took them a year, right around a year to actually determine who it was and press charges. And then almost another year before the person was convicted. - [Reporter] Working with the FBI isn't always feds and blue coat showing up in Holland computers away in the middle of the day for everyone to see. Over the course of the year, the FBI operated in the background while alpha company conducted their investigation. This discretion allowed the company to go about their business without knowing exactly what was happening behind the scenes. - [Anthony] People knew the branches went down. It was just a very limited group of people that knew exactly what was going on. And so the company had no idea that somebody had sabotaged us and gotten in. The company just knew we had an IT hiccup, our systems were impaired for a week and that's all they knew. - [Reporter] Anthony goes on to talk about the conviction. - [Anthony] After he was convicted and before sentencing, he wrote some sort of letter to the sentencing judge, three or four pages from his attorney about why the sentencing should be light. He still was not remorseful even to the point that he said things like, "Those routers are so easy to reset anybody," you know, "If anybody was competent on the IT team, "they could have figured out the password "and reset them back to factory default," which is true. But if all the configuration files are gone, resetting it back to factory default, doesn't help. - [Reporter] The investigation that exposed this insider threat provided some valuable insights on gaps in the processes that led to this situation. Anthony reflects on what could have been improved upon. - [Anthony] Now I have a whole sheet of here's what we need to check, involve with the due diligence. I'm a true believer in corporate standards, especially around security. You need one platform, you need one antivirus, you need one standard. And so what's the plan day one of close? And so you get IT involved for due diligence, not so they can go in and say their security stinks but so they can say their security stinks and here's our plan to fix it day one. I would have bought all the hardware ahead of time, I would have had a plan to roll it all out. And we kinda went through this process because it was collaborative for those three months on, here's our antivirus, here's yours. Which one do we wanna go with? We try to do this best practices, not hurt any feelings, have everybody feel involved and that was a mistake. We should've said our security is good. We actually don't care if yours is better 'cause ours is good for what we need it for so we're putting ours in. There is a couple of key areas on an acquisition as CIO's we need to understand. One is security obviously. What's their security stack look like? The other is around licensing. Are they in compliance? There can be big numbers in the case of an audit that you don't even think about when you buy the company and it can be a million dollars which changes the whole valuation of that acquisition. Another is IP and intellectual property and do they have trademarks, kind of where, where are the crown jewels and that dovetails in a security. Okay, if these are the crown jewels, how do we know that that nobody else can manipulate these numbers? 'Cause once again, if the numbers in the financial system or the billing system are being tampered with, from a security perspective, that can change the valuation. - [Reporter] And finally, he leaves us with some advice for anyone who will be performing mergers and acquisitions in the future. - [Anthony] For anybody who's doing heavy M&A, have a structure, have a process that's repeatable, that you do every single time. Tweak that process as you go along so as you find things that you missed on acquisition number one, just make sure you don't miss it on acquisition number two. Whatever that structure looks like but just have a structure, have a company standard where you know, works and works for your business that you can repeat and roll out to other folks, I think is the right approach. - [Reporter] Thank you to Anthony, Nisos and partner organizations for joining the podcast. Malicious insiders will always be a genuine threat to enterprise and they should be brought to justice when their actions tip the scales on threshold amounts of business loss. In this case, it was almost a million dollars by sheer acts of arrogance. Nevertheless, there are a lot of lessons to be learned, not only for integrating IT and security systems but most importantly that any acquisition is about the mergers of different cultures. This merger of cultures is often the most critical aspect to get right to avoid insider threats. Thank you for listening to "Know Your Adversary". Every other week, we will bring you a new cyber crime attribution investigation that is representative of the work of Nisos operators past, present, and future. If you have any good stories to pitch, please reach out as no two investigations are the same and simultaneously fascinating how digital clues come together to bring context to crimes that victimize enterprise. For more information, please visit www.nisos.com. Thank you for listening.