Tribe of Hackers Red Team. Marcus J. Carey
Читать онлайн книгу.or more), strong log consolidation and alerting, application whitelisting/behavioral analytics software, strong egress filtering (allow web ports out only through an authenticated proxy with filtering in place), and user awareness to social engineering. If the organization implements those things, I’m going to have a bad day as a red teamer.
Why do you feel it is critical to stay within the rules of engagement?
Staying within the rules of engagement or not is like the difference between landing a shell on your target and landing a shell on the personal device of your target’s significant other. One of these things is a highly illegal thing to do, and you might not be able to unsee what you see there.
If you were ever busted on a penetration test or other engagement, how did you handle it?
Busted? What’s that?
What is the biggest ethical quandary you experienced while on an assigned objective?
One time I was tasked with performing a penetration test for a company and made my way to the CIO’s system, where I found some very questionable things. I had a Meterpreter shell on the guy’s system and noticed some KeePass processes running. I thought, “Cool, I’ll wait for him to leave, log in, and then see if he left KeePass unlocked.” Late at night after he left work for the day, I connected to his system using RDP. Sure enough, he had left KeePass open, so I now had access to a ton of creds, including some personal ones of his.
But I also noticed some other windows open on his system. First, he was using RDP to connect to another company’s server outside of the target network, where he appeared to be doing some sort of “system administration.” To make things stranger, he was also using RDP to connect to a personal system. This personal system had well-known tools on the desktop for performing mass spamming and other tools. At this point in the engagement, it became an ethical quandary, so I stopped the engagement. I ended up hearing from the customer later on that the CIO was let go.
How does the red team work together to get the job done?
Collaborative infrastructure across the entire operation is necessary in my opinion. To be successful during the operation, we need to be able to share shells, data, and so on, easily. On the reporting side, it’s the same thing. We don’t want to be working in separate documents. This creates too much work later when we want to merge them. If we can collaborate on the same document platform, it creates a much smoother reporting process.
What is your approach to debriefing and supporting blue teams after an operation is completed?
After an engagement, I like it when the organization can get all the entities involved in a meeting with me. I want the security team there as well as members of the SOC and maybe even other sysadmin-type employees. This way, those who typically don’t see pentest reports now have an awareness of what can happen on the network. In turn, this helps arm them with the knowledge that they need to be diligent in protecting their own systems. I typically walk through the entire operation, from reconnaissance to initial compromise to escalation and finally data compromise.
If you were to switch to the blue team, what would be your first step to better defend against attacks?
Not switching back to the blue team. But if I did, I would first have a long discussion about budget. Knowing the budget can help you know how to best divvy it up to get the most out of it. You don’t want to go blow your whole budget on the latest blinky light system that likely requires another full-time employee to even manage. There are so many free and open source options out there for securing a network, but many of those require time and effort as well. So perhaps using your budget to hire another co-worker might be the best bet. Some things I would try as soon as possible if they weren’t already there would be to deploy Microsoft’s LAPS, up the password policy, and deploy MFA.
What is some practical advice on writing a good report?
Take lots of notes while you are testing and essentially write the report as you move along. The worst thing you can do is fill up a notes document with screenshots but forget why you actually took them. Trust me, I know it is really hard to stop what you are doing and go write a couple sentences. Especially when you are faced with a new shell, it can be tempting to just start hacking away at it. But if you don’t document, you will be regretting it later.
How do you ensure your program results are valuable to people who need a full narrative and context?
As much as possible, I try to explain what an attacker who was actually trying to do malicious things could have done. Most real attackers don’t have the same deadlines as red teamers, so they are not worried about being done within a few weeks’ time. They can take all the time they want. So for many organizations, being able to tie an actual threat actor’s potential actions to data you provide can result in great value for them, because they understand how bad things could really be.
An important part of my red team engagements is that I’m not placing domain admin access as a primary goal. In most cases, the data I want doesn’t require domain admin credentials to get it. I feel like the goal of the assessment needs to be something that the organization deems sensitive. If I can show that I’ve been able to compromise the CEO’s desktop or maybe a database containing credit card data or plans to build a battleship and then describe how these would be useful to an attacker, most organizations seem to find value in that.
How do you recommend security improvements other than pointing out where it’s insufficient?
Oftentimes I’m providing positive findings to customers to let them know where I think their controls are working. Even though something might be preventing me as an attacker, there are cases where those could still be improved. For example, maybe the organization has an exposed Outlook Web Access portal. Maybe I wasn’t able to access it during the assessment, but I still might recommend that they move it to the internal network and protect it behind a VPN.
Additionally, constant testing of your controls is a must. Even though the red team engagement is over, learn and utilize some of the techniques that were used. The methodology of the tester should be outlined in the report and will typically include both successes and failures. While some of the tester’s techniques might have failed during your engagement, you might find that something gets changed on your network without you knowing and now those techniques are successful. Lastly, having management support behind security improvements is critical. Policy controls that executives need to address should be provided in the report.
What nontechnical skills or attitudes do you look for when recruiting and interviewing red team members?
The hacker mind-set and creativity are the most important nontechnical character traits for a red teamer. Frequently on red team engagements they will be faced with challenges they have never seen before. Having the hacker mind-set means they will not stop when they face the unknown, but instead they will question everything and find unique and new ways to face a problem. When facing highly secured environments that utilize defense-in-depth strategies along with quality alerting and response, creativity on the red team is a must.
“Having the hacker mind-set means they will not stop when they face the unknown, but instead they will question everything and find unique and new ways to face a problem.”
What differentiates good red teamers from the pack as far as approaching a problem differently?
Most of the really good red teamers I have met specialize in some area heavily. This enables them to develop a deep understanding of a certain technology or software. Becoming a master of infrastructure setup, coding, device hacking, lock picking, or any other area will help you develop a niche skill that is useful on red team engagements. Having the ability to approach unique problems with a creative mind-set can make the difference between a successful red teamer and one who fails. ■
5