Hacking For Dummies. Kevin Beaver
Читать онлайн книгу.careful not to confuse criminal hackers with security researchers. Researchers not only hack aboveboard and develop the amazing tools that we get to use in our work, but they also (usually) take responsible steps to disclose their findings and publish their code. Unfortunately, there is a war going on against legitimate information security research, and the tools and techniques are often questioned by government agencies. Some people are even forced to remove these tools from their websites.
Malicious user
A malicious user — meaning a rogue employee, contractor, intern, or other user who abuses their trusted privileges — is a common term in security circles and in headlines about information breaches. The issue isn’t necessarily users hacking internal systems but users who abuse the computer access privileges they’ve been given. Users ferret through critical database systems to glean sensitive information, email confidential client information to the competition or elsewhere to the cloud to save for later, or delete sensitive files from servers that they probably didn’t need to have access to in the first place.
Sometimes, an innocent (or ignorant) insider whose intent isn’t malicious still causes security problems by moving, deleting, or corrupting sensitive information. Even an innocent fat finger on the keyboard can have dire consequences in the business world. Think about all the ransomware infections affecting businesses around the world. All it takes is one click by a careless user for your entire network to be affected.
Malicious users are often the worst enemies of IT and information security professionals because they know exactly where to go to get the goods and don’t need to be computer-savvy to compromise sensitive information. These users have the access they need, and management trusts them — often without question.
Recognizing How Malicious Attackers Beget Ethical Hackers
You need protection from hacker shenanigans. Along the lines of what my father taught me about being smarter than the machine you’re working on, you have to become as savvy as the guys who are trying to attack your systems. A true IT or security professional possesses the skills, mindset, and tools of a hacker but is trustworthy. They perform hacks as security tests against systems based on how hackers think and work and make tireless efforts to protect the organizations’ network and information assets.
Ethical hacking (otherwise known as vulnerability and penetration testing) involves the same tools, tricks, and techniques that criminal hackers use, with one major difference: It’s performed with the target’s permission in a professional setting. The intent of this testing is to discover vulnerabilities from a malicious attacker’s viewpoint to better secure systems. Vulnerability and penetration testing is part of an overall information risk management program that allows for ongoing security improvements. This security testing can also ensure that vendors’ claims about the security of their products are legitimate.
SECURITY TESTING CERTIFICATIONS
If you perform vulnerability and penetration tests and want to add another certification to your credentials, you may want to consider becoming a Certified Ethical Hacker (C|EH) through a certification program by EC-Council. See www.eccouncil.org
for more information. Like Certified Information Systems Security Professional (CISSP), the C|EH certification is a well-known, respected certification in the industry, accredited by the American National Standards Institute (ANSI 17024).
Other options include the SANS Global Information Assurance Certification (GIAC) program, IACRB Certified Penetration Tester (CPT), and the Offensive Security Certified Professional (OSCP) program, a hands-on security testing certification. I love the approach of the certifications, as all too often, people who perform this type of work don’t have the proper hands-on experience with the tools and techniques to do it well. See www.giac.org
, www.iacertification.org
, and www.offensive-security.com
for more information.
Vulnerability and penetration testing versus auditing
Many people confuse security testing via vulnerability and penetration testing with security auditing, but big differences exist in the objectives. Security auditing involves comparing a company’s security policies (or compliance requirements) with what’s actually taking place. The intent of security auditing is to validate that security controls exist, typically by using a risk-based approach. Auditing often involves reviewing business processes, and in some cases, it isn’t as technical. Some security audits, in fact, can be as basic as security checklists that simply serve to meet a specific compliance requirement.
Not all audits are high-level, but many of the ones I’ve seen — especially those involving compliance with the Payment Card Industry Data Security Standard (PCI DSS) and the Health Insurance Portability and Accountability Act (HIPAA) — are quite simplistic. Often, these audits are performed by people who have no technical security experience — or, worse, work outside IT altogether!
Conversely, security assessments based on ethical hacking focus on vulnerabilities that can be exploited. This testing approach validates that security controls don’t exist or are ineffectual. This formal vulnerability and penetration testing can be both highly technical and nontechnical, and although it involves the use of formal methodology, it tends to be a bit less structured than formal auditing. Where auditing is required (such as for SSAE 18 SOC reports and the ISO 27001 certification) in your organization, you might consider integrating the vulnerability and penetration testing techniques I outline in this book into your IT/security audit program. You might actually be required to do so. Auditing and vulnerability and penetration testing complement one another really well.
Policy considerations
If you choose to make vulnerability and penetration testing an important part of your business’s information risk management program, you need to have a documented security testing policy. Such a policy outlines who’s doing the testing, the general type of testing that’s performed, and how often the testing takes place. Specific procedures for carrying out your security tests could outline the methodologies I cover in this book. You should also consider creating security standards documented along with your policy that outline the specific security testing tools used and the specific people performing the testing. You could establish standard testing dates, such as once per quarter for external systems and biannual tests for internal systems — whatever works for your business.
Compliance and regulatory concerns
Your own internal policies may dictate how management views security testing, but you also need to consider the state, federal, and international laws and regulations that affect your business. In particular, the Digital Millennium Copyright Act (DMCA) sends chills down the spines of legitimate researchers. See www.eff.org/issues/dmca
for everything that the DMCA has to offer.
Many federal laws and regulations in the United States — such as the Health Insurance Portability and Accountability Act (HIPAA) and the associated Health Information Technology for Economic and Clinical Health (HITECH) Act, Gramm-Leach-Bliley Act (GLBA), North American Electric Reliability Corporation (NERC) Critical Infrastructure Protection (CIP) requirements, and the Payment Card Industry Data Security Standard (PCI DSS) — require strong security controls and consistent security assessments. There’s also the Cybersecurity