Скачать книгу

and MIT Public License, much of the open-source code you directly or indirectly use was published under, and so on.

      Hacking can develop useful new technological applications. But hacking can also be used harmfully. The general public seems to have focused on the latter connotation of the word hacker in lieu of its original meaning.

      I'm an advocate of an organization called Hacking Is Not a Crime, led by my friends Bryan McAninch, Chloé Messdaghi, and Phillip Wylie. Wylie is also the coauthor of the first book I cowrote for Wiley, The Pentester Blueprint. The book you're reading right now is my debut solo work for Wiley. And Wylie isn't related to Charles Wiley, who founded this company back in 1807. But perhaps this illustrates how tight knit the cybersecurity and hacker communities are: we tend to know each other quite well.

      I'm an idea person within the cybersecurity community, so my contribution to Hacking Is Not a Crime's mission to promote the positive use of the word hacker is to use my work in the media and writing books like this in a mindful and responsible way. During the many years I have been writing about cybersecurity and hacking, I always refer to the people who use computer technology to harm as cyberattackers, cybercriminals, or cyber threat actors. This distinction is a vital pillar of both cybersecurity culture and hacker culture.

      Even if you're 100 percent businessperson and 0 percent computer geek, understanding this will help you work with cybersecurity professionals and foster a strong security culture.

      If you work in business, you probably know what corporate culture is. It's how the people in your company behave, how the people in your company feel about it, and the attitudes and styles your company reinforces, whether that's done deliberately or accidentally. Corporate culture can affect employee morale, which can have a measurable effect on your bottom line.

      A strong security culture encourages the people in your company to behave in ways that facilitate your resilience to cyberattacks and help protect your precious data.

      I spoke to J. Wolfgang Goerlich, Duo Security advisory CISO of Cisco Systems. CISO stands for chief information security officer. CISOs bridge the gap between the suits and the nerds. Goerlich has years of experience in securing corporate business computer networks. Here's what he told me about security culture:

      Security culture comes from a partnership between security champions and security advocates. A security advocate is a member of the security team who focuses on getting practices into the hands of the workforce. It's more common for us to talk about security champions. A security champion is a member of the business itself, who collaborates with the security team on best practices. A culture of security has advocates working with champions to interpret and implement security controls. In a well-run security practice, controls will be usable and widely adopted, because of the partnership of advocates and champions.

      All security controls are useless if it is ignored. Good security is usable security. Good security is adopted security. The starting point, then, is empathy and kindness for the people we are charged with defending.

      I would stress the word everyone. I'm in a better position compared to my peers (CISOs of other companies, including those outside of the cybersecurity industry) as we are a security company. This means multiple things. It's easier to explain to my business managers, as they natively understand that “we are a security company” means our brand is based on the security of the company. And even people in departments that don't need to understand security management understand that branding is important.

      Security culture means that part of awareness training is decentralized. If someone is targeted by phishing, then they can speak to a colleague in the same room (now virtual) and ask them to take a look into it instead of going through an IT ticketing system.

      People aware of security can smell if they are being deceived by FUD, so the communication from the security team needs to be straightforward. (Both Merriam-Webster and Urban Dictionary define FUD as fear, uncertainty, and doubt.) Also, security-aware people can point out bad (security) control selection or implementation very quickly by replacing auditors or specialists.

      Of course, the security culture is not a replacement for security controls, but it helps in all kind of controls, even unpleasant ones.

      As with all the work you must do to keep your company secure, establishing and maintaining a strong security culture isn't a project you set then forget, as some infomercial spokespeople love to say about their As Seen on TV products. It's a constant, everyday process. It's something you build and maintain over the years. And if you neglect it, it will die. I love cybersecurity expert Bruce Schneier's ideas, so I'll quote him again as I often do in my writing:

      Security is a process, not a product.

      As I've mentioned, a strong security culture doesn't stop at your IT department. Every single person in your organization, from the bottom of the corporate hierarchy to the top, must be part of it.

      Every single thing your company's employees do with your computers, networks, and buildings can affect your security posture in a positive or negative way.

      A strong security culture begins when everyone understands how they can affect your security and they are willing to be accountable for that. Next, you need to promote security awareness. As with everything security-related, security training isn't something you should do only once. People in your organization need frequent security training and reminders about proper security habits.

      One of the most important things you can do is to train your workers to resist social engineering attempts. Explain what phishing is and the various ways it can manifest through phone calls, text messages, emails, web pages, and social media posts. Teach them that cyberattackers could pretend to be a person or company they trust, and to engage in healthy skepticism. And you must support that skepticism by reminding them that they won't be reprimanded for questioning if your chief executive officer (CEO) or tech support workers are who they say they are when they phone, email, or text message them.

      Your email servers could have robust antivirus software that scans all email attachments that go through the system. Nonetheless, no antivirus software is perfect. Malicious email attachments are one of the most common ways that cyberattackers acquire unauthorized access to computer systems. So, part of your company's regular security training should be a reminder to only open email attachments that they expect to receive, from senders they're

Скачать книгу