Скачать книгу

      These examples begin to imply some potential solutions for UIL. However, before we begin exploring solutions, we intend to set a foundation of understanding the types of losses that may be initiated through user actions. With this foundation, we can then discuss how to avoid putting users in a position where they might initiate loss, instruct them how to take better actions, and then prevent their actions from resulting in loss. We will also explore how to take the opportunity away from malicious actors, as well as how to detect and mitigate the malicious acts.

      Because there are an infinite number of user actions and inactions that can result in loss, it is helpful to categorize those actions. This allows you to identify which categories of user error and malice to consider in your environment and what specific scenarios to plan to mitigate. This chapter will examine some common categories where UIL occurs. We'll begin by considering processes, culture, physical losses, crime, user error, and inadequate training. Then we'll move on to technology implementation. Future chapters will explore ways of mitigating UIL.

      Although this might seem to have no direct relationship to the users, how your organization specifies work processes is one of the biggest causes of UIL. Every decision you make about your work processes determines the extent to which you are giving the user the opportunity to initiate loss.

      Obviously, there are a variety of potential losses that are created by removing a human cashier from the process (such as loss of business from customers who find interacting with a kiosk too complicated), but those are ideally accounted for within the revised process. The point is that the process itself can put the user in the position to create UIL, or it can remove the opportunity for the user to initiate loss.

      A process can be overly complicated and put well-intentioned users in a position where it is inevitable that they will make mistakes. For example, when you have users implement repetitive tasks in a rapid manner, errors generally happen. Such is the case with social media content reviewers. Facebook, for example, through outside contractors, pays content moderators low wages and has them review up to 1,000 reported posts a day. (See “Underpaid and Overburdened: The Life of a Facebook Monitor,” The Guardian, www.theguardian.com/news/2017/may/25/facebook-moderator-underpaid-overburdened-extreme-content.) This can mean that legitimate content is deleted, while harmful content remains. The situation is ripe for UIL and also for causing significant harm to the content moderators, who have stress both from the working conditions and from reviewing some of the most troubling content on the Internet.

      A process may also be poorly defined and give users access to more functionality and information than they require to perform their jobs. For example, companies used to attach credit card numbers to an entire sales record, and the credit card numbers were available to anyone in the entire fulfillment process, which included people in warehouses. Payment Card Industry Data Security Standard (PCI DSS) requires that only people who need access to the credit card numbers can actually access the information. Removing access to the information from all but those with a specific requirement to access it reduces the potential for those people to initiate a loss, maliciously or accidentally.

      All processes need to be examined to ensure that users are provided with minimum ability to create loss. Additionally, all organizations should have a process in place to prevent, detect, and mitigate the loss should a user initiate it.

      Establishing a great process is awesome. However, as stated in the famous Peter Drucker quote, “Culture eats strategy for breakfast.”

      Consider all of the security rules that exist in an organization. Then consider how many are usually followed. There are generally security rules that are universally followed and those that are universally ignored.

      As consultants, we are frequently issued badges when we arrive at a client's facility. We diligently don the badges, at least until we walk around and determine that we are the only people actually wearing a badge. While we intend to adhere to security policies, we also have a need to fit in with and relate to the people inside the organization. Badge wearing is a symptom of a culture where security policies inspire people to ignore them.

      Conversely, if many people in an office lock their file cabinets at the end of the day or whenever they leave their desk, most of their colleagues will generally do the same. Culture is essentially peer pressure about how to behave at work. No matter what the defined parameters of official behavior are within the organization, people learn their actual behavior through mirroring the behavior of their peers.

      When the Challenger space shuttle exploded, the explanation given to the public was that O-rings, cold weather, and a variety of other factors were the combined cause. However, internal investigations also revealed that there was a culture that was driven to take potentially excessive risk to stay on schedule. (See “Missed Warnings: The Fatal Flaws Which Doomed Challenger,” Space Safety Magazine, www.spacesafetymagazine.com/space-disasters/challenger-disaster/missed-warnings-fatal-flaws-doomed-challenger/.) Despite many warnings about safety concerns relevant to the Challenger launch, NASA executives chose to downplay the warnings and continued with the launch. Even if the Challenger explosion was due to a mechanical failure, it was clearly a UIL because someone made the conscious decision to ignore warnings and proceed despite the risks.

      While it shouldn't take a crippling of the entire space program to initiate culture fixes, NASA subsequently issued engineers challenge cards that they could place on the table in the middle of discussions and demand that their concerns be heard.

      In perhaps one of the most iconic cases of culture-based UIL, in 2017, the U.S. Navy destroyer USS Fitzgerald crashed into a large

Скачать книгу