You CAN Stop Stupid. Ira Winkler

Читать онлайн книгу.

You CAN Stop Stupid - Ira  Winkler


Скачать книгу
mistakenly clicked on the wrong button on an interface, particularly when they encounter an unexpected lag in cursor speed.

      Whether it occurs through a legitimate accident or carelessness, you must assume that users will make an error. You need to proactively plan for such errors and have audit procedures, warning systems, redundancies, and so on, to ensure that potential errors are mitigated before a loss can be initiated.

      NOTE One of the most common examples of preventing accidental errors is providing a confirmation message that has to be acknowledged prior to the permanent deletion of an email message.

      One fundamental aspect of awareness training is that people believe a properly trained user will not make mistakes. The reality is that even with the best training, a user will make fewer mistakes, not no mistakes.

      Many people take for granted that common sense will help prevent a lot of mistakes. That might be an overly optimistic assumption. Either way, there can be no common sense without common knowledge. It is critical to ensure that all users are grounded in common knowledge. Training attempts to establish and strengthen this common knowledge.

      However, training frequently falls short. Some training provides an adequate amount of knowledge but is short on practical experience. Knowledge without application is short lived. A random piece of information will rapidly dissipate from memory and, without reinforcement, will be quickly forgotten. We explore this further when we discuss the concept of the forgetting curve in Chapter 5, “The Problem with Awareness Efforts.”

      Proper training should ensure that users understand what their responsibilities are and how to perform them. Ideally, training also impresses the need for users to be attentive in the performance of their duties. This requires accuracy and completeness in training, as well as motivation.

      Everyone has experience with inadequate training and can relate to the fact that such training results in loss. Fortunately, training can be strengthened to make it more effective. Chapter 15, “Creating Effective Awareness Programs,” addresses the improvement of training.

      As we talk about UIL, it is important to consider contributing factors to those losses. Everyone has experience with difficult-to-use systems that inevitably contribute to loss of some type. Some systems cause typographical errors that cause people to transfer the wrong amount of money. Some navigational systems cause drivers to go to the wrong destination or to drive the wrong way on a one-way street.

      Some user interfaces contribute to users making security-related mistakes. For example, one of the most common security-related loss results from autocomplete in the Recipients field of an email. People frequently choose the wrong recipient from common names. In one case, after one of our employees left our organization to work for a client and solicited bids from a variety of vendors, we received a proposal from a competitor. The competitor apparently used an old email to send the proposal to our former employee. In another case, we had a client issue an email with a request for proposals to various organizations, including us and our competitors. One competitor mistakenly clicked Reply All, and all potential bidders received a copy of their proposal.

      There are many aspects of technological implementation that contribute to UIL. The following sections examine design and maintenance, user enablement, shadow IT, and user interfaces.

      Design and Maintenance

      There are a wide variety of decisions made in the implementation of technology. These design decisions drive the interactions and capabilities provided to the end users. Although it is easy to blame end users when they commit an act that inevitably leads to damage, if the design of the system leads them to commit the harmful action, it is hard to attribute the blame solely to the end user. Such is the case in attempting to blame the Lion Air and Ethiopian Airlines pilots of the doomed Boeing 737 MAX airplanes.

      In the implementation of technology, there are many common design issues that essentially automate loss. Programming errors can cause the crash of major computer systems. If this happens to a financial institution, transactions can be blocked for hours. If it happens to an airline's schedule systems, planes can be grounded until the problem is resolved.

      Another category of loss that many professionals fail to consider is the disposal of equipment. Just about all technology seems to have local storage. Before an organization discards computers, they generally know to remove the storage drives. Many people know that they should delete everything on their cellphones. However, many organizations and individuals fail to consider that the same diligence should apply to printers, copy machines, and other devices that had access to the organization's network or data.

      If a loss results from the decisions, actions, or inactions of a person, it is a loss that you have to consider in your risk reduction plans, and that includes loss that relates to design and maintenance.

      User Enablement

      While you can expect end users to make mistakes or be malicious, you do not have to enable the mistakes or malice. Unfortunately, some technology teams are doing exactly that. It is a given that users have to be able to perform their required business functions. However, you can design a user's access and function to limit the amount of loss they can initiate.

      As we discussed


Скачать книгу