• Aucun résultat trouvé

What They’re Doing Today

Dans le document William L. Simon (Page 108-112)

A decade after they were released, both seem to be settled into more tra-ditional lives. Matt is currently working for a large company in San Jose as a Java application developer. Costa has his own company and sounds quite busy, “setting up digital surveillance systems and distributed audio clients (slimdevices) for businesses.” He’s found work that he’s well suited for; people bored with their jobs would be envious that he is, he says, “enjoying every minute.”

I

NSIGHT

It seems amazing in today’s world that hackers still find it so easy to saunter into so many corporate Web sites. With all the stories of break-ins, with all the concern about security, with dedicated, professional security people on staff or consulting to companies large and small, it’s shocking that this pair of teenagers were skillful enough to find their way into the computers of a federal court, a major hotel chain, and Boeing Aircraft.

Part of the reason this happens, I believe, is that many hackers follow a path like I did, spending an inordinate amount of time learning about computer systems, operating system software, applications programs, networking, and so on. They are largely self-taught but also partly men-tored in an informal but highly effective “share the knowledge” tutoring arrangement. Some barely out of junior high have put in enough time and gained enough knowledge in the field that they qualify for a Bachelor of Science in Hacking degree. If MIT or Cal Tech awarded such a degree, I know quite a few I would nominate to sit for the graduation exam.

No wonder so many security consultants have a secret past as a black-hat hacker (including more than a couple whose stories appear in these pages).

Compromising security systems requires a particular type of mindset that can thoughtfully analyze how to cause the security to fail. Anybody trying to enter the field strictly on the basis of classroom learning would require a lot of hands-on experience, since he or she would be competing with consultants who started their education in the subject at age 8 or 10.

It may be painful to admit, but the truth is that everyone in the security field has a lot to learn from the hackers, who may reveal weakness in the system in ways that are embarrassing to acknowledge and costly to address.

They may break the law in the process, but they perform a valuable serv-ice. In fact, many security “professionals” have been hackers in the past.

Some will read this and put it down to Kevin Mitnick, the one-time hacker, simply defending today’s generation of hackers. But the truth is that many hacker attacks serve the valuable purpose of exposing weak-nesses in a company’s security. If the hacker has not caused any damage,

committed a theft, or launched a denial-of-service attack, has the com-pany suffered from the attack, or benefited by being made to face up to their vulnerabilities?

C

OUNTERMEASURES

Ensuring proper configuration management is a critical process that should not be ignored. Even if you properly configure all hardware and software at the time of installation and you keep up-to-date on all essen-tial security patches, improperly configuring just a single item can create a crack in the wall. Every organization should have an established proce-dure for ensuring that IT personnel who install new computer hardware and software, and telecom personnel who install telephone services, are thoroughly trained and regularly reminded, if not tested, on making cer-tain security is ingrained in their thinking and behavior.

At the risk of sounding — here and elsewhere — as if we’re promoting our earlier book, The Art of Deception (Wiley Publishing, Inc., 2002) pro-vides a plan for employee computer-security awareness training. Systems and devices should be security tested prior to being put into production.

I firmly believe that relying only on static passwords should be a prac-tice of the past. A stronger form of security authentication, using some kind of physical device such as time-based token or a reliable biometric, should be used in conjunction with a strong personal password — changed often — to protect systems that process and store valuable infor-mation. Using a stronger form of authentication doesn’t guarantee it can’t be hacked, but at least it raises the bar.

Organizations that continue to use only static passwords need to pro-vide training and frequent reminders or incentives that will encourage safe password practices. Effective password policy requires users to con-struct secure passwords containing at least one numeral, and a symbol or mixed-case character, and to change them periodically.

A further step requires making certain that employees are not catering to

“lazy memory” by writing down the password and posting it on their mon-itor or hiding it under the keyboard or in a desk drawer — places any expe-rienced data thief knows to look first. Also, good password practice requires never using the same or similar password on more than one system.

T

HE

B

OTTOM

L

INE

Let’s wake up, people. Changing default settings and using strong pass-words might stop your business from being victimized.

But this isn’t just user stupidity. Software manufacturers have not made security a higher priority than interoperability and functionality. Sure, they put careful guidelines in the user guides and the installation instruc-tions. There’s an old engineering saying that goes, “When all else fails, read the instructions.” Obviously, you don’t need an engineering degree to follow that bad rule.

It’s about time that manufacturers began getting wise to this perennial problem. How about hardware and software manufacturers starting to recognize that most people don’t read the documentation? How about providing a warning message about activating the security or changing the default security settings that pops up when the user is installing the product? Even better, how about making it so the security is enabled by default? Microsoft has done this recently — but not until late 2004, in the security upgrade to Windows XP Professional and Home editions with their release of “Service Pack 2,” in which the built-in firewall is turned on by default. Why did it take so long?

Microsoft and other operating system manufactures should have thought about this years ago. A simple change like this throughout the industry might make cyberspace a little safer for all of us.

91

Chapter 5

Dans le document William L. Simon (Page 108-112)