|
A comparison to the "Year 2000" threat is important With Y2K the risk applied to every processor-based product containing a clock function - irregardless of manufacturer. A single sample of any "simple" device (e.g. electronic clock) could have been tested by setting the clock "forward" to December 31, 1999 and allowing it to run through Midnight - with the user verifying normal function on January 1, 2000. For large, complex systems (e.g. mainframe) the OS and application software providers needed to scan and/or test the software to ensure correct date handling. Data maintained or used by the system was also at risk as it was often stored in two-digit format. With this new risk a terrorist cell operative with programming experience could apply for a job with one or more "target" companies. Once hired, the operative would "sleep" - learning the company's software tools and processes, possibly recommending the hiring of additional operatives, and awaiting a chance to obtain routine, unrestricted access to the code-base. The operative could even make recommendations back to the "cell" as to the best way to harm the system - and may wait for specific instructions prior to acting. Once the code-base was modified to include the harmful code, set to execute only on (or after) a certain date, the operative could either resume normal activities (assuming to be secure from detection) or could quit the company. Modern change management systems (e.g. IBM/Rational Software's "ClearCase" ®) could help protect the code (i.e. changes identified to a particular user), but only if all code changes are reviewed specifically with this threat in mind. Remember - stealing someone else's User ID and password can be easily accomplished with enough time in a friendly environment. |
Send mail to
webmaster@d50.org with
questions or comments about this web site.
|