Caja PDF

Comparta fácilmente sus documentos PDF con sus contactos, la web y las redes sociales.

Compartir un archivo PDF Gestor de archivos Caja de instrumento Buscar PDF Ayuda Contáctenos

Ciberseguridad y ciberdefensa as .pdf

Nombre del archivo original: Ciberseguridad y ciberdefensa as.pdf
Título: CyberWar, CyberTerror, CyberCrime
Autor: Mehan, Julie E.(Author)

Este documento en formato PDF 1.6 fue generado por Adobe Acrobat 7.0 / Acrobat Distiller 7.0 for Macintosh, y fue enviado en el 04/05/2015 a las 02:08, desde la dirección IP 181.55.x.x. La página de descarga de documentos ha sido vista 4009 veces.
Tamaño del archivo: 4 MB (284 páginas).
Privacidad: archivo público

Descargar el documento PDF

Vista previa del documento


A Guide to the Role of Standards
in an Environment of Change and Danger

Julie E. Mehan

A Guide to the Role of Standards in an Environment of
Change and Danger

CyberWar, CyberTerror,
A Guide to the Role of Standards in an
Environment of Change and Danger


Every possible effort has been made to ensure that the information
contained in this book is accurate at the time of going to press, and
the publishers and the author cannot accept responsibility for any
errors or omissions, however caused. No responsibility for loss or
damage occasioned to any person acting, or refraining from
action, as a result of the material in this publication can be
accepted by the publisher or the author.
Apart from any fair dealing for the purposes of research or private
study, or criticism or review, as permitted under the Copyright,
Designs and Patents Act 1988, this publication may only be
reproduced, stored or transmitted, in any form, or by any means,
with the prior permission in writing of the publisher, or in the case
of reprographic reproduction in accordance with the terms of
licences issued by the Copyright Licensing Agency. Enquiries
concerning reproduction outside those terms should be sent to the
publishers at the under-mentioned address.
IT Governance Ltd
Unit 3, Clive Court
Bartholomew’s Walk
Cambridgeshire Business Park
Ely, Cambs CB7 4EH
© Julie Mehan 2008
The author has asserted the rights of the author under the
Copyright, Designs and Patents Act 1988, to be identified as the
author of this work.
First published in the United Kingdom in 2008
by IT Governance Publishing.
Reprinted with corrections 2010.
ISBN 978-1-905356-48-5


This book was produced in an attempt to understand today’s
security environment and how the application of existing
international standards and best practices can be used to
assist in the protection of information systems and their
associated software. It identifies a body of knowledge
essential to acquire, develop, and sustain a secure
information environment. Each chapter contains a list of
related references, as well as recommendations for additional
reading. Together, this book provides a foundation for
developing further education and training curricula and
products, as well as being useful to security practitioners,
system administrators, managers, standards developers,
evaluators, testers, and those wishing to be knowledgeable
about the establishment and sustainment of a secure
information environment.
The wild growth of the Internet is one of the most
remarkable phenomena in human history. It is much more
than just a medium for communication – it is the core of a
global information infrastructure, which is influencing our
culture at the same time as it insinuates itself into our daily
lives. There are predictions that this phenomenon is
changing everything from standards of literacy and monetary
transactions, to the practice of medicine.
Almost every new development has opposing aspects. For
example, the automobile has provided us with new means of
effectively and quickly covering distance and moving goods.
It has also created pollution, caused innumerable deaths
through accident and misuse, and has created a dependency
on limited fossil fuel.

The rapid development of the Internet is also not without its
benefits and costs. The Internet is now an almost universal
trade space for economic transactions, government decisions,
and social interaction. At the same time, it is a largely
unstructured terrain with few legal limitations and rules. The
result has been a digital ‘wild, wild West’ with the Internet
providing a fertile feeding ground for cyberwarriors,
cyberterrorists, and cybercriminals.
Given that we might agree on the need to create some form
of order in this Internet environment, the key question now is
how? The imposition of law and regulation is one solution.
The other may be a greater reliance on standards and best
practices that allow for regularity and fairness in managing
the broad issue of Internet regulation and structure without
adversely affecting the Internet’s open architecture. The idea
is to embrace the new information technology as a powerful
positive agent for change without ignoring the dangers
created by the deterministic nature of change.
I undertook the task of authoring this book to provide
security practitioners, managers, engineers, as well as
educators, trainers, and others with a companion to guide
them through the challenge of using international standards
to address security issues in an environment of cyberwarfare,
cyberterrorism, and cybercrime.
The guiding principle behind this book was to
straightforwardly provide a new context for addressing some
of the broader challenges of security. It is not intended to
provide detailed instructions for implementing technical
security solutions – there are already sufficient works that
provide this information. Rather, it is intended to encourage
security professionals to look at the dangerous environment
in which our information systems exist and to view the world

of international standards and best practices as a resource for
creating a culture of security within their organizations.
Driven by awareness of the rampant worldwide explosion in
the exploitation of information system vulnerabilities and
human frailty and naivety, demand is growing for means to
improve the protection of these systems in both the defence
and commercial sectors. Government and industry need
standardized and consistent processes that effectively and
efficiently acquire, develop, and sustain secure information
systems; means to justify confidence in these processes and
their products; and practitioners that are motivated,
disciplined, and proficient in their execution.
Determining what level of knowledge to presume of the
readers of this book took several twists and turns. Initially, I
took the existence of several significant configuration and
technical security guides as a starting point for establishing
presumption of knowledge. My goal was to provide a new
perspective, essential to ensure the protection of information
systems, detection of exploitable vulnerabilities, and the
rapid restoration of essential capability.
Fortunately, efforts to answer the question: ‘What are the
standards and best practices relevant to achieving consistent
security processes to address an environment of cyberwar,
cyberterror, and cybercrime?’ benefited from a number of
prior efforts and products. These will be found in the
reference and further reading sections in each chapter.
In concert with the guiding principle that the book should
simply provide a new perspective on the use of standards and
best practices to persons already possessing good security
engineering knowledge, I attempted to ensure adequate
coverage of requisite knowledge areas in contributing
disciplines to enable instructors and professionals from

several disciplines, such as software engineering, systems
engineering, project management, etc., to identify and
acquire competencies associated with the identification and
implementation of appropriate standards and best practices.
Finally, I would like to say that I have enjoyed authoring this
book and thank all of the helpful people involved.
Dr Julie E. Mehan, PhD, CISSP



Dr Julie Mehan is a Principal Analyst for a strategic
consulting firm in the State of Virginia. She has been a
career Government Service employee, a strategic consultant,
and an entrepreneur – which either demonstrates her
flexibility or inability to hold on to a steady job! Until
November 2007, she was the co-founder of a small womanowned company focusing on secure, assured software
modernization and security services. She led business
operations, as well as the information technology governance
and information assurance-related services, including
certification and accreditation, systems security engineering
process improvement, and information assurance strategic
planning and programme management. During previous
years, Dr Mehan delivered information assurance and
security-related privacy services to senior department of
defence, federal government, and commercial clients
working in Italy, Australia, Canada, Belgium, and the United
She served on the President’s Partnership for Critical
Infrastructure Security, Task Force on Interdependency and
Vulnerability Assessments. Dr Mehan is Chair for the
development of criteria for the International System Security
Engineering Professional (ISSEP) certification, a voting
board member for development of the International Systems
Security Professional Certification Scheme (ISSPCS), and
chair of the Systems Certification Working Group of the
International Systems Security Engineers Association. She
also serves as an Associate Professor at the University of
Maryland University College, specializing in courses in


About the Author
Information Technology and Organizational Structure, and
Ethics in Information Technology.
Dr Mehan graduated summa cum laude (with highest
honour) with a PhD from Capella University in Organization
and Management, focusing her research into challenges
facing Chief Security Officers in large government and
commercial organizations and development of a dynamic
model of Chief Security Officer leadership. She holds a
Master of Arts with honours in International Relations and
Law from Boston University and a Bachelor of Science
degree in History and Languages from the University of
New York. Dr Mehan was elected 2003 Woman of
Distinction by the Women of Greater Washington and has
published numerous articles including Framework for
Reasoning About Security – A Comparison of the Concepts
of Immunology and Security; System Dynamics, Criminal
Behavior Theory and Computer-Enabled Crime; The Value
of Information-Based Warfare To Affect Adversary Decision
Cycles; and Information Operations in Kosovo: Mistakes,
Misteps, and Missed Opportunities, released in Cyberwar
4.0. Dr Mehan is also fluent in German and has
conversational skills in French and Italian.
The author can be contacted at



Chapter 1: What Technology Giveth It Taketh Away ...........5
Chapter 2: CyberAttack: It’s a Dangerous World for
Information Systems ............................................................19
Chapter 3: The Human Factor: The Underrated Threat.......49
Chapter 4: Transition from an Environment of ‘FUD’ to a
Standards-Based Environment.............................................67
Chapter 5: Establishing a Culture of CyberSecurity............71
Chapter 6: Increasing Internationalism: Governance, Laws,
and Ethics.............................................................................83
Chapter 7: Standards: What Are They and Why Should We
Care? ....................................................................................95
Chapter 8: From CyberWar to CyberDefence: Applying
Standards in an Environment of Change and Danger........109
Chapter 9: Conclusion: Where Do We Go From Here? ...241
Appendix 1: Gap Analysis Areas of Interest ....................245
Appendix 2: Standards Crosswalk .....................................249
Index ..................................................................................257
ITG Resources ...................................................................269




Purpose and scope
For persons with knowledge of security engineering, but not
standards and best practices, this book introduces them to the
discipline of international standards and best practices and
points to references for further knowledge. It supplies the
background needed to meaningfully recognize the topic that
a reference might cover and highlights the references which
might be of interest.
This book cannot, of course, enumerate the knowledge
needed in all possible fields in which secure information
systems are essential.
The period of human history in which we are living is often
called the information era. An era in which the whole world
has begun to communicate using information technology
(IT); an era during which information has become at least as
valuable as other, more tangible, resources. Modern styles of
life have caused major changes to the world of economy. It
is not only the size of a company, or the money it possesses
– it is information – which makes a company powerful.
Information is power: information is money: information is
critical. Without proper information, any organization is
vulnerable to failure – whether it is a production company,
service enterprise, commercial vendor, or government


For the past two decades, together with the enormous growth
in the amount of information in everyday life, the problem of
data and information security has emerged as a global
concern. Because of the increasing value of information in
our life, it is essential to provide an environment where it can
be processed, stored, and transmitted correctly and securely.
Today, the growing concerns about cyberterrorism,
cyberwarfare, cybercrime, and the erosion of personal
privacy have governments and agencies around the world
crafting legislation and seeking the right standards to
implement in order to improve information security.
The need for a workforce more skilled in the engineering of
a secure information systems environment is clear. The
discovery – and potential exploitation – of vulnerabilities1 in
information systems by unauthorized, unethical, or criminal
individuals – as well as by the uneducated user – can have a
serious impact upon an owner in terms of increased costs
(recovery and remediation), and a negative impact on the
organization’s reputation.2
Increasingly, these incidents include the theft, destruction, or
compromise of critical confidential data processed by the


‘Vulnerability: Weakness in an information system, system security
procedures, internal controls, or implementation that could be exploited
or triggered by a threat source.’ [NIST FIPS 200].

A study of Information Security Breaches conducted in 2006 by Price,
Waterhouse, Coopers on behalf of the UK Department of Trade and
Industry (DTI) measured the results of security breaches in several ways.
The results indicated that relying only on an analysis of the cash cost can
be misleading; rather, the impact on their reputation can be even more
devastating. Additional information on this study can be obtained at:


Increasingly, these incidents include the theft, destruction, or
compromise of critical confidential data processed by the
system, subjecting individuals to identity theft or causing
organizations to suffer significant losses from fraud. Today,
35 US states introduced legislation to require certain data
breaches to be made known to the public, particularly when
personal data may have been compromised. Europe is
considering following suit.3 Such publicity has caused
damage to the reputations of even established firms,
resulting in loss of business,4 and has also prompted a
number of other states to enact similar data freeze and
notification laws. Current listings of such states can be found
The problem is not only the result of attempted attacks and
insertion of malicious software from both inside and outside
organizations but also other issues as well. Many security
incidents can be traced back to vulnerabilities that were
caused by inadequacies in software requirements, or defects
in software design, coding, or deployment configuration. The
combination of attacks with defects often result in computer
and software security problems that are frequent,
widespread, and often serious.
This book is a necessary preliminary step towards addressing
the challenges related to achieving adequate exposure to the
benefits of using international standards and best practices to
address the challenges of cyberwarfare, cyberterrorism, and

See IT Week article at for more about this

ChoicePoint’s stock falling 20 percent in the period after an incident
was disclosed shows another potential impact.


cybercrime, as well as the unintended consequences created
by information systems users. These challenges include
addressing the skill shortages within government and
industry and curriculum needs within universities, colleges,
and trade schools.
The ultimate goal for this book is to introduce readers to the
practical use of standards and best practices to address
significant problems, such as those presented by cyberwar,
cyberterror, and cybercrime.
While the content of this document provides broad coverage,
readers interested in gaining an even deeper knowledge in
cyberwarfare, cyberterrorism, cybercrime, and international
standards are encouraged to also read the references
provided throughout this document.
PricewaterhouseCoopers (2006) Information Security
Breaches Survey 2006: A Technical Report. Developed on
behalf of the UK Department of Trade and Industry (DTI).
Bennett, Madeleine (2007) ‘UK Internet Users Want to be
Informed of Data Losses’. Information World Review, 04
May 2007.



‘Technology giveth and technology taketh away, and not always in
equal measure. A new technology sometimes creates more than it
destroys. Sometimes, it destroys more than it creates. But it is
never one-sided.’
Postman (1990)

Despite Postman’s dire prediction, society has profited
immensely from the development, implementation, and
operation of new information technologies. Our lives have
been enriched by the increased prosperity, expanded
opportunity, and greater variety that advances in information
technology provide.
From the printing press to the information age
The information age is a product of information technology.
This is not, however, its distinguishing feature. Despite what
many may believe, technology in some form has always
been a part of humanity, even in the most primitive of
societies. The factor that distinguishes the period of
information revolution following the invention of the
printing press, and the same factor that distinguishes our
technological world today, is that the entire human condition
has experienced radical change and has entered into a period
of recognizable growth dynamics based on information
expansion associated with technological innovations.
As with the printing press, the introduction of the new
Internet-based information technology is much more than

1: What Technology Giveth It Taketh Away
just a technological discovery to which society must adjust.
The explosive growth of the Internet – a worldwide
telecommunications network – and a global information
society have brought about a transformation of our social
systems. As a result, not only the information technology,
but also human beings, social relationships, economic
standards, norms, and ethical values have evolved.
There are visible parallels between events surrounding the
invention and proliferation of the printing press and the
societal changes that are appearing as a consequence of new
information technology. These are so compelling that one
might contend that these will be as dramatic as the events of
the scientific revolution, the spread of knowledge, and the
Reformation, which all had their roots in the propagation of
information as a result of the creation of the printing press.
Unintended consequences will certainly impact the future of
society as a result of the new information technology. The
cataclysmic societal and cultural changes that occurred
subsequent to the invention of the printing press were
completely unpredictable. In fact, it took more than a century
for these to be recognized.
The printing press
The invention of the printing press totally transformed the
way in which information was created, reproduced, sold, and
consumed. It brought into being new economic institutions
and relationships and altered old ones beyond recognition.
As a result, the printing press represents the only comparable
event in the history of communications to the recent
information technology revolution.


1: What Technology Giveth It Taketh Away
Gutenberg’s first printing press was invented by converting
an old wine press into a printing machine. His first prints
were made in the German city of Mainz in 1450, and by
1490 the printing press had permeated 110 cities in six
different countries and more than eight million books had
been printed; each providing access to information that had
never before been available to the average citizen. By the
end of the century the technology had spread throughout
Europe, setting in motion the first information explosion – a
precursor to today’s information revolution.
It is clear that the printing press radically altered the manner
in which information was collected, stored, retrieved,
criticized, discovered, and promoted – leading eventually
and inevitably to the Reformation, the Renaissance, and the
scientific revolution.
The printed works enabled by the printing press forced the
Reformation, for without crucial access to the printed
editions of religious texts and the emerging variations on the
relevant dogma issues, Martin Luther may not have had
sufficient incentive to develop his revolutionary new
theological concepts. Also, without enhanced access to the
creation of printed texts, Luther would not have been able to
spread his new ideas beyond a few elite.
The Renaissance also owes its spread across Europe to the
printing press. While there had been preceding efforts to
evolve humanistic concepts prior to the so-called ‘Italian
Renaissance’, it was not until the printing press and the
subsequent ability to put those ideas into the hands of the
average citizen that they were able to proliferate and thrive.
Nowhere was the effect of the printing press as evident as in
the scientific revolution. Science relies on the concept of the
accumulation of knowledge. The collection and universal

1: What Technology Giveth It Taketh Away
availability of scientific data relied on the printing press,
whereupon new contributions of knowledge could become
part of a permanent accumulation.
It must be noted that the printing press did not invent the
book; rather, it changed how books contributed to the
preservation and distribution of knowledge. Until the
printing press, books were meticulously hand-copied and,
consequently, distribution was limited to an extremely small
number of the learned and clerics. The printing press allowed
the production of thousands of copies of a single manuscript.
In essence, books were brought from the libraries of the elite
to the homes of the populace.
The printing press also changed how information could be
retrieved. Prior to the printing press, the ability to retrieve
information was largely dependent on the capability of an
individual to recall the location of the information. Indexed
books were essentially unknown. After the printing press,
however, indexing became part of a more orderly, systematic
approach to printed text.
One of the greatest, most immediate and most identifiable
consequences of the invention of the printing press was the
revolution in education and learning. Previously limited to
scholars and clerics, learning through books gradually
expanded to become part of the daily life of children and
adolescents; thus exposing young citizens to a very different
developmental process than that experienced by the youth of
medieval society.
If the printing press first fostered the positive concepts of
modern individuality, it was also a major factor in the
destruction of the medieval constructs of society and
community. The printing press represented an example of
technology that fostered change, creating both good and bad.

1: What Technology Giveth It Taketh Away
The path taken by society after the printing press has led
unalterably to what many term a revolution resulting in the
advent of the ‘new information technology’.

Further Reading:
Dewar, J. A. (2000) The Information Age and the Printing Press:
Looking Backward to See Ahead (P 8014) [Electronic version].
Retrieved June 20, 2002 from
Eisenstein, E. (1979) The Printing Press as an Agent of Change. New
York: Cambridge University Press.

The new information age5
As emerging information technologies become increasingly
prevalent, it also becomes clear that society as a whole finds
itself in the midst of an information revolution equally as
profound and certainly as far-reaching as the one initiated by
Gutenberg and the invention of the printing press. As then, it
is not the technology itself that defines this information
revolution, but rather the unprecedented capability to enable
a degree of one-to-many and many-to-many communication
never before seen.


The futurist Alvin Toffler in his book, The Third Wave, described
technology in terms of three ‘waves’. The first wave was the Agricultural
Revolution, the second wave the Industrial Revolution, and the third
wave is the Information Revolution. He argued that the means by which
countries amass wealth is reflected in the way in which they wage war.
Thus, war in the information age would likely depend largely upon the
use of information technology.


1: What Technology Giveth It Taketh Away
Over 100 years ago, the emergence of the telegraph
presented an evolution in mass communication and
information sharing begun by Gutenberg and his printing
press. While it provided a new means of communication and
caused noticeable changes in the speed of communication, it
nonetheless remained limited by regulation and
technological capability; thus ensuring that it did not expand
beyond a select group of users. Consequently, its effects
were limited. Short generations later, the telephone appeared,
also altering the course of communication. But, like the
telegraph, the telephone also was limited in its expansion
capability and, consequently, its effects were also restricted.
Neither change in communications represented a revolution
in the society in which they were introduced.
From the beginning of records and through the industrial
age, land, human labour, and physical possessions were the
key ingredients of wealth. In this traditional paradigm, the
creation of wealth required the transformation of tangible
raw materials into some form of product. Over time, the
nature of the product has evolved until today we see
information and intellectual property serving as the raw
material for the development of wealth. There is hardly an
organization today that does not rely on information to
Recent decades, however, have witnessed a radical change
equal in force to the printing press in the means by which
information is collected, stored, retrieved, criticized,
discovered, and promoted. The pervasive spread of
technology and the means of instant communication and
information sharing have created a second information
revolution. One of the distinguishing features of today’s
information revolution – just as in the day of Gutenberg – is


1: What Technology Giveth It Taketh Away
the affordability of the new technologies, as well as access
by the masses, rather than by an elite few.
Perceptions of the world and its population are being
changed through the availability of information in the form
of electronic media. Future generations may experience a
new form of information described through electronic
documents rather than the written word only. In fact, the
many-to-many communication medium of networked
technology facilitates the process of maintaining, updating,
and distributing knowledge, resulting in immediately
available and constantly updated information. Just as in the
period following the invention of the printing press and the
wider distribution of books and learning to the homes of the
populace, the increased availability and affordability of
technology that can collect, store, process, and transmit
information positions today’s citizens for similar
phenomenal change.
Not only has the capability to distribute and update
information been enhanced, but also the ability to retrieve
that information has taken another momentous leap.
This profusion of new technologies for collecting,
processing, transmitting, and displaying information – often
collectively called the ‘information revolution’ – are altering
the familiar political, economic, socio-cultural, and military
dimensions in ways that we do not fully comprehend, and at
a rate that people find difficult to accommodate. The
information explosion is affecting the global distribution of
Information technology also has the ability to shape the way
in which individuals interact with information and
knowledge. The new information capabilities enable rapid
access to information on any topic of immediate concern.

1: What Technology Giveth It Taketh Away
Individuals have access in real time to what is occurring
across the globe, resulting in a more informed and aware
populace. One of the groups affected most directly by
information technology and the associated information
revolution is our youth.
Peter Drucker6 pointed out that as early as the age of four,
children are displaying computer skills, perhaps even
surpassing those of their elders. They are growing up with
computers as their toys, their companions, and their tools.
Today, there is incongruence between the way schools still
teach and the way twenty-first-century children learn. This is
very similar to what occurred in the sixteenth-century
universities, a hundred years after the invention of the
printing press and the availability of books and access to
Such major changes in environmental or technological
conditions often stimulate new patterns of social
organization that in turn demand new cultural responses; e.g.
the development of new institutional arrangements and
behavioural norms appropriate to the altered conditions. This
process extends beyond learning how to implement the new
technologies to the more encompassing issue of social
reconstruction in the face of new environmental or
technological conditions. In other words, the prevailing
network of social, economic, and political considerations
influence how we respond as the challenge of adaptation is
accepted, as new technology is developed and as new
purposes are applied.

Peter F. Drucker was a writer, management consultant and university
professor. He coined the term ‘knowledge worker’ and is known for his
works describing the knowledge economy.


1: What Technology Giveth It Taketh Away
One further bastion of industrial society, the physical
presence of many commercial providers, has seen erosion as
a consequence of the new information technology.

Further Reading:
Drucker, P. (1994, November) ‘The age of social transformation’.
The Atlantic Monthly, 53-80.

The ‘dark side of high tech’
Information technology spans the globe and there is no doubt
that it has been beneficial for human civilization. And while
some nations have chosen to reject or delay the unrestricted
advance of information technology, for the most part, we
have all profited from its existence. Our lives have become
richer, prosperity has increased, and information technology
has provided a conduit for increased opportunity.
Throughout history, individuals have fallen blindly in love
with new technology while easily discarding the old. The
endless pursuit of new technologies has often been seen as a
panacea for resolving all the complex questions of existence.
Infatuated with the technology itself and not always aware of
its full implications, mankind can easily become a slave to
the technology. For example, cars were invented to provide a
more convenient and rapid means of transportation. But their
invention was followed by a long line of problems –
dependence on oil, rubber refining, and congestion – which
in turn generated a sequence of technical solutions, each
ultimately leading to environmental pollution, increased
traffic management challenges, and a whole host of thornier

1: What Technology Giveth It Taketh Away
So, for every beneficial advance in the area of information
technology, there is also an accompanying negative. In our
ever-growing dependence on information technology, we are
also exposed to increased risk. The dark side of the new
information technology is based on the ability to exploit
vulnerabilities associated with technology. The effects of this
exploitation not only have the potential to cause enormous
damage to individual victims, but also to negatively impact
confidence in the information technology itself. Information
technology has become essential to the everyday operation
of most organizations and businesses, and disruption of those
services could cripple a company – or even a nation.
Uses of the new technologies illustrate some of the darker
features of behaviour and raise issues that should not be
ignored. Among the most important are the potential loss of
privacy and the lack of adequate laws and practices to
protect individuals and groups from misuse of their personal
New technologies also make it much simpler for those who
are so inclined, to produce and consume what many would
consider undesirable kinds of entertainment – child
pornography, for example. Another unintended consequence
is the movement of traditional and new crimes to the world
of information technology; whereby hiding evidence of
criminal behaviour or developing new forms of criminal
behaviour will become increasingly simpler, especially as
society becomes more technologically informed.
So, what are the scenarios that keep those concerned with IT
security awake at night?

Increasing dependence: Increased societal and individual
dependence on computers and communication systems
makes these systems a target for attack. The terror threat

1: What Technology Giveth It Taketh Away
towards computer intensive systems will grow as these
get more and more important for modern societies.

Increasing complexity: The increasing complexity of
networks creates an environment that may lead to
increased catastrophic failures. It is likely that no one
truly comprehends the complexity and interdependencies
of the networks that are being built. The networks will
continue to expand exponentially into a single, advanced
integrated IP network handling the majority of the
world’s communications needs. This converged,
broadband, intelligent network will extend well beyond
voice and data, local and long distance, supporting an
ever-widening array of services, and blurring distinctions
among networking, computing and applications. Driven
by e-business requirements and facilitated by
technological advances such as e-switching and nextgeneration satellites, the increasing externalization of
networking will give rise to an environment where
applications, content, and data reside in the network and
are dynamically handled by network service providers in
real time, without user intervention.

Increasing content: An ever-increasing amount of data is
being compiled daily on our individual buying habits,
mobile phone usage, credit card purchases, and more.
Indeed terabits of personal data are being accumulated
and aggregated with little consideration for our privacy.
Content is at the core of business transactions,
publishing, and entertainment. The diversity, volume,
and effect of content will grow such that during the next
10 years, we will experience unprecedented levels of
interactive content, driving valuable revenue streams for
publishers, corporations, and media companies. Content


1: What Technology Giveth It Taketh Away
will be accessible almost anywhere via broadband. The
effects of this will stretch from the corporation into the
home, as rich media content will be stored and managed
in digital asset management systems. High-value content
will have to be delivered securely. In the enterprise, the
ongoing digitization of more and more information,
including document authorization, will fully ease in
digital process management for more and more business

Increasing mobility: Mobility represents the next major
business and technical discontinuity facing large
enterprises. While the PC and Internet revolutionized
communication systems, mobility will revolutionize
information flow affecting business users, customers, and
partners. As early as 2005, the Gartner Group anticipated
that by the year 2007 more than 60% of the EU and US
population aged 15 to 50 would carry or wear a wireless
computing and communications device for at least six
hours a day, by 2010 this is expected to be more than
75% (Source: Gartner Group 2005). By 2010, less than 5
percent of global wireless subscribers will be using true
4G technology, but 15 percent will be using components
of a full 3G architecture based on LAN/WAN integration
and IP applications. In its 2008 predictions, inCode7
announced 2008 as a year of increasing importance for
wireless, especially for the security implications of an
increasingly mobile wireless user population.


inCode, acquired by Verisign in 2006, has been publishing its wireless
predictions annually since 2003. See


1: What Technology Giveth It Taketh Away

Increased intelligent devices: While general-purpose
computers are interconnected via the Internet, billions of
miniature intelligent devices already inhabit the world,
with their number increasing faster than the human
population. The next ten years will bring new
capabilities: a) many physical objects will be coded and
therefore will become uniquely identifiable (radiofrequency identification or RFID); b) intelligent devices
will be embedded in many physical objects, and will be
networked via the (mostly) wireless Internet.

Increased globalization: More and more software and
hardware will be developed in low cost countries such as
India and China. Commodity computer hardware,
firmware, and commercial off-the-shelf (COTS) software
are now being developed and manufactured in a number
of foreign countries. Some of these have traditionally
been openly hostile to the US, and some of their software
industries may even be subject to direct influence or
pressure from their governments. Frequently, the origin
of a given software application may be difficult or even
impossible to determine (especially in the case of open
source software). And still, many governments have
instituted policies to give preference to the purchase and
use of COTS software over custom-designed products.
Considering this, any hostile nation state or group with
software development capability and an agenda could be
in an ideal position to sabotage software or hardware
developed for export.

Could anyone have foreseen this dramatic turn of events?
Many consider the first individual to clearly address this
growth trend was a man named Moore. In 1965, Intel
Corporation’s co-founder and Chairman Emeritus, Gordon


1: What Technology Giveth It Taketh Away
E. Moore, postulated that the number of transistors per
square inch on integrated circuits doubles every year. This
idea, called Moore’s Law, is based on the idea that
computing power increases at a steady and predictable rate.
Postman, N. (1990) Informing Ourselves to Death. From a
speech given at a meeting of the Gesellschaft fuer
Informatik, Stuttgart, Germany, October 1990. Retrieved



‘It doesn't matter much, does it? If we lose some critical
infrastructure, we're still screwed.’
Ryan Russell (2001)


In 1981, science fiction author Frederick Pohl published a
novel entitled The Cool War. It describes a future world in
which war has been forbidden after the destruction of the oil
supply in the Middle East. Despite this ban, however, nations
continue to battle at lower levels of conflict. Workers are
intentionally infected with virulent strains of flu, power
supplies are sabotaged leading to regular power failures, and
water supplies are drained. While these actions could not be
considered warfare, they quickly become draining nuisances
with devastating impacts on each nation’s quality of life.
Many security professionals claim that, much like the levels
of conflict described by Pohl, the information age may bring
with it a constant barrage of threats and attacks ranging the
scale from cyberwar to cyberterror and cybercrime.


Comment made by Incident Analyst Ryan Russell, from, in an interview with NewsFactor Network after a
successful 17-day attack against California’s electronic grid in 2001.


2: CyberAttack: It’s a Dangerous World for Information

Recently in the news …
In May 2007, newspapers worldwide reported that Estonia, a small
Baltic country and one of the most ‘wired’ nations in Europe, was
the target of waves of massive cyber attacks. Estonia is heavily
reliant on its information network and the attacks were debilitating.
Government and public websites were unexpectedly overwhelmed
by tens of thousands of visits, overcrowding the server bandwidths
and resulting in a total loss of capability. The attacks appeared to
pour in from all over the globe, but Estonian officials and computer
security experts alleged that the attackers could be identified by
their internet addresses – many of which were Russian, and some
of which appeared to be stemming from Russian state institutions.
The allegations reported that these attacks were caused by
Estonia’s relocation of a Soviet Second World War memorial and
could clearly be classed as cyberwarfare.
Chinese information warfare activities worldwide have shown a
marked increase in recent months. According to US officials, the
Chinese press has openly acknowledged the formation of
officially-sanctioned information warfare (IW) units in the militia
and reserve since the year 2000. Personnel assigned to these
units would have significant expertise in information technology
and would be recruited from academies, institutes, and information
technology industries. During a military contingency, IW units
would be engaged to support active Chinese military forces by
conducting focused network intrusions or other forms of ‘cyber’
warfare against an adversary’s military and commercial computer


These sections highlight real events occurring around the world.


2: CyberAttack: It’s a Dangerous World for Information
A very short history of war
Wars are fought within the context of their age. The weapons
used to fight the war are often determined by the prevalent
technology.10 But warfare is not just about the tools of war –
it is equally about the drivers of war: politics, human culture,
and society. In order to understand the nature of
cyberwarfare, it is essential to take a trip back in time and
discuss the emergence of warfare in general. What follows is
not a historically accurate accounting of warfare and
weapons from the Stone Age to the present, but rather, a
short trip to set the stage for the discussion of cyberwarfare.
One can assume that the first conflicts were fought hand-tohand with fists, feet, stones, and sticks. Combat was very
personal and very direct – something that did not change for
The first revolution in warfare came about as stone weapons
gave way to bronze in about 3500 BC. This period saw the
emergence of new and more deadly weaponry, such as
swords, the penetrating axe, the bow, and the chariot. It
would be wrong, however, to see this new revolution only in
terms of the increased deadliness of the weapons. By itself,
improved weaponry would only have resulted in a limited
increase in the scale of warfare unless accompanied by the
emergence of social structures capable of sustaining large
armies and providing them with the incentive, as well as the
means, to fight on a scale never before encountered. The true


In fact, there is a common saying in today’s military: ‘Offering the
military a technical solution is like offering an alcoholic a glass of wine.’


2: CyberAttack: It’s a Dangerous World for Information
military revolution of the Bronze Age was rooted in the
evolution of complex societal structures.
These structures included central state institutions and an
administrative apparatus capable of diverting resources
towards community-based goals; inevitably facilitating the
growth of more stable military structures. The result was the
expansion and stabilization of a fully articulated military
structure organized along modern lines. The military caste
evolved into a permanent element of the community and
social structure and was endowed with strong claims to
social legitimacy. It was in Sumer and Egypt that the world
witnessed the emergence of the world’s first true military
forces. And they have been with us ever since.
This change was accompanied by a profound evolution in
the psychological foundation for the public’s social
relationship with the larger community. The centralization of
large numbers of individuals into complex societal structures
became associated with a refocus of their primary
allegiances away from the extended clan or tribe, and toward
a larger social entity known as ‘the state’.
The most profound change occurred as the conduct of war
became a legitimate social function supported by its own farreaching institutional infrastructure. In fact, warfare itself
had to become a defining characteristic of the social order if
the larger communities, or states, were to survive the
predatory behaviour of other communities. The emergence
of warfare as a means of furthering a group or state agenda
was accompanied by a wide array of social, political,
economic, psychological, and military changes that made the
existence of war a relatively normal part of the social
framework. Within two thousand years, mankind went from
a condition in which combat was relatively rare, where death


2: CyberAttack: It’s a Dangerous World for Information
and destruction were suffered at low rates, to one in which
death and destruction occurred on an almost modern scale.
The essential nature of warfare changed little in the
thousands of years that followed the wars fought by the
military forces of Sumer and Egypt. The ages through which
mankind passed, in particular the Iron Age, served to cement
the role of warfare in the psychology of the state and the
The Hundred Years’ War (1337–1457) marked the initiation
of modern warfare. Increased loyalty to the nation-state was
witnessed as a series of dynastic wars crystallized national
identities. This period also marked the emergence of
permanent, professional armies. The introduction of
gunpowder, together with the later evolution of small arms,
such as the musket, dramatically changed the face of
warfare. For the first time, the military had a portable,
relatively accurate, reliable – and deadly – weapon.
The most horrific conflicts seen by mankind occurred in the
last century and were defined by the use of weaponry such as
guns, cannons, tanks, missiles, and bombs. What
distinguished these conflicts was the symmetrical nature of
the forces pitted against one another – massed power against
massed power. The mindset of conventional warfare
continued to dominate thinking until the dissolution of the
Soviet Union in the late 1990s. The likelihood of another
massive massed force-on-force battle on the plains between
Europe and Russia became a forgotten possibility.
In his book, On War, Karl von Clausewitz defined warfare as
‘an act of violence intended to compel our opponent to fulfil


2: CyberAttack: It’s a Dangerous World for Information
our will’.11 And this definition is as valid for the use of
massive force-on-force as it is for the use of small-scale
incendiary devices used by terrorists.
What has become far more prevalent is an increasing number
of asymmetric conflicts – counter-insurgency operations,
low intensity conflicts, urban combat and peacekeeping
operations – that necessitate a vastly different set of tactics,
equipment, training, and skills than conventional military
engagements of the past. Conflicts today are unlikely to
involve the commitment of massive numbers of troops to
fixed battle zones, rather, they are fought with smaller
military units combating small groups of fanatical terrorists,
or even individuals, sometimes using fairly unsophisticated
tactics and technologies, capable nonetheless of rendering
significant harm.
The experience of the millennia between Sumer and Egypt
and the mid-twentieth century focused on a centralization of
individuals within a larger entity, such as a nation state. In
recent decades, however, vertical social, cultural, ethnic, and
religious integration has given way to horizontal
decentralization and factional polarization away from the
larger state concept.
With the advent of new information technology, warfare
may no longer be limited to the clash of sword upon sword

Prussian General Karl von Clausewitz was a renowned military
strategist during the period around the Napoleonic wars. He wrote On
War largely between the years of 1817 and 1830, but he did not live long
enough to see its completion. It was published posthumously in 1832.
Nevertheless, it is still regarded as one of the seminal works on war and
strategy, largely because it was one of the first treatises on the subject to
integrate political, social, and economic issues into the discussion of war.


2: CyberAttack: It’s a Dangerous World for Information
characteristic of nation-state warfare. It is also to be expected
that a large percentage of the conflicts facing modern nation
states, such as the US and the UK, will be of a devolving and
asymmetrical nature. Threats will likely come from diverse
and differing sectors. Of great concern is the certainty that
conventional terrorism and low-intensity conflict could be
linked to, or magnified by, computer-infrastructure attacks
that are capable of causing damage to vital commercial,
military, and government information and infrastructure.
Thus, while advanced nations gain tremendous advantages
from highly developed information and battlefield
management systems, they also become increasingly
vulnerable to cyberattacks from unknown adversaries. So,
what has changed in the definition of warfare proposed by
Clausewitz? The definition no longer covers the complexity
of defining warfare in a world where it may be conducted
without violence, yet the same aim is achievable.
Recent media reports indicate that news agencies have
become increasingly aware of the emergence of this
potentially catastrophic threat of cyberwar.12 Even respected
media organizations, like the British Broadcasting
Corporation (BBC), report on nations engaging in
cyberwarfare. A word of caution: media discussion of
cyberwar, like its discussion of many subjects, should be
digested carefully to divide the truth from sensationalist
reporting, designed more to excite the reading audience than
to accurately portray an emerging and scarcely researched


The term ‘cyber’ first originated as part of the word ‘cybernetics’,
coined by Norbert Wiener in his eponymous book. It has since spawned a
whole generation of cyber terms, starting with William Gibson’s first use
of ‘cyberspace’ in his novel Neuromancer.


2: CyberAttack: It’s a Dangerous World for Information
field. Media reports may thus not be particularly useful in
truly defining cyberwarfare, although they may play a vital
role in creating a public perception of cyber conflicts and
their potential for damage. As a consequence of increased
media attention, there also exists an increased public
awareness of cyberwarfare, though it is based less on studies
and research and more on assumption and speculation.
Cyberwarfare is being examined by numerous governmental
and academic institutions, including the RAND Corporation
(, the Center for Strategic and International
Studies (, and the Carnegie Mellon University
Software Engineering Institute’s Computer Emergency
Response Team ( Others, like the Information
Warfare Site (IWS) at, provide an
independent perspective on cyberwarfare. Consistent among
the institutional commentary on cyberwarfare is the
observation that it is becoming an increasingly important
security consideration. The potential vulnerabilities of
critical infrastructure components, the ability to engage in
remote and essentially bloodless attacks, the certainty of
other nations having the ability and intent to develop
weapons for use in cyberwarfare, and the possibility of
establishing an information control within the conventional
battle space are challenges too critical to disregard.
It is thus not surprising that most of the discussions of
cyberwarfare originate from national or military sources
responding to the perceived vulnerability of the national
critical information infrastructure to a potentially devastating
high tech attack. Reflections on cyberwarfare are fuelled by
two identified trends. One is the increased terrorist threat and
their potential use of cyber tools and the other is the formal
articulation of cyberwarfare policies in other nations.


2: CyberAttack: It’s a Dangerous World for Information
As early as 1999, Chinese military colonels Qiao Liang and
Wang Xiangsui suggested that the increasing adoption of
information technology would bring about a revolution in
strategic thinking in military affairs, and the use of such
technology by China would be ‘highly critical to achieving
victory in future wars.’13
What is cyberwar?
The terms information warfare (IW) and cyberwarfare are
often used synonymously and many authors treat them as
interchangeable. This is not the case. IW is much broader
and includes any technique to disrupt or affect an entity’s
information use – cyberattack, electronic warfare,
psychological operations, or intentional deception. By using
the terms interchangeably, IW is inadvertently tied to
cyberwarfare even though it has a much broader mandate.
This failure to create a coherent definition of cyberwarfare
inhibits a more effective study of its unique characteristics.
We are concerned primarily with the cyberwarfare
component of IW.
Although cyberwar is only one tool in the IW tool chest, we
can still capitalize on the accepted classifications used to


Comments published in Unrestricted Warfare (Qiao Liang and Wang
Xiangsui, Beijing: PLA Literature and Arts Publishing House, 1999).
This book proposes tactics for developing countries, in particular China,
to employ as compensation for their perceived military inferiority vis-àvis the US during a high-tech war. The fact that the authors are senior
officers in the People’s Liberation Army implies that its release, and
likely its content, was officially approved by the Chinese government.


2: CyberAttack: It’s a Dangerous World for Information
describe IW and apply them to the more narrow set14. As
such, Class I cyberwar is concerned with the protection of
personal information – or personal privacy. While the results
can still be devastating, Class I cyberwar is considered to be
the lowest grade.
Class II cyberwar concerns itself with industrial and
economic espionage, which can be directed against nations,
corporations, universities, or other organizational structures.
This form of cyberwar is definitely on the rise.
Class III cyberwar is officially about global war and
terrorism, which includes cyberterrorism, but which may
also include attacks against other parts of the critical
infrastructure. Whether discussing the intentional trashing of
another entity’s personal computer or network, or a denial of
service and whether the offending party is a malicious
hacker, a criminal extortionist, a true terrorist (who probably
regards himself as a martyr to whatever cause), or a foreign
government, the end result falls into the same category.
Finally, Class IV cyberwar is the use of all of the techniques
of Classes I – III in combination with military activities in an
effort to obtain a battlefield advantage or a force multiplier.
Cyberwarfare, thus, infers both an information-related
conflict at the national, military level, as well as low
intensity conflict activities intended to inflict limited levels
of damage.
What could cyberwar look like? Most experts anticipate that
cyberwar will be used in conjunction with or in support of


The cyberwar Class I-IV designations are adapted from Dr Dorothy


2: CyberAttack: It’s a Dangerous World for Information
other, more kinetic, events. The primary purpose would be
to enhance the effect of a physical attack and could occur as:

An attack against supporting infrastructures, such as
telecommunications, transportation, or power providers.

An attack against complementary infrastructures, such as
financial institutions.

One of the primary criticisms of the proponents of cyberwar
is that they assume conflict in a high-tech environment.
Arguments rage between technology advocates and the
‘dinosaurs of warfare’ over the fact that many of our current
adversaries are not high-tech nation states or organizations,
but rather low-tech groups or loosely organized individuals.
These debates have validity and there are convincing
arguments on each side. The common level of agreement,
however, is that the enormous growth in information
technology has provided opportunities for attack never
before seen and that doctrine, structure, and processes must
be adapted to encounter them effectively.

Further Reading:
Adams, J. (1998) The Next World War. New York: Simon &
Alberts, D.S., Garstka, J. J. and Stein, F. P. (1999) Network Centric
Warfare: Developing and Leveraging Information Superiority.
Washington, D.C.: CCRP.
Campen, A.C. and Dearth, D.H. eds. (October 2000) Cyberwar 3.0:
Human Factors in Information Operations and Future Conflict.
Fairfax, Virginia: AFCEA International Press.
Pohl, F. (1981) The Cool War. New York: Ballantine Books.


2: CyberAttack: It’s a Dangerous World for Information
‘In a matter of time you will see attacks on the stock market, I
would not be surprised if tomorrow I hear of a big economic
collapse because of somebody attacking the main technical
systems in big companies.’
Sheikh Omar Bakri Muhammad


Recently in the news…
In the autumn of 2007, several Israeli newspapers reported that
Al-Qaeda intended to launch what they called ‘an electronic jihad’
against Western websites starting 1 November 2007. Massive
Distributed Denial of Service (DDOS) attacks were to be
In August 2006, there were news reports of Hezbollah trawling the
Internet for vulnerable sites they could compromise and use for
undetected communications paths and to broadcast terrorist
According to reports from US defence officials to Congress in May
2006, creators of combat video games have unwittingly been
given a role in a global propaganda campaign conducted by
Islamic militants. The games are used to encourage
impressionable Muslim youths to take up arms against the US.
Technically-savvy militants from organizations, such as Al-Qaeda,
have manipulated video games to portray US troops as bad guys
in gun battles against heavily armed Islamic radicals, portrayed as


The BBC News (August 2005) described Sheikh Mohammad as a
‘self-styled radical cleric’ and founder of the London Branch of the
radical Hizb Al-Tahrir (the Islamic Liberation Party). He presents
himself as a spokesman of Osama bin Laden’s International Islamic
Front for Jihad against Jews and Crusaders.


Although these attacks did not occur, the possibility of this type of
attack is being taken seriously by potential targets worldwide, such as the
United States and the United Kingdom.


2: CyberAttack: It’s a Dangerous World for Information
heroes. The gaming sites use diverse emotionally-charged
contents, including images of successful attacks against real US
soldiers in Iraq and video-recordings of US televangelists making
criticisms against Islam. The underlying propaganda message,
officials say, is to portray the US as engaged in an anti-Islam
crusade in order to gain control of Middle Eastern oil, and that
Muslims worldwide should unite to protect Islam from US

The concept of cyberterror strikes fear into some of the most
seasoned security personnel. Cyberterrorists and their
sponsoring nations or organizations have cyber capabilities
to exploit computer security vulnerabilities.
Significant discussion is occurring over whether terrorist
organizations have been actively planning to use computers
as a means of attack against critical infrastructure elements.
There is also disagreement among some experts about
whether critical infrastructure computers even present a
lucrative target for furthering terrorists’ goals. It is a fact,
however, that terrorist organizations are now using the
Internet to communicate, and openly available intelligence
reports indicate that Al-Qaeda and other terrorist groups may
be using computer technology to plan future terrorist attacks.
At the same time, nuisance attacks against computer systems
and the Internet are increasingly widespread, furthering the
notion that known computer system vulnerabilities persist
despite growing concerns about possible effects of a
successful exploitation on the critical infrastructure.
So, what is the appeal of cyberterrorism? First, terrorists may
be unwilling, or more importantly, unable to engage ‘force
on force’. Like the cyberwarrior, they resort to asymmetric
attacks to achieve their goals. Also, cyberterrorists can be
patient. Their planning can take years by taking advantage of


2: CyberAttack: It’s a Dangerous World for Information
insider ‘placement’ to conduct reconnaissance rather than
insider attacks with limited short-term effects.
Terrorist groups today make use of the Internet to
communicate, raise funds for their activities, and to gather
intelligence on possible future targets. There is currently no
published evidence that computers and the Internet have
been used directly, or targeted in a terrorist attack. But there
is more than sufficient evidence that currently available
malicious attack programs can enable almost anyone with
computer and network access to locate and attack other
vulnerable information system computers. Terrorists also
have access to these same malicious programs, as well as
hacker techniques that may be used to launch a widespread
cyberattack against any nation’s critical infrastructure.
Before proceeding further, let’s define cyberterrorism: First,
start with a definition of terrorism as ‘premeditated,
politically motivated violence perpetrated against noncombatant targets by sub-national groups or clandestine
agents, usually intended to influence an audience.’17
The following definition of cyberterrorism was proposed in
2002 by Ron Dick, head of the US National Infrastructure
Protection Division, which is embedded within the
Department of Homeland Security: ‘Cyberterrorism is a
criminal act perpetrated through computers resulting in
violence, death and/or destruction, and creating terror for the
purpose of coercing a government to change its policies.’


As defined in 22 US Code, section 2656. This definition has been
used by the US Government for statistical and analytical purposes since
the early 1980s.


2: CyberAttack: It’s a Dangerous World for Information
If the above two concepts are used in tandem, however, then
the definition of cyberterrorism would be expanded to
incorporate the politically-motivated use of computers by
terrorist groups, sub-nationals, or clandestine agents as
weapons or as targets intended to result in violence,
influence an audience, or affect national policies.
Finally, an attack on information systems might also be
categorized as cyberterrorism if the effects are sufficiently
destructive or disruptive as to generate public fear
comparable to that from a physical act of terrorism.
Many security experts claim that known terrorist
organizations may not see the immediate value in using the
Internet as a means to launch an attack, largely because it
creates less immediate drama and has a lower psychological
impact when compared with the traditional physical attack.
Some contend that until cyberterrorism results in direct
physical damage or loss of life, it will never be considered as
serious as a nuclear, biological, or chemical terrorist attack
by either the potential victims or the perpetrators. The
Internet may continue to be largely utilized by terrorist
organizations as a tool for surveillance, co-ordination, and
communication, rather than a mechanism of cyberterrorism –
that is, until a cyberterror event can be executed to attract as
much media attention as any physical terror event.
According to Dr Dorothy Denning18, executing a coordinated, large scale attack against critical infrastructure


Dr Denning, formerly the Patricia and Patrick Callahan Family
Professor of Computer Science at Georgetown University and Director
of the Georgetown Institute for Information Assurance, is renowned for
her work in cybersecurity, cybercrime, and cyberterrorism. She has been


2: CyberAttack: It’s a Dangerous World for Information
information systems demands significant resources to
develop technically-sophisticated hackers and to conduct
pre-operational information gathering activities.
The Center for the Study of Terrorism and Irregular Warfare
at the Naval Postgraduate School in Monterey, California,
issued a white paper in 1999 entitled Cyberterror: Prospects
and Implications. Their purpose was to assess the demand
side of terrorism. More specifically, they evaluated the
likelihood of terrorist organizations using cyberterrorism as a
method of attack. Their analysis at the time concluded that
the barrier to entry for anything beyond low-level hacks
would be high and that terrorists generally lack the means
(whether financial or intellectual) and human capital
essential to the execution of an operation of any
consequence. They further argued that cyberterrorism was a
thing of the future.
The authors of the paper defined three levels of cyberterror
capability to quantify the skill sets and resources essential to
execute the attack: simple-unstructured, advanced-structured,
and complex-co-ordinated. Each level indicates an
increasing ability by a cyberterrorist and/or terrorist
organization to execute attacks ranging from those limited to
single machines to attacks resulting in mass disruption; it
also indicates organizational capability to use limited or
highly sophisticated target analysis, tools, and command and
control. It was estimated that it would take a group starting
from nothing at least two years to reach the advanced-

frequently called to provide testimony on cybersecurity issues before the
US Congress.


2: CyberAttack: It’s a Dangerous World for Information
structured level and more than six years to attain the
complex-co-ordinated level.
The view in 1999 was that ‘cyberterror is not a threat. At
least, not yet and not for a while.’ This view changed in
2001. According to David E. Kaplan, evidence acquired after
the attacks of September 11, 2001 against the US World
Trade Towers and the Pentagon strongly suggests that the
terrorists of Al-Qaeda made extensive use of the Internet to
plan and co-ordinate their operations.
Terrorist cells, such as Al-Qaeda, reportedly use Internetbased communication methods, such as instant messaging,
electronic bulletin board systems, and Voice over IP (VoIP)
telephone services, such as Skype, to communicate internally
and with other terrorist cells. Khalid Shaikh Mohammed19,
one of the masterminds of the plot against the World Trade
Center, is reported to have used special Internet chat
software to communicate with at least two of the airline
Indications are that attacks by isolated groups or individuals
sympathetic to terrorist motivations, as well as those with
less specific anti-US and anti-allied sentiments, may be more
likely than direct attacks by terrorist groups or affiliated
nation-states. Many sympathizers perceive the current US
campaign against terrorism to be a crusade targeting
members of the Muslim faith. Perceptions such as these
could incite Muslims, or their sympathizers around the


Purported to be one of Al-Qaeda’s senior operatives, Mohammed was
captured in 2003 and sent to the US Detention Centre in Guantanamo,
Cuba. He is also alleged to have personally decapitated the journalist,
Daniel Pearl, in 2002.


2: CyberAttack: It’s a Dangerous World for Information
world, to engage in cyberterrorist activities, and many have
sophisticated and sustained cyberattacks. There are a variety
of pro-Islamic hacker groups, such as the UNIX Security
Guards (USG)20 and the Federal Bureau of Hackers (FBH)21,
which provide examples of how terrorist affiliates could
potentially utilize these tactics against targeted Western
This section is not intended to provide a longitudinal study
of media publications or intelligence documents to determine
if there are more events supporting cyberterror than there are
those that disprove it. Like the prisoners in Plato’s cave22,
however, there are shadows on the wall that represent the
shape of a future environment where cyberterror is a clear


The USG is an anti-Israeli hacker group estimated to have emerged
sometime around May 2002. In that year alone, over 2000 attacks were
attributed to their activities. For the most part, these were large-scale
website defacements.


Formed in July 2002, the FBH membership is largely from Pakistan.
Mi2g ( estimates that the FBH was responsible for 588
attacks in October 2002 alone.


In The Republic, the Greek Philosopher Plato describes a group of
prisoners held in a cave with their backs to the light at the entrance to the
cave. They can see objects only as shadows reflected on the walls of the
cave. When they are eventually released, they are initially confused by
the clarity of the actual objects they view. Over time, the prisoners
become accustomed to seeing objects as they are in reality. Upon their
return to the cave, these prisoners now see the shadows only as a
distortion of truth, but the prisoners who had never left are disturbed by
those who challenge everything they know to be true.


Documentos relacionados

Documento PDF plaw 107publ40
Documento PDF tank destroyer doctrine us
Documento PDF quest locjack security system flyer
Documento PDF 2013 china report final
Documento PDF analyzing soviet defense programs 1951 1990
Documento PDF military

Palabras claves relacionadas