Out of the Ordinary: Finding Hidden Threats by Analyzing Unusual Behavior / Edition 1

Out of the Ordinary: Finding Hidden Threats by Analyzing Unusual Behavior / Edition 1

by John Hollywood
ISBN-10:
0833035207
ISBN-13:
9780833035202
Pub. Date:
01/05/2005
Publisher:
RAND Corporation
ISBN-10:
0833035207
ISBN-13:
9780833035202
Pub. Date:
01/05/2005
Publisher:
RAND Corporation
Out of the Ordinary: Finding Hidden Threats by Analyzing Unusual Behavior / Edition 1

Out of the Ordinary: Finding Hidden Threats by Analyzing Unusual Behavior / Edition 1

by John Hollywood

Paperback

$27.5
Current price is , Original price is $27.5. You
$27.50 
  • SHIP THIS ITEM
    Qualifies for Free Shipping
  • PICK UP IN STORE
    Check Availability at Nearby Stores
  • SHIP THIS ITEM

    Temporarily Out of Stock Online

    Please check back later for updated availability.


Overview

Presents a unique approach to selecting and assembling disparate pieces of the information to produce a general understanding of a threat.

Product Details

ISBN-13: 9780833035202
Publisher: RAND Corporation
Publication date: 01/05/2005
Edition description: New Edition
Pages: 192
Product dimensions: 5.88(w) x 9.00(h) x 0.53(d)

Read an Excerpt

Out of the Ordinary

Finding Hidden Threats by Analyzing Unusual Behavior
By JOHN HOLLYWOOD DIANE SNYDER KENNETH McKAY JOHN BOON

RAND CORPORATION

Copyright © 2004 RAND Corporation
All right reserved.




Chapter One

Introduction

"I think anything out of the ordinary routine of life well worth reporting."

Sherlock Holmes, in Sir Arthur Conan Doyle's The Hound of the Baskervilles

Prologue: Something Bad Happened on November 9th

(A hypothetical but unfortunately realistic case study)

In conducting a post-mortem of the sad events of November 9th, it is important to consider the events and timelines leading up to the incident. By mid-November, the media were clamoring for details on who knew what, what was known when, how the "obvious" signals could have been missed, and how the "dots" could have failed to have been "connected" ... again. By the middle of December, investigative reporters and official government investigators had disclosed that the following observations had existed in various government databases (federal and local) since the middle of October:

February 4

Two dozen tuna boats are ordered in Seattle for export to Singapore under Panamanian ownership.

June 13

A city permit is issued for an Arab student forum to be held in Hong Kong in mid-November.

August 9

Two dozen new tuna boats eventually arrive and register in Sydney's harbor.

September 8

A Panamanian-registeredcompany makes an application for eighteen berths for tuna boats in Singapore.

October 2

At a reception at the British Embassy in Singapore, an Australian diplomat reports hiring and work oddities in Sydney harbor involving new tuna boats being repainted.

October 4

In Singapore, a new firm registered in Panama is reported as trying to pressure local officials to approve special berthing privileges on very short notice without the proper paperwork.

October 6-7

Over a hundred Arab students from ten countries book travel for November 10 to Hong Kong through Singapore.

October 10

A wharf in Philadelphia is leased to a Panamanian firm.

October 11

A routine inspection at a wharf in Philadelphia identifies what appear to be tuna boats off-loading heavy crates.

October 12

Two Arab students are detained for having false driver licenses in Singapore.

Abandoned luggage is found at the Singapore airport and a report is filed.

As the investigation continued, a few points became clear. The first was that although some of the above data points (the "dots") were clearly suspicious in retrospect, it would have been virtually impossible to pick them out of the huge volume of noise inherent in modern intelligence processing and analysis, even with standard filtering techniques. Similarly, although the connections between the dots were also obvious in retrospect, the intelligence community and homeland security agencies simply were not designed to support the discovery of such links or to perform the follow-on analysis needed to determine what the connected dots might mean. New strategies were clearly needed....

(Appendix A presents the complete case study of the "November 9th affair.")

The Problem of Connecting the Dots in Intelligence

Too small. Too few. Too sparse. Too irregular. Too contextual. These characteristics of data about the "bad guys" are today's challenges. Predicting how adversaries will act is easy to do in hindsight but hard to do in advance. If their behavior is regular, or if the challenge is bounded, analyses that identify systematic behavior can be and have been successful. However, with the current and growing asymmetric threats, new tools are needed to exploit characteristics that are too small, too few, too sparse, too irregular, and too contextual.

Traditional approaches have assumed larger, more observable, less agile, and less creative adversaries. The new adversaries are far less tangible and more elusive. The challenge is compounded by a growing data glut, increasing noise in the environment and decreasing time available to perform analysis. To complicate matters, we cannot assume that the adversary will attack the same way twice. Projects such as the Novel Intelligence from Massive Data (NIMD) program propose innovative ways to deal with some of these challenges and have significant potential to help find entirely new and meaningful relationships in large-scale data sources. However, a key aspect not addressed by the projects of which the authors are aware is how analysts initially identify points of interest that do not meet narrowly defined criteria-in other words, the dots. The closest analogy to this key part of the process is that of astute problem solvers who, like the fictional Sherlock Holmes, track certain characteristics to recognize out-of-the-ordinary situations that can yield clues about events and activities. Something was supposed to be there but was not. Something was there but it wasn't supposed to be. The activities are unusual-our suspects are acting differently. These out-of-the-ordinary observations yield insights into what may happen in the future.

Another key aspect not commonly addressed is how to connect the dots-to identify the context of the out-of-the-ordinary data and to generate and test hypotheses related to what the connected dots might mean. In the past, when the amount of available intelligence information was comparatively limited, analysts could keep track of a complete picture of a situation. For example, R. V. Jones (1978) explicitly notes how having one analyst accessing the complete information stream and seeing the big picture was critical for many World War II intelligence successes. However, in World War II, comparatively all-seeing analysts were possible since data gathering was largely manual and limited by scarce resources. The challenge today is much greater, given both the volumes of intelligence information available and the numerous technical, organizational, and policy barriers to synthesizing information from multiple sources.

The intelligence community (IC) today draws on a disparate, heterogeneous assortment of collection and analysis systems, many of which were designed without any intention that their inputs and outputs would ever be used in an integrated, cooperative fashion. Since the mid-1980s, the IC has focused on developing numerous analysis support systems, knowing that it will need to draw on data in every imaginable form. However, we are not even to the point of having all necessary data in electronic form. Historically, both technical and nontechnical barriers-such as organizational policies, cultures, and security-have limited the usefulness of analytic support tools. Nonetheless, recent progress in integrating collection and automated analysis systems and in organizational collaboration through task forces, interagency centers, and ad-hoc working groups has increased the prospect for dramatic improvements in data analysis.

To date, most analytical support tools have leveraged what the tools' designers thought the technology could provide, coupled with their perceptions of analysts' needs. Sadly, some systems were designed and delivered without close consultation with the end-user. Another consistent problem is that collection and analytical systems have been designed and applied using conventional mindsets and approaches. Research in how analysts do their work has repeatedly shown that analysts become prisoners of their own experience, biases, and cognitive limitations (Heuer, 1999). Many analysts designed their strategy by looking for patterns related to "fighting the last war," and the IC went on building software systems to accommodate analysts doing just that. Other systems were designed to lighten the load on the analyst, to shovel away 90 percent of the low-grade rock so the remaining 10 percent had the highest likelihood of containing the rich ore that the analyst could profitably mine-but the "ore" was defined as information consistent with established patterns. Similarly, those who collected data were led to look specifically for the data analysts believed would fill the missing piece of an established or predicted pattern. Thinking "outside the box" is not a natural behavior for intelligence analysts-or for the human brain. Nonetheless, as Jones and others note, certain analysts have been very successful at doing just that.

In this monograph, we describe a concept for an analysis tool that is based on how the most-effective human analysts think "outside the box" to detect threats-a tool that models how those experts watch for and track the out-of-the-ordinary situations that yield critical insights into an intelligence problem. The analysts' experience and cognitive skills, combined with their intuition, allow them to generate expectations about what they are watching. However, the current human threat detection process suffers from an immense data load, disparate information flows, and time pressures. The proposed tool will complement existing projects, such as NIMD, that augment the human analytic process. Using contextual models created by expert analysts (including machine "analysts"), which describe both "normal" and "significantly atypical" expectations for what is watched and tracked, the tool can detect and track unusual and out-of-the-ordinary situations as they develop.

We propose a multitiered analysis and filtering system to assist analysts: It would monitor what is watched over time, how they are watched, and the results of the watching. What might start out as unusual and mildly out of the ordinary may change in perspective as other out-of-the-ordinary observations are clustered and analyzed for interdependencies of such factors as time, geography, and finances. The results can focus, guide, and concentrate specific and detailed information searches and analyses that use other analytical tools available or under development.

When the proposed detector is coupled with tools for processing structures and correlating data and activities, an integrated preemptive analysis system results. The Atypical Signal Analysis and Processing (ASAP) system addresses the asymmetric threat from all information fronts-what is out there, what is developing and gaining momentum, and what other players are involved. We believe that ASAP would be an important tool for warning the United States of developing and impending asymmetric threats.

Cognitive Processes for Connecting the Dots

McKay has carried out an extended research agenda over the past 15 years on problem solvers in dynamic situations. This research has yielded insights into how humans proactively identify potential risks and their likely consequences; its results are the inspiration for the ASAP system.

McKay shows that proactive problem solvers monitor populations and key data streams, pick up the extraordinary signals that could indicate a potential risk, and then initiate additional information analyses as needed to illuminate the risk. Note that "could indicate a potential risk" is an important distinction; the problem solver does not analyze all instances of atypical behavior but only those observations that can quickly be declared "potentially relevant" to a particular risk. Heuristics are then used to reduce or avoid the anticipated problem. The study subjects watched both people and processes and used intuitive models of the watched to pick out behaviors and characteristics that were odd, unusual, or threatening. Their mental models were based on expected behaviors-actions and activities. Behaviors were watched over time and changes were tracked. Sudden changes, a series of changes, frequent changes, a high magnitude of change, and changes that fit into potentially threatening contexts all warranted raised eyebrows. If the situation was sufficiently different from what it had been in the past, it was examined more closely. If the situation was assessed to be potentially important, the immediate or short-term past was backswept to detect initially ignored signals that might be relevant to the situation. The analysts were also aware of clustering; if they made an increasing number of odd or interesting observations, their level of alertness and analysis rose significantly. The analysts would also look to see what was related to the unusual events, what the correlation was, and whether events were converging. Expert problem solvers, who have the job of foreseeing future difficulties and discounting them, go through this process continually-often without conscious effort. To them, it is second nature to recognize the dots and connect them. The initial trigger is usually a change in the status quo.

Studied in isolation, a single or minor change might not be noteworthy, but when placed in context of what has happened in the past and what else might be happening simultaneously, the change suddenly becomes important. Experts have been observed exhibiting this type of behavior in a routine and subconscious fashion. For example, consider an observation from McKay's six-month study of one individual. The planner in a large factory being studied had an idea of normal email traffic between two of the factory's organizations that he was watching. Over two weeks, the amount of email traffic slowly increased. When it had increased to a level beyond what was considered normal, the planner noted that the status quo had changed and that certain events might happen in the future. He anticipated that a meeting would take place on a specific date and involve certain individuals, notably including factory managers. As a result, he specifically planned critical manufacturing events to take place before and after the anticipated meeting-when the managers would be available for supervision and support. Figure 1.1 summarizes this example.

The planner was right in his prediction of the meeting. Further, during the research study, the planner detected over 75 percent of the major perturbations to the factory and made appropriate corrections 60 to 80 percent of the time-an impressive score.

As another example, intelligence analysts have been observed to have an expectation about how certain materiel assets will be configured and deployed. A movement of the assets to a different region-out of the normal area of operation-could indicate that something unusual is going on. The movements of German radar groups were monitored in this way during World War II when intelligence was being sought for the test site of the V-2 rocket (described in Jones, 1978). The email traffic and materiel examples are the types of early warning indicators that lead to proactive intervention. These examples are not particularly unusual and have been observed in a number of cognitive studies of problem solvers. They have also been commented upon by such experts as R. V. Jones and Allen Dulles in their descriptions of the cat-and-mouse activities in scientific and operational intelligence during World War II (Jones, 1978; Dulles, 1963).

The key is to watch, to have expectations about what is being watched, to identify out-of-the-ordinary happenings, and to be able to correlate them with other interesting observations. Those findings are then used to guide further analyses or actions. For example, consider unusual cash transactions combined with unusual travel patterns of members of an extremist party during a period just prior to the anniversary of a suicide bombing. They might not mean anything, but they are worth a second look.

It is important to note that the problem-solving processes described above are much less linear than they appear at first glance. A problem solver will go through multiple iterations of finding out-of-the-ordinary phenomena, attempting to link them with other information, and generating and testing hypotheses about meaning, with each step based on what the problem solver has learned to date. We call the ability to perform multiple steps in a nonlinear order, with the next determined by what has been learned to date along with other environmental factors, context-dependent analysis.

(Continues...)



Excerpted from Out of the Ordinary by JOHN HOLLYWOOD DIANE SNYDER KENNETH McKAY JOHN BOON Copyright © 2004 by RAND Corporation. Excerpted by permission.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.

From the B&N Reads Blog

Customer Reviews