Kit para Analistas

Fundamentals of Intelligence Analysis

Templates of Selected Analytic Tools

The Intelligence Question Refinery

New Development [Write in summary of new development]
Raw Question [Write in refinement]
Key Issue [Continue to write in refinements]
National Interests
Specific Customers’
Concerns
Target’s Perspective
Refined Question

One of the first things analysts have to do when tackling a new project is to make sure they are asking the right questions, and one way to do this is the Intelligence Question Refinery.  This is a quick and systematic way for analysts to make sure that they have taken into account the perspectives of all of the stakeholders.  Analysts who do not think about the core questions that they are asking, and from a variety of points of view, risk producing work that may be irrelevant or incomplete.

· Begin by writing down the new development (question from decision maker, new intelligence report, item in the news, etc.) that has prompted the need for analysis.

· Quickly draw on your instincts to formulate your first impressions (“raw question”) of what the intelligence question should be.  Then, after a quick pause, examine the raw question to make sure that you are addressing the key issue.  Rewrite the raw question, if necessary.

  • Next, consider thequestion from a variety of points of view to capture differentaspects (national interests, etc.).  Again, rewrite the question, if necessary, to capture these.  Your issue may have a different list of stakeholders than this generic list.
  • The result will be a more “refined” intelligence question.

Intelligence Collection Plan


Source Periodicity Purpose Utility Section Comments
Open Sources(OSINT) [FIll in blocks as research progresses]
[Fill in specific source]
Human Intelligence(HUMINT)
[Fill in specific source]
Imagery Intelligence(IMINT)
[Fill in specific source]
Signals Intelligence(SIGINT)
[Fill in specific source]
Measurement and Signatures Intelligence(MASINT)
[Fill in specific source]

The idea behind the Intelligence Collection Plan is to get an overall picture of what you have, what you need, and the value of your sources.  In the far left column, under the various categories of sources, list the specific sources you already have; leave room to add others later as your research continue.  If you do not have sources from one or more of the main categories, think about it some more, or consult experts in that field, as there may be some creative collection opportunity that you have not considered.

Once you have a solid list of sources, rate them according to a list of criteria such as the following (you may want to construct a different list, depending on your problem):

  • Periodicity: how or how many times you need to consult it (perhaps only once for a book, several times for a periodical, and daily or weekly until your research is complete for websites; set up a schedule).
  • Purpose: what you consulted the source for (background, differing points of view, statistical data, graphics and illustrations, etc.).
  • Utility: how valuable  (accurate, relevant, etc.) was the source in meeting your objectives.
  • Section: in which section of your paper or report will it be used.
  • Comment, any other pertinent information, from a collection point of view (contact information on those interviewed in case you have follow-up questions, etc.).

From time to time, look at the overall plan and assess your progress.  Do you have several items in each category; that is to say, have you consulted all possible types of sources?  Do you have sources for every section of your report?  Is there any follow-up action that you have forgotten to take?  Empty boxes in the matrix will indicate where more work on collection is needed.

A Quadrant for Generating Scenarios

An effective way to generate four different, creative, but still plausible options is to use a Quadrant to Generate Scenarios, which prompts thinking about how the interaction among the main forces could shape events. This technique exhausts the logical possibilities of the interaction between two factors, generates some options that analysts might not thought of, and still remains grounded in the reality of the key driving forces.

  • First, identify the two main drivers (forces that will shape the future). Discussion of

what are the two main drivers, among the many that are possible, can be a useful
exercise in itself.  Drivers are typically factors such as political, economic,
military, and social developments (political instability, new technologies, demographic
change,, etc.).

  • Put one driver on the vertical axis, and the other on the horizontal. This creates four

“worlds,” in which the high and low impact of the drivers interact in different
combinations.  Which of the worlds is closest to the current situation?

  • Flesh out a description of what each of the four worlds would look like. Consider

what the challenges and impact would be for each

  • Lay outsomeindicators of of how observers would know that the situation was

moving toward one of the scenarios.  Also think about wildcards:  significant
unexpected events that might dramatically alter the framework of analysis.

Analysis of Competing Hypotheses

Hypothesis 1

Hypothesis 2

Hypothesis 3

Hypothesis 4

Evidence 1 [FIll in blocks with
“I”  or  “C”]
Evidence 2
Evidence 3
Evidence 4
Evidence 5
Evidence 6
Total number
of “Is”
[Add up  “Is”]

An Analysis of Competing Hypothesis is a technique for determining which of several hypotheses is the most likely, given the available evidence.

The steps in performing an Analysis of Competing Hypotheses are:

  • Identify the possible hypotheses to be considered; have at least three or four, as this will force you to go beyond the typical three-part conventional wisdom, such as    backward/status quo/forward or good/indifferent/bad.  Place the hypotheses along    the horizontal axis of a matrix.
  • Gather the available evidence, and list those items along the vertical axis. Analysts  may want to refine the evidence by assigning a value or weight to each piece, such as a point on a scale from one to five.
  • Look at each piece of evidence and decide the diagnosticity of each one in relation     to each hypothesis.  File in the box in the matrix with an  “I”  if the evidence is      inconsistent with that hypothesis, a  “C”  if it is consistent, or a  “?”  if it is ambiguous. An “N” for neutral can also be used.  But  “?s”  and  “Ns”  should be used     sparingly (really think about it, and don’t use these as the easy way out).  If you      have assigned weight to the individual items of evidence, you may want to refine      these judgments by also assigning them a number (such as on a scale of 1 – 5) or  giving the stronger evidence a darker color (again, on a shaded scale).
  • Refine the matrix. Looking down the columns, are any hypotheses implausible; or, looking across the rows, does any of the evidence have no diagnostic value because it  is the same consistency or inconsistency for each hypothesis?  Give implausible hypotheses or nondiagnostic evidence a lower priority in consideration, but be careful about discarding these entirely, as new evidence may change your assessment.
  • Draw tentative conclusions about the relative likelihood of each hypothesis; which has the least “Is”? Try to disprove hypotheses rather than prove them.
  • Consider how sensitive your conclusion is to a few critical items of evidence. What      would happen if crucial evidence turned out to be wrong? Is there evidence that you would expect to see for any hypothesis, but is not there?
  • Report conclusions. Discuss the relative likelihood of all of the hypotheses, not just the most likely one. Keep in mind that new evidence could come in that increases      the likelihood of some of the hypotheses.
  • Identify indicators for future observation that may indicate that events are taking a    different course than expected.

SWOT Analysis

Strengths

1.

2.

3.

Weaknesses

1.
2.
3.

Opportunities

1.
2.
3.

[Formulate a strategy]

[Formulate a strategy]

Threats

1.
2.
3.

[Formulate a strategy]

[Formulate a strategy]

An assessment of Strengths, Weaknesses, Opportunities and Threats (SWOT) is one of the best known strategic planning tools taught in business schools. A SWOT Anlalysis lays out these four forces with which an organization will have to deal, considers the more significant interactions, and then prompts the formulation of an appropriate strategy for dealing with each of the four possible outcomes.

  • Strengths are what an organization does well, andweakness are what an

organization lacks or does poorly.  These are internal factors within an
organization, and management has a degree of control over them.

  • Opportunities, which create demand for an organization’s products or services, and threats, which are sources of potential harm, are factors in the external environment, and management has less control over them.
  • A SWOT analysis is based on a matrix which shows the interactions between these four elements. List the three most important Strengths, Weakensses, Opportunities, and Threats, in priority order (note that the discussion of the top three in each category has value in itself).
  • The intersection between Strengths and Weaknesses, on the one hand, and Opportunities and Threats, on the other, produces four possible situations in which the organization might find itself.  As part of their planning, management should generate a strategy for each of these four situations.

Decision, or Hypothesis, Tree

A decision tree lays out a series of sequential and mutually exclusive choices that constitute all possible options.  This helps to understand a problem by seeing cause and effect relationships.  It also indicates the choices that are available and where the decision maker is in the process of trying to resolve a situation.

  • Start with the situation. What choices are available to the decision makers (start or

not start a war, make or break a treaty, fire or retain an official, etc.).

  • If either of those choices is made, what options or choices are then possible or

necessary.

  • The choices can be extended to several levels, for greater detail and implications, or

a further extension in time.

Then end of each vertical chain is a possible outcome or hypothesis.  The decision tree can be used to create a range of possible outcomes, including options that may not have been considered.

Naturally not all problems can be broken down into such neat categories.

Once the hypothesis tree is laid out, the analyst should array the existing evidence against the various options.  Where is there the most evidence?  Does this mean that this option is the most likely?  Are gaps in reporting now more obvious? Is there any valid evidence that does not fit easily into any of the options, and what does that mean?

_______________________________________

Developing a Taxonomy of Intelligence Analysis Variables

Rob Johnston


Editor’s Note:  By distilling a list of the variables that affect analytic reasoning, the author aims to move the tradecraft of intelligence analysis closer to a science.  A carefully prepared taxonomy can become a structure for heightening awareness of analytic biases, sorting available data, identifying information gaps, and stimulating new approaches to the understanding of unfolding events, ultimately increasing the sophistication of analytic judgments.  The article is intended to stimulate debate leading to refinements of the proposed variables and the application of such a framework to analytic thinking among intelligence professionals.

* * *

Science is organized knowledge.

Herbert Spence [1]

Aristotle may be the father of scientific classification, but it was Carolus Linnaeus who introduced the first formal taxonomy—kingdom, class, order, genera, and species—in his Systema Naturae in 1735.  By codifying the naming conventions in biology, Linnaeus’s work provided a reference point for future discoveries.  Moreover, the development of a hierarchical grouping of related organisms contributed significantly to Darwin’s creation of an evolutionary theory.  The Systema Naturae taxonomy was not a fixed product, but rather a living document.  Linnaeus himself revised it through 10 editions, and later biologists have continued to modify it. [2]

As discoveries and research methods in other domains grew, taxonomies were created to help organize those disciplines and assist researchers in identifying variables that required additional study.  The development of specific taxonomies—from highly structured systems like the Periodic Table of chemical elements to less structured approaches like Bloom’s Taxonomy [3] —is a key step in organizing knowledge and furthering the growth of individual disciplines.  A taxonomy differentiates domains by bounding the problem space, codifying naming conventions, identifying areas of interest, helping to set research priorities, and often leading to new theories.  Taxonomies are signposts, indicating what is known and what has yet to be discovered.

This paper proposes a taxonomy for the field of intelligence.  Over 100 individuals gave their time and assistance in this work.  The resulting organized listing of variables will help practitioners strengthen their understanding of the analytic process and point them in directions that need additional attention. [4]

Intelligence Analysis

We could have talked about the science of intelligence, but . . . the science of intelligence is yet to be invented.

Charles Allen [5]

Understanding intelligence analysis is not a trivial matter.  The literature in the field is episodic and reflects specialized areas of concern.  Intelligence literature makes an important distinction between solving a problem in the public domain and solving a problem in a private or secret domain.  This distinction seems key in differentiating between general analysis and intelligence analysis.

Ronald Garst articulates two arguments that are used to support this distinction:  Intelligence analysis is more time-sensitive than analysis in other domains, he suggests, and it deals with information that intentionally may be deceptive. [6] The notion that intelligence is uniquely time sensitive is questionable.  Intelligence is not the only domain where time constraints can force decisions to be made before data are complete.  Whether one is in an operating room or a cockpit, time is always a key variable.  Intelligence is a life and death profession, but so are medicine and mass transportation.  In each instance, failure can mean casualties.

Garst’s point about intentional deception is more germane.  Seldom do analysts in fields other than intelligence deal with intentional deception.  Michael Warner makes a good case for secrecy being the primary variable that distinguishes intelligence from other activities. [7] He argues that the behavior of the subject of intelligence changes if the subject is aware of being observed or analyzed.  Not only is this true for intentional deception, the argument is supported by a long history of psychological research beginning with an experimental program at the Hawthorne Plant between 1927 and 1930.  The result of that research was the theory of the “Hawthorne Effect,” which, broadly interpreted, states that when subjects are aware of being observed, their behavior changes. [8]

Intentional deception can occur outside intelligence—in connection with certain law enforcement functions, for example—but most of the professional literature treats this as the exception, rather than the rule.  In the case of intelligence analysis, deception is the rule.  In intelligence analysis, the validity of the data is always in doubt.  Moreover, intelligence analysts are specifically trained to factor in deception as part of the analytic process, to look for anomalies and outliers instead of focusing on the central tendencies of distribution.

The taxonomy being developed here requires a definition of intelligence analysis that is specific to the field.  Sherman Kent, a pioneer in the intelli-gence discipline, wrote that intelligence was a “special category of knowledge.” [9] He outlined the basic descriptive element, the current reporting element, and the speculative estimates element as the key components of intelligence analysis.  His work laid the foundation for understanding the activities inherent in intelligence analysis by demonstrating that the analytic process itself was subject to being analyzed.  Kent took the first step toward developing a higher order, or meta-analytic, approach to analysis by reducing the process to smaller functional components for individual study.

Following suit, other authors focused attention on the process or methodological elements of intelligence analysis.  In Intelligence Research Methodology, Jerome Clauser and Sandra Weir followed Kent’s three functional areas and went on to describe basic research foundations and the inductive and deductive models for performing intelligence analysis. [10] Garst’s Handbook of Intelligence Analysis contains less background in basic research methods than Clauser and Weir, but it is more focused on the intelligence cycle. [11]

Bruce Berkowitz and Allan Goodman highlight the process of strategic intelligence and define intelligence analysis as:  “[T]he process of evaluating and transforming raw data into descriptions, explanations, and conclusions for intelligence consumers.” [12] Lisa Krizan, too, focuses on process.  She writes that, “At the very least, analysis should fully describe the phenomenon under study, accounting for as many relevant variables as possible.  At the next higher level of analysis, a thorough explanation of the phenomenon is obtained, through interpreting the significance and effects of its elements on the whole.” [13] In addition, several authors have written about individual analytic approaches, including the LAMP method, Warnings of Revolution, Bayes’ Theorem, Decision Trees, and FACTIONS and Policon. [14]

Explicit in the above definitions is the view that analysis is both a process and a collection of specific techniques.  Analysis is seen as an action that incorporates a variety of tools to solve a problem.  Different analytic methods have something to offer different analytic problems.  Although the referenced works focus on methods and techniques, they do not suggest that analysis is limited to tools and techniques.

Implicit in the above definitions is the idea that analysis is a product of cognition.  Some authors directly link analysis with cognition.  Robert Mathams defines analysis as:  “[T]he breaking down of a large problem into a number of smaller problems and performing mental operations on the data in order to arrive at a conclusion or generalization.” [15] Another scholar writes:  “Since the facts do not speak for themselves but need to be interpreted, it is inevitable that the individual human propensities of an intelligence officer will enter into the process of evaluation.” [16] Yet others describe analysis as a process whereby:  “[I]nformation is compared and collated with other data, and conclusions that also incorporate the memory and judgment of the intelligence analyst are derived from it.” [17]

Several authors make the case that analysis is not just a product of cognition but is itself a cognitive process.  J. R. Thompson and colleagues write that “[I]ntelligence analysis is an internal, concept-driven activity rather than an external data-driven activity.” [18] In his Psychology of Intelligence Analysis, Richards Heuer observes:  “Intelligence analysis is fundamentally a mental process, but understanding this process is hindered by the lack of conscious awareness of the workings of our own minds.” [19] Ephraim Kam comments:  “The process of intelligence analysis and assessment is a very personal one.  There is no agreed-upon analytical schema, and the analyst must primarily use his belief system to make assumptions and interpret information.  His assumptions are usually implicit rather than explicit and may not be apparent even to him.” [20]

These definitions reflect the other end of the spectrum from those concerned with tools and techniques.  They suggest that the analytic process is a constr-uction of the human mind and is significantly different from individual to individual or group to group.  Certainly Kam’s view is the most radical departure, but even he does not suggest that one forego tools, rather that the process of choosing the tool is governed by cognition as well.

Recognizing that the scope of intelligence analysis is so broad that it includes not only methods but also the cognitive process is a significant step.  Viewing analysis as a cognitive process opens the door to a complex array of variables.  Not only does one have to be concerned about individual analytic tools, but one also has to factor in the psychology of the individual analyst.  In the broadest sense, this means not merely understanding the individual psyche but also understanding the variables that interact with that psyche.  In other words, intelligence analysis is the socio-cognitive process by which a collection of methods is used to reduce a complex issue into a set of simpler issues within a secret domain.

Developing the Taxonomy

The first step of science is to know one thing from another.  This  knowledge consists in their specific distinctions; but in order that it may be fixed and permanent distinct names must be given to different things, and those names must be recorded and remembered.

Carolus Linnaeus,
18th century biologist

My research was designed to isolate variables that affect the analytic process.  The resulting taxonomy is meant to bound the problem space and stimulate dialogue leading to refinements.  Although a hierarchic list is artificial and rigid, it is a first step in clarifying areas for future research.  The actual variables are considerably more fluid and interconnected than such a structure suggests.  They might eventually be better represented by a link or web diagram, once the individual elements are refined through challenges in the literature.

To create this intelligence analysis taxonomy, I took Alexander Ervin’s applied anthropological approach, which uses multiple data collection methods to triangulate results. [21] I also drew on Robert White’s mental workload model, David Meister’s behavioral model, and the cognitive process model by Gary Klein and his colleagues. [22] Each model focuses on a different aspect of human performance for the development of a taxonomy:  White’s model examines the actual task and task requirements; Meister’s looks at the behavior of individuals performing a task; and Klein’s uses verbal protocols to identify the cognitive processes of individuals performing a task.

Surveying the literature.  My research began with a review of the literature for background information and the identification of variables.  I found 2,432 case studies, journal articles, technical reports, transcripts of public speeches, and books related to the topic.  This was then narrowed to 374 pertinent texts on which a taxonomy of intelligence analysis could be built.  These texts were analyzed to identify individual variables and categories of variables that affect intelligence analysis. [23] The intelligence literature, produced by academics and practitioners, tends to be episodic or case-based.  This is not unique to the field of intelligence.  A number of disciplines—medicine, business, and law, for example—are also case-based in nature.  A number of the texts were general or theoretical and indicate a trend in specialization within the field.  Again, this is not an uncommon phenomenon.

A “Q-sort” method was used to analyze the texts. [24] As I read each text, I recorded the variables that each author identified.  These variables were then sorted by similarity into groups.  Four broad categories of analytic variables emerged from the Q-sort process. [25]

Refining the prototype.  Next, I used the preliminary taxonomy derived from my reading of the literature to structure interviews with 51 substantive experts and 39 intelligence novices.  In tandem, I conducted two focus group sessions, with five individuals in each group.  The interviews and focus group discussions resulted in adding some variables to each category, moving variables between categories, and removing some in each category that appeared redundant.

Testing in a controlled setting.  Finally, to compare the taxonomy with specific analytic behaviors, I watched participants in a controlled intelligence analysis training environment.  Trainees were given information on specific cases and directed to use various methods to analyze the situations and generate final products.  During the training exercises, the verbal and physical behavior of individuals and groups were observed and compared with the taxonomic model.  I participated in a number of the exercises myself to gain a better perspective.  This process corroborated most of the recommendations that had been made by the experts and novices and also yielded additional variables for two of the categories. [26]

The resulting taxonomy is purely descriptive.  It is not intended to demonstrate the variance or weight of each variable or category.  That is, the listing is not sufficient to predict the effect of any one variable on human performance.  The intention of the enumeration is to provide a framework for aggregating existing data and to create a foundation for future experimentation.  Once the variables are identified and previous findings have been aggregated, it is reasonable to consider experimental methods that would isolate and control individual variables and, in time, indicate sources of error and potential remediation.

Systemic Variables

The column of Systemic Variables on the chart incorporates items that affect both an intelligence organization and the analytic environment.  Organizational Variables include the structure of the intelligence organization, the managerial/reporting chain, workflow diagrams, leadership, management, management practices, the working culture, history and traditions, social practices within the organization, work taboos, and organizational demographics.  They also encompass internal politics, the hierarchical reporting structure, and material and human resources.  The fields of industrial and organizational psychology, sociology, and management studies in business have brought attention to the importance of organizational behavior and the effect it has on individual work habits and practices.  Within the field of intelligence, the works of Allison, Berkowitz and Goodman, Elkins, Ford, Godson, and Richelson, among others, examine in general the organizational aspects of intelligence. [27]

The Systemic Variables category also focuses on environmental variables—that is, external influences on the organization, such as consumer needs and requirements, time constraints, the consumer’s model for using the information, the consumer’s organization, political constraints, and security issues.  The works of Betts, Hulnick, Hunt, Kam, and Laqueur address the environmental and consumer issues that affect intelligence analysis. [28]

Case studies that touch on different Systemic Variables include:  Allison, on the Cuban missile crisis; Betts, on surprise attacks; Kirkpatrick, on World War II tactical intelligence operations; Shiels, on government failures; Wirtz, on the Tet offensive in Vietnam; and Wohlstetter, on Pearl Harbor. [29]

Systematic Variables

The Systematic Variables are those that affect the process of analysis itself.  They include the user’s specific requirements, how the information was acquired, the information’s reliability and validity, how the information is stored, the prescribed methods for analyzing and processing the information, specific strategies for making decisions about the information, and the methods used to report the information to consumers.

A number of authors have written about the analytic tools and techniques used in intelligence:  Clauser and Weir, on intelligence research methods; Jones, on analytic techniques; and Heuer, on alternative competing hypotheses, to name a few.  Little work has been done comparing structured techniques to intuition.  Folker’s work is one of the exceptions—it compares the effectiveness of a modified form of alternative competing hypotheses with intuition in a controlled experimental design. [30] His study is unique in the field and demonstrates that experimental methods are possible.  Krotow’s research, on the other hand, looks at differing forms of cognitive feedback during the analytic process and makes recommendations to enhance intelligence decisionmaking. [31]

Idiosyncratic Variables

Variables in the third column on the chart are those that impact on an individual and his or her analytic performance.  They affect the individual’s weltanschauung.  Although the meaning of the German term is difficult to capture in English, Sigmund Freud comes close:  “[A]n intellectual construction which gives a unified solution of all the problems of our existence in virtue of a comprehensive hypothesis, a construction, therefore, in which no question is left open and in which everything in which we are interested finds a place.” [32] Weltanschauung has been translated as “world view,” “mindset,” and “mental model,” but the best approximation in English, in my view, is the often-overused word “paradigm.”  Paradigm stands for the sum of life’s experiences and acculturation that identifies an individual as a member of a group.  In the proposed taxonomy, Idiosyncratic Variables include one’s familial, cultural, ethnic, religious, linguistic, and political affiliations.  They also encompass psychological factors like biases, personality profiles, cognitive styles and processing, cognitive loads, [33] expertise, approach to problem-solving, decisionmaking style, and reaction to stress.  Finally, there are domain variables like education, training, and the readiness to apply knowledge, skills, and abilities to the task at hand.

The relevant psychological literature is robust.  Amos Tversky and Daniel Kahneman began to examine psychological biases in the early 1970s. [34] Their work has found its way into the intelligence literature through authors like Alexander Butterfield, Jack Davis, James Goldgeier, and Richards Heuer. [35] Decisionmaking and problem-solving have been studied since the early 1920s, and these data are reflected in Heuer’s work as well. [36] Personality profiling, too, is well understood and has had an impact on recent intelligence practices and theory. [37]

Other well-researched areas, however, have yet to be studied in the context of intelligence.  Issues of acculturation and affiliation, educational factors, and training strategies, for example, may yet yield interesting results and insights into the field of intelligence.

Communicative Variables

The fourth category contains variables that affect interaction within and between groups.  Because communication is the vital link within the system—among processes and individuals—this group of variables could be included logically in each of the other three categories.  Its broad relevance, however, makes it seem reasonable to isolate it as a distinct area of variability.  The Communicative Variables include formal and informal communications within an organization (from products to e-mails); among organizations; and between individuals and the social networks that they create.  In his essay on estimative probability, Sherman Kent highlights this area by describing the difficulty that producers of intelligence have in communicating the likelihood of an event to consumers of intelligence. [38] Case studies by Wohlstetter and others, which have addressed organizational issues, also touch on communication and social networks and the impact that communication has on the analytic process. [39] This is an area that could benefit from additional study.

Conclusion

There is rarely any doubt that the unconscious reasons for practicing a custom or sharing a belief are remote from the reasons given to justify them.

Claude Levi-Strauss [40]

Intelligence analysis is art and tradecraft.  There are specific tools and techniques to help perform the tasks, but, in the end, it is left to individuals to use their best judgment in making decisions.  This is not to say that science is not a part of intelligence analysis.  Science is born of organized knowledge, and organizing knowledge requires effort and time.  The work on this taxonomy is intended to help that process by sparking discussion, identifying areas where research exists and ought to be incorporated into the organizational knowledge of intelligence, and identifying areas where not enough research has been performed.

The field of medicine has a number of parallels.  Like intelligence, the practice of medicine is an art and a tradecraft.  Practitioners are trusted to use their best judgment in problem-solving by drawing on their expertise.  What is important to remember is that there are numerous basic sciences driving medical practice.  Biology, chemistry, physics, and all of the subspecialties blend together to create the medical sciences, the foundation on which the practice of modern medicine rests.  The practice of medicine has been revolutionized by the sciences that underpin its workings.

Intelligence analysis has not experienced that revolution.  The basic sciences that underpin intelligence are not physical sciences.  It is difficult to measure what is meant by “progress” in the human sciences.  The human sciences are considerably more multivariate than the physical sciences and it is much more difficult to control those variables.

There are numerous domains from which intelligence may borrow.  Organizational behavior is better understood today than ever before.  Problem-solving and decision-making have been researched since the 1920s.  Structural anthropology addresses many of the acculturation and identity issues that affect individual behavior.  Cognitive scientists are building models that can be tested in experimental conditions and used for developing new tools and techniques.  Sociology and social theory have much to offer in studying social networks and communication.

The organization of knowledge in medicine is medical science.  It took thousands of years to turn folk medicine into research-based allopathic medicine.  The cases that are used to build expertise, the tools and techniques that support diagnosis and treatment, and the criteria on which judgments are made all have integrated the art of medicine with the science of medicine.

The organization of knowledge in intelligence is not a small task, but it is one that is required for the health of the profession.  The taxonomy proposed here could serve as a springboard for a number of innovative projects:  development of a research matrix that identifies what is known and how that information may be of use in intelligence analysis; setting a research agenda in areas of intelligence that have been insufficiently studied; application of research from other domains to develop additional training and education programs for analysts; creation of a database of lessons learned and best practices to build a foundation for an electronic performance support system; integration of those findings into new analytic tools and techniques; and development of a networked architecture for collaborative problem-solving and forecasting, to name a few.  It is my hope that this taxonomy will help intelligence practitioners take steps in some of these new directions.


Footnotes

[1] Herbert Spencer wrote The Study of Sociology in 1874, which set the stage for sociology to emerge as a discipline.

[2] Ernst Haeckel introduced phylum to include related classes and family to include related genera in 1866.   The Linnaeus taxonomy is currently being revised to accommodate genomic mapping data.

[3] See Benjamin S. Bloom, ed., Taxonomy of Educational Objectives:  The Classification of Educational Goals:  Handbook I, Cognitive Domain (New York, NY:  Longmans, 1956).  Bloom’s taxonomy is a classification of levels of intellectual behavior in learning, including knowledge, comprehension, application, analysis, synthesis, and evaluation.

[4] I would like to thank the organizations that facilitated this study, specifically the Central Intelligence Agency’s Center for the Study of Intelligence; the Defense Intelligence Agency; the Institute for Defense Analyses; the National Military Intelligence Association; Evidence Based Research, Incorporated; Analytic Services, Inc. (ANSER); and the individual analysts at other organizations who gave of their time.  Aministrators, faculty members, and students at the CIA’s Kent School, the Joint Military Intelligence College, the Naval Postgraduate School, Columbia University, Georgetown University, and Yale University also supported this project.

[5] Comment made by the Associate Director of Central Intelligence for Collection at a public seminar on intelligence at Harvard University, spring 2000.  See  <http://pirp.harvard.edu/pdf-blurb.asp? id+518>.

[6] Ronald Garst, “Fundamentals of Intelligence Analysis,” Ronald Garst, ed., A Handbook of Intelligence Analysis, 2nd ed. (Washington, DC:  Defense Intelligence College, 1989).

[7] Michael Warner, “Wanted:  A Definition of “Intelligence,” Studies in Intelligence, Vol. 46, No. 3, 2002 (Washington, DC:  Center for the Study of Intelligence, 2002).

[8] Fritz J. Roethlisberger and William J. Dickson, Management and the Worker:  An Account of a Research Program Conducted by the Western Electric Company, Hawthorne Works, Chicago (Boston, MA:  Harvard University Press, 1939); Elton Mayo, The Human Problems of an Industrial Civilization (New York, NY:  MacMillan, 1933).

[9] Sherman Kent, Strategic Intelligence for American World Policy (Princeton, NJ:  Princeton University Press, 1966).

[10] Jerome K. Clauser and Sandra M. Weir, Intelligence Research Methodology:  An Introduction to Techniques and Procedures for Conducting Research in Defense Intelligence (Washington, DC:  US Defense Intelligence School, 1976).

[11] See also:  Morgan Jones, The Thinker’s Toolkit:  14 Powerful Techniques for Problem Solving (New York, NY: Times Business, 1998).  Jones’s book is a popular version of both Garst’s and Clauser and Weir’s work in that it describes a collection of fourteen analytic methods and techniques for problem-solving; however, the methods are not necessarily specific to intelligence.

[12] Bruce D. Berkowitz and Allan E. Goodman, Strategic Intelligence for American National Security (Princeton, NJ:  Princeton University Press, 1989), p. 85.

[13] Lisa Krizan, Intelligence Essentials for Everyone (Washington, DC:  Joint Military Intelligence College, 1999), p. 29.

[14] See:  Jonathan Lockwood and K. Lockwood, “The Lockwood Analytical Method for Prediction (LAMP),” Defense Intelligence Journal, 3(2), 1994, pp. 47-74; R. Hopkins, Warnings of Revolution:  A Case Study of El Salvador (Washington, DC:  Center for the Study of Intelligence, 1980) TR 80-100012; Jack Zlotnick, “Bayes’ Theorem for Intelligence Analysis,” (1972), in H. Bradford Westerfield, ed., Inside CIA’s Private World:  Declassified Articles from the Agency’s Internal Journal (New Haven, CT:  Yale University Press, 1995); John Pierce, “Some Mathematical Methods for Intelligence Analysis,” Studies in Intelligence, No. 21, Summer, 1977, pp. 1-19 (declassified); Edwin Sapp, “Decision Trees,” Studies in Intelligence, No. 18, Winter, 1974, pp. 45-57 (declassified); Stanley Feder, “FACTIONS and Policon:  New Ways to Analyze Politics,” (1987), in Westerfield, ed., Inside CIA’s Private World.

[15] Robert Mathams, “The Intelligence Analyst’s Notebook,” in D. Dearth and R. Goodden, eds., Strategic Intelligence: Theory and Application, 2nd ed., (Washington, DC:  Joint Military Intelligence College, 1995), p. 88.

[16] Avi Shlaim, “Failures in National Intelligence Estimates:  The Case of the Yom Kippur War,” World Politics, 28(3), 1976, pp. 348-380.

[17] John Quirk, David Phillips, Ray Cline, and Walter Pforzheimer, The Central Intelligence Agency:  A Photographic History (Guilford, CT:  Foreign Intelligence Press, 1986).

[18] J. R. Thompson, R. Hopf-Weichel, and R. Geiselman, The Cognitive Bases of Intelligence Analysis (Alexandria, VA:  Army Research Institute, Research Report 1362, 1984), AD-A146, pp. 132, 7.

[19] Richards J. Heuer, Jr., Psychology of Intelligence Analysis (Washington, DC:  Center for the Study of Intelligence, 1999), p. 1.

[20] Ephraim Kam, Surprise Attack.  The Victim’s Perspective (Cambridge, MA:  Harvard University Press, 1988), p. 120.

[21] Alexander Ervin, Applied AnthropologyTools and Perspectives for Contemporary Practice (Boston, MA:  Allyn and Bacon, 2000).

[22] Robert White, Task Analysis Methods: Review and Development of Techniques for Analyzing Mental Workload in Multiple Task Situations (St. Louis, MO:  McDonnell Douglas Corporation, 1971), MDC J5291; David Meister, Behavioral Analysis and Measurement Methods (New York, NY:  Wiley, 1985); G. Klein, G. R. Calderwood, and A. Clinton-Cirocco, Rapid Decision Making on the Fire Ground (Yellow Springs, OH:  Klein Associates, 1985), KA-TR-84-41-7 (prepared under contract MDA903-85-G-0099 for the US Army Research Institute, Alexandria, VA).

[23] A copy of the bibliography and search criteria is available from the author.

[24] See William Stephenson, The Study of Behavior:  Q-Technique and its Methodology (Chicago, IL:  University of Chicago Press, 1953).

[25] I would like to credit Dr. Forrest Frank of the Institute for Defense Analyses for his suggestions regarding the naming convention for these categories of variables as they appear on the accompanying chart.

[26] Throughout the project, my data collection method consisted of written field notes.  Anthropologists traditionally include specific detail from participant input or direct observation.  Usually this is in the form of precise descriptions of the actual behavior of participants and transcripts of their verbal interactions.  It is also standard practice in field work to capture these data, and the data from the interviews and focus groups, on audio- or videotape.  These practices were not followed in this particular case for two reasons:  First, the nature of my work was to derive categories of variables and individual variables in order to create a taxonomy, rather than to document actual practices and procedures, and using the prototype taxonomy to structure the interactions was sufficient for this purpose; second, the nature of intelligence work, the environment in which it occurs, and the professional practitioners require that certain data be restricted.

[27] Graham T. Allison, Essence of Decision:  Explaining the Cuban Missile Crisis (Boston, MA:  Little, Brown and Company, 1971); Bruce D. Berkowitz and Allan E. Goodman, Best Truth:  Intelligence in the Information Age (New Haven, CT:  Yale University Press, 2000); Dan Elkins, An Intelligence Resource Manager’s Guide (Washington, DC:  Joint Military Intelligence College, 1997); Harold Ford, Estimative Intelligence:  The Purposes and Problems of National Intelligence Estimating (Lanham, MD:  University Press of America, 1993); Roy Godson, ed., Comparing Foreign Intelligence:  The U.S., the USSR, the U.K. and the Third World (Washington, DC:  Pergamon-Brassey, 1988); Jeffrey Richelson, The U.S. Intelligence Community, 4th ed. (Boulder, CO:  Westview Press, 1999).

[28] Richard K. Betts, “Policy-makers and Intelligence Analysts:  Love, Hate or Indifference,” Intelligence and National Security, 3(1), January 1988, pp. 184-189; Arthur S. Hulnick, “The Intelligence Producer-Policy Consumer Linkage:  A Theoretical Approach,” Intelligence and National Security, 1(2), May 1986, pp. 212-233; David Hunt, Complexity and Planning in the 21st Century:  Intelligence Requirements to Unlock the Mystery (Newport, RI:  Naval War College, 2000); Kam, Surprise Attack; Walter A. Laqueur, The Uses and Limits of Intelligence (New Brunswick, NJ:  Transaction Publishers, 1993).

[29] Allison, Essence of Decision; Richard K. Betts, Surprise Attack (Washington, DC:  The Brookings Institution, 1982); Lyman B. Kirkpatrick, Jr., Captains Without Eyes:  Intelligence Failures in World War II (London:  MacMillan Company, 1969); Frederick L. Shiels, Preventable Disasters:  Why Governments Fail (Savage, MD:  Rowman and Littlefield Publishers, 1991); James J. Wirtz, The Tet Offensive:  Intelligence Failure in War (Ithaca, NY:  Cornell University Press, 1991); Roberta Wohlstetter, Pearl Harbor:  Warning and Decision (Palo Alto, CA:  Stanford University Press, 1962).

[30] MSgt. Robert D. Folker, Intelligence Analysis in Theater Joint Intelligence Centers:  An Experiment in Applying Structured Methods (Washington, DC:  Joint Military Intelligence College, 2000), Occasional Paper #7.  Folker’s study has some methodological flaws.  Specifically, it does not describe one of the independent variables (intuitive method), leaving the dependent variable (test scores) in doubt.

[31] Geraldine Krotow, The Impact of Cognitive Feedback on the Performance of Intelligence Analysts (Monterey, CA:  Naval Postgraduate School, 1992), AD-A252, p. 176.

[32] Sigmund Freud, “A Philosophy of Life:  Lecture 35,” New Introductory Lectures on Psycho-analysis (London:  Hogarth Press, 1933).

[33] “Cognitive loads” are the amount/number of cognitive tasks weighed against available cognitive processing power.

[34] Amos Tversky and Daniel Kahneman, “The Belief in the ‘Law of Small Numbers,’” Psychological Bulletin, 76, 1971, pp. 105-110; Tversky and Kahneman, “Judgment Under Uncertainty:  Heuristics and Biases,” Science, 185, 1974, pp. 1124-1131.

[35] Alexander Butterfield, The Accuracy of Intelligence Assessment:  Bias, Perception, and Judgment in Analysis and Decision (Newport, RI:  Naval War College, 1993), AD-A266, p. 925; Jack Davis, “Combating Mindset,” Studies in Intelligence, Vol. 35, Winter, 1991, pp. 13-18; James M. Goldgeier, “Psychology and Security,” Security Studies, 6(4), 1997, pp. 137-166; Heuer, Psychology of Intelligence Analysis.

[36] Frank H. Knight, Risk, Uncertainty and Profit (Boston, MA:  Houghton Mifflin, 1921).

[37] Caroline Ziemke, Philippe Loustaunau, and Amy Alrich, Strategic Personality and the Effectiveness of Nuclear Deterrence (Washington, DC:  Institute for Defense Analyses, 2000), D-2537.

[38] Sherman Kent, “Words of Estimative Probability,” Sherman Kent and the Board of National Estimates:  Collected Essays (Washington, DC:  Center for the Study of Intelligence, 1994).

[39] Wohlstetter, Pearl Harbor.

[40] Claude Levi-Strauss wrote Structural Anthropology in 1958, setting the stage for structuralism to emerge as an analytic interpretive method.


Rob Johnston is a postdoctoral research fellow at the CIA Center for the Study of Intelligence and a member of the research staff at the Institute for Defense Analyses

_____________________________________________________________________________________________

OTRAS FUENTES:

http://sites.google.com/site/seoanalysisnow/Home

http://advat.blogspot.com/

http://sourcesandmethods.blogspot.com/

_______________________________________________________________________________________

Responder

Por favor, inicia sesión con uno de estos métodos para publicar tu comentario:

Logo de WordPress.com

Estás comentando usando tu cuenta de WordPress.com. Cerrar sesión / Cambiar )

Imagen de Twitter

Estás comentando usando tu cuenta de Twitter. Cerrar sesión / Cambiar )

Foto de Facebook

Estás comentando usando tu cuenta de Facebook. Cerrar sesión / Cambiar )

Google+ photo

Estás comentando usando tu cuenta de Google+. Cerrar sesión / Cambiar )

Conectando a %s