Locally Nuanced Actionable Intelligence



Yüklə 115,33 Kb.
səhifə1/3
tarix02.11.2017
ölçüsü115,33 Kb.
#28455
  1   2   3


Locally Nuanced Actionable Intelligence:

Operational Qualitative Analysis for a Volatile World

John Hoven and Joel Lawton

forthcoming, American Intelligence Journal

this version: July 2015
Conventional-force companies learned much over the past 12 years as they executed missions historically reserved for Special Forces. War is fundamentally a human endeavor, and understanding the people involved is critically important. (Cone and Mohundro 2014: 5)
Context is critical… Aggregated and centralized quantitative methods … lack context and fail to account for qualitative inputs. Consequently, such reports often produce inaccurate or misleading findings. (Connable 2012: xviii)
US forces have many opportunities to interact with the local population in the normal course of their duties in operations. This source perhaps is the most under-utilized HUMINT collection resource. (Army FM 2-22.3 2006: ¶5-22)
Abstract

Qualitative analysis has two extraordinary capabilities: first, finding answers to questions we are too clueless to ask; and second, causal inference – hypothesis testing and assessment – within a single unique context (sample size of one). These capabilities are broadly useful, and they are critically important in village-level civil-military operations. Company commanders need to learn quickly, "What are the problems and possibilities here and now, in this specific village? What happens if we do A, B, and C?" – and that is an ill-defined, one-of-a-kind problem.

The core strategy of qualitative analysis is fast iteration between information-gathering and analysis, rapid-fire experimentation that generates rapid learning. That is also the core strategy of an iterative product development approach called Agile Management/Lean Start: make small changes in product features that address specific, poorly understood problems and possibilities; solicit customer feedback; iterate rapidly; pivot sharply as needed to explore more promising opportunities. That is the approach we are taking to adapt qualitative research methods to an operational tempo and purpose. The U.S. Army's 83rd Civil Affairs Battalion is our "first user" partner in innovation.

In the paper's opening vignette, the need for locally nuanced actionable intelligence is illustrated by an incident in Afghanistan where a single day's interviews revealed the presence of an insurgent element that had escaped the notice of carefully collected database evidence.


Disclaimer: The opinions expressed in this paper are those of the authors, and do not represent those of the U.S. Army, the Department of Defense, or the U.S. Government.
Acknowledgements: Valuable feedback on portions of this paper were received from presentations at an intelligence analysis workshop (“Understanding and Improving Intelligence Analysis: Learning from other Disciplines”) held at Ole Miss in July 2013, a Five Eyes Analytic Training Workshop in November 2014, and the 2014 and 2015 Annual Conferences of the International Association for Intelligence Educators (IAFIE); and from hundreds of conversations with military and intelligence people at all levels, at conferences of IAFIE, AFIO (Association of Former Intelligence Officers), NMIA (National Military Intelligence Association), NDIA (National Defense Industrial Association), INSA (Intelligence and National Security Alliance), ISS-ISA (Intelligence Studies Section of the International Studies Association), FAOA (Foreign Area Officer Association), and the Civil Affairs Association.

About the Authors


John Hoven (www.linkedin.com/in/johnhoven) is an innovation broker between those who do qualitative analysis and those who need its capabilities for operations and assessment. He recently completed a 40-year stint analyzing complex, dynamic relationships in merger investigations, as a qualitative microeconomist in the U.S. Justice Department's Antitrust Division. Dr. Hoven earned a Ph.D. in economics from the University of Wisconsin at Madison, an M.S. in physics from the University of California at Berkeley, and a B.A. in mathematics and physics from the University of Montana at Missoula. He is also an accomplished bassoonist, performing regularly in the D.C. area at lunchtime concerts of the Friday Morning Music Club.
Joel Lawton (www.linkedin.com/in/joellawton0125) is a former member of the U.S. Army’s Human Terrain System (HTS), U.S. Army, Training and Doctrine Command (TRADOC). His work with HTS included working in the U.S. and two tours to Afghanistan where he conducted socio-cultural research management, collection, and support; as well as open-source intelligence analysis and qualitative data collection and analysis. Joel served in the USMC, deploying to southern Helmand Province in 2009 in support of combat operations. Further, Joel is an advocate of qualitative analysis and its use in Military Intelligence collection efforts. He currently works as an Intelligence Analyst for the TRADOC G-2 at Fort Eustis, VA and resides in Newport News, VA. The vignette in this paper was presented at a November 2014 Five Eyes Analytic Training Workshop.

Part One


Joel Lawton
Operational Qualitative Analysis: A Vignette
I deployed twice to Afghanistan as a Human Terrain Analyst, conducting socio-cultural research in support of Intelligence Preparation of the Operational Environment (IPOE) requirements with the Human Terrain System (HTS), U.S. Army, Training and Doctrine Command (TRADOC) G-2. Qualitative interviewing was our primary means of analysis and collection.

In 2011, I worked with a German Provincial Reconstruction Team (PRT) in Kunduz Province. Their working hypothesis was that fewer significant activities (SIGACTs) means fewer insurgents in an area. In those areas that are assessed as safe, they could start pouring money into development – bridges, roads, schools, etc.


They had a very structured way of assessing this, through a questionnaire called TCAPF (Tactical Conflict Assessment Planning Framework), developed by USAID. TCAPF asked a simple set of questions: What would you do to improve your village? Give me three things. How would you prioritize it? Who would you go to, to get these things done?

You enter the answers in a database, and the PRT can use it to prioritize resources.


I was one of the enablers identified by the PRT to use such questionnaires to assist in development strategy planning. I used it essentially as just a starting point for probing and open-ended questions. After asking "Who would you go to?", I would ask "Why would you go to this guy? Who's the best person in this scenario?" So I'd get the answers for the PRT database, and I'd get my qualitative response.
I went to this one village in the northwestern part of Kunduz Province. It was assessed as safe, due to the general absence of SIGACTs. My very first interview for the day, I noticed something was not right. I had conducted hundreds of interviews at this point. Afghans are a very narrative society. Most of them can't read and write, so they like to talk to you. But that day, something was off. My first interviewee gave very short, quick answers, didn't want to answer my follow-up questions, and appeared uncomfortable talking to me. Very odd.
There was this 8- or 10-year old boy next to me. He said something to my interpreter and ran off. That was strange, so I asked my interpreter, "What did the boy say?" He said, “We are not allow to talk to you today, the men with beards are here today.”
I said, Aha, this now makes sense, there's probably Taliban in this village.
I knew some things about Afghans in that region. I knew it had an agrarian base, subsistence farming, very malnourished individuals – and being malnourished, smaller and bent over. Working in the fields, they tended to have bulging knees. They wore sandals, were largely unclean.
I noticed there were 10 to 15 men who did not fit the stereotype. They were almost German-like Afghans: big, six-foot, clean, no calluses on their hands from typical working in the fields, and wore boots as opposed to sandals (everyone else in the village wore sandals). I said, "I know who these guys are."
In order to get any value out of this, I knew I had to prove that these individuals were not from this village.
I did not know the names of the village elders, what sub-tribes were present, the crops they grew, and the time of their last harvest. But I knew that somebody from that village would know all those answers, just like that. I decided to ask the suspected insurgents these sorts of questions.
I talked to about five of them. All five of these guys gave me a completely different set of answers.
I wrote up my notes and observations and briefed the PRT’s commander and G2. The G2 immediately passed the information to an U.S. Army Special Operations Forces (SOF) in proximity. That information led to a complete cordon and search of that entire village the very next day, and then an adjacent village to it.
Qualitative can be actionable. If I had just used the TCAPF questions to fill in a database, I would never have had discovered any of that information. By using TCAPF as a starting point, going down the follow-on questions, probing questions, you get a very contextually rich, locally nuanced information that can reveal things that you never even thought to ask in the first place.
The value of this technique is reflected in an Army Field Manuel (FM) titled “Soldier Surveillance and Reconnaissance” (FM 2-91) which states:
"Interaction with the local populace enables Soldiers to obtain information of

immediate value through conversation… Every day, in all operational environments, Soldiers talk and interact with the local populace and observe more relevant information than technical sensors can collect." (Army FM 2-91.6 2007: 1-16, 1-48)


Operational qualitative analysis can provide a means to conduct rapid and tactically oriented assessments. Commanders assigned at the battalion and below command echelons can quickly impact their operational area of responsibility (AOR) through this simple and revealing approach.
Part Two

John Hoven


I. Introduction
Agile intelligence is a daunting challenge in a volatile, locally nuanced world. Computers can analyze massive amounts of data, but key factors are often unmeasurable. Those key factors are also, like the swirl of a tornado, remarkably specific to each time and place. Discovering key issues may be the most urgent priority, and that may require answers to questions we are too clueless to ask.
On the other hand, ordinary social conversations routinely reveal undiscovered issues. Follow-up questions yield answers to questions we hadn't thought to ask. This is familiar territory for the human mind. Qualitative interviewing builds on this, seeking “more depth but on a narrower range of issues than people do in normal conversations.” (Rubin and Rubin 2012: 6) If the project is a collaborative effort – a team of analysts, interviewers, and other data collectors – the team members constantly discuss what they've learned, what they need to learn next, and where to get it.
Qualitative analysis is explicitly designed for fast learning in poorly understood situations with “confusing, contradictory … rich accounts of experience and interaction." (Richards 2009: 4; cf. Ackerman et al. 2007: xvii) Moreover, the focus in qualitative analysis is on one specific context (or several, for comparison studies). That context-specific focus is especially critical for civil-military operations, which need to understand quickly how and why the various actors and actions interact in a specific, rapidly evolving context. (Kolton (2013) and Carlson (2011) are especially insightful examples.)
Operational qualitative analysis is qualitative analysis adapted to an operational tempo and purpose – e.g., for civil-military operations. This is a multi-step challenge. The first hurdle is inherent in any radical innovation: none of my professional peers are considering it, so it is not worth considering. That hurdle was overcome through hundreds of brief conversations with military and intelligence professionals at conferences and meetings in the DC area, constant revision of the concept and the search, and the eventual discovery of a "first user" innovation partner, the U.S. Army's 83rd Civil Affairs Battalion. Since then, progress has been dramatic. That, too, is a common phenomenon in radical innovation: progress is slow until all the pieces come together, and then innovation takes off. The next steps are still a steady climb up a mountain of unknown unknowns, but we know how to do that. (See Section IV.)
In short, the role of operational qualitative analysis is to facilitate rapid and accurate decisionmaking in one-of-a-kind, poorly understood contexts for civil-military operations. The core strategy is fast iteration between information-gathering, analysis, and action.
Section II asks "When is operational qualitative analysis the right tool for the job?" and highlights four key considerations: a) concurrent collection and analysis, (b) unknown unknowns, (c) richly nuanced data from a single context, and (d) causal inference and assessment within a particular context. Section III is a brief tutorial in two basic skills: qualitative interviewing and making sense of the data. Section IV describes our project to adapt qualitative analysis to an operational tempo and purpose, in partnership with the U.S. Army's 83rd Civil Affairs Battalion. Section V concludes.
II. When is operational qualitative analysis the right tool for the job?
A. Concurrent collection and analysis


Qualitative analysis

U.S. intelligence analysis

Analysis


Collection





In the U.S. intelligence system, collection and analysis are typically separate endeavors, like the Pony Express delivering news from one to the other. They work together, but they keep their distance unless there is a pressing need to work more closely together.1


I have argued for many years that collectors and analysts should work more closely together, but at the CIA all the efforts to make that happen have failed. (Hulnick 2008: 632)
... the heretofore separate endeavors of collection and analysis… It’s certainly appropriate at the agency levels to keep them separate... But at the level of ODNI [Office of the Director of National Intelligence] I believe they should be integrated. (Clapper 2010)
The doctrinally "correct" process for customer-collector interface via Ad-Hoc Requirements (AHRs), HUMINT Collection Requirements (HCRs) and evaluations is too slow and cumbersome. (Gallagher 2011: 7)
… information evaluation and analysis are highly interdependent… it would be interesting to determine in future interviews whether or not a feedback loop exists between analysts and collectors. (Derbentseva 2010: 19)
In contrast, qualitative analysts constantly go back and forth between analysis and data collection. Even the term "qualitative analysis" is normally understood to mean qualitative data-collection-and-analysis. The two are practically inseparable. Analysts read documents and interview transcripts looking for search terms, to focus the search for additional documents and interviewees. After each interview, investigators discuss what they've learned and say, "Now we need to interview these people and ask these questions." Hypotheses are discovered, tested, revised, and discarded. As they evolve, they redirect the search for relevant information.
In some kinds of social research you are encouraged to collect all your data before you start any kind of analysis. Qualitative research is different from this because there is no separation of data collection and data analysis. (Gibbs 2007: 3)
A striking feature of research to build theory from case studies is the frequent overlap of data analysis with data collection… The central idea is that researchers constantly compare theory and data – iterating toward a theory which closely fits the data… Case study theory building is a bottom up approach … Such theories are likely to be testable, novel, and empirically valid, but they … are essentially theories about specific phenomena. (Eisenhardt 1989: 538, 541, 547)
Connable (2012) argues that when locally nuanced intelligence is critical (as in counterinsurgency), local commanders should direct both collection and analysis:
Local commanders are best positioned to direct the collection of information over time for several reasons: (1) they understand the immediate cost and risk of that collection; (2) they and their staffs can analyze that information in context; and (3) they can adjust collection and reporting to meet current local conditions and context. (Connable 2012: 229)
COIN [counterinsurgency] information is best analyzed at the level at which it is collected. The COIN environment is complex and marked by sometimes-extreme variations in physical and human terrain, varying degrees of progress from area to area (and often from village to village), and varying levels of counterinsurgent presence and collection capability. (Connable 2012: xx)
Army Field Manuals illustrate how tantalizingly close the U.S. Army has come to integrating locally nuanced collection and analysis:
US forces have many opportunities to interact with the local population in the normal course of their duties in operations. This source perhaps is the most under-utilized HUMINT collection resource. (Army FM 2-22.3 2006: ¶5-22)
In that spirit, the Army's "Soldier Surveillance and Reconnaissance" Field Manual says, "Interaction with the local populace enables Soldiers to obtain information of immediate value through conversation… Every day, in all operational environments, Soldiers talk and interact with the local populace and observe more relevant information than technical sensors can collect." (Army FM 2-91.6 2007: 1-16, 1-48) It also advises, "Well-crafted open questions … serve as an invitation to talk – They require an answer other than 'yes' or 'no'." (Army FM 2-91.6 2007: 3-9)
And then … it instructs Soldiers to ask only basic fact-finding questions.2 No conversational questions. No follow-up questions. No opportunity to discover answers to questions we are too clueless to ask:
EXAMPLE QUESTIONS (Army FM 2-91.6 2007: 3-11, 3-35)

  • What is your name (verify this with identification papers and check the Detain/Of Interest/Protect Lists)?

  • What is your home address (former residence if a dislocated civilian)?

  • What is your occupation?

  • Where were you going (get specifics)?

  • Why are you going there (get specifics)?

  • What route did you travel to arrive here?

  • What obstacles (or hardships) did you encounter on your way here?

  • What unusual activity did you notice on your way here?

  • What route will you take to get to your final destination?

  • Who do you (personally) know who actively opposes the US (or multinational forces)? Follow this up with "who else?" If they know of anyone, ask what anti-US (multinational force) activities they know of, where they happened, and similar type questions.

  • Why do you believe we (US or multinational forces) are here?

  • What do you think of our (US or multinational force) presence here?

DO NOT––


  • Take notes in front of the person after asking the question…

DO—

  • Ask only basic questions as described in this section.

B. Unknown unknowns




Qualitative analysis starts with

  • unknown unknowns

  • no hypotheses to test3

U.S. intelligence analysis starts with

  • known unknowns

  • a full set of alternative competing hypotheses

U.S. intelligence directs collection efforts at known unknowns:


PIR [Priority Intelligence Requirements] should … [i]dentify a specific fact, event, activity (or absence thereof) which can be collected….PIR are further broken down into specific information requirements (SIR) and specific orders and requests (SOR) in order to tell an intelligence asset exactly what to find, when and where to find it, why it is important, and how to report it. … Decisions based on unanticipated threats or opportunities could never be reduced to PIR, SIR, and SOR quickly enough to assist the commander. (Spinuzzi 2007: 19)
Moreover, the focus is on well-understood problems for which a full set of plausible hypotheses can be articulated, and used as the basis for collection requirements:
Analysis of Competing Hypotheses [ACH] … requires analysts to start with a full set of plausible hypotheses… ACH is particularly effective when there is a robust flow of data to absorb and evaluate. For example, it is well-suited for addressing questions about technical issues in the chemical, biological, radiological, and nuclear arena. (Heuer & Pherson 2011: 32, 160)4
By contrast, qualitative analysis expects the unexpected. The problem is ill-defined (or a poorly understood aspect of an otherwise well-understood problem) and the goal is to make it a well-defined problem:
Quantitative methods assume that researchers already know both the key problems and the answer categories; these types of questions … often missed turning points, subtleties, and cross pressures…

In exploratory studies … follow-up questions may dominate the discussion … to explore unanticipated paths suggested by the interviewees … These questions are at the heart of responsive interviewing, because they allow you to achieve the depth of understanding that is the hallmark of this approach to research. (Rubin and Rubin 2012: 9, 122, 150)


Any given finding usually has exceptions. The temptation is to smooth them over, ignore them, or explain them away. But the outlier is your friend… Surprises have more juice than outliers. (Miles, Huberman, and Saldaña 2013: 301, 303, emphasis in original)
Qualitative analysis does not usually articulate hypotheses at the start of a project, when so little is known. When our understanding is so frail, one interview is enough to find that we're looking at this all wrong – as this paper's opening vignette so strikingly demonstrates. Relevant concepts and hypotheses are discovered, tested, and revised repeatedly as evidence accumulates about actors, actions, relationships, etc.
Finally and most importantly, theory-building research is begun as close as possible to the ideal of … no hypotheses to test…

The central idea is that researchers constantly compare theory and data – iterating toward a theory which closely fits the data… One step in shaping hypotheses is the sharpening of constructs. This is a two-part process involving (1) refining the definition of the construct and (2) building evidence which measures the construct in each case. (Eisenhardt 1989: 536, 541)


Qualitative research refrains from … formulating hypotheses in the beginning in order to test them. Rather, concepts (or hypotheses, if they are used) are developed and refined in the process of research. (Flick 2007: xi)
Ill-defined problems are often seen as atypical and unusual. They are not. They are the commonplace problems that routinely emerge from specific contexts of everyday life:
Ill-structured problems are typically situated in and emergent from a specific context. In situated problems, one or more aspects of the problem situation are not well specified, the problem descriptions are not clear or well defined, or the information needed to solve them is not contained in the problem statement (Chi & Glaser, 1985). Ill-structured problems are the kinds of problems that are encountered in everyday practice… (Jonassen 1997: 68)
C. Richly nuanced data from a single context


Data for qualitative analysis

  • Interviews and other richly nuanced data from a single context (sample size = 1)


Qualitative analysis can

  • explain and predict what happens, here and now, if we do A, B, C ...

Data for U.S. intelligence analysis

  • Indicators from various contexts



U.S. intelligence can

  • view trends, and compare the value of indicators in different contexts

Yüklə 115,33 Kb.

Dostları ilə paylaş:
  1   2   3




Verilənlər bazası müəlliflik hüququ ilə müdafiə olunur ©muhaz.org 2024
rəhbərliyinə müraciət

gir | qeydiyyatdan keç
    Ana səhifə


yükləyin