Ethnographic Research in the U.S. Intelligence Community: Opportunities and Challenges

This article considers lessons learned from conducting research inside the intelligence community. Drawing on a year of ethnographic field work and interviews at the National Counterterrorism Center, I show that “boundary personnel”people who navigate between the worlds of academia and national security provide value added in the form of tacit knowledge that outside researchers would not be able to deliver. At the same time, these people face delays, challenges to freedom of information, and ethical considerations that are unique to their positions. Despite setbacks, social scientists must continue their engagement with national security organizations to further our understanding of how these powerful institutions operate.

negotiating the tension between academic freedom and national security interests. For instance, Evans and Valdivia (2012) discuss the U.S. government's intervention in scientists' publication of the methods used to replicate the H5N1 bird flu virus over bioterrorism concerns; Russell and Webster (2005) discuss a similar controversy over the genetic sequencing of the 1918 flu virus. Others show that the development and deployment of powerful weapons over the last century centralized science and technology as security concerns for the U.S. government (e.g., Vogel, Balmer, Evans, Kroener, Matsumoto, and Rappert 2017;Bussolini 2011). Nuclear weapons in particular have been a consistent focal point, as have the social organization of their development at Los Alamos National Laboratory and Lawrence Livermore National Laboratory (e.g., Gusterson 1996;Rhodes 1986;Teller 1993;Bussolini 2011;Masco 2006).
The emphasis in this literature is understandably on natural sciences, math, and engineering, so for the most part it overlooks how social scientists negotiate similar tensions when working on issues related to national security, intelligence, and the military. In addition, research exploring the relationship between science and the government focuses almost exclusively on scientists who do research for the U.S. government, rather than those who do research on it. Finally, the contested material in this research is information, not personnel: scientists and government officials, often seeing themselves as occupying separate spheres, negotiate a blurry space in which 2 Secrecy and Society, Vol. 2, No. 1 [2018], Art. 5 https://scholarworks.sjsu.edu/secrecyandsociety/vol2/iss1/5 DOI: 10.31979/2377-6188.2018.020105 both entities lay claim to some scientific finding, method, or product. The social positions of the people themselves are not contested in the literature; the spheres seem to blur when committees are formed to negotiate acceptable information use (Evans and Valdivia 2012), but there is no discussion of the ways in which individual researchers may navigate both spaces. This article looks at the challenges and opportunities presented when social scientists themselves blur the boundary between government work and academia. How does their negotiation of this interstitial space affect the research process? What kinds of insights can the security clearance yield for an academic researcher, and -perhaps more importantly -what are its limitations? How might these accesses shape social scientists, and what are the longer-term implications for scholarship as a result of their experiences inside the intelligence community? Finally, what can research inside the U.S. intelligence community teach us about its people, analytic practices, and organizational culture that social scientists would not be able to learn from the outside? 1 1 Methods: Ethnography is the descriptive documentation of a living culture. It usually combines a number of approaches, such as participant-observation, interview, and artifact analysis, to present a detailed picture of the culture of interest. It is generally qualitative and designed to present the view of the study subjects as understood from the researcher's own perspective. Ethnographic methods originally emerged in anthropology, but have since been adopted by other social sciences, including sociology. The classical view is that ethnography requires the researcher to spend significant time (years, ideally) living among the people they are studying, but sociologists have had success in using ethnography to study cultures in work places, schools, or other organizational locations without actually living with the people they are studying. My research fits into this latter application of ethnographic methods. I worked as a counterterrorism analyst for the Central Intelligence Agency (CIA) between 2007 and 2011. During my time there, I sought and received permission to conduct an ethnography of the National Counterterrorism Center (NCTC), which is the organization that was created after 9/11 to address the 9/11 Commission Report's finding that

Tacit Knowledge
In thinking of the ways in which people may blur the space between academia and security organizations, it is useful to deploy the concept of boundary objects, "those objects that both inhabit several communities of practice and satisfy the informational requirements of each" (Bowker and Starr 1999, 297). What we might call "boundary personnel" instead of "boundary objects" have similar features. I argue that social scientists wishing to study how the intelligence community (IC) works cannot get a comprehensive picture of what the organizations are like without spending intelligence agencies were not sharing information the way they could or should. NCTC was created to house officers from all around the IC under one roof, with the hopes that working alongside representatives from other agencies would encourage analysts to share information. During the course of my research, I wanted to explore whether and how that information sharing occurs. As a sociologist, I was interested in learning about the small, day-to-day interactions that comprise the life of the analyst, as well as the larger organizational dynamics that accomplish intelligence work. I conducted 20 indepth interviews between August and December of 2010. I knew most of the participants personally, but a few people I did not know got in touch with me after hearing about my study and asked to participate. I aimed to get a relatively diverse sample with regard to the analysts' home agencies, to get as many perspectives as possible. Official demographic information was not available, but my perception is that the population of analysts in the wider IC is not particularly diverse on measures of race, gender, and age, so there is some, but not much, variation on these measures in my interview sample. The average age of the interview participants was 31.8 years, with a range of 24 to 47 and a median of 31.5. All but three of the participants identified as white; 13 identified as female and 7 as male. The average amount of time served at NCTC was 2.48 years, with a range of four months to six years, and a median of two years. The average amount of time served in the federal government was 7.7 years, with a range of 2 to 15 years, and a median of 7.5 years. Thirteen of the participants had at least one graduate-level degree, usually a master's degree. Thirteen of the participants claimed CIA as their home organization; seven of the longer-serving participants claimed more than one home organization over the course of their careers. The learning curve was unbelievably steep, not only in the methods of doing the job but also the ways in which people exchanged mere pleasantries. I frequently found myself understanding the words, but not the meaning, of what people said to me, which created a profound sense of culture shock and made the adjustment incredibly difficult. This shock was exacerbated by the secrecy in which the job is engulfed; there is no real way to prepare oneself for the first days and weeks of this job. It was as though a curtain were lifted, and all of the people, places, and things behind it came hurtling at me all at once with the force of water from a fire hose. (Nolan 2013, 3) I was explicitly told in my first weeks inside the IC that I should not expect to get anything done (in the form of publishing classified papers) for at least the first few months, maybe even a year. It can take up to six months to be granted access to some classified systems, and accumulating the tacit knowledge required not only to do quality intelligence analysis but also to navigate routine daily situations simply must be acquired the hard way. Many of my colleagues pointed to the organization's reliance on tacit knowledge as one of their greatest sources of overwhelm and frustration.
They reported, and my own experience confirmed, that tacit knowledge ranged from relatively small matters, such as figuring out the correct form to complete, to larger matters, such as knowledge of one's own job description or the goals of the larger organization. Part of this confusion at the National Counterterrorism Center (NCTC) stems from the fact that as a fledgling organization, NCTC's predecessor simply adopted many of CIA's bureaucratic procedures. This is confusing for non-CIA personnel at NCTC, but it is also confusing for CIA employees, because it is often unclear which organization is supposed to take responsibility for an action. Some analysts told me that they felt the management sometimes used this lack of clarity between agencies as an excuse not to fund training opportunities or travel expenses.
A few excerpts illustrate this idea: CIA analyst: If there is [frustration at work], it's usually figuring out how you're supposed to do something, what the proper procedure is. It's hard to figure out if it's not written down anywhere. Like travel. The simplest thing like buying a plane ticket is so not intuitive, and you often don't even know to ask. (Nolan 2013, 34) CIA analyst relatively new to the IC: This place is just so weird, because the people here seem to assume things that are not at all intuitive, and then they get mad when you haven't come to those conclusions yourself. Like, when I first started here, they were doing some construction and there were fewer parking spaces, so they instituted this valet parking system. But it wasn't like a valet in the real world where you pull up and hand the attendant the key or whatever. You were supposed to just know that you had to leave your car key on the left rear tire, in case the valet had to move your car. Even on your first day without having met anyone you were somehow supposed to know this. Well, I'd been working there for months and I didn't know it. Why would you assume something like that? How does that make intuitive sense? So I took my keys with me, and it turns out I was blocking in one of the higher-ups who needed to get out to go to a briefing, and they sent this system-wide flash alert to everyone's computer screen about my car, and it was just so embarrassing, but like, how was I supposed to know? And how was I supposed to know that I didn't even know how to park my car? Stuff like that makes you start second-guessing everything you took for granted before. (Nolan 2013, 34) Several These examples barely scratch the surface of the many ways in which intelligence work relies on tacit knowledge. Creating written products -the primary task for intelligence analysts -is frequently described as an "art," which connotes the importance of subjective judgment and creativity (e.g., Hasler 2010;Crumpton 2013). Even more often, intelligence work is characterized as "tradecraft," a "catchall for the often-idiosyncratic methods and techniques required to perform [intelligence] analysis," defined as "practiced skill in a trade or art" that "purposefully implies a mysterious process learned only by the initiated and acquired only through the elaborate rituals of professional indoctrination" (Johnston 2005, 17-18). Echoing MacKenzie and Spinardi's (1995) finding that judgment is a collective phenomenon rather than an individual one, analytic papers in the IC are considered "community" products; the paper does not bear the name of the individual author, but rather the seal of the institution for which the author works. The CIA at least recognizes that tacit knowledge is embodied; some CIA-sponsored documents explicitly talk about the loss of institutional memory due to attrition (e.g., Johnston 2005), and during my time there, people in positions of power talked about the need to preserve this kind of embodied information by strengthening mentoring programs and generating "lessons learned"-type documents.
All of this speaks to the benefits -indeed, the necessity -of having an intelligence practitioner with the proper academic training conduct research academics with no personal experience on the "inside" may find their lack of access to be an insurmountable challenge; the practitioner's embodied sense of the profession is thus the greatest opportunity afforded by their access.

Socialization and the Security Clearance
"Insider" status presents challenges in addition to its opportunities.
Getting a security clearance to work at the CIA is a long, arduous,  Masco's (2010: 441) discussion of secrecy's "distorting effects" and Daniel Ellsberg's (2002) reflections on the psychological effects of access to classified information in his autobiography. That access has static and dynamic elements; once you know something, you know it, but the maintenance of that information requires obfuscation, occasional lying, and many kinds of keeping track, all of which is invisible labor that takes a toll.
Moreover, the onboarding processes for intelligence practitioners vary by agency. My research shows that there is a persistent status hierarchy among the intelligence agencies and that the CIA is at the top of that hierarchy, even after the post-9/11 restructuring of the U.S. Intelligence Community. It is therefore impossible to present oneself as both an insider and as a neutral researcher. As a researcher, I had the bias-mitigating advantage of not having physically worked at CIA Headquarters for very long before I started working at the National Counterterrorism Center, but I was still a CIA person, and was viewed and treated as such during my time in the IC. I therefore tried as much as possible to foreground the experiences and stories of others, and included my own only when they were also corroborated by my colleagues.
Still, insiders conducting any kind of fieldwork must guard against what is sometimes called "going native" (Hammersley and Atkinson 1995, 110) in social science research and "clientism" in the intelligence community (Lowenthal 2014, 163): the tendency to become so immersed in the target population that the researcher defends or apologizes for that population instead of analyzing it more objectively. All ethnographers must prioritize is already a process that shapes rather than reflects the population of interest (Hammersley and Atkinson 1995), the added security concerns surrounding identity protection constitute a further challenge to the research process. I certainly made these kinds of choices, both consciously and-I am sure-unconsciously. There were many incredibly rich details and interactions that I could not document because of the highly sensitive circumstances in which they occurred, and there were times when I probably could have written about certain situations but chose not to in order to protect my colleagues. I felt that these choices were necessary for me to uphold both my oath of office and the ethics of field work, but I am cognizant that these choices also necessarily affect both the research process and the final product.

Ethical Concerns
Among the social sciences, anthropology has the longest and most  None of these dilemmas rise to the level other social scientists have faced in their work on and for the IC and the military, but it is important that each ethical dilemma be taken seriously and navigated successfully in order for the work to maintain its integrity.

Longer-Term Issues for Scholarship
There can be no doubt that the primary challenge presented to boundary personnel stems from the non-disclosure agreement employees with security clearances are required to sign. This contract requires the signatory's surrender of some First Amendment rights. From the moment of signing, whether the employee works in national security for a day or for thirty years, they must submit all writing related to their job to a board that will review it to determine whether it contains classified information. At the CIA, that body is the Publications Review Board (PRB); if other entities, such as the Office of the Director of National Intelligence (ODNI) have a stake in the material, they also weigh in. This requirement extends beyond traditional publication outlets such as books or articles to include blog posts, opinion pieces, tweets, Facebook posts, resumes, speeches, and more, and the material must be cleared before it can be shared with editors, colleagues, advisers, friends, or anyone without a clearance for that material (Central Intelligence Agency 2016). The need to protect against the disclosure of classified information is of course undeniable, and signatories enter into this contractual obligation freely.
This requirement begins to present challenges to scholarship when classification standards become unclear. Information is classified when it is determined that its disclosure would cause some degree of damage to national security; the difference among Confidential, Secret, and Top Secret information is that its disclosure would respectively cause "damage," "serious damage," and "exceptionally grave damage" to national security (The White House Executive 2009). Very little guidance is available to help determine the differences among these phrases, which leads to overclassification for practitioners (e.g., Ellington 2011) and uncertainty for would-be writers and publishers. Materials available in the public domain are sometimes redacted anyway (Masco 2010), and approaches to releasing classified or sensitive information seem to vary by agency (Masco 2010).
I argue that information review also varies within agencies. For instance, when I revised a chapter of my dissertation, I included a few sentences that the PRB had previously approved, but they were returned to me this time redacted in the following way: [CIA applicants] must endure days of physical and psychological testing, including a harrowing polygraph examination, during which the examiner may redacted redacted redacted redacted to the applicant. My polygraph lasted redacted hours, and while spending most of that time strapped to a tight blood pressure cuff, I redacted redacted redacted redacted redacted redacted redacted redacted redacted redacted.
I requested clarification of the new redactions, and included the original approved manuscript from January 2015 to show where the text had previously been cleared. I was told that the redactions were based on "current classification guidance," and that the Board upheld all redactionswith the exception of the word "eight," which I was now allowed to print: [CIA applicants] must endure days of physical and psychological testing, including a harrowing polygraph examination, during which the examiner may redacted redacted redacted redacted to the applicant. My polygraph lasted eight hours, and while spending most of that time strapped to a tight blood pressure cuff, I redacted redacted redacted redacted redacted redacted redacted redacted redacted redacted.
In other words, as of January 2015, I was allowed to say that my polygraph lasted eight hours (and that version of the manuscript remains on the Internet for anyone to see); on November 14, 2017, I was no longer allowed to say the word "eight"; and two days later, on November 16, it was allowed again. It is difficult to believe that these changes were based more on objective "current classification guidelines" -and that those guidelines happened to have changed in a two-day period in a way that specifically affected my use of the word "eight" -than on the subjective individual variation introduced by whomever answers the request for review. Does it really matter whether people know my polygraph lasted "hours" or "eight hours"? No, but the principle is the point: As we know from Durkheim, the organizational approach to managing secrets is often more important than the content of those secrets ([1912] 1995). Moreover, the workings of the PRB itself are secretive: Despite my efforts to find out, I still do not know how many people sit on the PRB nor whether its decisions are reached by majority vote or some other procedure. Just as the practice of "science" is not static, but rather a dynamic, iterative process (Gieryn 1983), so too is the production and reproduction of "classification," echoing Vogel et al.'s (2017) point that knowledge more generally is socially constructed at every step along the way.
Other scholars have grappled with similar issues. Aftergood (1999) writes about what he calls "genuine national security secrecy"-in other words, legitimate secrecy -as opposed to "political secrecy" (secrecy maintained for a political advantage) and "bureaucratic secrecy" (20) (the Weberian tendency of bureaucracies to control perception of the organization by restricting information), both of which are illegitimate but nonetheless  Gusterson (1999, 58) addresses this problem -the "death of the author" -among nuclear scientists working at Livermore Laboratory, but the lack of ownership over their government work is an issue with which all practitioners must contend if they intend to negotiate that transition successfully.
Restrictions on writing for former employees means that boundary personnel do not have the same freedoms to respond to critics of their writing or comment on current events in a timely manner. These days, it is not uncommon for academics to have Twitter accounts or personal websites that they use to increase their visibility in their field, but the requirement to send all writing to the PRB results in delays that mitigate the effectiveness of these platforms when used for career-enhancement purposes.
Longer-term implications for scholarship also depend upon one's status in the IC. If the researcher leaves the IC to pursue academia, they are still beholden to the regulations of the review board but may be unable to gain further access to the population. Thus, if an editor or reviewer raises an issue that would be best addressed with further inquiry-follow-up interviews or more observations-the researcher is likely unable to fulfill this request. If the researcher has remained inside the IC to allow for continued access, they are subject to additional standards of review that may make getting manuscripts out even more difficult than it already is. Although a former employee's writings can only be rejected if the manuscript contains classified information, current employees face additional standards, such as whether the research interferes with the employee's job duties or with U.S. national security interests.
Staying on the inside also means the researcher must continue to deal with the labyrinthine bureaucracy that comprises the U.S. government. On top of the standard delays and inefficiencies, the researcher may be met with resistance from unknown bureaucrats who are not even involved in the research and spend much of their time seemingly spinning their wheels.
Thus, there are pros and cons to staying or leaving, but either decision will inevitably result in delayed or stalled projects. On the plus side, having worked in the IC can lend a degree of credibility and value added that cannot be achieved with an outsider perspective, which may offset some of the costs of conducting research. There is an argument to be made that the benefits outweigh the costs, but to the extent possible, academics should plan and prepare for these substantial costs when developing their research agendas.

Conclusion
I have focused primarily on the challenges and opportunities presented to individual scholars -what I've termed "boundary personnel" -as they navigate dual identities as intelligence practitioners and scholars. But it is crucial to remember that this negotiation is taking place in the context of macro-level approaches to notions of secrecy and openness that are constantly shifting in response to and/or in anticipation of geopolitical realities. In the 21st century -and especially post-9/11 -the government has publicized efforts to declassify more material (McDermott 2011), and the official rhetoric has shifted towards the value of increased information sharing and collaboration among the intelligence agencies, from a "need to know" to a "need to share" posture. 3 At the same time, researchers have seen a contradiction emerge: They argue that this same period has resulted in more classification and restricted access to information from the outside (Ellington 2011;Masco 2010) in what they term the "securocrats' revenge" (Aftergood and Blanton 1999, 457) and the "iron curtain of secrecy" (Ericson 2005). These constantly shifting dynamics mean that there can never be a standard approach to this work; scholars must contend with the sense that the sand is always shifting beneath their feet. Rindzeviciute (2015) has suggested that massive government disclosures can go too far the other way and cause unanticipated harm, such that a "sweet spot" between full secrecy and full transparency may be desirable, if it even makes sense to think in these terms. But it is not in the government's best interest to limit inquiries too much either, because transparency and accountability are essential elements of a democracy.
Similarly, it is unwise for boundary personnel to back away entirely and become discouraged by these many challenges. Social scientists -especially those who study organizations and bureaucracies more generally -have an intellectual obligation to engage with the intelligence community and other security institutions, so that we may better understand how these complex seats of power operate. The value added of having practitioners do this work is that their tacit knowledge provides nuance and complexity to a body of scholarship that would otherwise suffer from their absence.