Myths vs. Realities: Bias and Groupthink in Private Intel/Geopolitical Risk
/This is the third part in the Myths vs. Realities series. If you’re new to the site, you may want to start with the Glossary. To catch up on previous posts in this series, scroll down- they’re all on this page.
In the previous two parts, I’ve examined some similarities and differences between the US Intelligence Community and the way that private intelligence firms or geopolitical risk firms function. The issues of bias and groupthink are prevalent in both- I don’t claim that the IC is any better at combatting both problems effectively. However, despite wider flexibility in hiring practices, the problems of bias and groupthink are even more acute in private intel firms. The following are five types of bias/groupthink that I’ve observed during my five years in this field:
Selection-driven Groupthink
For the IC, the security clearance process can substantially limit the pool of applicants that it can consider, necessarily eliminating citizens of other nations, US citizens with extensive time overseas, naturalized US citizens with certain backgrounds, etc., etc. This is a security issue, and a reality that the IC cannot change. In the private sector, however, there is significantly greater flexibility in terms of the types of people who can be hired. Some of the very backgrounds that may prevent an individual from obtaining a security clearance in the IC could in fact provide invaluable expertise at a private firm. This includes close familiarity with other cultures, native language expertise, and wide ranging business connections or personal contacts. And yet, most private intel firms are full of analysts from the same few top tier universities, with the same think tanks internships, generic study abroad experiences, and basket of intermediate language skills. These people had many of the same professors, read the same books, and wrote the same types of theses. They all got together at the university bar to bemoan becoming jaded cynics who are never surprised by anything. Now they read the same websites and books, and fantasize about the same PhD programs, while secretly hoping they’ll one day be Secretary of State. I don't mean to suggest that there's anything wrong with top tier university graduates who become analysts, but when a company tends to hire from the same three schools, they shouldn't be surprised that over time, groupthink becomes a major issue.
It’s no wonder then that these very similar people think the same way, and arrive at the same conclusions, and confirm each other’s analytical assessments (on the rare occasions when they bother to discuss them). Diversity among analysts is always good, but it’s also not a magical solution. Even a diverse group could produce uninspired analysis if a single analyst - who never engages with others to debate hypotheses, explore possibilities or consider alternative perspectives- writes projects. And in fact, most projects are the result of an analyst working solo, thereby exacerbating this groupthink issue, and leading to the next two types of bias, which are inextricably linked. One leads to the other, and both stem from the lack of a division of labor between collection and analysis.
Collection Bias
As I mentioned previously, the IC divides collectors and analysts into two separate groups, and the skills needed to excel in each job vary greatly. In most private intel firms, the same person typically collects and analyzes the information, interviews subject matter experts, fact checks their own research, and writes the assessment. That’s a lot for one person, and reminds me of Charleston Tucker’s improbable character on State of Affairs. She runs covert ops, shoots people, breaks people out of custody, oh and by the way, puts together the Presidential Daily Brief and actually briefs the President. When someone is a one-man private intel firm, they can’t possibly execute every single function well.
The quality of the analysis is determined then not just by the analyst’s analytical prowess, but also by the quality of their research skills. Usually, an analyst confirms a piece of information in two other sources at most, rarely checking to see if the two sources are truly different, and deems the information correct. For some things, two sources are enough, but for others, two sources are inadequate, and could significantly skew the analytical conclusion. Given short deadlines, the analyst will typically only look for just enough information needed to make their assessment, and does not do extra work to confirm, challenge their own preconceived notions, or consider the implications of whether a certain piece of information is driven by an agenda and therefore inaccurate or useless for making decisions.
Confirmation Bias
When the information is weak, the intelligence derived from it is weak. When we simply try to confirm our preconceived notions, that’s not good analysis. Lots of us look at the world through a set of lenses that we developed during our college and graduate school years. Some of us are realists, others liberals or constructivists. Those paradigms help us interpret the world we see around us, because we prioritize certain elements of connections, dynamics and conflicts over others. These lenses lead us to certain suspicions or theories about what explains international developments and how to interpret pieces of new information. Our specialties also affect our analyses. With regards to the recent drop in oil prices, some saw the Saudis using OPEC as an offensive weapon to hurt Russia, Iran (and Syria), while others suspected that the Saudis were taking a defensive posture, to protect an already eroded market share as US shale production increases. Both can be true. Both may be true. But if an analyst’s first thought is that Saudi Arabia is gaining power in the region, they may only search for information to confirm that suspicion, and ignore the market share question, which may in fact be the greater of the two motivations, or of greater interest to a client. If the analyst is also the collector, they’re casting a much narrower net for information. If the functions were split, a collector would cast a wide net, and get as much information as possible, to pass on to the analyst to reconcile the disparate facts, forcing the analyst to consider many competing explanations for what may be happening.
I should note here, that clients are part of the problem in this particular type of bias. Many already have a preconceived notion that they’re trying to confirm because they’ve already made plans to take a certain action and they want to reassure themselves that they’re right to do so.
One of my biggest frustrations is that geopolitical risk analysis is not truly a part of clients’ decision-making process. It’s frequently an afterthought. If it were part of the organic decision-making process, geopolitical risk analysis would be used before a decision is made, not to rubberstamp existing plans. To put this in simple terms: If you want to buy a laptop, you first read a variety online reviews from respected tech sites, then go to the store to check out the top contenders, and then comparison shop online to find the best price or buy the best laptop at the store. The way geopolitical risk analysis is used now, is akin to first ordering the laptop you think is best online, then going to read enough online reviews - of only that particular laptop - to convince yourself you made the right choice. Seems a bit backwards, right?
Caution-driven Bias
This is one type of bias that’s very difficult to counteract. In the industry, there’s a legal consideration at play. Private intel firms never want to put a client in harm’s way or expose them to risky situations. Hence, analysis will always err on the side of caution. This frequently makes identifying and mitigating risks a little easier. But then again, it also doesn’t take much expertise to warn someone not to go wondering around at night. It also certainly gets in the way or identifying and exploiting opportunities, all of which involve some degree of risk. It’s easy and logical to tell someone that they should stay in their well guarded, five star hotel. But if they need to go out to a building site, or visit a well-trafficked area, it takes a greater degree of security expertise to determine what is and isn’t safe, and what specific precautions may be necessary. Analyst X, fresh out of undergrad, likely doesn’t have that expertise, and so they’ll just give a generic set of common sense recommendations. Is that worth the money being charged for them? And if the people writing the recommendations aren’t trained to make good ones, why should those recommended be trusted?
Template-driven Bias
Finally, we come to template driven bias. Most products created by private intel and geopolitical risk firms are based on templates that follow a standardized outline. This creates consistency across products and ensures that analysts include a comprehensive overview of the subject they’re analyzing. Mostly, though, templates allow products to scale- swap out a couple details, and the product can be resold. The motivation here is profit, not intellectual rigor. In theory, this is great. Every product in a series will look the same, and provide the same information, allowing for comparison and better decision-making. In reality, templates beat original thought out of analysts, forcing them to all adhere to the same dry, BLUF-driven short paragraphs, where only the names are different. If one’s eyes glaze over filling one out, you can only imagine what a thrill it must be to read a dozen of them. But the real problem with templates is that they limit analyst initiative. They don’t inspire further inquiry, and if something isn’t asked for on the template, even if it may be relevant, an analyst is unlikely to take the time to hunt it down. Fill-in-the-blank style analysis doesn’t benefit anyone, least of all the analyst. Standardized processes and products can be beneficial; standardized thinking cannot ever be beneficial.
The bias and groupthink prevalent in private intel and geopolitical risk firms present a big problem that are not likely to be solved in the near term if these companies proceed apace. It’s not a theoretical problem. It’s a real problem, that’s having real consequences for the direction the industry is taking. Check back next Tuesday for a discussion of methodologies, and my ideas for how they can be improved to boost the value that private intel/geopolitical risk firms can provide.