You are on page 1of 3

THE ROLE OF OPEN SOURCE I NTELLIGENCE

Interview with Axel Dyvre


Axel Dyvre: Director in charge of Open Source Intelligence Solutions at CEIS. Previously he was Senior Partner in a software editorial firm where he spent 10 years defining, managing and controlling Open Source Intelligence projects in public and private organisations, in both national and international contexts. He has worked in complex environments combining tools, methodologies and expertise. Prior to that Axel Dyvre was an officer in reconnaissance units in the French Army for seven years. He has a Business degree from the Ecole Suprieure de Commerce de Paris (France).

Could you explain the difference between traditional and open source intelligence? ntelligence agencies are primarily focused on obtaining information that is not available to ordinary citizens, for example through networks of agents, by monitoring electronic communications, or by examining satellite photography. Certainly, the vast majority of these agencies funding is dedicated to these forms of intelligence. On the other hand, the amount of publicly available information is growing daily, with huge numbers of formal, and an even greater number of informal contributors posting information on the web. This information is often equally useful to take mission critical decisions. The fact that it is not classified does not mean that it is easy to find or easy to interpret. The challenge for intelligence or risk analysts is to identify all the actionable information and analyse it.

This is paradoxical. Intelligence services, and particularly the CIA, have carried out many studies in recent years on the relation between intelligence and OSINT. The figures varied, but all of the studies shared similar conclusions: between 35% and 95% of the information used by the US Government Intelligence, after processing, came from open sources. This said, you should note that the amounts allocated to OSINT by US agencies did not exceed 1% of their budgets. Therefore economic argument in favour of OSINT cannot be refuted: 1% of expenditure contributes to at least 35% of the results1! It is not surprising therefore that intelligence services all over the world are becoming increasingly interested in OSINT. As part of the post 9/11 reform process of the intelligence community, the CIA announced in November 2005 the creation of a special structure to deal with Open Source, the Open Source Centre. In December 2005, John Negroponte, the US Director of National Intelligence, appointed Eliot A. Jardines as Assistant Deputy Director of National Intelligence for Open Source. His mission is to set up a policy framework for the use of OSINT within the American intelligence community. How do you explain the uncommon performance of OSINT ? Well its very simple: as Steven Mercado suggested in an important article published in the CIAs Studies in Intelligence Journal, There are far more bloggers, journalists, pundits,

television reporters, and think-tankers in the world than there are case officers2. All these actors can cover the ground in much more detail than any security agency. With nearly one billion surfers, tens of thousands of news sites and over 43 million blogs, that is a potential mass of several hundred thousand items of information published every day. Some of them, filtered with the right tools and the right competence, could turn out to be key elements to take decisions in conflict prevention or crisis response, whatever the crisis. It is necessary to overcome the confusion that exists between the interest and reliability of an item of information and its degree of secrecy. For example, an extremely secret source can provide erroneous information, while on an analysis in a newspaper or on a website may prove correct! Of course, it is human nature to confuse the messenger and the message, but when it comes to intelligence or mission critical information, the value of a source and the value of an item of information have been subject to parallel evaluation. How could OSINT be used in conflict prevention or in the course of a crisis response operation? A very important advantage of OSINT is that the information is public (mind you public means that it is not classified, it doesnt mean that its easy to find). In the context of European and multilateral operations, involving governmental players (including military and civilian intelligence), humanitarian

Whats the relevance of OSINT in crisis or conflict prevention? In todays information society, OSINT (open source intelligence) is set to play a pivotal role for those working in crisis response, whether they are working in the field, in crisis prevention, in conflict evaluation or in event monitoring. All too often, information in the public domain is neglected by decision makers or opinion leaders in favour of information from traditional intelligence sources such as human intelligence (HUMINT), and signals intelligence (SIGINT).
1. 2.

On this subject, see: Markowitz, Joseph, (Summer 1997), The Open Source Role, Horizons 1 and 2, OSINT. Mercado, Stephen C. (2005) Reexamining the distinction between open information and secrets. Studies in Intelligence, Vol. 49, No. 2, www.cia.gov/csi/studies/Vol49no2/index.html.

[ 290 ]

organisations, international and private institutions, the use of OSINT is particularly useful. Being unclassified at the origin it can be released without problem to those in the field and at headquarters. Every one of the crisis responders has a need to know basis for that information and by passing it on nobody breaks any confidentiality rule. Think of information concerning physical security threats: imagine being able to gather all the (public) threat reports coming from all the actors on the ground in Kabul, and then sharing that information in real-time. So this means the end of secrecy then? Not really. As I said before, raw OSINT information is naturally unclassified. But in the open source intelligence process, further interpretation and analysis of that raw information can lead to sensitive conclusions, which may deserve a classification depending on the impact its disclosure may create. With OSINT it is the analysis that creates the value, not only the source. But if I understand you correctly everyone could become an OSINT analyst. There is a part of truth in what you say, but just a small part. There are many misperceptions concerning OSINT. If you are good at using Google it is not enough. If you also have a subscription to a database service it is better, but it is still not enough. OSINT is, first of all, a process that takes a lot of time and if a non-specialist with outdated resources tries to use OSINT, it takes even longer. Identifying useful open sources (the art of sourcing), qualifying them, categorising and structuring the signals they carry is a skill that can be learnt, but it cannot be improvised. To excel in OSINT you need to adopt a robust, scientific methodology. You also need to

avoid working using the means at our disposal: you need to have the right human resources, and the right technology. What is the most important skill for an OSINT analyst? The capacity to detect the different nature (legitimate or illegitimate) of the influence strategies carried out by the various sources. New information and communication technologies have changed the parameters of our information environment significantly. The first change is the speed at which information (or disinformation or propaganda) spreads. The second major change is the new access modalities. We are theoretically in a situation where we can be permanently connected (or permanently capable to communicate). All the information published on the planet is now, theoretically, within everyones reach. So you have to stay tuned in all the time or you risk alienating yourself from the epicentres of information. Finally, in the global business of crisis response, analysts have to master the Internet. The importance of the Internet as a vehicle for spreading information is undeniable both because of its audience and because journalists use the Internet on a large scale as a source of information. So traditional media, the very media on which the listener and reader rely for selected and validated information, basically use the same information and validation channels as their audience. This closes a loop, and dramatically increases the possibility to influence because transmitters and receivers are tending to become more and more intertwined. I presume OSINT analysts have to be good at finding things, first and foremost. Absolutely. The volumes of information involved are enormous. You have the web and

you also have the printed and audiovisual reporting which is distributed in a different way then you have a plethora of databases accessible via the Internet that distribute and archive specialised information from all over the world. For an example, LexisNexis provides access to approximately 30 000 different sources that cannot be found on the Internet. Then increasingly you have web-logs or blogs, which are online publications mostly produced by individuals. It is clear that the credibility and quality of an article published in Le Monde, Le Figaro or the Washington Post has nothing in common with a message posted on a blog. But we should not lose sight of the fact that the Lewinsky Affair was stirred up by a blog, and that the obscurity of the source did not prevent the scandal from assuming global proportions! What would be the added value of OSINT for crisis responders and particularly for those in charge of reconstruction and rehabilitation? The context does not change. You need to have tools that facilitate the process of filtering valuable content from noise. Then you need to be able track exactly the evolution of events; then review them critically to distinguish truth from manipulation; relevant content from the irrelevant; mission critical information from ordinary news coverage; and true details from details that may anticipate very relevant consequences. Let me explain how it works. Following the earthquake in Pakistan on 8 October 2005, a Donors Conference was held in Islamabad on 19 November 2005. This event gave rise to a real influence strategy by the Pakistani Government, to get its views across the donor community. If you track the subject Kashmir earthquake reconstruction using a dedicated software, for example the
[ 291 ]

Pericles3 monitoring and analysis system, you will be able to tap massive flows of information in all languages and be alerted about any anomaly in the behaviour of the information. An anomaly can be defined as an unusual amount of articles covering the same topic or an unusual correlation between a person and a debate. With this type of software the user defined the type of anomaly he or she wants to find. In the case of the earthquake in Pakistan the interesting element for someone responsible for the reconstruction is the attitude of the players involved (political leaders, international and national organisations, governments and NGOs), with regard to the specific issue of damage assessment. Evaluating the cost of reconstruction was the key issue of the Donors onference. Over 5 000 public news sites worldwide were active in that occasion on that topic. In the months leading up to the Donors Conference, nearly 1 300 documents from media, governments, international institutions and NGOs discussed the upcoming event. Out of all these documents, about 150 were specifically related to damage assessment. The specific ability of the Pericles system to locate the geographical origin of this informa-

tion, categorise it by its tone, enables swift detection of any type of anomaly. For example, in the days leading up to the conference, the subject of damage assessment suddenly took a very political turn. The Pakistani Government had engaged in a genuine (and understandable) operation to influence donors to agree to their terms. This was happening while the emotion aroused by the earthquake was subsiding worldwide. Pericles had not only detected this decline in the global attention but also had spotted that from 11 November, the issue of reconstruction and particularly that of damage assessment was attracting more and more coverage in Pakistan. On 9 November, Pakistans English language newspaper, Dawn, published an article entitled Government, Donors differ on Damage, indicating that the Damage Assessment Team consisting of the World Bank, the Asian Development Bank and the United Nations Development Programme (UNDP) valued the cost of reconstruction at US$2.78 billion. But the same newspaper stated a few lines later that the Pakistani Government contested that conclusion, putting the cost at three times that figure. This was taken up with varying degrees of precision by the rest of the Pakistani press, but the information was hardly covered anywhere else in the world!

Was it a deliberate choice or the fruit of lack of attention to the Pakistani context? The influence campaign by the Pakistani Government continued on the 11 November, when Reuters agency published a news story saying that the Damage Assessment Team had evaluated the damage at US$5.2 billion dollars. According to the press agency, this figure came from an adviser to the Pakistani Government! The story was at that moment relayed all over the world. The figure was gradually confirmed by the donors, and even became the basis of the conference of 19 November. That day the Pakistani Prime Minister Shaukat Aziz declared: Its a very successful day for Pakistan. The results are better than our expectations these are the fruits of our policies and recognition of Pakistans role in the region. On 28 November, an article entitled Pakistan seeks opportunity in face of tragedy in the Dallas Morning News, covered by a large number of American newspapers, pointed to the success of the Pakistani Governments influence campaign, by insisting on the positive effects that the reconstruction aid should have on the countrys economy. That article highlights how Pakistan was able to exploit its strategic position in the war against terror, to win over the US Government and all the donors to share its view.

3.

Pericles is a Global Intelligence Platform, created and developed by the French software editor Datops. It comprises three main parts, functioning on the intelligence cycle model and adapted to an intelligence organisation. These include the following: - agent server with automated capture and tagging; - linguistic and semantic indexations; and - automated classification with graphical visualisation and analytical tools. Pericles is able to monitor millions of websites, forums, blogs and databases, as well as gather and treat thousands of documents per hour.

[ 292 ]

You might also like