The Intellipedia experiment or rather, shared secrets |
Gianluigi CESTA |
From Washington, on the evening of the 31st October last, an explosive news item from the Reuter Agency throws the international Intelligence community into confusion: John Dimitri Negroponte, then Director of the National Intelligence (in the media known as “Zar”), had just announced to the world the existence of “Intellipedia”, a new link-up and information sharing system among the sixteen United States Intelligence bodies, which supervise the area of information and security: an innovation which was destined to leave its mark. In fact, it emerged after the Twin Towers tragedy that the American Government was in possession of a substantial quantity of information about the preparation of the attack, but it was so fragmented throughout the files of the various agencies that no-one had an overall picture of the situation. A similar occurrence happened over preparations for the invasion of Iraq: an efficient information exchange system would, in all probability, have avoided the blunder over the supposed arms of mass destruction of Saddam Hussein. Therefore, at the beginning of 2006, the necessity of providing a valid instrument for the management and coordination of information led the CIA to prepare a specific instrument, inspired by the free encyclopaedia, “Wikipedia” – simple, but extraordinarily effective: the “Intellipedia”. This new system marks the beginning of a new phase: the passage from the formation of Intelligence to the communication of Intelligence. photo Ansa The official news is only a few days old: that the CIA, the most well-known American Intelligence Agency, has developed its own on-line encyclopaedia. The aim is to ensure that all the agencies concerned with intelligence can, in the most rapid way, consult and share information. This experiment is called Intellipedia and, as the name itself implies, is a “secret” version of its celebrated big sister, Wikipedia, famous on-line encyclopaedia, invented 4 years ago. Obviously, access is restricted to users from the intelligence agencies, people who have access to classified information and, therefore, access is not possible to non-intelligence civilians. The idea had its beginning within the Direction of National Intelligence, headed by the “Zar”, John Negroponte, when the experts posed the question on the problem of how to develop information exchange through the means of a private network. Last April, as a result of collaboration between the American agencies, Intellipedia was born, but its existence has been revealed only in these last few days. The news was given by Negroponte himself at a press conference held at the Bolling Military Air Force Base in Washington. The necessity of a system of information exchange arose after the 11th September attack in 2001, when it became known that both the CIA and the FBI were in possession of certain sensitive information regarding the imminent attack, but due to the inadequate sharing and communication of this information, it had been impossible to foresee the attack. The same applies for the preparation of the attack on Iraq: had there been adequate information, the blunder concerning the supposed arms of mass destruction of Saddam Hussein would have been avoided. Negroponte himself states that certain errors of judgement could also have been avoided if information possessed by the various intelligence agencies had been better handled and distributed. Such a system of information-sharing could revolutionize intelligence standards to the point that to make a National Intelligence Estimate (NIE); meaning a document which relates how a particular situation stands, it could be simply sufficient to consult and print a report from this immense data bank. Obviously, a system which foresees the allocation of such a large quantity of classified information on a network, even though private, creates perplexity and doubt as to the possibility of information leakage. But the originators assume the risk, maintaining that the benefits are far superior to the negative aspects. In any event, at the moment, access is strictly reserved and limited. How it works from a technical point of view The databank of the encyclopaedia is located on servers which are not physically connected to the “normal” internet network, but on a private and protected network called JWICS. The Joint Worldwide Intelligence Communications System (JWICS) is an inter-connected system between computers used by the United States Defence Department and the Department of State to transfer confidential information (including information at a TOP SECRET and SCI level) through a “data package” sorted by a TCP/IP protocol within a protected environment. This system furnishes also other services such as e-mail and exact textual connections between documents. To draw a parallel, this system could be defined as the secret internet network of the Defence Department, together with the SIPRNet network. The JWICS, which has substituted the old Defence Data Network DSNET2 and DSNET3, is the most developed and is based on ARPANET technology. It supplies the users of DOD Intelligence Information System (DODIIS) at SCI level with a high speed multimedia network, using a high level of communication to allow for the easy use of data, items, pictures and graphics. The system uses the Joint Deployable Intelligence Support System (JDISS) with primary means of interface with the operator. Like the All Source Analysis System, the JWICS is an evolved system. In the future, the system will also be extended to the SIPRNet network: more extended than the JWICS network and managed by the Defence Information System Agency: a network accessible to a larger number of users and which will allow reaching the necessary critical mass,( to be discussed further ahead). The DIA, (The Defence Intelligence Agency) established that all the Special Offices of Security (SSOs) install the JWICS. The “secret” encyclopaedia is developed with the wiki software (of the Hawaiian “speed”). The strong point about this software is that it is easier to update than a normal web page. No special programming language or software is required in the HTML, nor any other software like Microsoft Front Page or Adobe. In fact, with these means a product is created which, once finalized, is loaded onto the server. While with a wiki software, it is literally sufficient to press a key on the web page to obtain the desired modifications, and it is enough to copy the URL in the text for the exact textual connection. Intellipedia allows the analysts to create a subject and then add, within a “collaboration space”, their own knowledge regarding each document. The analysts working on a certain case can go and see if someone else is working on the same case and whether or not (hopefully) they have other or different information - or they can simply add the information which is in their possession. According to Mark Roseman, founder of the Technologies Course Forum of Guelph, Ontario, (which supplies the commercial version of the wiki software), this software work very well in situations where people are looking for a way to work together in a manner that satisfies everyone. Certainly, on-line collaboration is not a new invention, but what characterizes this software is the fact that the user does not need particular software nor does he need any kind of specific training. In 2002, the research society IDC of Framingham made a study from which emerged the fact that the e-mail was the most used expedient. Easy to use, but the problem is that all the data rest inside the single accounts of each user. Well, just like the e-mails, the wiki software does not need specific programmes, only a web browser (Explorer, Opera, Natscap, Firefox etc., and the files are stored in a central place which are retrievable by anyone at a given moment. A “wiki page” seems like all the others, except in one corner there is a key labelled “edit”: clicking on this key gives access to a page which shows the web page in textual format and, from there, all desired modifications can be introduced. It is then saved and the work enters into the network on the web page. During a conference held in April, 2006, Doctor Calvin Andrus, Head of the Technological Office for the renovation of the CIA, stated that the CIA had started to use the wiki software for internal use and that this use had brought an increase of some 12,000 pages to the ‘top secret’ network. WIKIPEDIA The time dedicated by the analysts to fill these pages, in fact, creates Intellipedia, an important storage of information, which is then shared with the other agencies which, in turn, update the contents. This software does not change the nature of the collaboration, but simply gives different dynamics. In fact, at first sight, managers see wiki software as chaotic: an extra workload. In fact, the normally tendency is to perfect contents for publication before actual publication, but then after, one becomes accustomed to the idea that: it is more efficient to first display a piece of information and then elaborate it various times. In practice, the text is first edited and then modified, no longer the contrary. The text is published by a user and then many hands intervene to rectify and integrate it. But, unlike what happens on Wikipedia, where the updates are sent anonymously, the modifications which are sent by the Intellipedia users are identified by their personal information, so as to be able to control sources and monitor eventual errors. Doctor Andrus puts it this way: “by using this approach, we will come to a critical mass that will permit a permanent and total change in the way of carrying out intelligence”. A world change In fact, for many, this new system of managing information will herald a real innovation. Doctor Andrus is one of those who believe in this genetic mutation of intelligence which will be able to report a quantity of information which can be introduced and analyzed from any part of the world. An approach that has been in contrast with the actual practises of the CIA, which habitually absorbs an enormous quantity of information, writes reports, meticulously eliminates errors and then produces the report. Another example is given by the case of a tourist aeroplane which crashed into a skyscraper in Manhattan: within a few hours the news appeared on the network and was receiving many e-mails. This new frontier of intelligence, even if on the surface it appears of little significance, since it does not entail great legislative reforms or the institution of new agencies and organizations, could, in fact, bring about a true revolution in this sector. For a long time now, particularly after the 11th September attack, many analysts have begun to underline the necessity of passing from an intelligence of information to an intelligence of communication: structures which begin to hold real dialogue between each other and no longer limit themselves, at best, to informing only “colleagues” of other agencies and countries. That which, in fact, makes the difference between ‘inform’ and ‘communicate’ is the feedback which allows the forming of the response relations, which in the information is uncontrolled, while in the communication it is essential for the communication itself. The limit shown over these years, both for American and British Intelligence, has been to discover, all at once, that the enormous accumulation of information is useless, unless it is analyzed and elaborated. Better by far, then, to have a third of the information, but share it, so that a communicative relation between the “producers” of intelligence can be achieved. This communicative relation produces the feedback which serves to bring out inter-connected relations between the various pieces of information which go beyond the information itself, giving a quid pluris to the simple “notions” which are stored. This is the moment which marks the passage from an intelligence of the information to an intelligence of the communication. This is the change which became necessary after the 11th September attack. Intellipedia is moving decisively in this direction. This new system of information-sharing permits the filing of all the classified information in a central point which can be reached by intranet, thereby making it available to the single agencies. Consequently, an extremely ample database is constituted, composed of contributions from each and every office and analyst. The single user can then consult the total information on a particular subject in the database, as well as contents which are not strictly ‘informative’, such as notes on meetings and information of internal interest. Each element introduced and edited will be “readapted” by anyone – subsequent to the first edition of the information – who knows more or differently with respect to the preceding user. In practice, the work of thousands of individuals can contribute to determine the general performance of the entire system which is superior and which, above all, directs and defines with no need of a superior hierarchical control. The placed information will be modified with the passage of time. From this work of subsequent multiple interventions, the most important information should easily emerge, likewise for the “stale” news-information. To do this it is important that the feedback, which assigns an estimated value to the single piece of information that has been consulted and evaluated by someone, enters into contact with it. But in practice, how do we proceed? The wiki software (already discussed) and the blogs are the instruments with which the single user intervenes: interfacing with this enormous data store which is formed, as mentioned previously, by the archives of the single agencies. The users, from the most expert analyzers to the more junior ones, must use these instruments for different final objectives. The blogs (from “web log”) permit the liberal treatment of a subject: The single user decides how and what to deal with, in accordance with his preference and knowledge. Subsequently, other users intervene, who consult this (and other) blogs, and create the feedback which allows the identification of the information or subject of major interest, and the understanding of how the knowledge on a particular subject has been directed and placed in the operative community. Therefore, with wiki software, used to develop this “encyclopaedia for authorized persons”, the page is edited and can be re-edited by subsequent users for updating, rectification and specification. Therefore, the wiki software and blogs are the two instruments with which the operator extrapolates useful information from the database, separating it from less useful or useless material which threatens to generate confusion and hide the material of interest. The blogs are more adaptable and personal. The wiki software is more corporative and institutional. As Eric Heseltine, Scientific Director of the National Intelligence, states: “We are using wikis, we are using blogs, we are using chat, we are using instant messaging”. The feedback allows the response relation, which is at the base of this system. It grants communication between the various intelligence operators and not just the mere forwarding of information without worrying whether the recipient reads it or not. But such a system, in order to function adequately, must have an elevated number of users, blogs, edited pages and feedback. Starting from the Law of Robert Metcalfe, who establishes that the value of the communication system grows, approximately, by the square of the number of links of the system itself: one can define the corollary for which the value of the knowledge shared on the web spaces grows, approximately, by the square of the number of links created on the web space itself. There is, however, a threshold limit beyond which there will be an alteration of the intelligence system: in the same way that a synaptic network does not generate intelligence until the number of synapses is high enough to permit the creation of sufficient connections for such objective. In the same way, the CIA scientists, who have studied the project and, in particular, Dr. Andrus, maintain that when a critical level of users is reached, once past this point, there will be a substantial innovation in the manner in which intelligence is carried out. It will not be a structural change which, in itself, is slow, complex and little suits the necessity of these times, but a change in the very nature of carrying out intelligence. This new operative methodology can lead to adapting the reactions of the intelligence in extremely brief time periods, on the grounds of the external inputs. Thanks to such “shared knowledge”, the new stimuli will enter into this “mechanism” which will permit a speedy and collaborative elaboration on the part of the entire community: a kind of “virtual, integrated-commune” space of collaboration. The theoretical foundation In fact, the system is based on the theory of Complexity. This theory can explain how the utilization of the software technology under discussion, used on an ample scale, i.e. by a large number of users, could lead to the creation of a new manner of intelligence, an intelligence that renews itself and automatically adapts itself to surrounding changes. In fact, as this theory teaches, a complex phenomenon as a whole is superior to the sum of the single phenomena which it comprises, and its development can take directions which cannot be foreseen from the behaviour of the single components. Let us take the market as a notorious example. Many users, consumers, producers operate within the market to satisfy their own single interests. The rule of supply and demand comes as a ‘natural’ consequence to regulate the sector: there are no regulations from ‘above’ which impose that there be an ‘X’ offer or demand (beyond the question of regularization of the market for its protection against fraud or cartels etc.,). The users are all at the same level, without a hierarchy, but with their continual interaction they generate collective behaviour which, in fact, is above and beyond the single user. Like an “invisible hand” (as defined by the economist, Adam Smith in “The Wealth of the Nations”) that intervenes in “dictating” the rules, which are not established by a superior authority, but from “below”, by the self-organized community. Superior powers and regulators which work unknown: in fact, a large demand for goods by a part of the globe Y, determines a larger production of the same goods in a distant place Z, sometimes very far away, without the knowledge of the single operators. In simpler terms, the “system” which is created has its own reason, independent of its single components and for this, it is considered as a phenomenon as a whole. Therefore, the Theory of Complexity is based on certain elements: the self-organization of the individuals and the emersion of collective behaviour: the relations that are created between the single individuals permit their decisional orientation within the complex system; the feedback that every individual receives concerning the changes in the information that comes back to him and which contributes to the decisional process of the single individual; the adaptation of the system to the behaviour of the single individuals which, in turn, are influenced by the new entries into the system and by the decisions of the other individuals belonging to the system itself; the non-linearity of the system for which small changes in the initial introduction can correspond to much vaster and unforeseeable changes in the system. Therefore, the above mentioned adaptation of the system could be - on the basis of the Theory of Complexity - apart from the feedback received by the Intelligence Community, and the sharing of the information (basic assumption) that the officials are free to act, and to react separately - in the area of the virtual community – to the input which comes from the outside and that they be, for this, experts in tradecraft. They must be granted major participation in the strategies to adopt. This, with the objective of enabling them to adapt their “contribution” to the virtual community, in the shortest possible time and, therefore, they must not receive, or better, they must not be subjected to “torrents” of strategies decided from ‘above’. Conclusion This initiative of the United States Intelligence might appear to be a banal invention: one of the many instruments which, from time to time, the Intelligence support scientists invent for the agencies for whom they work. Nevertheless, in this case, much attention must be taken to closely follow the development of this project. In fact, it could represent the keystone to the genetic change of intelligence – the way of carrying out intelligence. In a century, so-called “the century of insecurity”, a change in this sector would seem indispensable: recent history demonstrates this necessity. Notwithstanding the idea of sharing all information might make ‘old school’ experts turn up their collective nose, due to the risk of information leaks, such a step is obligatory in order to face the modern need. However, the hypothetical risk is inferior to the benefits to be had when the system is operating under ‘full-steam’. This new American system could be a pilot project which might then be taken up in Europe. First, in the single nations; – hopefully, creating a system which makes the information available to the various police and intelligence corps in the various States. Eventually, a single European project could be contemplated which, by integrating a database to which all the intelligence agencies of the Old Continent could contribute, would make a tremendous growth in the capacity to protect the single Countries and, finally, one could hope for a world system. Perhaps the project is as ambitious as it is Utopian. Nevertheless, such a prospect could represent the new intelligence: the intelligence of the 21st Century. Seat of the CIA by www.rotten.com BIBLIOGRAPHY - Andrew C. - Dilke D., “The missing dimension: Governments and Intelligence communities in the Twentieth Century”, Illinois University Press, Illinois 1984. - Andrus Calvin, “The Wiki and the Blog: Toward a Complex Adaptive Intelligence Community, Studies in Intelligence”, Volume 49 number 3, September 2005. - Andronico Alberto, “La decostruzione come metodo: Riflessi di Derrida nella teoria del diritto”, Giuffré, Milano 2002. - Autori Vari, “L'Intelligence del XXI secolo”, Quaderni del Centro Gino Germani, Roma 2001. - Bauman Zygmunt,“La modernità liquida”, Laterza, Bari 2000. - Benedetti Amedeo, “L'osservazione per l’Intelligence e l’indagine”, Erga Edizioni, Genova 2003. - Benjafield John G. 1992) “Cognition”, Prentice Hall, Inc; trad. it. “Psicologia dei Processi Cognitivi”, Il Mulino, Bologna, 1995. - Caligiuri Mario, Introduzione, in Steele R. D., “Intelligence”, Rubettino, Catanzaro 2002. - Ceci Alessandro, ” Imitation of life”, in Rapporto EURISPES 2003, Eurispes, Roma 2003. - Ceci Alessandro, “Intell-Action”, dispense del tirocinio d’Intelligence presso l'Università de L'Aquila. - Ceci Alessandro, “Intelligence e Democrazia”, Rubettino, Soveria Mannelli, 2006. - Chiesi Antonio, “L'analisi dei reticoli”, Franco Angeli, Milano 1999. - Lévy Pierre, ” L'intelligenza collettiva” “The Collective Intelligence”, Feltrinelli, Milano 1996. - Martinotti Guido, “Informazione e sapere”, Anabasi, Milano 1992. - Metcalfe Robert "There Oughta Be a Law." The New York Times, section D, page 7, column 1, last edition, Monday 15 July 1996. - Nicolis G - Prigogine I., “La complessità - Esplorazioni nei nuovi campi della scienza”, Einaudi, Torino 1991. - Smith Adam, “The Wealth of the Nations”, Utet, Torino, 1975. - Steele R. D., “Intelligence”, Rubettino, Catanzaro 2002. - Watzlawick P., Beavin J. H., Jackson D. D., “Pragmatics of Human Communication” - “Pragmatica della Comunicazione Umana”, Astrolabio, Roma 1971. |