This action might not be possible to undo. Are you sure you want to continue?
Inês Salpico, Soramäki Networks, firstname.lastname@example.org Kimmo Soramäki, Soramäki Networks, email@example.com
Network Science has been so far regarded mostly as a tool of statistical analysis. However, it is growingly understood as well as a methodological and conceptual approach to data processing, analysis and interpretation. The social networking phenomena has had interesting and dramatic implications for network science namely because it made network representations widespread and better understood by the general audience. It also pushed the field’s boundaries while opening the door for a clear connection between the different disciplines that study and relate to it. At the same time new understanding of data has emerged: data is now understood as a mutating, ever updating and interconnected entity. As a consequence of this change of paradigm institutions are growingly pushed to not only make data available but also to render it in a way that is both clear and “browsable”. In parallel the latest financial crisis made it clear that financial institutions must be regarded as nodes in a complex system in which interdependency and mutual influence are essential for its analysis. It illustrated the role of financial linkages as a channel for the propagation of shocks. It also forged the concept that institutions may be “too interconnected to fail”, in addition to the traditional concept of being “too big to fail”. Analyzing and understanding financial/economic phenomena and systemic risk can be made more intuitive by representing sets of (financial) data not in blunt tables but rather in appropriate and visual layouts that can show the links between system’s agents. A visual representation that allows exploring, analysing and communicating vast amounts of data can allow regulators (and the general public) to form a better understanding of the financial system and its interlinkages. We argue that due to the rare and specific nature of financial crisis analyzing financial risks cannot be done using an “artificial intelligence” paradigm – in which algorithms identify repetitive events – but rather through “intelligence amplification”, allowing a continuous monitoring of the system state. This monitoring is likely to be most effective using a visual approach, i.e. by making visible and perceptible any outstanding unpredictable events. When data is particularly granular and complex making sense of it relies mostly in the possibilities of looking at it in a perceptible, cross-referenced manner with a view of its evolution over a given time span. The paper will have both a theoretical and a practical perspective, resulting from the background research and hands-on work undergone while developing the data and network analysis software FNA (www.financialnetworkanalysis.com) and implementing its use in different financial institutions. The core – analytical engine – of the software is open source and is being built using different open source libraries (namely for visualization) and incorporating a number of analysis algorithms that result from academic research.
Financial Networks present specificities – different from those of social and technological networks - that require new tools and models to make sense of them. This paper (along with FNA software) aims at bridging the gap between the research in the understanding of complex networks and operational supervisory and regulatory analysis of financial data. Furthermore, a critical view is drawn to if and how the introduction of the new paradigm can affect policy, decision making processes and institutional discourses. In the aftermath of the crisis new regulatory authorities and powers have been set up. These include the European System of Financial Supervisors and the Office of Financial Research in the USA, amongst others. The creation of these new agents with new and extended monitoring mandates will result that more granular data will be collected and ultimately needs to made sense of. One of the current development priorities is to allow for the application, namely its online version, to get live feeds from different data sources that can then be browsed and cross analyzed. Our work has led us to confront with and deal with the lack of – relevant – data that is made public, and we would like to at least tap into the available sources while at the same time drawing attention to this paradox: in the age of Big Data the most meaningful data sets, particularly on the financial and economic systems, are still kept within institutions.
Proposed length of paper: 6 pages + images.