ETF Central » policy http://etf-central.com Fast-paced market news, analysis, and discussion - Michael J. Bommarito II Mon, 14 Feb 2011 19:25:43 +0000 en hourly 1 http://wordpress.org/?v=3.3.1 Reading List, Week of Jan. 10, 2010 http://etf-central.com/2011/01/10/reading-list-week-of-jan-10-2010/ http://etf-central.com/2011/01/10/reading-list-week-of-jan-10-2010/#comments Mon, 10 Jan 2011 19:40:20 +0000 Michael Bommarito http://etf-central.com/?p=460 Instead of posting papers separately, I’ve decided to transition to a weekly reading list format.  I’ll update this post over the course of the week, but here’s the initial list:

DeliciousGoogle ReaderTwitterFacebookEmailLinkedInShare/Bookmark

]]>
http://etf-central.com/2011/01/10/reading-list-week-of-jan-10-2010/feed/ 0
Paper: D. Hu, J. L. Zhao, Z. Hua, M. C. S. Wong. Network-Based Modeling and Analysis of Systemic Risks in Banking Systems. http://etf-central.com/2010/11/16/paper-d-hu-j-l-zhao-z-hua-m-c-s-wong-network-based-modeling-and-analysis-of-systemic-risks-in-banking-systems/ http://etf-central.com/2010/11/16/paper-d-hu-j-l-zhao-z-hua-m-c-s-wong-network-based-modeling-and-analysis-of-systemic-risks-in-banking-systems/#comments Tue, 16 Nov 2010 04:07:24 +0000 Michael Bommarito http://etf-central.com/?p=411 Tonight, I’ll be clearing out a backlog of papers built up over the last few days. Here’s the first paper (not including mine) – Network-Based Modeling and Analysis of Systemic Risks in Banking Systems.  The paper is interesting for two reasons.  First, they model a banking system with an explicit network of bank-to-bank relationships and correlated bank balance sheets.  Second, they simulate their model to compare the ability of  bank risk measures to predict contagion.  I’m not sure their LASER algorithm is actually any different than the Kleinberg HITS method applied to an edge-weighted network, but it does outperform the standard accounting measures in their simulations.  Abstract and download below:

Abstract: Preventing financial crisis has become the concerns of average citizens all over the world and the aspirations of academics from disciplines outside finance. In many ways, better management of financial risks can be achieved by more effective use of information in financial institutions. In this paper, we developed a network-based framework for modeling and analyzing systemic risks in banking systems by viewing the interactive relationships among banks as a financial network. Our research method integrates business intelligence (BI) and simulation techniques, leading to three main research contributions in this paper. First, by observing techniques such as the HITS algorithm used in estimating relative importance of web pages, we discover a network-based analytical principle called the Correlative Rank-In-Network Principle (CRINP), which can guide an analytical process for estimating relative importance of nodes in many types of networks beyond web pages. Second, based on the CRINP principle, we develop a novel risk estimation algorithm for understanding relative financial risks in a banking network called Link-Aware Systemic Estimation of Risks (LASER) for purposes of reducing systemic risks. To validate the LASER approach, we evaluate the merits of the LASER by comparing it with conventional approaches such as Capital Asset Ratio and Loan to Asset Ratio as well as simulating the effect of capital injection guided by the LASER algorithm. The simulation results show that LASER significantly outperforms the two conventional approaches in both predicting and preventing possible contagious bank failures. Third, we developed a novel method for effectively modeling one major source of bank systemic risk – correlated financial asset portfolios – as banking network links. Another innovative aspect of our research is the simulation of systemic risk scenarios is based on real-world data from Call Reports in the U.S. In those scenarios, we observe that the U.S. banking system can sustain mild simulated economic shocks until the magnitude of the shock reaches a threshold. We suggest our framework can provide researchers new methods and insights in developing theories about bank systemic risk. The BI algorithm – LASER, offers financial regulators and other stakeholders a set of effective tools for identifying systemic risk in the banking system and supporting decision making in systemic risk mitigation.

D. Hu, J. L. Zhao, Z. Hua, M. C. S. Wong. Network-Based Modeling and Analysis of Systemic Risks in Banking Systems. http://ssrn.com/abstract=1702467.

DeliciousGoogle ReaderTwitterFacebookEmailLinkedInShare/Bookmark

]]>
http://etf-central.com/2010/11/16/paper-d-hu-j-l-zhao-z-hua-m-c-s-wong-network-based-modeling-and-analysis-of-systemic-risks-in-banking-systems/feed/ 0
Paper Abstract: D. M. Katz, M. J. Bommarito II. Measuring the Complexity of Law: The United States Code. http://etf-central.com/2010/11/15/paper-abstract-d-m-katz-m-j-bommarito-ii-measuring-the-complexity-of-law-the-united-states-code/ http://etf-central.com/2010/11/15/paper-abstract-d-m-katz-m-j-bommarito-ii-measuring-the-complexity-of-law-the-united-states-code/#comments Tue, 16 Nov 2010 03:35:17 +0000 Michael Bommarito http://etf-central.com/?p=400 I’ve been offline for a few days wrapping up some contracts and academic work, but I wanted to highlight an exciting paper that Dan and I have been working on – Measuring the Complexity of Law: The United States Code.  This law review is a thorough description of our method for measuring legal complexity and is the counterpart to A Mathematical Approach to the Study of the United States Code, recently published in Physica A.  Given the recent chatter on possible tax reform and simplification lately, Tax Code complexity may be popular topics in the near future.  The paper isn’t public yet, but you can read the abstract below:

Abstract: The complexity of law is an issue relevant to all who study legal systems. In addressing this issue, scholars have taken approaches ranging from descriptive accounts to theoretical models. Preeminent examples of this literature include Long & Swingen (1987), McCaffery (1990), Schuck (1992), White (1992), Kaplow (1995), Epstein (1997), Kades (1997), Wright (2000), Holz (2007) and Bourcier & Mazzega (2007). Despite the significant contributions made by these and many other authors, a review of the literature demonstrates an overall lack of empirical scholarship.

In this paper, we address this empirical gap by focusing on the United States Code (“Code”). Though only a small portion of existing law, the Code is an important and representative document, familiar to both legal scholars and average citizens. In published form, the Code contains hundreds of thousands of provisions and tens of millions of words; it is clearly complex. Measuring this complexity, however, is not a trivial task. To do so, we borrow concepts and tools from a range of disciplines, including computer science, linguistics, physics, and psychology.

Our goals are two-fold. First, we design a conceptual framework capable of measuring the complexity of legal systems. Our conceptual framework is anchored to a model of the Code as the object of a knowledge acquisition protocol. Knowledge acquisition, a field at the intersection of psychology and computer science, studies the protocols individuals use to acquire, store, and analyze information. We develop a protocol for the Code and find that its structure, language, and interdependence primarily determine its complexity.

Second, having developed this conceptual framework, we empirically measure the structure, language, and interdependence of the Code’s forty-nine active Titles. We combine these measurements to calculate a composite measure that scores the relative complexity of these Titles. This composite measure simultaneously takes into account contributions made by the structure, language, and interdependence of each Title through the use of weighted ranks. Weighted ranks are commonly used to pool or score objects with multidimensional or nonlinear attributes. Furthermore, our weighted rank framework is flexible, intuitive, and entirely transparent, allowing other researchers to quickly replicate or extend our work. Using this framework, we provide simple examples of empirically supported claims about the relative complexity of Titles.

In sum, this paper posits the first conceptually rigorous and empirical framework for addressing the complexity of law. We identify structure, language, and interdependence as the conceptual contributors to complexity. We then measure these contributions across all forty-nine active Titles in the Code and obtain relative complexity rankings. Our analysis suggests several additional domains of application, such as contracts, treaties, administrative regulations, municipal codes, and state law.

D.M. Katz, M. J. Bommarito II. Measuring the Complexity of Law: The United States Code..

DeliciousGoogle ReaderTwitterFacebookEmailLinkedInShare/Bookmark

]]>
http://etf-central.com/2010/11/15/paper-abstract-d-m-katz-m-j-bommarito-ii-measuring-the-complexity-of-law-the-united-states-code/feed/ 1