漢森的研究檢討:充分利用數據來對經濟建模 | 計算機和經濟學交叉的機會

文末有計量經濟圈重大提議:api

漢森,2013年諾貝爾經濟學獎得到者之一,是動態經濟學和金融資產訂價領域的頂級專家,如今研究注意力主要集中在「不肯定性」方面。這兩篇研究檢討文章偏偏是與當下發展火熱的計量經濟學相關的,計量經濟圈做爲對應訂閱號有義務拿出來分享。app

漢森的研究檢討:充分利用數據來對經濟建模 | 計算機和經濟學交叉的機會

第一篇:充分利用數據來對經濟建模
Modeling to Make the Most of Data
Economic modeling is often driven by discovery or development of new data sets. For many substantive applications, the data does not simply 「speak for itself,」 making it important to build structural models to interpret the evidence in meaningful ways.electron

For example, the availability of data from national income and product accounts was an early influence on formal macroeconomic modeling. As other evidence on the economic behavior of individuals and firms became available, builders of dynamic economic models incorporated microeconomic foundations in part to bring to bear a broader array of evidence on macroeconomic policy challenges. Similarly, econometricians built and applied new methods for panel data analysis to understand better the empirical implications of microeconomic data with statistical rigor.ide

As large cross-sections of financial market returns became easily accessible to researchers, asset pricing theorists built models that featured economically interpretable risk-return tradeoffs to give an economic interpretation to the observable patterns in financial market data.In all of these cases, data availability provoked new modeling efforts, and these efforts were crucial in bringing new evidence to bear on important policy-relevant economic questions.flex

Rapid growth in computing power and the expansion of electronic marketplaces and information sharing to all corners of life have vastly expanded the data available to individuals, enterprises, and policy makers. Important computational advances in data science have opened the door to analyses of massive new data sets, which potentially offers new insights to a variety of questions important in economic analysis. The richness of this new data provides flexible ways to make predictions about individual and market behavior— for example, the assessment of credit risk when taking out loans and implications for markets of consumer goods including housing.ui

These are topics explored at previous institute events such as the Macro Financial Modeling 2016 Conference held on January 27–29, 2016. One example is the work of Matthew Gentzkow and Jesse Shapiro and a conference on the use of text as data. Another example is the construction and use of new measures of policy uncertainty developed by Scott R. Baker, Nicholas Bloom, and Steve Davis.this

Just as statisticians have sought to provide rigor to the inferential methods used in the data analysis, econometricians now have new challenges in enriching these modeling efforts beyond straightforward data description and prediction. While the data may be incredibly rich along some dimensions, many policy-relevant questions require the ability to transport this richness into other hypothetical settings, as is often required when we wish to know likely responses to new policies or changes in the underlying economic environment. This more subtle but substantively important form of prediction requires both economic and statistical modeling to fully exploit the richness of the data and the power of the computational methods.idea

The door is wide open for important new advances in economic modeling well suited to truly learn from the new data. By organizing conferences like the  September 23-24, 2016 event, the Becker Friedman Institute is nurturing a crucial next step of how best to integrate formal economic analysis to address key policy questions. We seek to foster communication among a variety of scholars from computer science, statistics, and economics in addressing new research challenges. This conference, organized by Stephane Bonhomme, John Lafferty, and Thibaut Lamadon, will encourage the synergistic research efforts of computational statistics and econometrics.spa

— Lars Peter Hansen, Becker Friedman Institute Director 設計

漢森的研究檢討:充分利用數據來對經濟建模 | 計算機和經濟學交叉的機會

第二篇:計算機和經濟學交叉所出現的機會

Opportunity at the Intersection of Computation and Economics
Following the tradition of interdisciplinary collaboration will yield exciting research
In the 1940s and 1950s, the Cowles Commission, then at the University of Chicago, brought together economic scholars together with eminent statisticians and applied mathematicians who pioneered exciting new lines of research in mathematically-oriented economic theory and econometrics. Their intellectual probes, nurtured by cross-disciplinary interactions, had a profound impact on economic research over the next decades.

Today, as our computational power continues to expand, it opens the door to new and exciting approaches to research. Following in the tradition of the Cowles Commission, in the next few months the Becker Friedman Institute will be exploring how computation can nurture new approaches to economic research by bringing together computational experts and economists to engage in productive exchanges of ideas along two different fronts.

One area we are exploring is how computing power enhances development of economic theory. For example, economists often use rationality hypotheses when building models. It is understood this approach is at best an approximation of individual’s behavior and decision-making. This has led many researchers to explore alternative notions of bounded rationality in complex economic environments in which the approximation of full rationality is harder to defend. Among other things, models with bounded rationality impose limitations on the computational effort required for the full optimization.

Meanwhile, advances in information technology has led to the emergence of new markets with new forms of exchange. Computational advances offer approaches that can approximate behavioral interactions in these new types of market interactions. Our 2015–16 Research Fellows, Ben Brooks and Mohammad Akbarpour, have organized a conference in August on the Frontiers of Economic Theory and Computer Science that will bring together economists and computer scientists to explore promising new research directions in this exciting area of endeavor.

On a related front, data science has brought together computational and statistical expertise to study so-called 「machine learning」 approaches to the analysis of large scale data accumulating from everyday transactions in every area of our lives. The institute is probing the question of how to use such approaches in conjunction with economic models that allow us to study important policy questions. Comparing alternative policy options often requires that we engage in the analysis of counterfactuals. This requires that we extrapolate what we have learned from rich data to realms where data is more sparse, using what we call structural economic models. In this vein, the analysis of new and rich data will lead to new economic models designed to address important policy questions. A conference I am organizing with my colleagues Stéphane Bonhomme, John Lafferty, and Thibaut Lamadon will bring together econometricians and statisticians to probe new opportunities for advancement in this exciting synergistic area of research.

While two conferences alone cannot hope to meet the impact of almost two decades of influential work that emerged from the Cowles Commission, they will help to encourage some exciting directions for synergistic research and innovation at the intersection of computation, statistics and economic analysis.

—Lars Peter Hansen

注:來源於 http://larspeterhansen.org/

重要提議:

上一日分享了《時間序列分析三十講》視頻,圈圈想在這裏說一下,爲何給圈友們兩種選擇:①1元費用,②資料置換。咱們主要考慮的其實傾向於第二種,讓各位圈友所保有的計量經濟資料可以互通有無,這樣之後就能夠以極低的流通代價讓彼此都可以看到。

不過,出乎意料以外的事情,是大多數圈友選擇的是第一種方式。這就讓計量經濟圈不可以達到圈內資源互通,不過咱們相信,私人資源仍是須要經過一些價值來提供展示的機會。圈友們有計量資源提供的和有須要的,均可以經過計量經濟圈主頁菜單「問題交流」的聯繫方式,來告訴咱們,這樣能夠設計一些雙方都滿意的機制來促進資源流動。

相關文章
相關標籤/搜索