Font Size: a A A

Long Memory High-frequency Financial Data Volatility

Posted on:2010-11-20Degree:MasterType:Thesis
Country:ChinaCandidate:J BaiFull Text:PDF
GTID:2199360275983916Subject:Operational Research and Cybernetics
Abstract/Summary:PDF Full Text Request
The study of high frequency financial data is a new research field of financial econometrics. In the last two decades, the research in volatility model of high frequency financial data became popular in the academic community. In 1982, Engle proposed the famous autoregressive conditional heteroskedasticity model which is also called ARCH model in short terms. After that the financial and economic scholars have published thousands of articles about conditional heteroskedasticiy or volatility. Especially in the last ten years, Andersen and other scholars proposed a method to use the high frequency time-sharing data to estimate the volatility, so that the more accurate estimation for volatility can be gotten. The estimate way is called as realized volatility. Following their approach, many scholars performed more in-depth studies for the characteristics of volatility and developed the better way to predict volatility. Among so many papers, the ARFIMA-RV model is a very important model to predict volatility. Based on the"U"-shaped trend of days of high frequency time-sharing data, Zhang Shiying and other scholars brought forth weighted realized volatility, and pointed out its long memory. That confirms the important feature of long memory of financial data. Weighted realized volatility is a way to measure volatility without models'particular assumptions. But the research how to model weighted realized volatility is scare.Based on reading a large number of documents, the paper studies the volatility of high frequency financial data with long memory. Firstly, by empirical research, we compare the basic statistical characteristics of the return of Chinese Shanghai Composite Index and different standard volatilities. Accordingly we verify the conclusion that realized volatility and its expansion can get more accurate estimation of volatility. Secondly, we use two methods to test the normality of different volatilities. We not only verify that the logarithm of the realized volatility and its expansion form is approximately normal distribution, but also we find that logarithm of weighted realized volatility has significantly higher normality. Thirdly, based on the characteristics of weighted realized volatility, we use ARFIMA-ARIMA model which is an expansion form of ARFIMA model to describe the important characteristics such as long memory of weighted realized volatility, and predict the future volatility. Finally, using Chinese Shanghai Composite Index, we evaluate the proposed model in this paper, which is called ARFIMA-ARIMA-WRV. At the same time we verify the long memory of weighted realized volatility again. Then we introduce one important application in risk management: based on weighted realized volatility, we propose the calculation of value at risk.
Keywords/Search Tags:High frequency financial data, long memory, realized volatility and its expansion, ARFIMA-ARIMA-WRV model, value at risk
PDF Full Text Request
Related items