Skip to main content

Featured

关于对24年中国经济形势的一点看法

        今天已经是大年初五,春节也差不多接近尾声了,也是我在老家待的最后一天,刚好饭后闲来无事,终于静下心来有空写一写宏观经济分析。         回顾23年春节前的几个交易日,权益市场比较动荡,中证1000的平值隐含波动率最高冲到了91.48,要知道中证1000的实现波动率中位数也就15左右,而春节前几个交易日的连续大幅下跌和国家队快速出手使得权益市场走出深V形态,历史和隐含波动率也随之快速飙升。                另外伴随着雪球集体敲入、DMA爆仓等各类事件爆发,权益市场一片鬼哭狼嚎,就在大家都在讨论这波大A行情该谁来背锅时,证监会突发换帅。想想之前频繁出现在财经类流量博主文章中的北向、量化、公墓等,这次券商场外衍生品和私募微盘股应该也难逃一劫。都说经济繁荣时,大家都忙着数钱根本没有人在意合不合规,经济衰退时,你连呼吸都是错的,人性就是如此。关于现有微观市场体制的一些问题我之前也写过一些文章,这里不想再赘述,这里只想探讨一下宏观经济形势问题。         经济活动存在周期,这是我们初学经济学时就所熟知的,一个完整的经济周期包含繁荣、衰退、萧条和复苏四个阶段,每个阶段一般没有固定的时间长度和明显的分界线。但是如果回顾国内经济发展的历史情况,我们便可以大致发现国内经济增长开始下滑并不是近两年才开始的,三年疫情只是一场突如其来的黑天鹅,并没有影响整个大经济周期的演变方向。              从上图不难看出,从2001年加入世贸组织后,我国经济增长率同比逐年上升,呈现出快速发展的繁荣景象,也就是当时全球媒体称赞的“中国速度”。直到2008年,美国次贷危机爆发,中国也深受波及,随后政府出台了史上最大规模的“4万亿”扩张政策,虽然帮助中国摆脱了金融危机的泥潭,但也造成了后续非常严重的产能过剩、通货膨...

Total Pageviews

READING 22: LINEAR REGRESSION WITH MULTIPLE REGRESSORS

Omitted Variable Bias
        Omitted variable bias is present when two conditions are met:
  • (1) the omitted variable is correlated with the movement of the independent variable in the model, and
  • (2) the omitted variable is a determinant of the dependent variable.
The General Multiple Linear Regression Model
        The multiple regression equation specifies a dependent variable as a linear function of two or more independent variables:
Yi = B0 + B1X1i + B2X2i + … + BkXki + εi
        The intercept term is the value of the dependent variable when the independent variables are equal to zero. Each slope coefficient is the estimated change in the dependent variable for a one-unit change in that independent variable, holding the other independent variables constant.
The Slope Coefficient in Multiple regression
        In a multivariate regression, each slope coefficient is interpreted as a partial slope coefficient in that it measures the effect on the dependent variable from a change in the associated independent variable holding other things constant.
Homoscedasticity and Heteroscedasticity in Multiple regression
        In multiple regression, homoscedasticity and heteroskedasticity are just extensions of their definitions discussed in the previous reading. Homoscedasticity refers to the condition that the variance of the error term is constant for all independent variables, X, from i = 1 to n: Var(εi | Xi) = σ2. Heteroskedasticity means that the dispersion of the error terms varies over the sample. It may take the form of conditional heteroskedasticity, which says that the variance is a function of the independent variables.
        Homoscedasticity means that the variance of error terms is constant for all independent variables, while heteroskedasticity means that the variance of error terms varies over the sample. Heteroskedasticity may take the form of conditional heteroskedasticity, which says that the variance is a function of the independent variables.
Measures of Fit in Multiple Regression
        Multiple regression estimates the intercept and slope coefficients such that the sum of the squared error terms is minimized. The estimators of these coefficients are known as ordinary least squares (OLS) estimators. The OLS estimators are typically found with statistical software.
        The standard error of the regression (SER) is the standard deviation of the predicted values for the dependent variable about the regression line:

        The coefficient of determination, R2, is the percentage of the variation in Y that is explained by the set of independent variables.
  • R2 increases as the number of independent variables increases—this can be a problem.
  • The adjusted R2 adjusts the R2 for the number of independent variables.


Assumptions of the multiple linear regression model
        Assumptions of multiple regression mostly pertain to the error term, εi.
  • A linear relationship exists between the dependent and independent variables.
  • The independent variables are not random, and there is no exact linear relation between any two or more independent variables.
  • The expected value of the error term is zero.
  • The variance of the error terms is constant.
  • The error for one observation is not correlated with that of another observation.
  • The error term is normally distributed.
Multicollinearity
        Perfect multicollinearity exists when one of the independent variables is a perfect linear combination of the other independent variable. Imperfect multicollinearity arises when two or more independent variables are highly correlated, but less than perfectly correlated.

Popular Posts