本文介绍了Python-SciPy Kernal估计示例-密度1个的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我目前正在通过此SciPy示例进行工作基于核估计.特别是,一个标记为单变量估计"的变量.与创建随机数据相反,我使用资产收益率.我的第二次估算值(甚至是我创建的用于比较的简单pdf规范)显示的密度在20时达到峰值,这没有任何意义……我的代码如下:

I'm currently working through this SciPy example on Kernal Estimation. In particular, the one labelled "Univariate estimation". As opposed to creating random data, I am using asset returns. My 2nd estimation though (and even the simply norm pdf I create to compare to) are showing a density that peaks at 20, which makes no sense... My code is as follows:

x1 = np.array(data['actual'].values)[1:]
xs1 = np.linspace(x1.min()-1,x1.max()+1,len(x1))
std1 = x1.std()
mean1 = x1.mean()

x2 = np.array(data['log_moves'].values)[1:]
xs2 = np.linspace(x2.min()-.01,x2.max()+.01,len(x2))
#xs2 = np.linspace(x2.min()-1,x2.max()+2,len(x2))
std2 = x2.std()
mean2 = x2.mean()

kde1 = stats.gaussian_kde(x1)  # actuals
kde2 = stats.gaussian_kde(x1, bw_method='silverman')

kde3 = stats.gaussian_kde(x2)  # log returns
kde4 = stats.gaussian_kde(x2, bw_method='silverman')

fig = plt.figure(figsize=(10,8))
ax1 = fig.add_subplot(211)
ax1.plot(x1, np.zeros(x1.shape), 'b+', ms=12)  # rug plot
ax1.plot(xs1, kde1(xs1), 'k-', label="Scott's Rule")
ax1.plot(xs1, kde2(xs1), 'b-', label="Silverman's Rule")
ax1.plot(xs1, stats.norm.pdf(xs1,mean1,std1), 'r--', label="Normal PDF")

ax1.set_xlabel('x')
ax1.set_ylabel('Density')
ax1.set_title("Absolute (top) and Returns (bottom) distributions")
ax1.legend(loc=1)

ax2 = fig.add_subplot(212)
ax2.plot(x2, np.zeros(x2.shape), 'b+', ms=12)  # rug plot
ax2.plot(xs2, kde3(xs2), 'k-', label="Scott's Rule")
ax2.plot(xs2, kde4(xs2), 'b-', label="Silverman's Rule")
ax2.plot(xs2, stats.norm.pdf(xs2,mean2,std2), 'r--', label="Normal PDF")

ax2.set_xlabel('x')
ax2.set_ylabel('Density')

plt.show()

我的结果:

数据仅供参考,第一和第二时刻:

And for reference, the data going in first and 2nd moments:

print std1
print mean1
print std2
print mean2
4.66416718334
0.0561365678347
0.0219996729055
0.00027330546845

此外,如果我更改第二张图表以生成对数正态PDF,则会得到一条扁平线(如果Y轴像顶部一样正确缩放,我相信它会显示出我期望的分布)

Further, if I change the 2nd chart to produce a lognormal PDF, I get a flat line (which, if the Y-axis was correctly scaled like the top, I'm sure would show a distribution like I'd expect)

推荐答案

内核密度估计的结果是概率密度.虽然概率不能大于1,但密度可以.

The result of a kernel density estimate is a probability density. While probability can't be larger than 1, a density can.

给定一个概率密度曲线,您可以通过对该范围内的概率密度求积分来找到该范围内的概率.从肉眼来看,两条曲线下的积分大约为1,因此输出看起来是正确的.

Given a probability density curve, you can find the probability within a range (x_1, x_2) by integrating the probability density in that range. Judging by eye, the integral under both your curves is approximately 1, so the output appears to be correct.

这篇关于Python-SciPy Kernal估计示例-密度1个的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

09-05 20:02