本文介绍了Python加载数据并进行多高斯拟合的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我一直在寻找对数据进行多次高斯拟合的方法。到目前为止,我发现的大多数示例都使用正态分布来生成随机数。但是我有兴趣查看数据图并检查是否有1-3个峰值。

I've been looking for a way to do multiple Gaussian fitting to my data. Most of the examples I've found so far use a normal distribution to make random numbers. But I am interested in looking at the plot of my data and checking if there are 1-3 peaks.

我可以在一个峰值上执行此操作,但是我没有知道如何做得更多。

I can do this for one peak, but I don't know how to do it for more.

例如,我有以下数据:

For example, I have this data: http://www.filedropper.com/data_11

我尝试使用lmfit,当然也可以使用scipy,但效果不佳。

I have tried using lmfit, and of course scipy, but with no nice results.

感谢您的帮助!

推荐答案

简单地使单个高斯和的参数化模型函数生效。为您的初始猜测选择一个好的值(这是非常关键的一步),然后让 scipy.optimize 稍微调整一下这些数字。

Simply make parameterized model functions of the sum of single Gaussians. Choose a good value for your initial guess (this is a really critical step) and then have scipy.optimize tweak those numbers a bit.

这是您可能的操作方式:

Here's how you might do it:

import numpy as np
import matplotlib.pyplot as plt
from scipy import optimize

data = np.genfromtxt('data.txt')
def gaussian(x, height, center, width, offset):
    return height*np.exp(-(x - center)**2/(2*width**2)) + offset
def three_gaussians(x, h1, c1, w1, h2, c2, w2, h3, c3, w3, offset):
    return (gaussian(x, h1, c1, w1, offset=0) +
        gaussian(x, h2, c2, w2, offset=0) +
        gaussian(x, h3, c3, w3, offset=0) + offset)

def two_gaussians(x, h1, c1, w1, h2, c2, w2, offset):
    return three_gaussians(x, h1, c1, w1, h2, c2, w2, 0,0,1, offset)

errfunc3 = lambda p, x, y: (three_gaussians(x, *p) - y)**2
errfunc2 = lambda p, x, y: (two_gaussians(x, *p) - y)**2

guess3 = [0.49, 0.55, 0.01, 0.6, 0.61, 0.01, 1, 0.64, 0.01, 0]  # I guess there are 3 peaks, 2 are clear, but between them there seems to be another one, based on the change in slope smoothness there
guess2 = [0.49, 0.55, 0.01, 1, 0.64, 0.01, 0]  # I removed the peak I'm not too sure about
optim3, success = optimize.leastsq(errfunc3, guess3[:], args=(data[:,0], data[:,1]))
optim2, success = optimize.leastsq(errfunc2, guess2[:], args=(data[:,0], data[:,1]))
optim3

plt.plot(data[:,0], data[:,1], lw=5, c='g', label='measurement')
plt.plot(data[:,0], three_gaussians(data[:,0], *optim3),
    lw=3, c='b', label='fit of 3 Gaussians')
plt.plot(data[:,0], two_gaussians(data[:,0], *optim2),
    lw=1, c='r', ls='--', label='fit of 2 Gaussians')
plt.legend(loc='best')
plt.savefig('result.png')

您可以看到,几乎这两个拟合之间(在视觉上)没有区别。因此,您不能确定源中是否存在3个高斯或仅存在2个。但是,如果您必须进行猜测,请检查最小的残差:

As you can see, there is almost no difference between these two fits (visually). So you can't know for sure if there were 3 Gaussians present in the source or only 2. However, if you had to make a guess, then check for the smallest residual:

err3 = np.sqrt(errfunc3(optim3, data[:,0], data[:,1])).sum()
err2 = np.sqrt(errfunc2(optim2, data[:,0], data[:,1])).sum()
print('Residual error when fitting 3 Gaussians: {}\n'
    'Residual error when fitting 2 Gaussians: {}'.format(err3, err2))
# Residual error when fitting 3 Gaussians: 3.52000910965
# Residual error when fitting 2 Gaussians: 3.82054499044

在这种情况下,3个高斯人给出了更好的结果,但我也使最初的猜测相当准确。

In this case, 3 Gaussians gives a better result, but I also made my initial guess fairly accurate.

这篇关于Python加载数据并进行多高斯拟合的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

09-15 04:02