问题描述
我希望有人能帮助我.我从头开始实现了逻辑回归(因此没有库,除了Python中的numpy).
I hope someone can help me. I did an implementation of logistic regression from scratch (so without library, except numpy in Python).
我使用 MNIST数据集作为输入,并决定尝试(因为我正在做二进制分类)仅对两位数字进行测试:1和2.我的代码可在此处找到
I used MNIST dataset as input, and decided to try (since I am doing binary classification) a test on only two digits: 1 and 2. My code can be found here
笔记本应在装有必需库的任何系统上运行.
The notebook should run on any system that have the necessary library installed.
不知何故,我的成本函数没有收敛.由于我的A(我的S形)等于1,因为z很大,所以我得到了错误.
Somehow my cost function is not converging. I am getting error since my A (my sigmoid) is getting equal to 1, since z is getting very big.
我尝试了所有操作,但没有看到我的错误.谁能看一下,让我知道我是否错过了明显的事情?这里的重点不是高精度.正在使模型收敛到某种东西;)
I tried everything but I don't see my error. Can anyone give a look and let me know if I missed something obvious? The point here is not getting a high accuracy. Is getting the model to converge to something ;)
预先感谢,翁贝托
推荐答案
我收到了错误消息.问题是我将类标签1和2(在MNIST中可以找到的标签)用作对象,但是在二进制分类中,您将这些值与0和1进行了比较,因此该模型无法收敛,因为sigmoid()
(请参见我的代码) )只能从0到1(是一个概率).
I got the error. The problem was that I used as class labels 1 and 2 (the one you can find in MNIST), but in binary classification you compare those values with 0 and 1, so the model could not converge, since sigmoid()
(see my code) can only go from 0 to 1 (is a probability).
使用0和1代替1和2可以很好地解决此问题.现在我的模型收敛到98%的精度:-)
Using 0 and 1 instead of 1 and 2 solved the problem beatifully. Now my model converges to 98% accuracy :-)
感谢大家的帮助!
关于,翁贝托
这篇关于使用MNIST进行Logistic回归实现-无法收敛?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!