本文介绍了如何导入已手动下载的MNIST数据集?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我一直在尝试一个Keras示例,该示例需要导入MNIST数据

I have been experimenting with a Keras example, which needs to import MNIST data

from keras.datasets import mnist
import numpy as np
(x_train, _), (x_test, _) = mnist.load_data()

它会生成错误消息,例如Exception: URL fetch failure on https://s3.amazonaws.com/img-datasets/mnist.pkl.gz: None -- [Errno 110] Connection timed out

It generates error messages such as Exception: URL fetch failure on https://s3.amazonaws.com/img-datasets/mnist.pkl.gz: None -- [Errno 110] Connection timed out

它应该与我使用的网络环境有关. 是否有任何功能或代码可以直接导入手动下载的MNIST数据集?

It should be related to the network environment I am using. Is there any function or code that can let me directly import the MNIST data set that has been manually downloaded?

我尝试了以下方法

import sys
import pickle
import gzip
f = gzip.open('/data/mnist.pkl.gz', 'rb')
  if sys.version_info < (3,):
    data = pickle.load(f)
else:
    data = pickle.load(f, encoding='bytes')
f.close()
import numpy as np
(x_train, _), (x_test, _) = data

然后我收到以下错误消息

Then I get the following error message

Traceback (most recent call last):
File "test.py", line 45, in <module>
(x_train, _), (x_test, _) = data
ValueError: too many values to unpack (expected 2)

推荐答案

keras.datasets.mnist文件真的很短.您可以手动模拟相同的操作,即:

Well, the keras.datasets.mnist file is really short. You can manually simulate the same action, that is:

  1. https://s3.amazonaws.com/img-下载数据集数据集/mnist.pkl.gz
  2. .

  1. Download a dataset from https://s3.amazonaws.com/img-datasets/mnist.pkl.gz
  2. .

import gzip
f = gzip.open('mnist.pkl.gz', 'rb')
if sys.version_info < (3,):
    data = cPickle.load(f)
else:
    data = cPickle.load(f, encoding='bytes')
f.close()
(x_train, _), (x_test, _) = data

这篇关于如何导入已手动下载的MNIST数据集?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

09-15 02:52