You can not select more than 25 topics Topics must start with a chinese character,a letter or number, can include dashes ('-') and can be up to 35 characters long.

PyTorch_quick_intro.ipynb 64 kB

6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331332333334335336337338339340341342343344345346347348349350351352353354355356357358359360361362363364365366367368369370371372373374375376377378379380381382383384385386387388389390391392393394395396397398399400401402403404405406407408409410411412413414415416417418419420421422423424425426427428429430431432433434435436437438439440441442443444445446447448449450451452453454455456457458459460461462463464465466467468469470471472473474475476477478479480481482483484485486487488489490491492493494495496497498499500501502503504505506507508509510511512513514515516517518519520521522523524525526527528529530531532533534535536537538539540541542543544545546547548549550551552553554555556557558559560561562563564565566567568569570571572573574575576577578579580581582583584585586587588589590591592593594595596597598599600601602603604605606607608609610611612613614615616617618619620621622623624625626627628629630631632633634635636637638639640641642643644645646647648649650651652653654655656657658659660661662663664665666667668669670671672673674675676677678679680681682683684685686687688689690691692693694695696697698699700701702703704705706707708709710711712713714715716717718719720721722723724725726727728729730731732733734735736737738739740741742743744745746747748749750751752753754755756757758759760761762763764765766767768769770771772773774775776777778779780781782783784785786787788789790791792793794795796797798799800801802803804805806807808809810811812813814815816817818819820821822823824825826827828829830831832833834835836837838839840841842843844845846847848849850851852853854855856857858859860861862863864865866867868869870871872873874875876877878879880881882883884885886887888889890891892893894895896897898899900901902903904905906907908909910911912913914915916917918919920921922923924925926927928929930931932933934935936937938939940941942943944945946947948949950951952953954955956957958959960961962963964965966967968969970971972973974975976977978979980981982983984985986987988989990991992993994995996997998999100010011002100310041005100610071008100910101011101210131014101510161017101810191020102110221023102410251026102710281029103010311032103310341035103610371038103910401041104210431044104510461047104810491050105110521053105410551056105710581059106010611062106310641065106610671068106910701071107210731074107510761077107810791080108110821083108410851086108710881089109010911092109310941095109610971098109911001101110211031104110511061107110811091110111111121113111411151116111711181119112011211122112311241125112611271128112911301131113211331134113511361137113811391140114111421143114411451146114711481149115011511152115311541155115611571158115911601161116211631164116511661167116811691170117111721173117411751176117711781179118011811182118311841185118611871188118911901191119211931194119511961197119811991200120112021203120412051206120712081209121012111212121312141215121612171218121912201221122212231224122512261227122812291230123112321233123412351236123712381239124012411242124312441245124612471248124912501251125212531254125512561257125812591260126112621263126412651266126712681269127012711272127312741275127612771278127912801281128212831284128512861287128812891290129112921293129412951296129712981299130013011302130313041305130613071308130913101311131213131314131513161317131813191320132113221323132413251326132713281329133013311332133313341335133613371338133913401341134213431344134513461347134813491350135113521353135413551356135713581359136013611362136313641365136613671368136913701371137213731374137513761377137813791380138113821383138413851386138713881389139013911392139313941395139613971398139914001401140214031404140514061407140814091410141114121413141414151416141714181419142014211422142314241425142614271428142914301431143214331434143514361437143814391440144114421443144414451446144714481449145014511452145314541455145614571458145914601461146214631464146514661467146814691470147114721473147414751476147714781479148014811482148314841485148614871488148914901491149214931494
  1. {
  2. "cells": [
  3. {
  4. "cell_type": "markdown",
  5. "metadata": {},
  6. "source": [
  7. "# PyTorch快速入门\n",
  8. "\n",
  9. "PyTorch的简洁设计使得它入门很简单,在深入介绍PyTorch之前,本节将先介绍一些PyTorch的基础知识,使得读者能够对PyTorch有一个大致的了解,并能够用PyTorch搭建一个简单的神经网络。部分内容读者可能暂时不太理解,可先不予以深究,后续的课程将会对此进行深入讲解。\n",
  10. "\n",
  11. "本节内容参考了PyTorch官方教程[^1]并做了相应的增删修改,使得内容更贴合新版本的PyTorch接口,同时也更适合新手快速入门。另外本书需要读者先掌握基础的Numpy使用,其他相关知识推荐读者参考CS231n的教程[^2]。\n",
  12. "\n",
  13. "[^1]: http://pytorch.org/tutorials/beginner/deep_learning_60min_blitz.html\n",
  14. "[^2]: http://cs231n.github.io/python-numpy-tutorial/"
  15. ]
  16. },
  17. {
  18. "cell_type": "markdown",
  19. "metadata": {},
  20. "source": [
  21. "### Tensor\n",
  22. "\n",
  23. "Tensor是PyTorch中重要的数据结构,可认为是一个高维数组。它可以是一个数(标量)、一维数组(向量)、二维数组(矩阵)以及更高维的数组。Tensor和Numpy的ndarrays类似,但Tensor可以使用GPU进行加速。Tensor的使用和Numpy及Matlab的接口十分相似,下面通过几个例子来看看Tensor的基本使用。"
  24. ]
  25. },
  26. {
  27. "cell_type": "code",
  28. "execution_count": 1,
  29. "metadata": {},
  30. "outputs": [],
  31. "source": [
  32. "from __future__ import print_function\n",
  33. "import torch as t"
  34. ]
  35. },
  36. {
  37. "cell_type": "code",
  38. "execution_count": 2,
  39. "metadata": {},
  40. "outputs": [
  41. {
  42. "data": {
  43. "text/plain": [
  44. "tensor([[1.2563e-37, 0.0000e+00, 5.7453e-44],\n",
  45. " [0.0000e+00, nan, 4.5814e-41],\n",
  46. " [1.3733e-14, 6.4076e+07, 2.0706e-19],\n",
  47. " [7.3909e+22, 2.4176e-12, 1.1625e+33],\n",
  48. " [8.9605e-01, 1.1632e+33, 5.6003e-02]])"
  49. ]
  50. },
  51. "execution_count": 2,
  52. "metadata": {},
  53. "output_type": "execute_result"
  54. }
  55. ],
  56. "source": [
  57. "# 构建 5x3 矩阵,只是分配了空间,未初始化\n",
  58. "x = t.Tensor(5, 3) \n",
  59. "x"
  60. ]
  61. },
  62. {
  63. "cell_type": "code",
  64. "execution_count": 3,
  65. "metadata": {},
  66. "outputs": [
  67. {
  68. "data": {
  69. "text/plain": [
  70. "tensor([[0.7149, 0.6065, 0.8056],\n",
  71. " [0.2450, 0.1942, 0.5305],\n",
  72. " [0.6735, 0.7798, 0.6060],\n",
  73. " [0.1072, 0.8325, 0.8617],\n",
  74. " [0.5117, 0.2246, 0.4984]])"
  75. ]
  76. },
  77. "execution_count": 3,
  78. "metadata": {},
  79. "output_type": "execute_result"
  80. }
  81. ],
  82. "source": [
  83. "# 使用[0,1]均匀分布随机初始化二维数组\n",
  84. "x = t.rand(5, 3) \n",
  85. "x"
  86. ]
  87. },
  88. {
  89. "cell_type": "code",
  90. "execution_count": 4,
  91. "metadata": {},
  92. "outputs": [
  93. {
  94. "name": "stdout",
  95. "output_type": "stream",
  96. "text": [
  97. "torch.Size([5, 3])\n"
  98. ]
  99. },
  100. {
  101. "data": {
  102. "text/plain": [
  103. "(3, 3)"
  104. ]
  105. },
  106. "execution_count": 4,
  107. "metadata": {},
  108. "output_type": "execute_result"
  109. }
  110. ],
  111. "source": [
  112. "print(x.size()) # 查看x的形状\n",
  113. "x.size()[1], x.size(1) # 查看列的个数, 两种写法等价"
  114. ]
  115. },
  116. {
  117. "cell_type": "markdown",
  118. "metadata": {},
  119. "source": [
  120. "`torch.Size` 是tuple对象的子类,因此它支持tuple的所有操作,如x.size()[0]"
  121. ]
  122. },
  123. {
  124. "cell_type": "code",
  125. "execution_count": 5,
  126. "metadata": {},
  127. "outputs": [
  128. {
  129. "data": {
  130. "text/plain": [
  131. "tensor([[1.6605, 1.1155, 1.2724],\n",
  132. " [0.6727, 0.6428, 1.0969],\n",
  133. " [1.4898, 1.7437, 1.3258],\n",
  134. " [0.8030, 1.5725, 1.4709],\n",
  135. " [0.6847, 0.4828, 0.6183]])"
  136. ]
  137. },
  138. "execution_count": 5,
  139. "metadata": {},
  140. "output_type": "execute_result"
  141. }
  142. ],
  143. "source": [
  144. "y = t.rand(5, 3)\n",
  145. "# 加法的第一种写法\n",
  146. "x + y"
  147. ]
  148. },
  149. {
  150. "cell_type": "code",
  151. "execution_count": 6,
  152. "metadata": {},
  153. "outputs": [
  154. {
  155. "data": {
  156. "text/plain": [
  157. "\n",
  158. " 0.4063 0.7378 1.2411\n",
  159. " 0.0687 0.7725 0.0634\n",
  160. " 1.1016 1.4291 0.7324\n",
  161. " 0.7604 1.2880 0.4597\n",
  162. " 0.6020 1.0124 1.0185\n",
  163. "[torch.FloatTensor of size 5x3]"
  164. ]
  165. },
  166. "execution_count": 6,
  167. "metadata": {},
  168. "output_type": "execute_result"
  169. }
  170. ],
  171. "source": [
  172. "# 加法的第二种写法\n",
  173. "t.add(x, y)"
  174. ]
  175. },
  176. {
  177. "cell_type": "code",
  178. "execution_count": 6,
  179. "metadata": {},
  180. "outputs": [
  181. {
  182. "data": {
  183. "text/plain": [
  184. "tensor([[1.7112, 1.2969, 0.3289],\n",
  185. " [0.7841, 1.0128, 0.7596],\n",
  186. " [1.1364, 1.1541, 0.8970],\n",
  187. " [0.8831, 0.7063, 0.3158],\n",
  188. " [1.5160, 1.3610, 0.8437]])"
  189. ]
  190. },
  191. "execution_count": 6,
  192. "metadata": {},
  193. "output_type": "execute_result"
  194. }
  195. ],
  196. "source": [
  197. "# 加法的第三种写法:指定加法结果的输出目标为result\n",
  198. "result = t.Tensor(5, 3) # 预先分配空间\n",
  199. "t.add(x, y, out=result) # 输入到result\n",
  200. "result"
  201. ]
  202. },
  203. {
  204. "cell_type": "code",
  205. "execution_count": 7,
  206. "metadata": {},
  207. "outputs": [
  208. {
  209. "name": "stdout",
  210. "output_type": "stream",
  211. "text": [
  212. "最初y\n",
  213. "tensor([[0.9778, 0.9240, 0.0337],\n",
  214. " [0.7461, 0.8548, 0.5141],\n",
  215. " [0.5364, 0.9908, 0.1078],\n",
  216. " [0.6880, 0.1675, 0.0010],\n",
  217. " [0.9120, 0.5539, 0.2896]])\n",
  218. "第一种加法,y的结果\n",
  219. "tensor([[0.9778, 0.9240, 0.0337],\n",
  220. " [0.7461, 0.8548, 0.5141],\n",
  221. " [0.5364, 0.9908, 0.1078],\n",
  222. " [0.6880, 0.1675, 0.0010],\n",
  223. " [0.9120, 0.5539, 0.2896]])\n",
  224. "第二种加法,y的结果\n",
  225. "tensor([[1.7112, 1.2969, 0.3289],\n",
  226. " [0.7841, 1.0128, 0.7596],\n",
  227. " [1.1364, 1.1541, 0.8970],\n",
  228. " [0.8831, 0.7063, 0.3158],\n",
  229. " [1.5160, 1.3610, 0.8437]])\n"
  230. ]
  231. }
  232. ],
  233. "source": [
  234. "print('最初y')\n",
  235. "print(y)\n",
  236. "\n",
  237. "print('第一种加法,y的结果')\n",
  238. "y.add(x) # 普通加法,不改变y的内容\n",
  239. "print(y)\n",
  240. "\n",
  241. "print('第二种加法,y的结果')\n",
  242. "y.add_(x) # inplace 加法,y变了\n",
  243. "print(y)"
  244. ]
  245. },
  246. {
  247. "cell_type": "markdown",
  248. "metadata": {},
  249. "source": [
  250. "注意,函数名后面带下划线**`_`** 的函数会修改Tensor本身。例如,`x.add_(y)`和`x.t_()`会改变 `x`,但`x.add(y)`和`x.t()`返回一个新的Tensor, 而`x`不变。"
  251. ]
  252. },
  253. {
  254. "cell_type": "code",
  255. "execution_count": 9,
  256. "metadata": {},
  257. "outputs": [
  258. {
  259. "data": {
  260. "text/plain": [
  261. "\n",
  262. " 0.2522\n",
  263. " 0.7138\n",
  264. " 0.6019\n",
  265. " 0.3675\n",
  266. " 0.5104\n",
  267. "[torch.FloatTensor of size 5]"
  268. ]
  269. },
  270. "execution_count": 9,
  271. "metadata": {},
  272. "output_type": "execute_result"
  273. }
  274. ],
  275. "source": [
  276. "# Tensor的选取操作与Numpy类似\n",
  277. "x[:, 1]"
  278. ]
  279. },
  280. {
  281. "cell_type": "markdown",
  282. "metadata": {},
  283. "source": [
  284. "Tensor还支持很多操作,包括数学运算、线性代数、选择、切片等等,其接口设计与Numpy极为相似。更详细的使用方法,会在第三章系统讲解。\n",
  285. "\n",
  286. "Tensor和Numpy的数组之间的互操作非常容易且快速。对于Tensor不支持的操作,可以先转为Numpy数组处理,之后再转回Tensor。"
  287. ]
  288. },
  289. {
  290. "cell_type": "code",
  291. "execution_count": 8,
  292. "metadata": {},
  293. "outputs": [
  294. {
  295. "data": {
  296. "text/plain": [
  297. "tensor([1., 1., 1., 1., 1.])"
  298. ]
  299. },
  300. "execution_count": 8,
  301. "metadata": {},
  302. "output_type": "execute_result"
  303. }
  304. ],
  305. "source": [
  306. "a = t.ones(5) # 新建一个全1的Tensor\n",
  307. "a"
  308. ]
  309. },
  310. {
  311. "cell_type": "code",
  312. "execution_count": 9,
  313. "metadata": {},
  314. "outputs": [
  315. {
  316. "data": {
  317. "text/plain": [
  318. "array([1., 1., 1., 1., 1.], dtype=float32)"
  319. ]
  320. },
  321. "execution_count": 9,
  322. "metadata": {},
  323. "output_type": "execute_result"
  324. }
  325. ],
  326. "source": [
  327. "b = a.numpy() # Tensor -> Numpy\n",
  328. "b"
  329. ]
  330. },
  331. {
  332. "cell_type": "code",
  333. "execution_count": 10,
  334. "metadata": {},
  335. "outputs": [
  336. {
  337. "name": "stdout",
  338. "output_type": "stream",
  339. "text": [
  340. "[1. 1. 1. 1. 1.]\n",
  341. "tensor([1., 1., 1., 1., 1.], dtype=torch.float64)\n"
  342. ]
  343. }
  344. ],
  345. "source": [
  346. "import numpy as np\n",
  347. "a = np.ones(5)\n",
  348. "b = t.from_numpy(a) # Numpy->Tensor\n",
  349. "print(a)\n",
  350. "print(b) "
  351. ]
  352. },
  353. {
  354. "cell_type": "markdown",
  355. "metadata": {},
  356. "source": [
  357. "Tensor和numpy对象共享内存,所以他们之间的转换很快,而且几乎不会消耗什么资源。但这也意味着,如果其中一个变了,另外一个也会随之改变。"
  358. ]
  359. },
  360. {
  361. "cell_type": "code",
  362. "execution_count": 13,
  363. "metadata": {},
  364. "outputs": [
  365. {
  366. "name": "stdout",
  367. "output_type": "stream",
  368. "text": [
  369. "[2. 2. 2. 2. 2.]\n",
  370. "\n",
  371. " 2\n",
  372. " 2\n",
  373. " 2\n",
  374. " 2\n",
  375. " 2\n",
  376. "[torch.DoubleTensor of size 5]\n",
  377. "\n"
  378. ]
  379. }
  380. ],
  381. "source": [
  382. "b.add_(1) # 以`_`结尾的函数会修改自身\n",
  383. "print(a)\n",
  384. "print(b) # Tensor和Numpy共享内存"
  385. ]
  386. },
  387. {
  388. "cell_type": "markdown",
  389. "metadata": {},
  390. "source": [
  391. "Tensor可通过`.cuda` 方法转为GPU的Tensor,从而享受GPU带来的加速运算。"
  392. ]
  393. },
  394. {
  395. "cell_type": "code",
  396. "execution_count": 6,
  397. "metadata": {},
  398. "outputs": [
  399. {
  400. "name": "stdout",
  401. "output_type": "stream",
  402. "text": [
  403. "tensor([[1.6605, 1.1155, 1.2724],\n",
  404. " [0.6727, 0.6428, 1.0969],\n",
  405. " [1.4898, 1.7437, 1.3258],\n",
  406. " [0.8030, 1.5725, 1.4709],\n",
  407. " [0.6847, 0.4828, 0.6183]], device='cuda:0')\n"
  408. ]
  409. }
  410. ],
  411. "source": [
  412. "# 在不支持CUDA的机器下,下一步不会运行\n",
  413. "if t.cuda.is_available():\n",
  414. " x = x.cuda()\n",
  415. " y = y.cuda()\n",
  416. " x + y\n",
  417. "print(x+y)"
  418. ]
  419. },
  420. {
  421. "cell_type": "markdown",
  422. "metadata": {},
  423. "source": [
  424. "此处可能发现GPU运算的速度并未提升太多,这是因为x和y太小且运算也较为简单,而且将数据从内存转移到显存还需要花费额外的开销。GPU的优势需在大规模数据和复杂运算下才能体现出来。\n",
  425. "\n",
  426. "### Autograd: 自动微分\n",
  427. "\n",
  428. "深度学习的算法本质上是通过反向传播求导数,而PyTorch的**`Autograd`**模块则实现了此功能。在Tensor上的所有操作,Autograd都能为它们自动提供微分,避免了手动计算导数的复杂过程。\n",
  429. " \n",
  430. "`autograd.Variable`是Autograd中的核心类,它简单封装了Tensor,并支持几乎所有Tensor有的操作。Tensor在被封装为Variable之后,可以调用它的`.backward`实现反向传播,自动计算所有梯度。Variable的数据结构如图2-6所示。\n",
  431. "\n",
  432. "\n",
  433. "![图2-6:Variable的数据结构](imgs/autograd_Variable.svg)\n",
  434. "\n",
  435. "\n",
  436. "Variable主要包含三个属性。\n",
  437. "- `data`:保存Variable所包含的Tensor\n",
  438. "- `grad`:保存`data`对应的梯度,`grad`也是个Variable,而不是Tensor,它和`data`的形状一样。\n",
  439. "- `grad_fn`:指向一个`Function`对象,这个`Function`用来反向传播计算输入的梯度,具体细节会在下一章讲解。"
  440. ]
  441. },
  442. {
  443. "cell_type": "code",
  444. "execution_count": 8,
  445. "metadata": {},
  446. "outputs": [],
  447. "source": [
  448. "from torch.autograd import Variable"
  449. ]
  450. },
  451. {
  452. "cell_type": "code",
  453. "execution_count": 9,
  454. "metadata": {
  455. "scrolled": true
  456. },
  457. "outputs": [
  458. {
  459. "data": {
  460. "text/plain": [
  461. "tensor([[1., 1.],\n",
  462. " [1., 1.]], requires_grad=True)"
  463. ]
  464. },
  465. "execution_count": 9,
  466. "metadata": {},
  467. "output_type": "execute_result"
  468. }
  469. ],
  470. "source": [
  471. "# 使用Tensor新建一个Variable\n",
  472. "x = Variable(t.ones(2, 2), requires_grad = True)\n",
  473. "x"
  474. ]
  475. },
  476. {
  477. "cell_type": "code",
  478. "execution_count": 10,
  479. "metadata": {
  480. "scrolled": true
  481. },
  482. "outputs": [
  483. {
  484. "data": {
  485. "text/plain": [
  486. "tensor(4., grad_fn=<SumBackward0>)"
  487. ]
  488. },
  489. "execution_count": 10,
  490. "metadata": {},
  491. "output_type": "execute_result"
  492. }
  493. ],
  494. "source": [
  495. "y = x.sum()\n",
  496. "y"
  497. ]
  498. },
  499. {
  500. "cell_type": "code",
  501. "execution_count": 11,
  502. "metadata": {},
  503. "outputs": [
  504. {
  505. "data": {
  506. "text/plain": [
  507. "<SumBackward0 at 0x7fb610129c88>"
  508. ]
  509. },
  510. "execution_count": 11,
  511. "metadata": {},
  512. "output_type": "execute_result"
  513. }
  514. ],
  515. "source": [
  516. "y.grad_fn"
  517. ]
  518. },
  519. {
  520. "cell_type": "code",
  521. "execution_count": 12,
  522. "metadata": {},
  523. "outputs": [],
  524. "source": [
  525. "y.backward() # 反向传播,计算梯度"
  526. ]
  527. },
  528. {
  529. "cell_type": "code",
  530. "execution_count": 13,
  531. "metadata": {},
  532. "outputs": [
  533. {
  534. "data": {
  535. "text/plain": [
  536. "tensor([[1., 1.],\n",
  537. " [1., 1.]])"
  538. ]
  539. },
  540. "execution_count": 13,
  541. "metadata": {},
  542. "output_type": "execute_result"
  543. }
  544. ],
  545. "source": [
  546. "# y = x.sum() = (x[0][0] + x[0][1] + x[1][0] + x[1][1])\n",
  547. "# 每个值的梯度都为1\n",
  548. "x.grad "
  549. ]
  550. },
  551. {
  552. "cell_type": "markdown",
  553. "metadata": {},
  554. "source": [
  555. "注意:`grad`在反向传播过程中是累加的(accumulated),**这意味着每一次运行反向传播,梯度都会累加之前的梯度,所以反向传播之前需把梯度清零。**"
  556. ]
  557. },
  558. {
  559. "cell_type": "code",
  560. "execution_count": 14,
  561. "metadata": {},
  562. "outputs": [
  563. {
  564. "data": {
  565. "text/plain": [
  566. "tensor([[2., 2.],\n",
  567. " [2., 2.]])"
  568. ]
  569. },
  570. "execution_count": 14,
  571. "metadata": {},
  572. "output_type": "execute_result"
  573. }
  574. ],
  575. "source": [
  576. "y.backward()\n",
  577. "x.grad"
  578. ]
  579. },
  580. {
  581. "cell_type": "code",
  582. "execution_count": 15,
  583. "metadata": {
  584. "scrolled": true
  585. },
  586. "outputs": [
  587. {
  588. "data": {
  589. "text/plain": [
  590. "tensor([[3., 3.],\n",
  591. " [3., 3.]])"
  592. ]
  593. },
  594. "execution_count": 15,
  595. "metadata": {},
  596. "output_type": "execute_result"
  597. }
  598. ],
  599. "source": [
  600. "y.backward()\n",
  601. "x.grad"
  602. ]
  603. },
  604. {
  605. "cell_type": "code",
  606. "execution_count": 16,
  607. "metadata": {},
  608. "outputs": [
  609. {
  610. "data": {
  611. "text/plain": [
  612. "tensor([[0., 0.],\n",
  613. " [0., 0.]])"
  614. ]
  615. },
  616. "execution_count": 16,
  617. "metadata": {},
  618. "output_type": "execute_result"
  619. }
  620. ],
  621. "source": [
  622. "# 以下划线结束的函数是inplace操作,就像add_\n",
  623. "x.grad.data.zero_()"
  624. ]
  625. },
  626. {
  627. "cell_type": "code",
  628. "execution_count": 26,
  629. "metadata": {},
  630. "outputs": [
  631. {
  632. "data": {
  633. "text/plain": [
  634. "tensor([[1., 1.],\n",
  635. " [1., 1.]])"
  636. ]
  637. },
  638. "execution_count": 26,
  639. "metadata": {},
  640. "output_type": "execute_result"
  641. }
  642. ],
  643. "source": [
  644. "y.backward()\n",
  645. "x.grad"
  646. ]
  647. },
  648. {
  649. "cell_type": "markdown",
  650. "metadata": {},
  651. "source": [
  652. "Variable和Tensor具有近乎一致的接口,在实际使用中可以无缝切换。"
  653. ]
  654. },
  655. {
  656. "cell_type": "code",
  657. "execution_count": 17,
  658. "metadata": {},
  659. "outputs": [
  660. {
  661. "name": "stdout",
  662. "output_type": "stream",
  663. "text": [
  664. "tensor([[0.5403, 0.5403, 0.5403, 0.5403, 0.5403],\n",
  665. " [0.5403, 0.5403, 0.5403, 0.5403, 0.5403],\n",
  666. " [0.5403, 0.5403, 0.5403, 0.5403, 0.5403],\n",
  667. " [0.5403, 0.5403, 0.5403, 0.5403, 0.5403]])\n"
  668. ]
  669. },
  670. {
  671. "data": {
  672. "text/plain": [
  673. "tensor([[0.5403, 0.5403, 0.5403, 0.5403, 0.5403],\n",
  674. " [0.5403, 0.5403, 0.5403, 0.5403, 0.5403],\n",
  675. " [0.5403, 0.5403, 0.5403, 0.5403, 0.5403],\n",
  676. " [0.5403, 0.5403, 0.5403, 0.5403, 0.5403]])"
  677. ]
  678. },
  679. "execution_count": 17,
  680. "metadata": {},
  681. "output_type": "execute_result"
  682. }
  683. ],
  684. "source": [
  685. "x = Variable(t.ones(4,5))\n",
  686. "y = t.cos(x)\n",
  687. "x_tensor_cos = t.cos(x.data)\n",
  688. "print(y)\n",
  689. "x_tensor_cos"
  690. ]
  691. },
  692. {
  693. "cell_type": "markdown",
  694. "metadata": {},
  695. "source": [
  696. "### 神经网络\n",
  697. "\n",
  698. "Autograd实现了反向传播功能,但是直接用来写深度学习的代码在很多情况下还是稍显复杂,torch.nn是专门为神经网络设计的模块化接口。nn构建于 Autograd之上,可用来定义和运行神经网络。nn.Module是nn中最重要的类,可把它看成是一个网络的封装,包含网络各层定义以及forward方法,调用forward(input)方法,可返回前向传播的结果。下面就以最早的卷积神经网络:LeNet为例,来看看如何用`nn.Module`实现。LeNet的网络结构如图2-7所示。\n",
  699. "\n",
  700. "![图2-7:LeNet网络结构](imgs/nn_lenet.png)\n",
  701. "\n",
  702. "这是一个基础的前向传播(feed-forward)网络: 接收输入,经过层层传递运算,得到输出。\n",
  703. "\n",
  704. "#### 定义网络\n",
  705. "\n",
  706. "定义网络时,需要继承`nn.Module`,并实现它的forward方法,把网络中具有可学习参数的层放在构造函数`__init__`中。如果某一层(如ReLU)不具有可学习的参数,则既可以放在构造函数中,也可以不放,但建议不放在其中,而在forward中使用`nn.functional`代替。"
  707. ]
  708. },
  709. {
  710. "cell_type": "code",
  711. "execution_count": 27,
  712. "metadata": {},
  713. "outputs": [
  714. {
  715. "name": "stdout",
  716. "output_type": "stream",
  717. "text": [
  718. "Net(\n",
  719. " (conv1): Conv2d(1, 6, kernel_size=(5, 5), stride=(1, 1))\n",
  720. " (conv2): Conv2d(6, 16, kernel_size=(5, 5), stride=(1, 1))\n",
  721. " (fc1): Linear(in_features=400, out_features=120, bias=True)\n",
  722. " (fc2): Linear(in_features=120, out_features=84, bias=True)\n",
  723. " (fc3): Linear(in_features=84, out_features=10, bias=True)\n",
  724. ")\n"
  725. ]
  726. }
  727. ],
  728. "source": [
  729. "import torch.nn as nn\n",
  730. "import torch.nn.functional as F\n",
  731. "\n",
  732. "class Net(nn.Module):\n",
  733. " def __init__(self):\n",
  734. " # nn.Module子类的函数必须在构造函数中执行父类的构造函数\n",
  735. " # 下式等价于nn.Module.__init__(self)\n",
  736. " super(Net, self).__init__()\n",
  737. " \n",
  738. " # 卷积层 '1'表示输入图片为单通道, '6'表示输出通道数,'5'表示卷积核为5*5\n",
  739. " self.conv1 = nn.Conv2d(1, 6, 5) \n",
  740. " # 卷积层\n",
  741. " self.conv2 = nn.Conv2d(6, 16, 5) \n",
  742. " # 仿射层/全连接层,y = Wx + b\n",
  743. " self.fc1 = nn.Linear(16*5*5, 120) \n",
  744. " self.fc2 = nn.Linear(120, 84)\n",
  745. " self.fc3 = nn.Linear(84, 10)\n",
  746. "\n",
  747. " def forward(self, x): \n",
  748. " # 卷积 -> 激活 -> 池化 \n",
  749. " x = F.max_pool2d(F.relu(self.conv1(x)), (2, 2))\n",
  750. " x = F.max_pool2d(F.relu(self.conv2(x)), 2) \n",
  751. " # reshape,‘-1’表示自适应\n",
  752. " x = x.view(x.size()[0], -1) \n",
  753. " x = F.relu(self.fc1(x))\n",
  754. " x = F.relu(self.fc2(x))\n",
  755. " x = self.fc3(x) \n",
  756. " return x\n",
  757. "\n",
  758. "net = Net()\n",
  759. "print(net)"
  760. ]
  761. },
  762. {
  763. "cell_type": "markdown",
  764. "metadata": {},
  765. "source": [
  766. "只要在nn.Module的子类中定义了forward函数,backward函数就会自动被实现(利用`Autograd`)。在`forward` 函数中可使用任何Variable支持的函数,还可以使用if、for循环、print、log等Python语法,写法和标准的Python写法一致。\n",
  767. "\n",
  768. "网络的可学习参数通过`net.parameters()`返回,`net.named_parameters`可同时返回可学习的参数及名称。"
  769. ]
  770. },
  771. {
  772. "cell_type": "code",
  773. "execution_count": 28,
  774. "metadata": {},
  775. "outputs": [
  776. {
  777. "name": "stdout",
  778. "output_type": "stream",
  779. "text": [
  780. "10\n"
  781. ]
  782. }
  783. ],
  784. "source": [
  785. "params = list(net.parameters())\n",
  786. "print(len(params))"
  787. ]
  788. },
  789. {
  790. "cell_type": "code",
  791. "execution_count": 29,
  792. "metadata": {},
  793. "outputs": [
  794. {
  795. "name": "stdout",
  796. "output_type": "stream",
  797. "text": [
  798. "conv1.weight : torch.Size([6, 1, 5, 5])\n",
  799. "conv1.bias : torch.Size([6])\n",
  800. "conv2.weight : torch.Size([16, 6, 5, 5])\n",
  801. "conv2.bias : torch.Size([16])\n",
  802. "fc1.weight : torch.Size([120, 400])\n",
  803. "fc1.bias : torch.Size([120])\n",
  804. "fc2.weight : torch.Size([84, 120])\n",
  805. "fc2.bias : torch.Size([84])\n",
  806. "fc3.weight : torch.Size([10, 84])\n",
  807. "fc3.bias : torch.Size([10])\n"
  808. ]
  809. }
  810. ],
  811. "source": [
  812. "for name,parameters in net.named_parameters():\n",
  813. " print(name,':',parameters.size())"
  814. ]
  815. },
  816. {
  817. "cell_type": "markdown",
  818. "metadata": {},
  819. "source": [
  820. "forward函数的输入和输出都是Variable,只有Variable才具有自动求导功能,而Tensor是没有的,所以在输入时,需把Tensor封装成Variable。"
  821. ]
  822. },
  823. {
  824. "cell_type": "code",
  825. "execution_count": 30,
  826. "metadata": {
  827. "scrolled": true
  828. },
  829. "outputs": [
  830. {
  831. "data": {
  832. "text/plain": [
  833. "torch.Size([1, 10])"
  834. ]
  835. },
  836. "execution_count": 30,
  837. "metadata": {},
  838. "output_type": "execute_result"
  839. }
  840. ],
  841. "source": [
  842. "input = Variable(t.randn(1, 1, 32, 32))\n",
  843. "out = net(input)\n",
  844. "out.size()"
  845. ]
  846. },
  847. {
  848. "cell_type": "code",
  849. "execution_count": 31,
  850. "metadata": {},
  851. "outputs": [],
  852. "source": [
  853. "net.zero_grad() # 所有参数的梯度清零\n",
  854. "out.backward(Variable(t.ones(1,10))) # 反向传播"
  855. ]
  856. },
  857. {
  858. "cell_type": "markdown",
  859. "metadata": {},
  860. "source": [
  861. "需要注意的是,torch.nn只支持mini-batches,不支持一次只输入一个样本,即一次必须是一个batch。但如果只想输入一个样本,则用 `input.unsqueeze(0)`将batch_size设为1。例如 `nn.Conv2d` 输入必须是4维的,形如$nSamples \\times nChannels \\times Height \\times Width$。可将nSample设为1,即$1 \\times nChannels \\times Height \\times Width$。"
  862. ]
  863. },
  864. {
  865. "cell_type": "markdown",
  866. "metadata": {},
  867. "source": [
  868. "#### 损失函数\n",
  869. "\n",
  870. "nn实现了神经网络中大多数的损失函数,例如nn.MSELoss用来计算均方误差,nn.CrossEntropyLoss用来计算交叉熵损失。"
  871. ]
  872. },
  873. {
  874. "cell_type": "code",
  875. "execution_count": 39,
  876. "metadata": {
  877. "scrolled": true
  878. },
  879. "outputs": [
  880. {
  881. "data": {
  882. "text/plain": [
  883. "tensor(28.3834, grad_fn=<MseLossBackward>)"
  884. ]
  885. },
  886. "execution_count": 39,
  887. "metadata": {},
  888. "output_type": "execute_result"
  889. }
  890. ],
  891. "source": [
  892. "\n",
  893. "output = net(input)\n",
  894. "target = Variable(t.arange(0,10).float().unsqueeze(0)) \n",
  895. "criterion = nn.MSELoss()\n",
  896. "loss = criterion(output, target)\n",
  897. "loss"
  898. ]
  899. },
  900. {
  901. "cell_type": "markdown",
  902. "metadata": {},
  903. "source": [
  904. "如果对loss进行反向传播溯源(使用`gradfn`属性),可看到它的计算图如下:\n",
  905. "\n",
  906. "```\n",
  907. "input -> conv2d -> relu -> maxpool2d -> conv2d -> relu -> maxpool2d \n",
  908. " -> view -> linear -> relu -> linear -> relu -> linear \n",
  909. " -> MSELoss\n",
  910. " -> loss\n",
  911. "```\n",
  912. "\n",
  913. "当调用`loss.backward()`时,该图会动态生成并自动微分,也即会自动计算图中参数(Parameter)的导数。"
  914. ]
  915. },
  916. {
  917. "cell_type": "code",
  918. "execution_count": 32,
  919. "metadata": {},
  920. "outputs": [
  921. {
  922. "name": "stdout",
  923. "output_type": "stream",
  924. "text": [
  925. "反向传播之前 conv1.bias的梯度\n",
  926. "Variable containing:\n",
  927. " 0\n",
  928. " 0\n",
  929. " 0\n",
  930. " 0\n",
  931. " 0\n",
  932. " 0\n",
  933. "[torch.FloatTensor of size 6]\n",
  934. "\n",
  935. "反向传播之后 conv1.bias的梯度\n",
  936. "Variable containing:\n",
  937. "1.00000e-02 *\n",
  938. " -4.2109\n",
  939. " -2.7638\n",
  940. " -5.8431\n",
  941. " 1.3761\n",
  942. " -2.4141\n",
  943. " -1.2015\n",
  944. "[torch.FloatTensor of size 6]\n",
  945. "\n"
  946. ]
  947. }
  948. ],
  949. "source": [
  950. "# 运行.backward,观察调用之前和调用之后的grad\n",
  951. "net.zero_grad() # 把net中所有可学习参数的梯度清零\n",
  952. "print('反向传播之前 conv1.bias的梯度')\n",
  953. "print(net.conv1.bias.grad)\n",
  954. "loss.backward()\n",
  955. "print('反向传播之后 conv1.bias的梯度')\n",
  956. "print(net.conv1.bias.grad)"
  957. ]
  958. },
  959. {
  960. "cell_type": "markdown",
  961. "metadata": {},
  962. "source": [
  963. "#### 优化器"
  964. ]
  965. },
  966. {
  967. "cell_type": "markdown",
  968. "metadata": {},
  969. "source": [
  970. "在反向传播计算完所有参数的梯度后,还需要使用优化方法来更新网络的权重和参数,例如随机梯度下降法(SGD)的更新策略如下:\n",
  971. "```\n",
  972. "weight = weight - learning_rate * gradient\n",
  973. "```\n",
  974. "\n",
  975. "手动实现如下:\n",
  976. "\n",
  977. "```python\n",
  978. "learning_rate = 0.01\n",
  979. "for f in net.parameters():\n",
  980. " f.data.sub_(f.grad.data * learning_rate)# inplace 减法\n",
  981. "```\n",
  982. "\n",
  983. "`torch.optim`中实现了深度学习中绝大多数的优化方法,例如RMSProp、Adam、SGD等,更便于使用,因此大多数时候并不需要手动写上述代码。"
  984. ]
  985. },
  986. {
  987. "cell_type": "code",
  988. "execution_count": 33,
  989. "metadata": {},
  990. "outputs": [],
  991. "source": [
  992. "import torch.optim as optim\n",
  993. "#新建一个优化器,指定要调整的参数和学习率\n",
  994. "optimizer = optim.SGD(net.parameters(), lr = 0.01)\n",
  995. "\n",
  996. "# 在训练过程中\n",
  997. "# 先梯度清零(与net.zero_grad()效果一样)\n",
  998. "optimizer.zero_grad() \n",
  999. "\n",
  1000. "# 计算损失\n",
  1001. "output = net(input)\n",
  1002. "loss = criterion(output, target)\n",
  1003. "\n",
  1004. "#反向传播\n",
  1005. "loss.backward()\n",
  1006. "\n",
  1007. "#更新参数\n",
  1008. "optimizer.step()"
  1009. ]
  1010. },
  1011. {
  1012. "cell_type": "markdown",
  1013. "metadata": {},
  1014. "source": [
  1015. "\n",
  1016. "\n",
  1017. "#### 数据加载与预处理\n",
  1018. "\n",
  1019. "在深度学习中数据加载及预处理是非常复杂繁琐的,但PyTorch提供了一些可极大简化和加快数据处理流程的工具。同时,对于常用的数据集,PyTorch也提供了封装好的接口供用户快速调用,这些数据集主要保存在torchvison中。\n",
  1020. "\n",
  1021. "`torchvision`实现了常用的图像数据加载功能,例如Imagenet、CIFAR10、MNIST等,以及常用的数据转换操作,这极大地方便了数据加载,并且代码具有可重用性。\n",
  1022. "\n",
  1023. "\n",
  1024. "### 小试牛刀:CIFAR-10分类\n",
  1025. "\n",
  1026. "下面我们来尝试实现对CIFAR-10数据集的分类,步骤如下: \n",
  1027. "\n",
  1028. "1. 使用torchvision加载并预处理CIFAR-10数据集\n",
  1029. "2. 定义网络\n",
  1030. "3. 定义损失函数和优化器\n",
  1031. "4. 训练网络并更新网络参数\n",
  1032. "5. 测试网络\n",
  1033. "\n",
  1034. "#### CIFAR-10数据加载及预处理\n",
  1035. "\n",
  1036. "CIFAR-10[^3]是一个常用的彩色图片数据集,它有10个类别: 'airplane', 'automobile', 'bird', 'cat', 'deer', 'dog', 'frog', 'horse', 'ship', 'truck'。每张图片都是$3\\times32\\times32$,也即3-通道彩色图片,分辨率为$32\\times32$。\n",
  1037. "\n",
  1038. "[^3]: http://www.cs.toronto.edu/~kriz/cifar.html"
  1039. ]
  1040. },
  1041. {
  1042. "cell_type": "code",
  1043. "execution_count": 3,
  1044. "metadata": {},
  1045. "outputs": [],
  1046. "source": [
  1047. "import torch as t\n",
  1048. "import torchvision as tv\n",
  1049. "import torchvision.transforms as transforms\n",
  1050. "from torchvision.transforms import ToPILImage\n",
  1051. "show = ToPILImage() # 可以把Tensor转成Image,方便可视化"
  1052. ]
  1053. },
  1054. {
  1055. "cell_type": "code",
  1056. "execution_count": 4,
  1057. "metadata": {},
  1058. "outputs": [
  1059. {
  1060. "name": "stdout",
  1061. "output_type": "stream",
  1062. "text": [
  1063. "Files already downloaded and verified\n",
  1064. "Files already downloaded and verified\n"
  1065. ]
  1066. }
  1067. ],
  1068. "source": [
  1069. "# 第一次运行程序torchvision会自动下载CIFAR-10数据集,\n",
  1070. "# 大约100M,需花费一定的时间,\n",
  1071. "# 如果已经下载有CIFAR-10,可通过root参数指定\n",
  1072. "\n",
  1073. "# 定义对数据的预处理\n",
  1074. "transform = transforms.Compose([\n",
  1075. " transforms.ToTensor(), # 转为Tensor\n",
  1076. " transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5)), # 归一化\n",
  1077. " ])\n",
  1078. "\n",
  1079. "# 训练集\n",
  1080. "trainset = tv.datasets.CIFAR10(\n",
  1081. " root='../data/', \n",
  1082. " train=True, \n",
  1083. " download=True,\n",
  1084. " transform=transform)\n",
  1085. "\n",
  1086. "trainloader = t.utils.data.DataLoader(\n",
  1087. " trainset, \n",
  1088. " batch_size=4,\n",
  1089. " shuffle=True, \n",
  1090. " num_workers=2)\n",
  1091. "\n",
  1092. "# 测试集\n",
  1093. "testset = tv.datasets.CIFAR10(\n",
  1094. " '../data/',\n",
  1095. " train=False, \n",
  1096. " download=True, \n",
  1097. " transform=transform)\n",
  1098. "\n",
  1099. "testloader = t.utils.data.DataLoader(\n",
  1100. " testset,\n",
  1101. " batch_size=4, \n",
  1102. " shuffle=False,\n",
  1103. " num_workers=2)\n",
  1104. "\n",
  1105. "classes = ('plane', 'car', 'bird', 'cat',\n",
  1106. " 'deer', 'dog', 'frog', 'horse', 'ship', 'truck')"
  1107. ]
  1108. },
  1109. {
  1110. "cell_type": "markdown",
  1111. "metadata": {},
  1112. "source": [
  1113. "Dataset对象是一个数据集,可以按下标访问,返回形如(data, label)的数据。"
  1114. ]
  1115. },
  1116. {
  1117. "cell_type": "code",
  1118. "execution_count": 5,
  1119. "metadata": {},
  1120. "outputs": [
  1121. {
  1122. "name": "stdout",
  1123. "output_type": "stream",
  1124. "text": [
  1125. "ship\n"
  1126. ]
  1127. },
  1128. {
  1129. "data": {
  1130. "image/png": "iVBORw0KGgoAAAANSUhEUgAAAGQAAABkCAIAAAD/gAIDAAALVElEQVR4nO1cW3MVxxGe2d1zk46EhEASlpCEBaEo43K5UqmUKz8jpCo/MQ/Jj0j5JQllLGIw2NxsK4iLERJH17PXPEz316OZxdLoeb4XWruzvbPDfNOX6Tn64cuRUkopVde1OomqEbmsaqcZhIKbFTVJVVV5jRsWRGdRlaRc8d2Gbmtu3/ADTdM4Ql4m0tXavYs+NI1mVepjX9pUckUXlXMX7RMVcWbEwQpAppkCEACttMgsJiw1uOK1EYHvJWhtvQWqUhY0s0FrJqbGYy5V00S6Bwjf5RpdSZKU/vaorRrpldau2oRfFGdWAOJgBSBLZMLy5OS/Ey1DCakRXqAZBDZJunEayRVrkguNEpe3iSwO3DkxWCCv9R0ed1J0hvsO9uFtYLTSNg3d5QhsjTMrAHGwApBZfHJnvj2zwYtaYTLz5GzQ5qQiS2w8U6tsHmm3WeL1wmK9e+VEJ+C7ijlkN5VvZQluJM5HqTbvF0Y6zqwAxMEKQBysAGRwWH0XO7GMqCxVWNr4AawFvivQ1O6CaLvdWpNLXXNEnYgn7boC0r0Ga6ulSvrAjgVPg6pkj58vQUOt8S2W68APIhiIHvx5EAcrAJk18wkWc+ygF3EsLnLuSey9qwNudC0+hJU5Ep/Ddf0tEvqeDFwHiztYHBpXgzg0csv3VERV43EzevDnQRysAEg+S8JgvldbbEgSmBKka+mWzHyhFWJyNIJOC4hs4VJLPotttJDPzZbZy0cj7V2zqFJXuZh7lmp7zqAPSMYpV4g4HXGwAiCBdIt32tjWENlk7d/FE87f2mOHnSxqGtCQA1rtPqi8KxBOmjD0wPOQvTy45fm2JA1ASY3eeHyMOB1xsAKQiY/n08q68BteX4vQss9DQm3lfxEb4lErGhVb62mSXLXVP5ek1h0v2e0pP/HtkuJ2I9w4swIQBysAWcPTrvIqAM6I1CcdiMIzuYBHmGR4MEFWl1mQ8pNlUzhv0QolCOzxCotV3fD/OvIwKLNgVbVmd9oLRWtRLouD1q6vHGdWAOJgBUBI0VKMcDYknt8olUaNn8axd114ejeunRKC1O5mkrWDeyI6dL7DN82+oYQ1TKz8rV8A4Wd7Ik5HHKwAZFq5+/26JegTaEmQIg/jjnjLhEey0dr8hJXBbgjebO3XukkebM3aqkBkLQkc3PXKJrxMqd3hunYXkxQ+s/tVER9HHKwAZLLm+5nEVrRUyzGJavdOW+7UripgS4dSPNGIJCZtKaZI47CG1OonzC7KpPx9B+isGrwOHRUaViBpTZ5qmqb8FRFnRhysAMTBCoCUSYK24H9b4liWNvHFUZXfkrjlhYP5n1mrQ8aBcCUmPOFuUbMclf68q4QOpFYVJ+JorINwgLAdlXiOQtW6THsZ7RhInwdxsAJgbbJaRbzm38r2tgE/Nhan2fWMMZNB0IP9D9C0vf3OCEXB2StW1ZuYcl47nBxSr/hQTpL15TO482VJroZfoCDuiFecUdvBAF/WnBqLZ3fOgzhYAcjatm1cwYb2JrNVe8R/s+1Dk4QLfp/98BCq7t69a4TxeGyEPCc+Fg1Zyi++/NIIn9++bQTQcHK2B1U4QqekNApW3k1tV6UbFdi5A1hPmGZvszXiDIiDFYAsscp86N/W3BPguay1xmRmTR5/Gy6xXbh0ERdXlz+hFzEdtt+/N0JeEw0zVvr4+wdGuH79Bt868Qb+CPSKbTrTFoF3guJcvlLZpcbMOkmXt5Q2RpyGOFgBaNndCd/fYQ1ylo6Jyf8X+TGZuV5X3njzxroRpqbIBf3mm3tG6A5njXBwdER9YtZfnL3g99M6g4cKRWTZvEIoT0r8PLhStXcsPs6sAMTBCoCYlMqLqqQw1vb6pJCBnTdVOQ+CAjjR8fbtKyN8d/9b6Dw+PjbC5i+/GCHNiKTXrpOw9XLLCF999SfuFPWqKqQeIvUOi9f8OR22ffiZCvldB8mQW7UOqPzDOHBqO86sAMTBCkBWeT+XIlV6lt2QX3GQ/U9qX1aF00YOjLEvOneZrJvqiDVMFQV3U3Nz1GyOXNa8yo2w9YpoOL+wyMq5JMi22rUwivopd9wtnFq5YeOJPSfv5EyTRGsYjjhYAcgQOllzklDVYiPQLFMwgkix8hlLsaL0f3BhetoIPzx5YoT5K8vQeXBwYISpGaLh/v6+EV5vEfuevPjJCH/7+z+M8Jc7fzVCryuZUuvnlOhKXoBE2hFg2cUVtew+fNESzWKtwzkQBysAcbACkB0XpXNJ9kUsM4/cccXubJmT/52mXW5BQ//zTz8b4e3bX42wf3hohPxEJRScD96w6Q2MsLh01QhXr103wmBIy193YpJ7YvWZ/Ymyoe6N+St6aYe/y1udJeQQVVhwk9oNSOLMCkAcrABk9+7/10jwtuEldKzcU6/DfnNN/vrkgPzvJCEaNglduXdvwwgbG/eNsLu3Z4SF1TXoXF4mN+Lp06dGmGNXfmVlxQjrN24aYW2Nkl9vft02wrgQHoJZ45w2ipBTyziQxg6TtfdLRCtKey1q4SZpcC9EfBxxsAKQvf+wa6TBgCxRxkmlzLKGmoPJNSbIzDTlgvsDqkJ49uJ/dGuGMr/r69eMsDMi13x6fhE6//Xv/xhhc3PTCCWnqO7c+bMRZmcptH786LER3rwmGua2OWQTdshmt9MhIwinPpX9Hg6k4dNbNMTeKtYlv4Y64nTEwQpABpNSHNAEnp2l3FOv30W7hUt0scPcHI12jbC3T/Gw4jNqv7tJlmtpiUi3u0c03DnMofOPf/i9Eb74/DNqtks6+/zqmRnyRY8OaJvnYH/EfWeiWdVRiIgrzohhdwe0bbyAv2yj4W9UL0WcjjhYAcgSnszb22Rl9njCPzvaQbseVwpcmiVepFLaQCPe53I9mNGq5NxQ2bJBsrJ8hVRxVT4MMRzjfEz28ZPFy0bY3KRUV29yILqYUKMRkTTPmYZcnIsMV8qVvzCCRdFCQ+tcbsxnhSMOVgCyhmfdxUs0z1EOW42lWLbhY9mDASVzUQePCp5KUZuDQ7KPBVfyjXMOPGsxYTnzGDSE3cmYKSknWLocga6vXnUeV0qV7HlWnDhqeM8JDNOpe1K8kjNDkjgqeenAmlDHFM05EAcrABkog1mHdAccQqWULjkvyns5OVfN9jPKzHSEO8je8OOY+aX1Yww1NjvlPdyM+ctv2d+jDmRMzP60dC/nOG5+boaUF2TT9yoUPXT4HbKBRVcSoXQxphdVXAQMWxlnVgDiYAUgO2YaznEyBDwBv5RSyyuU1ex1aTI/evS9EV5uvTHCYEhbCUh4dlLyG3WXnUxl5yS50LxyDWuGA6mcGtIDEsbwNot9UcQBYMo1VDOTE0Y4PqRDL3VO2VosF3ND3h9ZmIcq1Dq8eU0PVtXgRHcjzoI4WAGIgxWAbOEy0fWIyzQS9iFu3/4M7VaWKTO1NyLmT0xQNvnwmIz00xfPjfDkx2eknVUhRzbJJ+GU5a9P8PrS4aieM2MSig/6tHCguPKoOIYq/KbTaIeC//l5itKHvJIOp+gtV68sGGHpCn17t2M5NLwX++7dB/5k+sA4swIQBysAGfI+MMljrtPf2JDK4offkYBULJJWq2trRrh165YRUGb14AEduHn+nBi6s7MLnb0eu/68EwNh0KFb3Q7Fz91u12lTWbWNSUqdQeHFCgf8K4urRri6St7PBU6E9bFzbKnCNm2vR+m50ZAS7nFmBSAOVgAyJGum+QDN+JBouPVqE+0O93aNAIp1mBf//PprI3Q9WoE7S0tLRsjzH6ETaazhkExkxldqjl1hm0bcAcTkCJ6VUkfHtIZ8yiVKO2wWYaw7XVI+9SkRM0mQ/hYavt+mF/X7ZD3n5siUx5kVgDhYAfg/pQ4eZ65sAxcAAAAASUVORK5CYII=\n",
  1131. "text/plain": [
  1132. "<PIL.Image.Image image mode=RGB size=100x100 at 0x7F1EC53B6588>"
  1133. ]
  1134. },
  1135. "execution_count": 5,
  1136. "metadata": {},
  1137. "output_type": "execute_result"
  1138. }
  1139. ],
  1140. "source": [
  1141. "(data, label) = trainset[100]\n",
  1142. "print(classes[label])\n",
  1143. "\n",
  1144. "# (data + 1) / 2是为了还原被归一化的数据\n",
  1145. "show((data + 1) / 2).resize((100, 100))"
  1146. ]
  1147. },
  1148. {
  1149. "cell_type": "markdown",
  1150. "metadata": {},
  1151. "source": [
  1152. "Dataloader是一个可迭代的对象,它将dataset返回的每一条数据拼接成一个batch,并提供多线程加速优化和数据打乱等操作。当程序对dataset的所有数据遍历完一遍之后,相应的对Dataloader也完成了一次迭代。"
  1153. ]
  1154. },
  1155. {
  1156. "cell_type": "code",
  1157. "execution_count": 6,
  1158. "metadata": {},
  1159. "outputs": [
  1160. {
  1161. "name": "stdout",
  1162. "output_type": "stream",
  1163. "text": [
  1164. " cat deer horse plane\n"
  1165. ]
  1166. },
  1167. {
  1168. "data": {
  1169. "image/png": "iVBORw0KGgoAAAANSUhEUgAAAZAAAABkCAIAAAAnqfEgAAA09UlEQVR4nO19SZAk53Xel0tl1l7V1dvs0wBmAAx2AiAEimCQIilrs6wIy3JYctjhi0+KUIRPvvrkcPhqRVg32WFH2AeFFZYoOyTKpCRCJkWCBEgAxDL7TE/3dE8vtW+5+fC/72VNVTWla9P/u/Trqsw///wzK/N9b/keYMWKFStWrFixYsWKFStWrFixYsWKFStWrFixYsWKlf+/xFn86Lf/2T+W75zMKNFkDGDc78i/45FR0mQq+6Ty13VdjitKPI2NksHnMV0eKpvdOQgLRikWQ6P4BdnF90UJAtmmVCoZpcCvCp4MW3AdAFkmw2ZpwoOlPL5sWQzkQOVi0SiVUpHnLisTp7II//zf/AfMyMVXNo1SXz1rlO7BUHaJZBcvkANVm3KgUi2QyfgyfhRnAApcyEd7B0bZuHDeKGkk3/WPxrJv7HK2Mlp/0DOK68n61Os1AFEki58kMScuc5tOp3Nn6vuebOLJJ54vB/ID+aRYCgCMB3KmDu+feCrDfvD2D/G4/M6//FdG+fJrbxmlfSyzvX94DOCv3v66+ff2w4dGefPNrxjlaedYxu+Jcu7zv26Ug0jO/c7tD41yeHDDKN3jXQD1alXWpCpX6v1P7hilc/xAvsoio3zx818wyo3rd41y4dIlo2zfet8ov/qLbwHYurhl/r11Z1+mtLpqlJsc/79+7S+M8upXvmyU15590ijR9n2j/Nvf//eYkd/+F79jFL3VPU8uhxvImTr8KsuyOSXgnex4sk2SJLODeK4oSSw/h6AgP4d4MpEDpforczm+bDOZyP1jfkxONuHEZQM3rMmBQln2zJNpT/UmTKc8Ndk5iuSWTibR7OnEsVyX3/29f4fHxYUVK1asnBLxl3wWy9ssLPKJjgEAL+ubfx1XHpkFvpML+oB39QEvg43G8liNJzRweJwgDDBjb4Uh7Z1yyA3EXnBdtQJ8blyYG81xOL6TAfByO27+lRUEYkYVaa8FtCmcTGbrOnw1LTNCAaRTOdz+jthEDt/5NFmQuLJN4vBt78uhx7EYKYVKCKBcF4OxnkRcBNkygyhRJGfUO5R9hzR11WZUu2k4HIGvWQApzUzP05dnxtk6/ETPja9NDuupFea4AMo0b8OCLGC708cJctyVrz65c8soMV+5e8dHAFpnG/J5IMe9+f4PjHKxJed+jldzjdbZ+lf/vlE2X31OTqSzY5S/+J9/AKDIpdgdyXJ1Y4EIo1he7Jc3zxjllRc+I1O6LVbed97+c1kMyBX5zrsfA3ju+c/KuTuPjPLBt75plMBvGWV15aJR9g9l38++9kWjTNY/kXX5fcxKgSupii6+S5zh+PP3fERLJDeTaUl5jgcgpYnk8YeZ8cZOE7GS9GrGEUdz5VYo0GD3qGSpAwCp/ELVcnf5401dudNcXw5d8NXUAkeT2QZ8gMTeFECaqkF3oiFlLSwrVqycGrEPLCtWrJwaWQIJ6011bItSqzoAJiX67cZi29H6QyEoGyXjE3A8IRih0z1OxnMHCn0fQKFApBaKoUhXMqpVOt2J5tT0dR0ZP8l96iKu5wFwGTFIErEzpxOZ9ngkSMqt1eXQBKGuR2OYs/LS5c90n+7GCaFHRsjmevKVz+jBeBRxMrKxH8qy+BUPgEuvdq0l4DGhP3hIS9ohCi5VK0bpHQsqqVYqevbmT7/fBxDRyC8QVqiikDCHHlSCgLibG8eED+PhCECrIeumoY8pgf+iXL99xyiNspzaG6+/bpRPv3ULwL37AhVrZTmLF56/YpTi4Z5RDnZ3jbLaEVd3CXLE5oogyj//+h8ZZWf7AYDLq/L5rR+/Z5RWTQDmL/7SV43y5OUto9y9e90od7YFsg0GgvTf+rmfN8pzz70MoN2RmycZDoxybiKfxEdyOYpDXjLePA8PZNqF6QDLRH0deofnCiGg3oger1QKdWWAG4sYL0rG+E+qPhqNhjEM4/De0/s2pmNef3cakJlOYgCOw9splN9qyhlkKc89476hukqIRhnIcghgUSgASDgnxcWLYi0sK1asnBqxDywrVqycGlkCCTvdtlEGPTERy2EAoMCYWsEXIKAJQUFBsE9EY9IlUitWxCB0mdyhIa1yuQSgWhM4WQg0nMc8KaZHKb5ziPw0bqUBRM9T1FMAgFRtVDmLlHPTUJpLWDEYCFyNmPRUDJmZlaOtx+TlF58wykFHhr13QxBBjZh2TMw8ZKCqALGB11uSt1IsVwAEFYYCaQyPOKXpVAaJafa7BVmfUlmWztXV4O4m4BLHmoelIVReIIIHDc2oOD9ByYCZ6HBCyKlJT4tS1qw6Xt82b7B+rwPgledeMP+eXZEo29OXn5adCbI0D2sqgyG+I+lXLhfh7gcSW5xM2gDe/sF75t8Wces/+YVfNcoLr71mlGuvvmKUvY6M75bkjP7HH/yhTOaZa0b5/OfeAuARAP7NLUn72rwuKPJi7YJR1mJZsG4iMc3f+0+/a5RnN+tYKry4izlWWaIXiBdR8+MYPNWL6Co2dBzMYDoN52kcX38vE8L5IJCNM41up3Rc8CfpSf4gcx75w9cDezz3mEeMuGKe5pEl+tvk3o4DIE71Ll0emoe1sKxYsXKKxD6wrFixcmpkCSTUGhePj7MszgBMIwkVqcHm+VpownBSLNtMUwFBil+qRWYbEm2Vy2UAtTqT+mm+akGAT6tVUxlT2pmaCanpjppKWqvXMRNJUYs6Yb6ixs6iCZNa+ZXm14UEMmFYwjJ57bWrRrmz3ZUtCfeePHfOKPcZG7qx25ZTpjF/9axsY1Lw2jyv9khAhKafJoTDKVGcmvdTAtgC1zbT+ItjJi9nMWH5xXSqRRXgtEVKCsA19sSvKiWBXWaRc5RNyOA4J4Z11taaRtnbk8TOo0MBehurGwDOr0tpy4U1Ua5eFUjoeM8aZTiVZfnff/onRgnee9soFy8JNj/bkEW4cPYqgKeeF4B2sSwobIPKB++8Z5TWplTtXHh2yyi/9Gu/bJRvf//HRrlzS4p1PnttD0CLAeWtp+S4hSPJNT18IOfVLAu2vX5fynp2+nIn7N1hCPxx0QxPF/orYBhXf27O/HXRH4i6YnLJzC5yv2nQOckYIs9kJhEjiQoJC76WyLCujuE842+JCeXiTBMGOKlMA5e8b1mRA24TEkhquHASjwGkC3B4UayFZcWKlVMjSyysp69KFsx0SL/vcALkZo4aL+omd7W0kqk0SWNFNuZLQA0rL3cEZgCCAr3yno7G8TWDA+qApBeZVpLPhCn16ZpH/JiedbXBtAp6PJHzcjL17osFUVY39kJ20pyUCzLa5qoct/WqrJvHhKyMGS4x1ydgOkyDFa2mlGHKtQ19pm5FNDNpu3hMfklY5JSFjDPQwp3wkrlIAUz53o743vNSrRti+hg96C6vS5V5XuD7s8BainIpBICMeTq+XtMTLazzGy1OUk7t1vU7MoVRAqDsyOdFKh6jOgnN2wlfxccDyWOaHom91uu1jaLBnEmcAXj9rc+bf+OurMlfv3fTKJvnxfjqj2Tdxn1ZqJXqulG+8BUp/dn96AM50MN7AGqbcjqdqWCIqCEJX/sPxIzqemLFNBixiVzZ6+jhPpZJHjtayMNSh7pDsJJnI/KUk3T+JpfymnxLOVCs151GjAZdtHSmzPvK02xHpTlABoCAJI8IpTS6tKbNDzTHivZaHhAghuNt4yYuZlHRQiBIxVpYVqxYOTViH1hWrFg5NbIEEl48R4Kno7ZRBt4AAOjXU8tNszOU8ibjeG6gSf2CfRIsFNNkGYBuR/ypmtNRqQouU5+uFotosYvrBJyDAkwxI0fjCYBBn9wSPK+AhSYBvcsFzUnJa9mVBouwa9GdaY7Ls2hWiETo9e/t0xPJ9XninACNpkcfN1FVr9OfPS6JMBCPGAcgmovpL89YKKPrr1k8RZ7j+uoKgDt7h+bfPmuDWEKPDVa0ZDnTgxJFcJGhaJHu0sDHDLWZOk0znOglnXZlDm984UtG0WS07bu7AA6PJAfq/v1t2aAo3vGXPvumTJuXu5pwShW5SWLClg8/IRFCWAZQKEswZ0jYcv+RHKgliA3PvS4reQUyWi0UV0ahQIINrn8hiwFUSrI4RyQF+6Pvfs8o62QBubb1lFE+/8JLRvnu9z82SufBMZaJGg6aH6cgUf3xM84N3pz5AKw/Y2TMoEV1v+TYkIM5iTpGZJsolhs4JvFJyF/xJGJBUpICcDOlOZGJRwpF9TGhhHTMuko53+FYf9d0KTguAF/DSpmFhFasWDn9Yh9YVqxYOTWyBBJutsQqrjOLZ1gdAQiVpY9W64Qg5ZjgUaMGoNmvgYAiI0qaQjUZTQAMmBg0HIjF2Okz4csXJSxLtKhcEaVGIgeHkE0Vkz526axQDIdFmYCvBQ2F+eBjzuhAbDshPoqTEyAhUSSYgtTtMhoyFmXcE0N6tSX462yzKQciJOz2RwAO9yWXxyNcbTE6eNBlNGogkZppVyJlIZc0cpQxQuz8V178WQDXrslR/uzP/8ooTcLtr35JCIvffV9yhXZ5EZXRQa94HMv43V4XyyK/IxJgLEp1Iti8vyO1LI/274lyuA9gvSHpVy1Glm/cFwrju4/+l1HWGGlqEk0cfyQAcJ+h2CtPXjbKOC0AuPup5E997itfMsozz79olHu3BHse9+WUD1mak2Yy/kZVbpLzrwmsazRCAB3SkHz5F/+BUb7+7W8bpcKg9sULQuC32VozyirvWxKgoD2XD6d8ivzAXQjwAcpznW9l/ii7tSZkmWRGdb8U+FtWMmWP/paE0buUSFAT/RLNBeMVTxOjsMCLBA8e102nnSmDZjYf+5shj5RNTA7gdKKELjYPy4oVK6df7APLihUrp0aWQMLGStMoTSomUKI1LlofEzE5U/vcDAiCtM1GwoyyiEUnEakLsmkEIIsFpAxisai7EzFwh33BPqORlL8orNvckCDOuTVJydtsSWhppekBWFlhNUlRwamS/ykdIPgVo5BEi2pa/62QUDnX2QcHGZNaPe66UiVToJZK5PTVCYAC3xxPnBWWcZcTOFORfQ/asm43yCE3JnauNORkJ7TM723fA/Brv/Qr5t/L6wxTrgjsOmoLCDo8FKa6ep1xQ8bF+gMCPYIFzw8xA6Vz3osT0msBDNvCpH58IJUrF87IZFbX1wEM23Jj/Pjjj4xy8drLRgm5Stu7km/5Aq/y+gtStTM+FG6/ClM6G/VNAPfvyS4txhM3tqSY5skLW0aJSSTwta8J+V/GOpJ2h0x7vPf+8v0fALjy7PPm31/49d80ys+Qj/DqmQ2ZCQQA3iSjQ28oi1Cpyt2Cx0nwc6597UDlqAdDy2Lm+fg11qxMeCVl1EvS2S2VfV+pH/w8XEhXA+vqYtJYxmQiVEY9zykASBhhVEioPyUFgooN9YhT1oeVeEs3ecsdHR4AmE5l2FTZBRfEWlhWrFg5NWIfWFasWDk1sgQSBsRHga9+/hTAhFTuWn0U0Fht1gSghUwTHXPjEWHFhJGFlEXbncEIwEFfzL9+KltGzHabJGTL5r6ahloo+3OKdspaWasACAKdpMu5Efcpk3Ruxyoduxask8RaK6weF4e4KSyIkR/4smWzLhVkBbLgN0ge7zCaNuqLsr7SANBsyuQ3G7KSJVrUXeZ8VmmW7+9ISHFCtgYFmCBQ/fCj6wCe2RKQ9dqLAmQ0IfYP//hrMj7DtAGp5fpMiewTxWsqabNRx0x3Jq1cUwKMRVk7tyXbMEHx6JEwtT/zwmcArL8mKHhIlNLcbBrl6lXBfUe3BVupYwEs+mutCuLbPCMVgoNHfQCIBNPppdRwVUFf05z//o7EDT/4kbAA3r4tHU/diYyzUgkxUzR35ZrEHL/0pS8Z5RxzcUMSW97/xjeMsncol2xtQ6Z94yHTXAHMdNZKcyYMzlEDfwTmmqY7Q/KnsTm9tzOzhflvRAZ6OJojrSx9DAWy2DN25iczJZ1k4DqYCTVqxF+5GWLmCSRsWJdxtJjg8Ec/lG67AaPMzVoNQIMI0bMEflasWPkpkCUWVveR1FIUlbPYzTDrDsyrrlnfz66f2vx9HMrzNSQtwTGrUvYGsteN4zGAg2N59mfkBigVWQwB2bdFTiUlzyoVlZ2ZBlQow5arRQAVdoEN6EHkqycnCdLnOK2l3EGoTlD3BKanUN3zXJZqhb587YBCn3VA0uRqIAZUzFfF8xdXATw6Eg9xoFSztOw89nbd3tnj3MQmrbXElJuQv0EpKiJkAO7tyS4/87q4sTsHbaPceyi+9pjXZcAUJ59xhipjBaOxuIh39/cAnKMLX2uDFnmWVe48POQpy+Sikfj7G80VAFvPihn11pelmU2Bfc+P9o+Mcv26dNaZHstoEdOIGrRZBgOxkh492AOQ0L1drMuaH+2Ie/5od4erIZ/84Pv/1yiffCo2qVaYFJj9FDoJgJ0dyRF793t/Y5Q3f+ErcoJchAJtijL79PTpud9Y28Qy0Uw3ZTLIHetsvaNtcd0FA0T/z+iGN/VzZabdqVc+oZmpju2Ye2esRtKKnIIrN2FF+c3jDEDGQfo03LS5vNKaJ3ThJ+yjo7/QJxn9UBe9AXPa2iezpTlWrFj5KRD7wLJixcqpkSWQ8OE9Jraw4U29XsEMd4JaoB7tTFf9mjn9mJL8ySCHXUlF2TkQn+4kdgFkNLmVXnk0lEFYz4BaILZinYqWzrgZU0Los0/HAYBxJIOMdNp0EIbMVamw14vOX7vmKEeadricE/XcT2kMZxBlQst2TKu7Q9q51qrglwsMUxS9NoBGRY4yUpIJ+kRbdZntE5fFOd0niPtgW/DRtM9UmqkACdNsdUKs21yT4/aYujVgrxS3JPhrQndph/wZDqvntao+jscAuj0SbHgnOkdV3v1E/OUXL8ocilPBR+P+IQA/kMt8/oKcYIEgaLIr94w/0YagbL1z2DZKEpP3cVUmY9agS6dvfyizPbwpxTo3PpRqpEe7UiT0YPcux+d9lTunZTKDKAZQZMnU994RkobLL0lA41VGNvbv3OTZs6sokf5owKDB4zLT1xZU9KfEHxeXJVuAhOrYVn+N2SLn2mTExuX8yxXxcPdJ0nA8ZksnJmAWMrkbpwPJoZtMepipxNLSH522/pzdPIyQdwQyf6ukyUTOYx4BmHLxNb9sUayFZcWKlVMj9oFlxYqVUyNL8M5wKHZgmpCeAT6APgtl1GwOGUasVNkRh00r4bIhKLP7t9lT5O49ApnYx0w0pErGtXpJAEhYEPNyheOvr0v5Ra0lympd7NiiJwca9DuYyb5xaSS7uR0tn0TEbhrjUv4JNdE15WROYkKqjKFMJVHrs+i8Tf6G47aE5FbKZzltOaNPbj4AkKRy3GaDsa0SQ7S0qMuMh5YZsskmpOXjqa0QxSfRBMDgSC7lkPHKPRbipAPBZR6nXaoKa0KayCC72+IcWGnIpVlZqQPod2XYFjOPPP/EN19Kn0BQk5jmxaKUB7368lUAXk4OIUihvy1RvP2PpHVNlTU0Zd4bet88uCf46+HtT2WbcgXA5nPS1ijal7M4vCmjjXYl5lgjkirk/HOcNjPyXIa9xnEEoEs+jyIjvy5z36rsLXS71+EocoGe3LpklEYoi4wf4zFR3OTMZzYpJMwWOtvqXZryk1SDjCkw08HX4z3pE4WlU1nSfibeiaQoJBNeWcD7MdPTpt0dzsoDWfMB+FzAIG94Qy5PbX7MWyPlJ0oCkUQKhF3M9HBwbWmOFStWfgrEPrCsWLFyamQJJFxdbxqlxMR5x/EBDIdiXo6HgobG2nNpLAZnpaaEYWVurNx4rDmg+Voo+gB6x5IcyARJBJkAkEpLBvGVfqws5muT4bZGWY5Y8hjgy0IAYcCET1r7mimX94MipcSI+G48Fmim/b6Uo25Oxiw9B+uTQmZR+oyd1aoy7SQRADhiPbpWSoxHCYAS8UWZLVFdBpjiiIEYZuudYRDncy+1eEayF0OL2Hv4CMBKUzbY2xM83u4IWllbkaRQXhYMGMYtsW9VifzlKRNSjx4dANjYEFyjeDnDyTZ8UU7tY1IX/PyvCb3BM09fBrDNrNRGQ67pp9els1bnjqRxrvDmCIkmzgSykmNfoM0x+24V2n0Aq4cyyZU+Q43rcl+V+3KCh8RuVfLxt+klGKrfgBjQdKXLpszaXZdbUdNre0cyWkz81Scv4OsvkwWwJIwOf/zNP55ZJBQYdFYSdk3K1XCbo9wM2gEMWqyz0McUKQCf9wxIrK5ga8LSpQ6rnZIVuVsmKUGoNh9jr4EbN+4B+PimZM8+T9LEl648weMwYJ0uJKbm1WyEfgzBGzBYrzNRmY+aRbEWlhUrVk6NLGtVTxdusajPeAeAS3unHDIzpU/vMsmV2m15WyZa6co2lnsP5f0zINVvIZwAQMYu6rR3jpkMNY3kQTukk2/EYT2+1jbpfW+xbNjHFDPe04h+9DEf2wMahuOpvkW1YnPB05ktf6Z7bDZZJhfYZExCZLLrliraKV4+GfJt36CX+vlnrwHotcUC8hjQ8D368nmJGnWx9Wor8tXZqUwy9cV26LFItRIWATSaq/xc3qJHtLBSfRXz3a6l1HFXNl5pyiIXWREVTX3M0IepBEGAE+RzbwkXc78v5/jaF6UEp9hcBbAK8cEXQ5nShz+WCuSDB2KU1WM5rybtWY0VXGjKsly5JMbL6HAA4AKr4je0UQz9watk3O522kZpVeUED5iMdsyrGUdaWRIAWNuUJf3q3/uyUc6fER7k40dSX6U5ekX1x9PMLCz5wQHIfe1qAemt6LustqHTWkv01dQKNKGJ47lZBlbSYLbih9dOMUSFRWvdqfwutCYPzNV6+1ti6t69swPgeCA/80PS0o0yiSo4tIG8sCljVGTFFPRo/yo1JzuDIwD7exIeGVoLy4oVKz8FYh9YVqxYOTWyLA+LVE36nfH/RYRsKZ1/PtmHayTeTRJ5Ao7HtMO72leDdiyrakajPoACwVFIh3qpqA5vgVRFuqKLbIFTKimbAksB+PA1BmfO0KTVQvTweXQVV7WhjnJaaWcgxZQn1I0XmRDEyvac+6FAtDgZCpCZkHOqEgqAvXRRmvrEgzaAY3p2o5FsqcsC1lJoN1kFAjFd+IMRIxsjjYf0AJw7K2lfIZ3KUawxEIYgCoovmHlERDMecfyuzGpzvQWgypImbWIUs7frovzWP/0tDivw5PKWnLvpxXKmIJf71qfSbfSDG8IVdbQtFTMXuRq1TWE7SOl9L5NOY60mN2FcrAGo1mXYR7vCbPXJR5KoNSTFmMPLvcbOPTu8ZC4LxTKWXpkb7CyJmL/yxZ81isKx7i69/hz2+avXjNLYENh4i3Vvc6K9ncplpra5giJz9iulqUoUPyoReTz3leFWiXir5B1VSUhy3GOOHityEJA3hX6JoChIOSzLWZ895wC4XJGUukuXBAm2zoj3vV6Se0OxZ0TqBeVGPz4SEpFety0nkk0xQ0euNFuLYi0sK1asnBqxDywrVqycGlkCCaeMLEwZp5hGJu5Gur5qg4pYjIk++BjbirnxOqOE/cJ1o7Sj20YZPRoCiMasHqDpWGQNxyYN6SAv1pHxNRDj5HSxGukDZiChdnZR/jNtFAqCIKa8zNQ+6Pje8md6ljebFCVgxUycchhGmlihgc2LckYrqxKA++DuJwAc2s8xSSYU4RbyfBzwE05b6yEYCa2xjmRSLgKoMkx5/rxgww8/YR1GprhSJjemHc7OpFhZkdk6KYno+gMA00gOd+GCgLufUJrzMlOQVBSemHKUkP8ePJJY832SC2rRfodT6uXRWy7CiJlTd6R8pLXaALC+Lud+f1foie9QmTAoHDJPLSSbhcfqkFIsaDfq0wHi+QCeufK0+XeFfL5dcgpOWD80Q6cnylki2fF0OdjJi2wSjQmyIQ3ZLHymBCqDiLLl5UuaQ0IHM6zEE6L7eCJh4hFJLHoD+SR02LSGkLDWkBtg66nnjLK5UgbQWJHlqnEBU57X8EhKvro9iZl2hqLE7PLravkOc7WMOyK/1W0jVStWrPwUiH1gWbFi5dTIEkhYaYiZp3mPpnC835MgQo+FLKknNl6ZXWEKRY2yMWanXWccBpIcEkVXygDqbH8CQsIeCyaG7LhTqgj2PJ9IZ5SnrjA/TUNmkQbIRpjpdqMxNeUwU0M6TeYTR7W5iEOSvDhdbp0mLM1xq5oTyCjeWJQaeeWnicIKmczDPSluMB1KHH6eKjbUYn0m1irPtaN5gwqQtXyHgaRi4AGokGu/wRTQ7QeCm7S0KCI2TPz5dESfk9GoTbFUAqB0jT0y+fmFE5n8ZhgQFbyzVsMMS/Q9ILlgn6nFLinrHvEE7/PemxKDTAl2AtKwnxuPAFRYYHTILMQ9Vswocz+YRhtqDQoD0zXSjYwnMmytXAPwmZdfNf+WQwHX/Uxqy3bJGZ8yAqvhVPVLrDKePicjlrsNSJWn6+b6RIKFcO6rvL8pu0npNsVyDUC5JouwvnnOKDFTcDUMff5YLuLNmwKZXaLR1TXZ68ozguvrwRTAwf4+j0soSq/HjRtCjuizWssrsH+VEmDwvsp46Q0dvkYJs5wrdF6shWXFipVTI/aBZcWKlVMjSyBha0MiGgGLCk0VXrE2nyLps5pfa80URGh4osoenGsrEvtrkMPg3nEXQJDI59WKZPoVmER3/UOpYAqKrBPUNmJklY6mDKuR8KBWbwDItCElZ5Jno/Er7QPqERFUtS0SIaEyEc5JwvaWU6KVKGbCbSZnVCTl3gpP+fBQAmG9EakRyg4AXztWMripHGlFXW1meCYsLnOYSlrOiD0JPQyTX5GZvT96/0dUPpRzpzVeIe1crMHBFiOJiYw2YcrfpQvnMVMC1meAyTmhGRoe61u1PAdXIfeEfIQZ+dR9ZiEeD9pG2SbSr69K5eDxkSCy8ZHEFgemQnBbQPeAV7nHINR4QmKQVPBRyHju+aclCHh4Q+jetWOu63gAfN4PDotMA8IxveV6LLUrsl5vRNxaZnb0nGgIPmAisubZehnD6MwTBmtgx5pByls744VIHR/AyrrQ5DdfFEz37o/lN9UhOYdDbgY3oxuEwxUIkDukbPw/3/oagDNrgjQvXdkySo3Ek00y62uPu4x8kBPS9UWMoyqT3zQ2bhxmTRdOvJ2shWXFipVTI0ssLJ/v7YSpRqaopappIMxj8vMHIX2W9PkFZPhVuqXzZ8VwOzqSF8XBcQqgTkaBc2eY3lWUfQ535SXplyQOUG+JzzII+P7RPClNLIoed5PzP+2fqvOPeYLq7VObAvxqkZmAW8poQ+YBuXQ8O+zfE9CKKfNNtXOk1Q8yzsVKFYBHXoqiGGfodpmrQusppYUy1nwWR17gzUbTKJ22mBuGQ0K7zH7ne9Lipc+Xf4vkR/lFpIWlRnG/Kzk1SowlPv085Y1lK9GJiTN5t1ptV6udXRwHM+TAA1rN2sF8dUXskfvbsj7PvC5cWr/8G79plDY5p771Z39qlFsffQDgJkejqQqfpxzFskrKdfUiG95ceU6U778vdULK1mAS1pT6LclLdkS2traMcuO2UDCnGmeg8RWcQNeQ0WmtjWM0OVDtWbAlsNaN+SzbKigxVvaY4pE3/PZtNoiNZLn2yRP9YJt9betijqW1FgeRr/a2JYmy23kE4OJ5CZR98y++ZZRPb0rZ0xfefNMoRdqMMR8Bak9hoZOxk4wxE7ZSo3VRrIVlxYqVUyP2gWXFipVTI0sMVM9TjEAn5WgCYMrKA/Vea7HLiMzCpYrY8DWCFC2mqZEwoMgsj+7+DoCDHSmmP96TXKFzZwQklpgRVmeNSIMl+IhkMt0DMXojlm8Y5OEzzaRE7rQiqy6mdO4qZV2SKsCUSarBDxAkPi5JTN47JjRVGxUOIrUUiYI4El1o2VM2JmXtxAXgaZEQL0hQ0katrJhhMYq6+VdYGtWoNY3S6YkNX2utAUg47IiTvHhJMmsmpIXQNKkmmZd9+n11ETZWZf3rlSqATlcgVZX5cScB57+LKNTtD7v8hPk4BBFrLG351X/0D43y87/8K0bRhLKXX37FKDc++QSzLUs5Pr3Y+O//+b8Y5cMPhYv5lVdfM8rVa8KvsPonAjAPD/Y5TISZihZVUl2ldQFKD/aFjWAGDosSx8tvp8XPtW1PzIuo4Qv9ATo5piISZIjJFEtd/5HwXnz3h+8Z5a03XjRK90gSvrRipn0keVjOnixLWuEV6chX9+7fw0xIbUhyizLb8e4+pFNC64c0NMSomrY+Ukq/glfHDHDWB8uiWAvLihUrp0bsA8uKFSunRpZY8iOChYjG9mgwwkzqk/Ii+AuJLRMqI9qKdW3BUhe0ePZJ6dd4aecQQLEkSHBtVSKATbI1BDtijddr3Jdoca1JnjCSPYxGZPIreAAyzS5h+xm1ltXgnEw1SqgpQnJGWqwznSy34UfsBpQXEaQCOUOaxwViq/5EJqnceJoL5qMEoFRhIlUkUTxXg1u8Rg4JwmtVpgIx5DRhbKnclDVcPX8ZwHc/EESg+V+tVtMoY22xqaEZzlZLf0psi6vT3tnZBeCQgHxtQ0BQTpf4E2Q+SCiirV/G2tqzTyTCRjjPPifpUS8xn6jKQJIi5Zeek6+effp5zOB9h3ApYbLVN77+TaPcuCWRsvNnBSmvMgx99oyc2kcfCz7yvQxApyOQp0dc7CXKPSAHWueyzESfZRtd9jnRBYwXeoimjEdnZAGJCEJTHjHWPD5l8osdAH0WMF08L11tblwXTsTDQ0F5CZFmSAK/2lh+s1caTxnl+IEsnan9ijkBpVpZWyMzJXlB1AWkzVZd8o142sxV6UbSDLMI92SxFpYVK1ZOjdgHlhUrVk6NLIGEHZawZzRoo8kUQMwsu5wXQbEhYYVyJIz67NDVF4xz1BGrcpv9vuLxBMCjtuC+aCgINF4V3DcmFfpGS2zUFiNxTaYURgPS5hG7mRiKBo+0KaOam1pi7mtDR9ZbpAwXDtkKbMDGWXOiTVgdFjBpeYrLlDmXxrYW3Kh5XGLIcnN9HUDsCmTrH7MZGgOvASRps8E6D4+xlTEJ2h02Xitkss3b338PwDvvCyTM6foIGVoM/IWsdtIFGjIcXCYkVDR98OgRgFJVPi+0BbtVSPK9KEvq7vPYWQZygQB5y9sJJ6mVRltbV41yYUPoOlzG/pgbq61CUQoKAAKukuZksogLHjNjq+wm22LtlM+uoitapMVZu14AYEpc2dkX3gufrUO7rL/RWHOFhTjxWIZtd9hB93HRYhRtmKYAKl8uLSnLZBuFhFONr2k4chwDOLorBUYTpRJkSdPmatMoh20paXqwL8rHJJ4/c5G/Tf7233j5eQAhS6b2HgkuBoPyRToWfA1gav9XXjJHyRd5juZJk+Sc7id0UrAWlhUrVk6RLM3DIj/RQGyTbqcLoMvXqVYnqPu0yNIT/UTrJ3Obhf4/ZVNdqxUB7IzknbPP7JXjjmSInFkXD2id7+/QE7tj/768OpQmOPdnuz6ASCegLx0+8kO1sGhYaWdKbc9ZY4qT75ewTHxlj/LnncmZmqJa7pvKS7hZkjM5uymnVi07AHZph47YrEVZooZkcRrQc18iD3WfVt4KU6iOyDL8V995B8DukQyrPFytGluudnlxM1l/bRSuSu5bpfVaa9TwWJoM3cDREkNKziMn/9VaFu1INAUQs+TV4QaafhWSuezqk+L9BQ3nw35bRmNNeN5D1HVnJ6l+7knu2JYlrdVYbcYq6AKXv0Ur0uNL3bBmu3QMj5nyFrIWvUvryWGPmUZNzLQJk8J6h4+wTPRHl3umlfWM47MwKk/IAk2VYu6t5ldxCiDJ2EmXPGLrLfmkHsq+jYZMcr8jk/zwUKzI47/8vlFefEYM29VGFUDKwu8tpvVlnLY+UALaXA5o6ualOWo0yl/jfU9zE9JSJFuxYuX0i31gWbFi5dTIEkioTLUHB4IBb16/iRnvrKIhNek6tHi1EalyAAyHYl5qr0e1YyvlCoC1NTIE0XNfrYqyUWGy0lTGf3R/wG3FqiwRNVSqgrbKUh5EWKGkXdwyoK9dG4X26GKPSDlUoc+10SximWifGIWEE0Kbiie7aCJbysILNyaJkiuwq9PbA7B/LG7OAfPXnKlAEoeYutsXDMIoAiL6leNMwM4hYXtvNAYwJJSbEt17ZL2aMnIyg+ZkDk9ckUS5IcGUo61cXB/AiOGR7W1xPNcbNZwg47Yg/THpp5UJw/TR7LYlznBM2miHkK1MX3gpkAnc+OgdGY0FJdrRNucUdlzMZDYleqZ6FlNByg2WbSWE2yMeeqXG24bNkEajAYBeX27FB7usaNlnjYsr98zWNQFKBV/u7bsPhcygTz7iOUkXilE0KSkn5FJqNq3RySnGFC3K/27qADhHF3vKlL1Of/z4Hqhq32LGWPq8vg+74nS/PGKRXHMdgM/DuEqRTEY5ZSjxmTaYOepZZ47Y3wb9LCS0YsXKT4PYB5YVK1ZOjSyBhNFUSw3kk+PjNoCAbGGanlNmUxYNBUYsKHFon9eaYpSC3RPHTK7pDaYAWi0BRyXmGdVrsm8rZJ19qlPS9jDgHAQJhgShyWQMwGfYokDb1+W+MZFOxjNFXrXDE2FCWfEExrUkD3URt0ZkUygxdsaErNSfhy39RJDyoNcDMCJw9hh/ySBmuWaNOZybl+fasBCKCWsZj2h6uBbZSKZODmuPAHDKuKT2wKmtSLJbnxlzfRLjalzYxLA0qKr9XzV7aFGuv/vXRkmI0Vwm5kzGQwCTMct6xuQLZFyswlFr5HQcdgWE6rLHE+bQcVYGekyYyKZdOZWTo8rksIwJWZ5OiTHZBo/dZLhwHE8AdFg25LHl7bvvvCuzbcit3jy7ZZSDA8lsuvERmamnyyFhlvMu8COFhGRrcB0t0qI7gmeUsfxIT9ZxUzxGC671N3LKEUPMHj0YdbJH9pXogkBVHQiOHwLwGK/UnsT5/JWaXHOpPA2zEsny55bp/CVKyF/oQn2SirWwrFixcmrEPrCsWLFyamRZI1WS5Gkk7pXPvIgZ3oJUe3Qo7UGslQGa0iZ/tSC7wCBdic1Ws8IYQFhg8C5gx1OPliFHc8lrrqGHcok5nzQ8p0RVJssxb2vKKBKI8rRqR0kaHKLdSs5Ip71elkPCkK19qkwOTBMx3ccMF3oMQq1tbHC2emoCSyvlMmZab2ptReqzoEFT8rjIgXb91MVmbGuDDW+ubm0CeLB/aP6tV5syPKk4XJfx0IbsEoTkHSTBYeqx/IWXpl6vAoi4bpW67FIqLV8lAMODB5wlV1vbZGYJgIA34dkVWfzVCkt/tJssM28TBvg8oonRQCnrGJ/yPACOEkhwECX1X+Et4RG2qIchY4ZtsymTWV+XTMuHhx0AD/eF5ODlt94wCsuicMAI4MGekEDssLVqiaHGp68xA/Ybb2NGlP1dZ+uoK8NR3KTrNs9qsEiEkbkOgIxwMqf+YDuoIe80dTVcZLsslwdSnKd0KVkSAcjyDrvzWdk6t9SZd8Vozq3iVs2szpDh5L5Ks2ItLCtWrJwasQ8sK1asnBpZYskrqnMJCTfPbQJot7UwTdBQoy5xpZ1dyZ3rkaRBbdAG8zkd8hNoALHZdAD4OafdfNGZlu9rhDEhmhgw/22qQTRoDMIBEBJSpUQTXkHbZNHi5ZELxD4BdC+GlpYtEYA6EwtLFe2syVxZwi7NxDt3ecsovbZgtM6xzL/klQAEFZl8maukJBNqJXtcU8IL1IjmylWpTHSZqfjsM78BYO9IcjKnrN8cMvA3JB1jSHykSLlA3D3oS6rnmFHCMAwBHLPozPVlSrW6DPIH/+3reFzikRwx0ppBhuRMBNbl7bRSlUEukU4v1jCrImaGoRXOK0O/r4Wb6RRAwg0KmspIIOMxAzMM5AL5gRw64d2ysiokfBvrAufbgymAARewUSPx5LUrcoIsvVSqxatba5y2fFUqaNL1Y6LxMjUhNHlSs0O1u5cy+TkLRXnOYrTx8f983lBBoDFH+WqzJqe8slqbO2KFHhjfhCM5pZh5yDPJ5HpK88fWuWmZpPbZmyd1xDzmVbEWlhUrVk6NLDEf1ljPPWHByt7DfQADZu7oO63Dogpt7F5mLlVeWEBFa2jU+26eqpowpalPE3aUSflujPle1eSaESdTpAvf44FMmb7yE+hxS2W+Tlma4+hbekR7iqU5+Ymc4AgMQy4drb8Se757ZO91QvlEE5fUQoy0z1ChBODSplTDVMqy7/GhpPBoX1uPZ+TrW4YvKG04OuYbb+3sWQDPPCsvfyWDznKfLp2jmhSTc0vQn61FQBSz+1iJN5jQ5NHU+tf4j3O7KGmvu5A9NJA4iRxFAyk/88bnjPLUM6/I6azJSg5GbaNo59aU5FDawF1KWHg/jHjzuMycGkznm+Devrctn0zkvuonTOvjbyQMygAmzH2bcnGqVZlAhaTAai/UAjFVcnPphH4wGdSe4mwfpw8G/eiYdXDr1fzbyIXVhe8xx62gRGI0sYplTbTkiXC2ahQbA01ZrZW2QSW/yksm4cxt4/EHPo2NUTzPArIo1sKyYsXKqRH7wLJixcqpkSWQsH0otKeDriC+9tExAJcZFgVmLc0Yq4RU2s9D+9DQTk66ZHQgPDEpSGO6qDUfX/cNyQsYEmSVWFleq4ubWV25WrUTGuhEj3pKi3fI7iNePjXln+A60Fg96Iw57PI2J0WlfCChcFBjp9hQgEB7KGd08/odOUetzCBO8ZwAQMBkNGWbW2mIT1eRoDLVVZmmNGoLf4NHE1orS/b39gCUalJkr+BaO+Jo5lfAkiYdRJdUs5OUwNe4S0tlWa6EHYm0M82i+FxkZdbWxCK3EABIIjncgC4ILZCaktn3xg2BbN/+tjj1xwyPKBmA8mf4foCZOo8h4zPquZ+SpUNb43zjm38p585hP70v4ZHDHrPqPB9AkYTLMUG9x8vhaOegHNaBc5NFSE7IV8tyAEiEzltR0eLfBQAu8ByoZ0bxsnxRoJt8xAZRY/4wA3ZFcjkZ7/FQQH4MDqtoTieg9Mf5qS0oM/N+7LyyhUQzFWthWbFi5dSIfWBZsWLl1MiyrjmM/WlOfau5AqBUJkkb8cWI6Tl37kkNvVZoF0NWP3BjBXoTbcgaRZjJeMoWaL2SvBfm/INVTdARu5j4LBxv1JoAiuU6R2P0cDrP6DYiLMpYxqHYJ+XKuCc80stsHKLkFvu7Eter1gUtDsdyRsdMaAqZC6YVRQdHHQAeU94aZBlXuBoSgCsRQ0QMpTxqEcN25Yog5cy0eGEsLVsICU0nMoMeoXqRjAsaYJ2yoH8SycaGBNElQkjIxZhmy4NfmCl/yQj04pwOJAWQMWvpnXd/aJT9fQnV1Vj6UyEvyOYq+ydtSCxbqSLVpWBSzLTPwJAAcMAbr9tuG2XrkuRJXb78ihyxKbfNjbtSVfOjDz+SM8kyAHXGmissySoSEiakIdGUQM/TXCdNblwOdtT7MVHSRy0byuGYIqb5Op5sIQAnx104mtIBelx2TVjrHsuyh3SahLxtQrojzC3nsnguvxMWIGF+RCpLsKGWH7kuZhwLP6FGx1pYVqxYOTViH1hWrFg5NbIEEmpfLLXLgrAAYGdPCJ6Vpj1j4qKnJRS0Y7URaUJsmBN6KVkaEgBBUcsj2FiJikYhnbz3kTI6EOjR4lQO7/ZRB0DQI0sfT2dJBDMPMLH6nLAoL1w4ISKjjTw1xTQmy3iv05YjssFRmXA14V5TZnjGaQRgwFBpleUpYGA0dQUsjIZisWv5SImRSo+NvxI2JTNWd85LRxSmDPQjksdrC6+UxnuR5UFaGVNg9Uzq+AASrb/3tKxKcybnpXskoUxtKxvzSvV7PQAxP3/9jTdl8rzu2p+1we5kF84J5i2WeO6MdmU5i2SEmb6/PpnvC7y4ffKyq5RYRKUg6oWrTxjl2ScvzW7p85RzzkJ2ZkvIrK/gtEhuPJeWgeadzom2jE0XCPBmwDYh4dz/M6KxcpMY7CykceqvO81RpLZllVsi1kgiKQPTPOZuRluMYM5PSSuK3Lwp2QJa5FeOm83ukthGqlasWLFixYoVK1asWLFixYoVK1asWLFixYoVK1asWLFixYqVUyf/D3PcGe48X+nJAAAAAElFTkSuQmCC\n",
  1170. "text/plain": [
  1171. "<PIL.Image.Image image mode=RGB size=400x100 at 0x7F1EC53EAB38>"
  1172. ]
  1173. },
  1174. "execution_count": 6,
  1175. "metadata": {},
  1176. "output_type": "execute_result"
  1177. }
  1178. ],
  1179. "source": [
  1180. "dataiter = iter(trainloader)\n",
  1181. "images, labels = dataiter.next() # 返回4张图片及标签\n",
  1182. "print(' '.join('%11s'%classes[labels[j]] for j in range(4)))\n",
  1183. "show(tv.utils.make_grid((images+1)/2)).resize((400,100))"
  1184. ]
  1185. },
  1186. {
  1187. "cell_type": "markdown",
  1188. "metadata": {},
  1189. "source": [
  1190. "#### 定义网络\n",
  1191. "\n",
  1192. "拷贝上面的LeNet网络,修改self.conv1第一个参数为3通道,因CIFAR-10是3通道彩图。"
  1193. ]
  1194. },
  1195. {
  1196. "cell_type": "code",
  1197. "execution_count": 7,
  1198. "metadata": {},
  1199. "outputs": [
  1200. {
  1201. "name": "stdout",
  1202. "output_type": "stream",
  1203. "text": [
  1204. "Net(\n",
  1205. " (conv1): Conv2d(3, 6, kernel_size=(5, 5), stride=(1, 1))\n",
  1206. " (conv2): Conv2d(6, 16, kernel_size=(5, 5), stride=(1, 1))\n",
  1207. " (fc1): Linear(in_features=400, out_features=120, bias=True)\n",
  1208. " (fc2): Linear(in_features=120, out_features=84, bias=True)\n",
  1209. " (fc3): Linear(in_features=84, out_features=10, bias=True)\n",
  1210. ")\n"
  1211. ]
  1212. }
  1213. ],
  1214. "source": [
  1215. "import torch.nn as nn\n",
  1216. "import torch.nn.functional as F\n",
  1217. "\n",
  1218. "class Net(nn.Module):\n",
  1219. " def __init__(self):\n",
  1220. " super(Net, self).__init__()\n",
  1221. " self.conv1 = nn.Conv2d(3, 6, 5) \n",
  1222. " self.conv2 = nn.Conv2d(6, 16, 5) \n",
  1223. " self.fc1 = nn.Linear(16*5*5, 120) \n",
  1224. " self.fc2 = nn.Linear(120, 84)\n",
  1225. " self.fc3 = nn.Linear(84, 10)\n",
  1226. "\n",
  1227. " def forward(self, x): \n",
  1228. " x = F.max_pool2d(F.relu(self.conv1(x)), (2, 2)) \n",
  1229. " x = F.max_pool2d(F.relu(self.conv2(x)), 2) \n",
  1230. " x = x.view(x.size()[0], -1) \n",
  1231. " x = F.relu(self.fc1(x))\n",
  1232. " x = F.relu(self.fc2(x))\n",
  1233. " x = self.fc3(x) \n",
  1234. " return x\n",
  1235. "\n",
  1236. "\n",
  1237. "net = Net()\n",
  1238. "print(net)"
  1239. ]
  1240. },
  1241. {
  1242. "cell_type": "markdown",
  1243. "metadata": {},
  1244. "source": [
  1245. "#### 定义损失函数和优化器(loss和optimizer)"
  1246. ]
  1247. },
  1248. {
  1249. "cell_type": "code",
  1250. "execution_count": 8,
  1251. "metadata": {},
  1252. "outputs": [],
  1253. "source": [
  1254. "from torch import optim\n",
  1255. "criterion = nn.CrossEntropyLoss() # 交叉熵损失函数\n",
  1256. "optimizer = optim.SGD(net.parameters(), lr=0.001, momentum=0.9)"
  1257. ]
  1258. },
  1259. {
  1260. "cell_type": "markdown",
  1261. "metadata": {},
  1262. "source": [
  1263. "### 训练网络\n",
  1264. "\n",
  1265. "所有网络的训练流程都是类似的,不断地执行如下流程:\n",
  1266. "\n",
  1267. "- 输入数据\n",
  1268. "- 前向传播+反向传播\n",
  1269. "- 更新参数\n"
  1270. ]
  1271. },
  1272. {
  1273. "cell_type": "code",
  1274. "execution_count": 10,
  1275. "metadata": {},
  1276. "outputs": [
  1277. {
  1278. "name": "stderr",
  1279. "output_type": "stream",
  1280. "text": [
  1281. "/home/bushuhui/.virtualenv/dl/lib/python3.5/site-packages/ipykernel_launcher.py:25: UserWarning: invalid index of a 0-dim tensor. This will be an error in PyTorch 0.5. Use tensor.item() to convert a 0-dim tensor to a Python number\n"
  1282. ]
  1283. },
  1284. {
  1285. "name": "stdout",
  1286. "output_type": "stream",
  1287. "text": [
  1288. "[1, 2000] loss: 2.210\n",
  1289. "[1, 4000] loss: 1.958\n",
  1290. "[1, 6000] loss: 1.723\n",
  1291. "[1, 8000] loss: 1.590\n",
  1292. "[1, 10000] loss: 1.532\n",
  1293. "[1, 12000] loss: 1.467\n",
  1294. "[2, 2000] loss: 1.408\n",
  1295. "[2, 4000] loss: 1.374\n",
  1296. "[2, 6000] loss: 1.345\n",
  1297. "[2, 8000] loss: 1.331\n",
  1298. "[2, 10000] loss: 1.338\n",
  1299. "[2, 12000] loss: 1.286\n",
  1300. "Finished Training\n"
  1301. ]
  1302. }
  1303. ],
  1304. "source": [
  1305. "from torch.autograd import Variable\n",
  1306. "\n",
  1307. "t.set_num_threads(8)\n",
  1308. "for epoch in range(2): \n",
  1309. " \n",
  1310. " running_loss = 0.0\n",
  1311. " for i, data in enumerate(trainloader, 0):\n",
  1312. " \n",
  1313. " # 输入数据\n",
  1314. " inputs, labels = data\n",
  1315. " inputs, labels = Variable(inputs), Variable(labels)\n",
  1316. " \n",
  1317. " # 梯度清零\n",
  1318. " optimizer.zero_grad()\n",
  1319. " \n",
  1320. " # forward + backward \n",
  1321. " outputs = net(inputs)\n",
  1322. " loss = criterion(outputs, labels)\n",
  1323. " loss.backward() \n",
  1324. " \n",
  1325. " # 更新参数 \n",
  1326. " optimizer.step()\n",
  1327. " \n",
  1328. " # 打印log信息\n",
  1329. " running_loss += loss.data[0]\n",
  1330. " if i % 2000 == 1999: # 每2000个batch打印一下训练状态\n",
  1331. " print('[%d, %5d] loss: %.3f' \\\n",
  1332. " % (epoch+1, i+1, running_loss / 2000))\n",
  1333. " running_loss = 0.0\n",
  1334. "print('Finished Training')"
  1335. ]
  1336. },
  1337. {
  1338. "cell_type": "markdown",
  1339. "metadata": {},
  1340. "source": [
  1341. "此处仅训练了2个epoch(遍历完一遍数据集称为一个epoch),来看看网络有没有效果。将测试图片输入到网络中,计算它的label,然后与实际的label进行比较。"
  1342. ]
  1343. },
  1344. {
  1345. "cell_type": "code",
  1346. "execution_count": null,
  1347. "metadata": {
  1348. "lines_to_next_cell": 2
  1349. },
  1350. "outputs": [],
  1351. "source": [
  1352. "dataiter = iter(testloader)\n",
  1353. "images, labels = dataiter.next() # 一个batch返回4张图片\n",
  1354. "print('实际的label: ', ' '.join(\\\n",
  1355. " '%08s'%classes[labels[j]] for j in range(4)))\n",
  1356. "show(tv.utils.make_grid(images / 2 - 0.5)).resize((400,100))"
  1357. ]
  1358. },
  1359. {
  1360. "cell_type": "markdown",
  1361. "metadata": {},
  1362. "source": [
  1363. "接着计算网络预测的label:"
  1364. ]
  1365. },
  1366. {
  1367. "cell_type": "code",
  1368. "execution_count": 12,
  1369. "metadata": {},
  1370. "outputs": [
  1371. {
  1372. "name": "stdout",
  1373. "output_type": "stream",
  1374. "text": [
  1375. "预测结果: cat ship ship ship\n"
  1376. ]
  1377. }
  1378. ],
  1379. "source": [
  1380. "# 计算图片在每个类别上的分数\n",
  1381. "outputs = net(Variable(images))\n",
  1382. "# 得分最高的那个类\n",
  1383. "_, predicted = t.max(outputs.data, 1)\n",
  1384. "\n",
  1385. "print('预测结果: ', ' '.join('%5s'\\\n",
  1386. " % classes[predicted[j]] for j in range(4)))"
  1387. ]
  1388. },
  1389. {
  1390. "cell_type": "markdown",
  1391. "metadata": {},
  1392. "source": [
  1393. "已经可以看出效果,准确率50%,但这只是一部分的图片,再来看看在整个测试集上的效果。"
  1394. ]
  1395. },
  1396. {
  1397. "cell_type": "code",
  1398. "execution_count": 13,
  1399. "metadata": {},
  1400. "outputs": [
  1401. {
  1402. "name": "stdout",
  1403. "output_type": "stream",
  1404. "text": [
  1405. "10000张测试集中的准确率为: 54 %\n"
  1406. ]
  1407. }
  1408. ],
  1409. "source": [
  1410. "correct = 0 # 预测正确的图片数\n",
  1411. "total = 0 # 总共的图片数\n",
  1412. "for data in testloader:\n",
  1413. " images, labels = data\n",
  1414. " outputs = net(Variable(images))\n",
  1415. " _, predicted = t.max(outputs.data, 1)\n",
  1416. " total += labels.size(0)\n",
  1417. " correct += (predicted == labels).sum()\n",
  1418. "\n",
  1419. "print('10000张测试集中的准确率为: %d %%' % (100 * correct / total))"
  1420. ]
  1421. },
  1422. {
  1423. "cell_type": "markdown",
  1424. "metadata": {},
  1425. "source": [
  1426. "训练的准确率远比随机猜测(准确率10%)好,证明网络确实学到了东西。"
  1427. ]
  1428. },
  1429. {
  1430. "cell_type": "markdown",
  1431. "metadata": {},
  1432. "source": [
  1433. "#### 在GPU训练\n",
  1434. "就像之前把Tensor从CPU转到GPU一样,模型也可以类似地从CPU转到GPU。"
  1435. ]
  1436. },
  1437. {
  1438. "cell_type": "code",
  1439. "execution_count": 44,
  1440. "metadata": {},
  1441. "outputs": [],
  1442. "source": [
  1443. "if t.cuda.is_available():\n",
  1444. " net.cuda()\n",
  1445. " images = images.cuda()\n",
  1446. " labels = labels.cuda()\n",
  1447. " output = net(Variable(images))\n",
  1448. " loss= criterion(output,Variable(labels))"
  1449. ]
  1450. },
  1451. {
  1452. "cell_type": "markdown",
  1453. "metadata": {},
  1454. "source": [
  1455. "如果发现在GPU上并没有比CPU提速很多,实际上是因为网络比较小,GPU没有完全发挥自己的真正实力。"
  1456. ]
  1457. },
  1458. {
  1459. "cell_type": "markdown",
  1460. "metadata": {},
  1461. "source": [
  1462. "对PyTorch的基础介绍至此结束。总结一下,本节主要包含以下内容。\n",
  1463. "\n",
  1464. "1. Tensor: 类似Numpy数组的数据结构,与Numpy接口类似,可方便地互相转换。\n",
  1465. "2. autograd/Variable: Variable封装了Tensor,并提供自动求导功能。\n",
  1466. "3. nn: 专门为神经网络设计的接口,提供了很多有用的功能(神经网络层,损失函数,优化器等)。\n",
  1467. "4. 神经网络训练: 以CIFAR-10分类为例演示了神经网络的训练流程,包括数据加载、网络搭建、训练及测试。\n",
  1468. "\n",
  1469. "通过本节的学习,相信读者可以体会出PyTorch具有接口简单、使用灵活等特点。从下一章开始,本书将深入系统地讲解PyTorch的各部分知识。"
  1470. ]
  1471. }
  1472. ],
  1473. "metadata": {
  1474. "kernelspec": {
  1475. "display_name": "Python 3",
  1476. "language": "python",
  1477. "name": "python3"
  1478. },
  1479. "language_info": {
  1480. "codemirror_mode": {
  1481. "name": "ipython",
  1482. "version": 3
  1483. },
  1484. "file_extension": ".py",
  1485. "mimetype": "text/x-python",
  1486. "name": "python",
  1487. "nbconvert_exporter": "python",
  1488. "pygments_lexer": "ipython3",
  1489. "version": "3.5.2"
  1490. }
  1491. },
  1492. "nbformat": 4,
  1493. "nbformat_minor": 2
  1494. }

机器学习越来越多应用到飞行器、机器人等领域,其目的是利用计算机实现类似人类的智能,从而实现装备的智能化与无人化。本课程旨在引导学生掌握机器学习的基本知识、典型方法与技术,通过具体的应用案例激发学生对该学科的兴趣,鼓励学生能够从人工智能的角度来分析、解决飞行器、机器人所面临的问题和挑战。本课程主要内容包括Python编程基础,机器学习模型,无监督学习、监督学习、深度学习基础知识与实现,并学习如何利用机器学习解决实际问题,从而全面提升自我的《综合能力》。