You can not select more than 25 topics Topics must start with a chinese character,a letter or number, can include dashes ('-') and can be up to 35 characters long.

PyTorch_quick_intro.ipynb 63 kB

6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
6 years ago
12345678910111213141516171819202122232425262728293031323334353637383940414243444546474849505152535455565758596061626364656667686970717273747576777879808182838485868788899091929394959697989910010110210310410510610710810911011111211311411511611711811912012112212312412512612712812913013113213313413513613713813914014114214314414514614714814915015115215315415515615715815916016116216316416516616716816917017117217317417517617717817918018118218318418518618718818919019119219319419519619719819920020120220320420520620720820921021121221321421521621721821922022122222322422522622722822923023123223323423523623723823924024124224324424524624724824925025125225325425525625725825926026126226326426526626726826927027127227327427527627727827928028128228328428528628728828929029129229329429529629729829930030130230330430530630730830931031131231331431531631731831932032132232332432532632732832933033133233333433533633733833934034134234334434534634734834935035135235335435535635735835936036136236336436536636736836937037137237337437537637737837938038138238338438538638738838939039139239339439539639739839940040140240340440540640740840941041141241341441541641741841942042142242342442542642742842943043143243343443543643743843944044144244344444544644744844945045145245345445545645745845946046146246346446546646746846947047147247347447547647747847948048148248348448548648748848949049149249349449549649749849950050150250350450550650750850951051151251351451551651751851952052152252352452552652752852953053153253353453553653753853954054154254354454554654754854955055155255355455555655755855956056156256356456556656756856957057157257357457557657757857958058158258358458558658758858959059159259359459559659759859960060160260360460560660760860961061161261361461561661761861962062162262362462562662762862963063163263363463563663763863964064164264364464564664764864965065165265365465565665765865966066166266366466566666766866967067167267367467567667767867968068168268368468568668768868969069169269369469569669769869970070170270370470570670770870971071171271371471571671771871972072172272372472572672772872973073173273373473573673773873974074174274374474574674774874975075175275375475575675775875976076176276376476576676776876977077177277377477577677777877978078178278378478578678778878979079179279379479579679779879980080180280380480580680780880981081181281381481581681781881982082182282382482582682782882983083183283383483583683783883984084184284384484584684784884985085185285385485585685785885986086186286386486586686786886987087187287387487587687787887988088188288388488588688788888989089189289389489589689789889990090190290390490590690790890991091191291391491591691791891992092192292392492592692792892993093193293393493593693793893994094194294394494594694794894995095195295395495595695795895996096196296396496596696796896997097197297397497597697797897998098198298398498598698798898999099199299399499599699799899910001001100210031004100510061007100810091010101110121013101410151016101710181019102010211022102310241025102610271028102910301031103210331034103510361037103810391040104110421043104410451046104710481049105010511052105310541055105610571058105910601061106210631064106510661067106810691070107110721073107410751076107710781079108010811082108310841085108610871088108910901091109210931094109510961097109810991100110111021103110411051106110711081109111011111112111311141115111611171118111911201121112211231124112511261127112811291130113111321133113411351136113711381139114011411142114311441145114611471148114911501151115211531154115511561157115811591160116111621163116411651166116711681169117011711172117311741175117611771178117911801181118211831184118511861187118811891190119111921193119411951196119711981199120012011202120312041205120612071208120912101211121212131214121512161217121812191220122112221223122412251226122712281229123012311232123312341235123612371238123912401241124212431244124512461247124812491250125112521253125412551256125712581259126012611262126312641265126612671268126912701271127212731274127512761277127812791280128112821283128412851286128712881289129012911292129312941295129612971298129913001301130213031304130513061307130813091310131113121313131413151316131713181319132013211322132313241325132613271328132913301331133213331334133513361337133813391340134113421343134413451346134713481349135013511352135313541355135613571358135913601361136213631364136513661367136813691370137113721373137413751376137713781379138013811382138313841385138613871388138913901391139213931394139513961397139813991400140114021403140414051406140714081409141014111412141314141415141614171418141914201421142214231424142514261427142814291430143114321433143414351436143714381439144014411442144314441445144614471448144914501451145214531454145514561457145814591460146114621463146414651466146714681469147014711472147314741475147614771478
  1. {
  2. "cells": [
  3. {
  4. "cell_type": "markdown",
  5. "metadata": {},
  6. "source": [
  7. "# PyTorch快速入门\n",
  8. "\n",
  9. "PyTorch的简洁设计使得它入门很简单,在深入介绍PyTorch之前,本节将先介绍一些PyTorch的基础知识,使得读者能够对PyTorch有一个大致的了解,并能够用PyTorch搭建一个简单的神经网络。部分内容读者可能暂时不太理解,可先不予以深究,后续的课程将会对此进行深入讲解。\n",
  10. "\n",
  11. "本节内容参考了PyTorch官方教程[^1]并做了相应的增删修改,使得内容更贴合新版本的PyTorch接口,同时也更适合新手快速入门。另外本书需要读者先掌握基础的Numpy使用,其他相关知识推荐读者参考CS231n的教程[^2]。\n",
  12. "\n",
  13. "[^1]: http://pytorch.org/tutorials/beginner/deep_learning_60min_blitz.html\n",
  14. "[^2]: http://cs231n.github.io/python-numpy-tutorial/"
  15. ]
  16. },
  17. {
  18. "cell_type": "markdown",
  19. "metadata": {},
  20. "source": [
  21. "## 1. Tensor\n",
  22. "\n",
  23. "Tensor是PyTorch中重要的数据结构,可认为是一个高维数组。它可以是一个数(标量)、一维数组(向量)、二维数组(矩阵)以及更高维的数组。Tensor和Numpy的ndarrays类似,但Tensor可以使用GPU进行加速。Tensor的使用和Numpy及Matlab的接口十分相似,下面通过几个例子来看看Tensor的基本使用。"
  24. ]
  25. },
  26. {
  27. "cell_type": "code",
  28. "execution_count": 1,
  29. "metadata": {},
  30. "outputs": [],
  31. "source": [
  32. "from __future__ import print_function\n",
  33. "import torch as t"
  34. ]
  35. },
  36. {
  37. "cell_type": "code",
  38. "execution_count": 2,
  39. "metadata": {},
  40. "outputs": [
  41. {
  42. "data": {
  43. "text/plain": [
  44. "tensor([[3.7158e-37, 0.0000e+00, 5.7453e-44],\n",
  45. " [0.0000e+00, nan, 4.5745e-41],\n",
  46. " [1.3733e-14, 6.4076e+07, 2.0706e-19],\n",
  47. " [7.3909e+22, 2.4176e-12, 1.1625e+33],\n",
  48. " [8.9605e-01, 1.1632e+33, 5.6003e-02]])"
  49. ]
  50. },
  51. "execution_count": 2,
  52. "metadata": {},
  53. "output_type": "execute_result"
  54. }
  55. ],
  56. "source": [
  57. "# 构建 5x3 矩阵,只是分配了空间,未初始化\n",
  58. "x = t.Tensor(5, 3) \n",
  59. "x"
  60. ]
  61. },
  62. {
  63. "cell_type": "code",
  64. "execution_count": 3,
  65. "metadata": {},
  66. "outputs": [
  67. {
  68. "data": {
  69. "text/plain": [
  70. "tensor([[0.4157, 0.7456, 0.9620],\n",
  71. " [0.3965, 0.8182, 0.7723],\n",
  72. " [0.3705, 0.9292, 0.0063],\n",
  73. " [0.4054, 0.9137, 0.9611],\n",
  74. " [0.8307, 0.0900, 0.6887]])"
  75. ]
  76. },
  77. "execution_count": 3,
  78. "metadata": {},
  79. "output_type": "execute_result"
  80. }
  81. ],
  82. "source": [
  83. "# 使用[0,1]均匀分布随机初始化二维数组\n",
  84. "x = t.rand(5, 3) \n",
  85. "x"
  86. ]
  87. },
  88. {
  89. "cell_type": "code",
  90. "execution_count": 4,
  91. "metadata": {},
  92. "outputs": [
  93. {
  94. "name": "stdout",
  95. "output_type": "stream",
  96. "text": [
  97. "torch.Size([5, 3])\n"
  98. ]
  99. },
  100. {
  101. "data": {
  102. "text/plain": [
  103. "(3, 3)"
  104. ]
  105. },
  106. "execution_count": 4,
  107. "metadata": {},
  108. "output_type": "execute_result"
  109. }
  110. ],
  111. "source": [
  112. "print(x.size()) # 查看x的形状\n",
  113. "x.size()[1], x.size(1) # 查看列的个数, 两种写法等价"
  114. ]
  115. },
  116. {
  117. "cell_type": "markdown",
  118. "metadata": {},
  119. "source": [
  120. "`torch.Size` 是tuple对象的子类,因此它支持tuple的所有操作,如x.size()[0]"
  121. ]
  122. },
  123. {
  124. "cell_type": "code",
  125. "execution_count": 5,
  126. "metadata": {},
  127. "outputs": [
  128. {
  129. "data": {
  130. "text/plain": [
  131. "tensor([[0.5021, 1.2500, 1.4749],\n",
  132. " [0.6019, 0.9378, 1.7240],\n",
  133. " [1.2752, 1.3837, 0.6832],\n",
  134. " [1.2053, 1.4374, 1.5160],\n",
  135. " [0.9404, 0.8743, 0.8164]])"
  136. ]
  137. },
  138. "execution_count": 5,
  139. "metadata": {},
  140. "output_type": "execute_result"
  141. }
  142. ],
  143. "source": [
  144. "y = t.rand(5, 3)\n",
  145. "# 加法的第一种写法\n",
  146. "x + y"
  147. ]
  148. },
  149. {
  150. "cell_type": "code",
  151. "execution_count": 6,
  152. "metadata": {},
  153. "outputs": [
  154. {
  155. "data": {
  156. "text/plain": [
  157. "tensor([[0.5021, 1.2500, 1.4749],\n",
  158. " [0.6019, 0.9378, 1.7240],\n",
  159. " [1.2752, 1.3837, 0.6832],\n",
  160. " [1.2053, 1.4374, 1.5160],\n",
  161. " [0.9404, 0.8743, 0.8164]])"
  162. ]
  163. },
  164. "execution_count": 6,
  165. "metadata": {},
  166. "output_type": "execute_result"
  167. }
  168. ],
  169. "source": [
  170. "# 加法的第二种写法\n",
  171. "t.add(x, y)"
  172. ]
  173. },
  174. {
  175. "cell_type": "code",
  176. "execution_count": 6,
  177. "metadata": {},
  178. "outputs": [
  179. {
  180. "data": {
  181. "text/plain": [
  182. "tensor([[1.7112, 1.2969, 0.3289],\n",
  183. " [0.7841, 1.0128, 0.7596],\n",
  184. " [1.1364, 1.1541, 0.8970],\n",
  185. " [0.8831, 0.7063, 0.3158],\n",
  186. " [1.5160, 1.3610, 0.8437]])"
  187. ]
  188. },
  189. "execution_count": 6,
  190. "metadata": {},
  191. "output_type": "execute_result"
  192. }
  193. ],
  194. "source": [
  195. "# 加法的第三种写法:指定加法结果的输出目标为result\n",
  196. "result = t.Tensor(5, 3) # 预先分配空间\n",
  197. "t.add(x, y, out=result) # 输入到result\n",
  198. "result"
  199. ]
  200. },
  201. {
  202. "cell_type": "code",
  203. "execution_count": 7,
  204. "metadata": {},
  205. "outputs": [
  206. {
  207. "name": "stdout",
  208. "output_type": "stream",
  209. "text": [
  210. "最初y\n",
  211. "tensor([[0.0864, 0.5044, 0.5128],\n",
  212. " [0.2054, 0.1196, 0.9517],\n",
  213. " [0.9047, 0.4545, 0.6769],\n",
  214. " [0.7999, 0.5236, 0.5549],\n",
  215. " [0.1097, 0.7843, 0.1277]])\n",
  216. "第一种加法,y的结果\n",
  217. "tensor([[0.0864, 0.5044, 0.5128],\n",
  218. " [0.2054, 0.1196, 0.9517],\n",
  219. " [0.9047, 0.4545, 0.6769],\n",
  220. " [0.7999, 0.5236, 0.5549],\n",
  221. " [0.1097, 0.7843, 0.1277]])\n",
  222. "第二种加法,y的结果\n",
  223. "tensor([[0.5021, 1.2500, 1.4749],\n",
  224. " [0.6019, 0.9378, 1.7240],\n",
  225. " [1.2752, 1.3837, 0.6832],\n",
  226. " [1.2053, 1.4374, 1.5160],\n",
  227. " [0.9404, 0.8743, 0.8164]])\n"
  228. ]
  229. }
  230. ],
  231. "source": [
  232. "print('最初y')\n",
  233. "print(y)\n",
  234. "\n",
  235. "print('第一种加法,y的结果')\n",
  236. "y.add(x) # 普通加法,不改变y的内容\n",
  237. "print(y)\n",
  238. "\n",
  239. "print('第二种加法,y的结果')\n",
  240. "y.add_(x) # inplace 加法,y变了\n",
  241. "print(y)"
  242. ]
  243. },
  244. {
  245. "cell_type": "markdown",
  246. "metadata": {},
  247. "source": [
  248. "注意,函数名后面带下划线**`_`** 的函数会修改Tensor本身。例如,`x.add_(y)`和`x.t_()`会改变 `x`,但`x.add(y)`和`x.t()`返回一个新的Tensor, 而`x`不变。"
  249. ]
  250. },
  251. {
  252. "cell_type": "code",
  253. "execution_count": 8,
  254. "metadata": {},
  255. "outputs": [
  256. {
  257. "data": {
  258. "text/plain": [
  259. "tensor([0.7456, 0.8182, 0.9292, 0.9137, 0.0900])"
  260. ]
  261. },
  262. "execution_count": 8,
  263. "metadata": {},
  264. "output_type": "execute_result"
  265. }
  266. ],
  267. "source": [
  268. "# Tensor的选取操作与Numpy类似\n",
  269. "x[:, 1]"
  270. ]
  271. },
  272. {
  273. "cell_type": "markdown",
  274. "metadata": {},
  275. "source": [
  276. "Tensor还支持很多操作,包括数学运算、线性代数、选择、切片等等,其接口设计与Numpy极为相似。更详细的使用方法,会在第三章系统讲解。\n",
  277. "\n",
  278. "Tensor和Numpy的数组之间的互操作非常容易且快速。对于Tensor不支持的操作,可以先转为Numpy数组处理,之后再转回Tensor。"
  279. ]
  280. },
  281. {
  282. "cell_type": "code",
  283. "execution_count": 9,
  284. "metadata": {},
  285. "outputs": [
  286. {
  287. "data": {
  288. "text/plain": [
  289. "tensor([1., 1., 1., 1., 1.])"
  290. ]
  291. },
  292. "execution_count": 9,
  293. "metadata": {},
  294. "output_type": "execute_result"
  295. }
  296. ],
  297. "source": [
  298. "a = t.ones(5) # 新建一个全1的Tensor\n",
  299. "a"
  300. ]
  301. },
  302. {
  303. "cell_type": "code",
  304. "execution_count": 10,
  305. "metadata": {},
  306. "outputs": [
  307. {
  308. "data": {
  309. "text/plain": [
  310. "array([1., 1., 1., 1., 1.], dtype=float32)"
  311. ]
  312. },
  313. "execution_count": 10,
  314. "metadata": {},
  315. "output_type": "execute_result"
  316. }
  317. ],
  318. "source": [
  319. "b = a.numpy() # Tensor -> Numpy\n",
  320. "b"
  321. ]
  322. },
  323. {
  324. "cell_type": "code",
  325. "execution_count": 11,
  326. "metadata": {},
  327. "outputs": [
  328. {
  329. "name": "stdout",
  330. "output_type": "stream",
  331. "text": [
  332. "[1. 1. 1. 1. 1.]\n",
  333. "tensor([1., 1., 1., 1., 1.], dtype=torch.float64)\n"
  334. ]
  335. }
  336. ],
  337. "source": [
  338. "import numpy as np\n",
  339. "a = np.ones(5)\n",
  340. "b = t.from_numpy(a) # Numpy->Tensor\n",
  341. "print(a)\n",
  342. "print(b) "
  343. ]
  344. },
  345. {
  346. "cell_type": "markdown",
  347. "metadata": {},
  348. "source": [
  349. "Tensor和numpy对象共享内存,所以他们之间的转换很快,而且几乎不会消耗什么资源。但这也意味着,如果其中一个变了,另外一个也会随之改变。"
  350. ]
  351. },
  352. {
  353. "cell_type": "code",
  354. "execution_count": 13,
  355. "metadata": {},
  356. "outputs": [
  357. {
  358. "name": "stdout",
  359. "output_type": "stream",
  360. "text": [
  361. "[2. 2. 2. 2. 2.]\n",
  362. "\n",
  363. " 2\n",
  364. " 2\n",
  365. " 2\n",
  366. " 2\n",
  367. " 2\n",
  368. "[torch.DoubleTensor of size 5]\n",
  369. "\n"
  370. ]
  371. }
  372. ],
  373. "source": [
  374. "b.add_(1) # 以`_`结尾的函数会修改自身\n",
  375. "print(a)\n",
  376. "print(b) # Tensor和Numpy共享内存"
  377. ]
  378. },
  379. {
  380. "cell_type": "markdown",
  381. "metadata": {},
  382. "source": [
  383. "Tensor可通过`.cuda` 方法转为GPU的Tensor,从而享受GPU带来的加速运算。"
  384. ]
  385. },
  386. {
  387. "cell_type": "code",
  388. "execution_count": 12,
  389. "metadata": {},
  390. "outputs": [
  391. {
  392. "name": "stdout",
  393. "output_type": "stream",
  394. "text": [
  395. "tensor([[0.9177, 1.9956, 2.4369],\n",
  396. " [0.9984, 1.7561, 2.4963],\n",
  397. " [1.6457, 2.3129, 0.6895],\n",
  398. " [1.6107, 2.3511, 2.4770],\n",
  399. " [1.7711, 0.9643, 1.5050]], device='cuda:0')\n"
  400. ]
  401. }
  402. ],
  403. "source": [
  404. "# 在不支持CUDA的机器下,下一步不会运行\n",
  405. "if t.cuda.is_available():\n",
  406. " x = x.cuda()\n",
  407. " y = y.cuda()\n",
  408. " x + y\n",
  409. "print(x+y)"
  410. ]
  411. },
  412. {
  413. "cell_type": "markdown",
  414. "metadata": {},
  415. "source": [
  416. "此处可能发现GPU运算的速度并未提升太多,这是因为x和y太小且运算也较为简单,而且将数据从内存转移到显存还需要花费额外的开销。GPU的优势需在大规模数据和复杂运算下才能体现出来。\n"
  417. ]
  418. },
  419. {
  420. "cell_type": "markdown",
  421. "metadata": {},
  422. "source": [
  423. "## 2. Autograd: 自动微分\n",
  424. "\n",
  425. "深度学习的算法本质上是通过反向传播求导数,而PyTorch的**`Autograd`**模块则实现了此功能。在Tensor上的所有操作,Autograd都能为它们自动提供微分,避免了手动计算导数的复杂过程。\n",
  426. " \n",
  427. "`autograd.Variable`是Autograd中的核心类,它简单封装了Tensor,并支持几乎所有Tensor有的操作。Tensor在被封装为Variable之后,可以调用它的`.backward`实现反向传播,自动计算所有梯度。Variable的数据结构如图2-6所示。\n",
  428. "\n",
  429. "\n",
  430. "![图2-6:Variable的数据结构](imgs/autograd_Variable.svg)\n",
  431. "\n",
  432. "\n",
  433. "Variable主要包含三个属性。\n",
  434. "- `data`:保存Variable所包含的Tensor\n",
  435. "- `grad`:保存`data`对应的梯度,`grad`也是个Variable,而不是Tensor,它和`data`的形状一样。\n",
  436. "- `grad_fn`:指向一个`Function`对象,这个`Function`用来反向传播计算输入的梯度,具体细节会在下一章讲解。"
  437. ]
  438. },
  439. {
  440. "cell_type": "code",
  441. "execution_count": 13,
  442. "metadata": {},
  443. "outputs": [],
  444. "source": [
  445. "from torch.autograd import Variable"
  446. ]
  447. },
  448. {
  449. "cell_type": "code",
  450. "execution_count": 14,
  451. "metadata": {
  452. "scrolled": true
  453. },
  454. "outputs": [
  455. {
  456. "data": {
  457. "text/plain": [
  458. "tensor([[1., 1.],\n",
  459. " [1., 1.]], requires_grad=True)"
  460. ]
  461. },
  462. "execution_count": 14,
  463. "metadata": {},
  464. "output_type": "execute_result"
  465. }
  466. ],
  467. "source": [
  468. "# 使用Tensor新建一个Variable\n",
  469. "x = Variable(t.ones(2, 2), requires_grad = True)\n",
  470. "x"
  471. ]
  472. },
  473. {
  474. "cell_type": "code",
  475. "execution_count": 15,
  476. "metadata": {
  477. "scrolled": true
  478. },
  479. "outputs": [
  480. {
  481. "data": {
  482. "text/plain": [
  483. "tensor(4., grad_fn=<SumBackward0>)"
  484. ]
  485. },
  486. "execution_count": 15,
  487. "metadata": {},
  488. "output_type": "execute_result"
  489. }
  490. ],
  491. "source": [
  492. "y = x.sum()\n",
  493. "y"
  494. ]
  495. },
  496. {
  497. "cell_type": "code",
  498. "execution_count": 16,
  499. "metadata": {},
  500. "outputs": [
  501. {
  502. "data": {
  503. "text/plain": [
  504. "<SumBackward0 at 0x7f85680bd710>"
  505. ]
  506. },
  507. "execution_count": 16,
  508. "metadata": {},
  509. "output_type": "execute_result"
  510. }
  511. ],
  512. "source": [
  513. "y.grad_fn"
  514. ]
  515. },
  516. {
  517. "cell_type": "code",
  518. "execution_count": 17,
  519. "metadata": {},
  520. "outputs": [],
  521. "source": [
  522. "y.backward() # 反向传播,计算梯度"
  523. ]
  524. },
  525. {
  526. "cell_type": "code",
  527. "execution_count": 18,
  528. "metadata": {},
  529. "outputs": [
  530. {
  531. "data": {
  532. "text/plain": [
  533. "tensor([[1., 1.],\n",
  534. " [1., 1.]])"
  535. ]
  536. },
  537. "execution_count": 18,
  538. "metadata": {},
  539. "output_type": "execute_result"
  540. }
  541. ],
  542. "source": [
  543. "# y = x.sum() = (x[0][0] + x[0][1] + x[1][0] + x[1][1])\n",
  544. "# 每个值的梯度都为1\n",
  545. "x.grad "
  546. ]
  547. },
  548. {
  549. "cell_type": "markdown",
  550. "metadata": {},
  551. "source": [
  552. "注意:`grad`在反向传播过程中是累加的(accumulated),**这意味着每一次运行反向传播,梯度都会累加之前的梯度,所以反向传播之前需把梯度清零。**"
  553. ]
  554. },
  555. {
  556. "cell_type": "code",
  557. "execution_count": 19,
  558. "metadata": {},
  559. "outputs": [
  560. {
  561. "data": {
  562. "text/plain": [
  563. "tensor([[2., 2.],\n",
  564. " [2., 2.]])"
  565. ]
  566. },
  567. "execution_count": 19,
  568. "metadata": {},
  569. "output_type": "execute_result"
  570. }
  571. ],
  572. "source": [
  573. "y.backward()\n",
  574. "x.grad"
  575. ]
  576. },
  577. {
  578. "cell_type": "code",
  579. "execution_count": 20,
  580. "metadata": {
  581. "scrolled": true
  582. },
  583. "outputs": [
  584. {
  585. "data": {
  586. "text/plain": [
  587. "tensor([[3., 3.],\n",
  588. " [3., 3.]])"
  589. ]
  590. },
  591. "execution_count": 20,
  592. "metadata": {},
  593. "output_type": "execute_result"
  594. }
  595. ],
  596. "source": [
  597. "y.backward()\n",
  598. "x.grad"
  599. ]
  600. },
  601. {
  602. "cell_type": "code",
  603. "execution_count": 21,
  604. "metadata": {},
  605. "outputs": [
  606. {
  607. "data": {
  608. "text/plain": [
  609. "tensor([[0., 0.],\n",
  610. " [0., 0.]])"
  611. ]
  612. },
  613. "execution_count": 21,
  614. "metadata": {},
  615. "output_type": "execute_result"
  616. }
  617. ],
  618. "source": [
  619. "# 以下划线结束的函数是inplace操作,就像add_\n",
  620. "x.grad.data.zero_()"
  621. ]
  622. },
  623. {
  624. "cell_type": "code",
  625. "execution_count": 22,
  626. "metadata": {},
  627. "outputs": [
  628. {
  629. "data": {
  630. "text/plain": [
  631. "tensor([[1., 1.],\n",
  632. " [1., 1.]])"
  633. ]
  634. },
  635. "execution_count": 22,
  636. "metadata": {},
  637. "output_type": "execute_result"
  638. }
  639. ],
  640. "source": [
  641. "y.backward()\n",
  642. "x.grad"
  643. ]
  644. },
  645. {
  646. "cell_type": "markdown",
  647. "metadata": {},
  648. "source": [
  649. "Variable和Tensor具有近乎一致的接口,在实际使用中可以无缝切换。"
  650. ]
  651. },
  652. {
  653. "cell_type": "code",
  654. "execution_count": 24,
  655. "metadata": {},
  656. "outputs": [
  657. {
  658. "name": "stdout",
  659. "output_type": "stream",
  660. "text": [
  661. "tensor([[0.5403, 0.5403, 0.5403, 0.5403, 0.5403],\n",
  662. " [0.5403, 0.5403, 0.5403, 0.5403, 0.5403],\n",
  663. " [0.5403, 0.5403, 0.5403, 0.5403, 0.5403],\n",
  664. " [0.5403, 0.5403, 0.5403, 0.5403, 0.5403]])\n"
  665. ]
  666. },
  667. {
  668. "data": {
  669. "text/plain": [
  670. "tensor([[0.5403, 0.5403, 0.5403, 0.5403, 0.5403],\n",
  671. " [0.5403, 0.5403, 0.5403, 0.5403, 0.5403],\n",
  672. " [0.5403, 0.5403, 0.5403, 0.5403, 0.5403],\n",
  673. " [0.5403, 0.5403, 0.5403, 0.5403, 0.5403]])"
  674. ]
  675. },
  676. "execution_count": 24,
  677. "metadata": {},
  678. "output_type": "execute_result"
  679. }
  680. ],
  681. "source": [
  682. "x = Variable(t.ones(4,5))\n",
  683. "y = t.cos(x)\n",
  684. "x_tensor_cos = t.cos(x.data)\n",
  685. "print(y)\n",
  686. "x_tensor_cos"
  687. ]
  688. },
  689. {
  690. "cell_type": "markdown",
  691. "metadata": {},
  692. "source": [
  693. "## 3. 神经网络\n",
  694. "\n",
  695. "Autograd实现了反向传播功能,但是直接用来写深度学习的代码在很多情况下还是稍显复杂,torch.nn是专门为神经网络设计的模块化接口。nn构建于 Autograd之上,可用来定义和运行神经网络。nn.Module是nn中最重要的类,可把它看成是一个网络的封装,包含网络各层定义以及forward方法,调用forward(input)方法,可返回前向传播的结果。下面就以最早的卷积神经网络:LeNet为例,来看看如何用`nn.Module`实现。LeNet的网络结构如图2-7所示。\n",
  696. "\n",
  697. "![图2-7:LeNet网络结构](imgs/nn_lenet.png)\n",
  698. "\n",
  699. "这是一个基础的前向传播(feed-forward)网络: 接收输入,经过层层传递运算,得到输出。\n",
  700. "\n",
  701. "### 3.1 定义网络\n",
  702. "\n",
  703. "定义网络时,需要继承`nn.Module`,并实现它的forward方法,把网络中具有可学习参数的层放在构造函数`__init__`中。如果某一层(如ReLU)不具有可学习的参数,则既可以放在构造函数中,也可以不放,但建议不放在其中,而在forward中使用`nn.functional`代替。"
  704. ]
  705. },
  706. {
  707. "cell_type": "code",
  708. "execution_count": 25,
  709. "metadata": {},
  710. "outputs": [
  711. {
  712. "name": "stdout",
  713. "output_type": "stream",
  714. "text": [
  715. "Net(\n",
  716. " (conv1): Conv2d(1, 6, kernel_size=(5, 5), stride=(1, 1))\n",
  717. " (conv2): Conv2d(6, 16, kernel_size=(5, 5), stride=(1, 1))\n",
  718. " (fc1): Linear(in_features=400, out_features=120, bias=True)\n",
  719. " (fc2): Linear(in_features=120, out_features=84, bias=True)\n",
  720. " (fc3): Linear(in_features=84, out_features=10, bias=True)\n",
  721. ")\n"
  722. ]
  723. }
  724. ],
  725. "source": [
  726. "import torch.nn as nn\n",
  727. "import torch.nn.functional as F\n",
  728. "\n",
  729. "class Net(nn.Module):\n",
  730. " def __init__(self):\n",
  731. " # nn.Module子类的函数必须在构造函数中执行父类的构造函数\n",
  732. " # 下式等价于nn.Module.__init__(self)\n",
  733. " super(Net, self).__init__()\n",
  734. " \n",
  735. " # 卷积层 '1'表示输入图片为单通道, '6'表示输出通道数,'5'表示卷积核为5*5\n",
  736. " self.conv1 = nn.Conv2d(1, 6, 5) \n",
  737. " # 卷积层\n",
  738. " self.conv2 = nn.Conv2d(6, 16, 5) \n",
  739. " # 仿射层/全连接层,y = Wx + b\n",
  740. " self.fc1 = nn.Linear(16*5*5, 120) \n",
  741. " self.fc2 = nn.Linear(120, 84)\n",
  742. " self.fc3 = nn.Linear(84, 10)\n",
  743. "\n",
  744. " def forward(self, x): \n",
  745. " # 卷积 -> 激活 -> 池化 \n",
  746. " x = F.max_pool2d(F.relu(self.conv1(x)), (2, 2))\n",
  747. " x = F.max_pool2d(F.relu(self.conv2(x)), 2) \n",
  748. " # reshape,‘-1’表示自适应\n",
  749. " x = x.view(x.size()[0], -1) \n",
  750. " x = F.relu(self.fc1(x))\n",
  751. " x = F.relu(self.fc2(x))\n",
  752. " x = self.fc3(x) \n",
  753. " return x\n",
  754. "\n",
  755. "net = Net()\n",
  756. "print(net)"
  757. ]
  758. },
  759. {
  760. "cell_type": "markdown",
  761. "metadata": {},
  762. "source": [
  763. "只要在nn.Module的子类中定义了forward函数,backward函数就会自动被实现(利用`Autograd`)。在`forward` 函数中可使用任何Variable支持的函数,还可以使用if、for循环、print、log等Python语法,写法和标准的Python写法一致。\n",
  764. "\n",
  765. "网络的可学习参数通过`net.parameters()`返回,`net.named_parameters`可同时返回可学习的参数及名称。"
  766. ]
  767. },
  768. {
  769. "cell_type": "code",
  770. "execution_count": 26,
  771. "metadata": {},
  772. "outputs": [
  773. {
  774. "name": "stdout",
  775. "output_type": "stream",
  776. "text": [
  777. "10\n"
  778. ]
  779. }
  780. ],
  781. "source": [
  782. "params = list(net.parameters())\n",
  783. "print(len(params))"
  784. ]
  785. },
  786. {
  787. "cell_type": "code",
  788. "execution_count": 27,
  789. "metadata": {},
  790. "outputs": [
  791. {
  792. "name": "stdout",
  793. "output_type": "stream",
  794. "text": [
  795. "conv1.weight : torch.Size([6, 1, 5, 5])\n",
  796. "conv1.bias : torch.Size([6])\n",
  797. "conv2.weight : torch.Size([16, 6, 5, 5])\n",
  798. "conv2.bias : torch.Size([16])\n",
  799. "fc1.weight : torch.Size([120, 400])\n",
  800. "fc1.bias : torch.Size([120])\n",
  801. "fc2.weight : torch.Size([84, 120])\n",
  802. "fc2.bias : torch.Size([84])\n",
  803. "fc3.weight : torch.Size([10, 84])\n",
  804. "fc3.bias : torch.Size([10])\n"
  805. ]
  806. }
  807. ],
  808. "source": [
  809. "for name,parameters in net.named_parameters():\n",
  810. " print(name,':',parameters.size())"
  811. ]
  812. },
  813. {
  814. "cell_type": "markdown",
  815. "metadata": {},
  816. "source": [
  817. "forward函数的输入和输出都是Variable,只有Variable才具有自动求导功能,而Tensor是没有的,所以在输入时,需把Tensor封装成Variable。"
  818. ]
  819. },
  820. {
  821. "cell_type": "code",
  822. "execution_count": 28,
  823. "metadata": {
  824. "scrolled": true
  825. },
  826. "outputs": [
  827. {
  828. "data": {
  829. "text/plain": [
  830. "torch.Size([1, 10])"
  831. ]
  832. },
  833. "execution_count": 28,
  834. "metadata": {},
  835. "output_type": "execute_result"
  836. }
  837. ],
  838. "source": [
  839. "input = Variable(t.randn(1, 1, 32, 32))\n",
  840. "out = net(input)\n",
  841. "out.size()"
  842. ]
  843. },
  844. {
  845. "cell_type": "code",
  846. "execution_count": 29,
  847. "metadata": {},
  848. "outputs": [],
  849. "source": [
  850. "net.zero_grad() # 所有参数的梯度清零\n",
  851. "out.backward(Variable(t.ones(1,10))) # 反向传播"
  852. ]
  853. },
  854. {
  855. "cell_type": "markdown",
  856. "metadata": {},
  857. "source": [
  858. "需要注意的是,torch.nn只支持mini-batches,不支持一次只输入一个样本,即一次必须是一个batch。但如果只想输入一个样本,则用 `input.unsqueeze(0)`将batch_size设为1。例如 `nn.Conv2d` 输入必须是4维的,形如$nSamples \\times nChannels \\times Height \\times Width$。可将nSample设为1,即$1 \\times nChannels \\times Height \\times Width$。"
  859. ]
  860. },
  861. {
  862. "cell_type": "markdown",
  863. "metadata": {},
  864. "source": [
  865. "### 3.2 损失函数\n",
  866. "\n",
  867. "nn实现了神经网络中大多数的损失函数,例如nn.MSELoss用来计算均方误差,nn.CrossEntropyLoss用来计算交叉熵损失。"
  868. ]
  869. },
  870. {
  871. "cell_type": "code",
  872. "execution_count": 30,
  873. "metadata": {
  874. "scrolled": true
  875. },
  876. "outputs": [
  877. {
  878. "data": {
  879. "text/plain": [
  880. "tensor(28.6268, grad_fn=<MseLossBackward>)"
  881. ]
  882. },
  883. "execution_count": 30,
  884. "metadata": {},
  885. "output_type": "execute_result"
  886. }
  887. ],
  888. "source": [
  889. "\n",
  890. "output = net(input)\n",
  891. "target = Variable(t.arange(0,10).float().unsqueeze(0)) \n",
  892. "criterion = nn.MSELoss()\n",
  893. "loss = criterion(output, target)\n",
  894. "loss"
  895. ]
  896. },
  897. {
  898. "cell_type": "markdown",
  899. "metadata": {},
  900. "source": [
  901. "如果对loss进行反向传播溯源(使用`gradfn`属性),可看到它的计算图如下:\n",
  902. "\n",
  903. "```\n",
  904. "input -> conv2d -> relu -> maxpool2d -> conv2d -> relu -> maxpool2d \n",
  905. " -> view -> linear -> relu -> linear -> relu -> linear \n",
  906. " -> MSELoss\n",
  907. " -> loss\n",
  908. "```\n",
  909. "\n",
  910. "当调用`loss.backward()`时,该图会动态生成并自动微分,也即会自动计算图中参数(Parameter)的导数。"
  911. ]
  912. },
  913. {
  914. "cell_type": "code",
  915. "execution_count": 31,
  916. "metadata": {},
  917. "outputs": [
  918. {
  919. "name": "stdout",
  920. "output_type": "stream",
  921. "text": [
  922. "反向传播之前 conv1.bias的梯度\n",
  923. "tensor([0., 0., 0., 0., 0., 0.])\n",
  924. "反向传播之后 conv1.bias的梯度\n",
  925. "tensor([-0.0368, 0.0240, 0.0169, 0.0118, -0.0122, -0.0259])\n"
  926. ]
  927. }
  928. ],
  929. "source": [
  930. "# 运行.backward,观察调用之前和调用之后的grad\n",
  931. "net.zero_grad() # 把net中所有可学习参数的梯度清零\n",
  932. "print('反向传播之前 conv1.bias的梯度')\n",
  933. "print(net.conv1.bias.grad)\n",
  934. "loss.backward()\n",
  935. "print('反向传播之后 conv1.bias的梯度')\n",
  936. "print(net.conv1.bias.grad)"
  937. ]
  938. },
  939. {
  940. "cell_type": "markdown",
  941. "metadata": {},
  942. "source": [
  943. "### 3.3 优化器"
  944. ]
  945. },
  946. {
  947. "cell_type": "markdown",
  948. "metadata": {},
  949. "source": [
  950. "在反向传播计算完所有参数的梯度后,还需要使用优化方法来更新网络的权重和参数,例如随机梯度下降法(SGD)的更新策略如下:\n",
  951. "```\n",
  952. "weight = weight - learning_rate * gradient\n",
  953. "```\n",
  954. "\n",
  955. "手动实现如下:\n",
  956. "\n",
  957. "```python\n",
  958. "learning_rate = 0.01\n",
  959. "for f in net.parameters():\n",
  960. " f.data.sub_(f.grad.data * learning_rate)# inplace 减法\n",
  961. "```\n",
  962. "\n",
  963. "`torch.optim`中实现了深度学习中绝大多数的优化方法,例如RMSProp、Adam、SGD等,更便于使用,因此大多数时候并不需要手动写上述代码。"
  964. ]
  965. },
  966. {
  967. "cell_type": "code",
  968. "execution_count": 32,
  969. "metadata": {},
  970. "outputs": [],
  971. "source": [
  972. "import torch.optim as optim\n",
  973. "#新建一个优化器,指定要调整的参数和学习率\n",
  974. "optimizer = optim.SGD(net.parameters(), lr = 0.01)\n",
  975. "\n",
  976. "# 在训练过程中\n",
  977. "# 先梯度清零(与net.zero_grad()效果一样)\n",
  978. "optimizer.zero_grad() \n",
  979. "\n",
  980. "# 计算损失\n",
  981. "output = net(input)\n",
  982. "loss = criterion(output, target)\n",
  983. "\n",
  984. "#反向传播\n",
  985. "loss.backward()\n",
  986. "\n",
  987. "#更新参数\n",
  988. "optimizer.step()"
  989. ]
  990. },
  991. {
  992. "cell_type": "markdown",
  993. "metadata": {},
  994. "source": [
  995. "\n",
  996. "\n",
  997. "### 3.4 数据加载与预处理\n",
  998. "\n",
  999. "在深度学习中数据加载及预处理是非常复杂繁琐的,但PyTorch提供了一些可极大简化和加快数据处理流程的工具。同时,对于常用的数据集,PyTorch也提供了封装好的接口供用户快速调用,这些数据集主要保存在torchvison中。\n",
  1000. "\n",
  1001. "`torchvision`实现了常用的图像数据加载功能,例如Imagenet、CIFAR10、MNIST等,以及常用的数据转换操作,这极大地方便了数据加载,并且代码具有可重用性。\n"
  1002. ]
  1003. },
  1004. {
  1005. "cell_type": "markdown",
  1006. "metadata": {},
  1007. "source": [
  1008. "## 4. 小试牛刀:CIFAR-10分类\n",
  1009. "\n",
  1010. "下面我们来尝试实现对CIFAR-10数据集的分类,步骤如下: \n",
  1011. "\n",
  1012. "1. 使用torchvision加载并预处理CIFAR-10数据集\n",
  1013. "2. 定义网络\n",
  1014. "3. 定义损失函数和优化器\n",
  1015. "4. 训练网络并更新网络参数\n",
  1016. "5. 测试网络\n",
  1017. "\n",
  1018. "### 4.1 CIFAR-10数据加载及预处理\n",
  1019. "\n",
  1020. "CIFAR-10[^3]是一个常用的彩色图片数据集,它有10个类别: 'airplane', 'automobile', 'bird', 'cat', 'deer', 'dog', 'frog', 'horse', 'ship', 'truck'。每张图片都是$3\\times32\\times32$,也即3-通道彩色图片,分辨率为$32\\times32$。\n",
  1021. "\n",
  1022. "[^3]: http://www.cs.toronto.edu/~kriz/cifar.html"
  1023. ]
  1024. },
  1025. {
  1026. "cell_type": "code",
  1027. "execution_count": 3,
  1028. "metadata": {},
  1029. "outputs": [],
  1030. "source": [
  1031. "import torch as t\n",
  1032. "import torchvision as tv\n",
  1033. "import torchvision.transforms as transforms\n",
  1034. "from torchvision.transforms import ToPILImage\n",
  1035. "show = ToPILImage() # 可以把Tensor转成Image,方便可视化"
  1036. ]
  1037. },
  1038. {
  1039. "cell_type": "code",
  1040. "execution_count": 4,
  1041. "metadata": {},
  1042. "outputs": [
  1043. {
  1044. "name": "stdout",
  1045. "output_type": "stream",
  1046. "text": [
  1047. "Files already downloaded and verified\n",
  1048. "Files already downloaded and verified\n"
  1049. ]
  1050. }
  1051. ],
  1052. "source": [
  1053. "# 第一次运行程序torchvision会自动下载CIFAR-10数据集,\n",
  1054. "# 大约100M,需花费一定的时间,\n",
  1055. "# 如果已经下载有CIFAR-10,可通过root参数指定\n",
  1056. "\n",
  1057. "# 定义对数据的预处理\n",
  1058. "transform = transforms.Compose([\n",
  1059. " transforms.ToTensor(), # 转为Tensor\n",
  1060. " transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5)), # 归一化\n",
  1061. " ])\n",
  1062. "\n",
  1063. "# 训练集\n",
  1064. "trainset = tv.datasets.CIFAR10(\n",
  1065. " root='../data/', \n",
  1066. " train=True, \n",
  1067. " download=True,\n",
  1068. " transform=transform)\n",
  1069. "\n",
  1070. "trainloader = t.utils.data.DataLoader(\n",
  1071. " trainset, \n",
  1072. " batch_size=4,\n",
  1073. " shuffle=True, \n",
  1074. " num_workers=2)\n",
  1075. "\n",
  1076. "# 测试集\n",
  1077. "testset = tv.datasets.CIFAR10(\n",
  1078. " '../data/',\n",
  1079. " train=False, \n",
  1080. " download=True, \n",
  1081. " transform=transform)\n",
  1082. "\n",
  1083. "testloader = t.utils.data.DataLoader(\n",
  1084. " testset,\n",
  1085. " batch_size=4, \n",
  1086. " shuffle=False,\n",
  1087. " num_workers=2)\n",
  1088. "\n",
  1089. "classes = ('plane', 'car', 'bird', 'cat',\n",
  1090. " 'deer', 'dog', 'frog', 'horse', 'ship', 'truck')"
  1091. ]
  1092. },
  1093. {
  1094. "cell_type": "markdown",
  1095. "metadata": {},
  1096. "source": [
  1097. "Dataset对象是一个数据集,可以按下标访问,返回形如(data, label)的数据。"
  1098. ]
  1099. },
  1100. {
  1101. "cell_type": "code",
  1102. "execution_count": 5,
  1103. "metadata": {},
  1104. "outputs": [
  1105. {
  1106. "name": "stdout",
  1107. "output_type": "stream",
  1108. "text": [
  1109. "ship\n"
  1110. ]
  1111. },
  1112. {
  1113. "data": {
  1114. "image/png": "iVBORw0KGgoAAAANSUhEUgAAAGQAAABkCAIAAAD/gAIDAAALVElEQVR4nO1cW3MVxxGe2d1zk46EhEASlpCEBaEo43K5UqmUKz8jpCo/MQ/Jj0j5JQllLGIw2NxsK4iLERJH17PXPEz316OZxdLoeb4XWruzvbPDfNOX6Tn64cuRUkopVde1OomqEbmsaqcZhIKbFTVJVVV5jRsWRGdRlaRc8d2Gbmtu3/ADTdM4Ql4m0tXavYs+NI1mVepjX9pUckUXlXMX7RMVcWbEwQpAppkCEACttMgsJiw1uOK1EYHvJWhtvQWqUhY0s0FrJqbGYy5V00S6Bwjf5RpdSZKU/vaorRrpldau2oRfFGdWAOJgBSBLZMLy5OS/Ey1DCakRXqAZBDZJunEayRVrkguNEpe3iSwO3DkxWCCv9R0ed1J0hvsO9uFtYLTSNg3d5QhsjTMrAHGwApBZfHJnvj2zwYtaYTLz5GzQ5qQiS2w8U6tsHmm3WeL1wmK9e+VEJ+C7ijlkN5VvZQluJM5HqTbvF0Y6zqwAxMEKQBysAGRwWH0XO7GMqCxVWNr4AawFvivQ1O6CaLvdWpNLXXNEnYgn7boC0r0Ga6ulSvrAjgVPg6pkj58vQUOt8S2W68APIhiIHvx5EAcrAJk18wkWc+ygF3EsLnLuSey9qwNudC0+hJU5Ep/Ddf0tEvqeDFwHiztYHBpXgzg0csv3VERV43EzevDnQRysAEg+S8JgvldbbEgSmBKka+mWzHyhFWJyNIJOC4hs4VJLPotttJDPzZbZy0cj7V2zqFJXuZh7lmp7zqAPSMYpV4g4HXGwAiCBdIt32tjWENlk7d/FE87f2mOHnSxqGtCQA1rtPqi8KxBOmjD0wPOQvTy45fm2JA1ASY3eeHyMOB1xsAKQiY/n08q68BteX4vQss9DQm3lfxEb4lErGhVb62mSXLXVP5ek1h0v2e0pP/HtkuJ2I9w4swIQBysAWcPTrvIqAM6I1CcdiMIzuYBHmGR4MEFWl1mQ8pNlUzhv0QolCOzxCotV3fD/OvIwKLNgVbVmd9oLRWtRLouD1q6vHGdWAOJgBUBI0VKMcDYknt8olUaNn8axd114ejeunRKC1O5mkrWDeyI6dL7DN82+oYQ1TKz8rV8A4Wd7Ik5HHKwAZFq5+/26JegTaEmQIg/jjnjLhEey0dr8hJXBbgjebO3XukkebM3aqkBkLQkc3PXKJrxMqd3hunYXkxQ+s/tVER9HHKwAZLLm+5nEVrRUyzGJavdOW+7UripgS4dSPNGIJCZtKaZI47CG1OonzC7KpPx9B+isGrwOHRUaViBpTZ5qmqb8FRFnRhysAMTBCoCUSYK24H9b4liWNvHFUZXfkrjlhYP5n1mrQ8aBcCUmPOFuUbMclf68q4QOpFYVJ+JorINwgLAdlXiOQtW6THsZ7RhInwdxsAJgbbJaRbzm38r2tgE/Nhan2fWMMZNB0IP9D9C0vf3OCEXB2StW1ZuYcl47nBxSr/hQTpL15TO482VJroZfoCDuiFecUdvBAF/WnBqLZ3fOgzhYAcjatm1cwYb2JrNVe8R/s+1Dk4QLfp/98BCq7t69a4TxeGyEPCc+Fg1Zyi++/NIIn9++bQTQcHK2B1U4QqekNApW3k1tV6UbFdi5A1hPmGZvszXiDIiDFYAsscp86N/W3BPguay1xmRmTR5/Gy6xXbh0ERdXlz+hFzEdtt+/N0JeEw0zVvr4+wdGuH79Bt868Qb+CPSKbTrTFoF3guJcvlLZpcbMOkmXt5Q2RpyGOFgBaNndCd/fYQ1ylo6Jyf8X+TGZuV5X3njzxroRpqbIBf3mm3tG6A5njXBwdER9YtZfnL3g99M6g4cKRWTZvEIoT0r8PLhStXcsPs6sAMTBCoCYlMqLqqQw1vb6pJCBnTdVOQ+CAjjR8fbtKyN8d/9b6Dw+PjbC5i+/GCHNiKTXrpOw9XLLCF999SfuFPWqKqQeIvUOi9f8OR22ffiZCvldB8mQW7UOqPzDOHBqO86sAMTBCkBWeT+XIlV6lt2QX3GQ/U9qX1aF00YOjLEvOneZrJvqiDVMFQV3U3Nz1GyOXNa8yo2w9YpoOL+wyMq5JMi22rUwivopd9wtnFq5YeOJPSfv5EyTRGsYjjhYAcgQOllzklDVYiPQLFMwgkix8hlLsaL0f3BhetoIPzx5YoT5K8vQeXBwYISpGaLh/v6+EV5vEfuevPjJCH/7+z+M8Jc7fzVCryuZUuvnlOhKXoBE2hFg2cUVtew+fNESzWKtwzkQBysAcbACkB0XpXNJ9kUsM4/cccXubJmT/52mXW5BQ//zTz8b4e3bX42wf3hohPxEJRScD96w6Q2MsLh01QhXr103wmBIy193YpJ7YvWZ/Ymyoe6N+St6aYe/y1udJeQQVVhwk9oNSOLMCkAcrABk9+7/10jwtuEldKzcU6/DfnNN/vrkgPzvJCEaNglduXdvwwgbG/eNsLu3Z4SF1TXoXF4mN+Lp06dGmGNXfmVlxQjrN24aYW2Nkl9vft02wrgQHoJZ45w2ipBTyziQxg6TtfdLRCtKey1q4SZpcC9EfBxxsAKQvf+wa6TBgCxRxkmlzLKGmoPJNSbIzDTlgvsDqkJ49uJ/dGuGMr/r69eMsDMi13x6fhE6//Xv/xhhc3PTCCWnqO7c+bMRZmcptH786LER3rwmGua2OWQTdshmt9MhIwinPpX9Hg6k4dNbNMTeKtYlv4Y64nTEwQpABpNSHNAEnp2l3FOv30W7hUt0scPcHI12jbC3T/Gw4jNqv7tJlmtpiUi3u0c03DnMofOPf/i9Eb74/DNqtks6+/zqmRnyRY8OaJvnYH/EfWeiWdVRiIgrzohhdwe0bbyAv2yj4W9UL0WcjjhYAcgSnszb22Rl9njCPzvaQbseVwpcmiVepFLaQCPe53I9mNGq5NxQ2bJBsrJ8hVRxVT4MMRzjfEz28ZPFy0bY3KRUV29yILqYUKMRkTTPmYZcnIsMV8qVvzCCRdFCQ+tcbsxnhSMOVgCyhmfdxUs0z1EOW42lWLbhY9mDASVzUQePCp5KUZuDQ7KPBVfyjXMOPGsxYTnzGDSE3cmYKSknWLocga6vXnUeV0qV7HlWnDhqeM8JDNOpe1K8kjNDkjgqeenAmlDHFM05EAcrABkog1mHdAccQqWULjkvyns5OVfN9jPKzHSEO8je8OOY+aX1Yww1NjvlPdyM+ctv2d+jDmRMzP60dC/nOG5+boaUF2TT9yoUPXT4HbKBRVcSoXQxphdVXAQMWxlnVgDiYAUgO2YaznEyBDwBv5RSyyuU1ex1aTI/evS9EV5uvTHCYEhbCUh4dlLyG3WXnUxl5yS50LxyDWuGA6mcGtIDEsbwNot9UcQBYMo1VDOTE0Y4PqRDL3VO2VosF3ND3h9ZmIcq1Dq8eU0PVtXgRHcjzoI4WAGIgxWAbOEy0fWIyzQS9iFu3/4M7VaWKTO1NyLmT0xQNvnwmIz00xfPjfDkx2eknVUhRzbJJ+GU5a9P8PrS4aieM2MSig/6tHCguPKoOIYq/KbTaIeC//l5itKHvJIOp+gtV68sGGHpCn17t2M5NLwX++7dB/5k+sA4swIQBysAGfI+MMljrtPf2JDK4offkYBULJJWq2trRrh165YRUGb14AEduHn+nBi6s7MLnb0eu/68EwNh0KFb3Q7Fz91u12lTWbWNSUqdQeHFCgf8K4urRri6St7PBU6E9bFzbKnCNm2vR+m50ZAS7nFmBSAOVgAyJGum+QDN+JBouPVqE+0O93aNAIp1mBf//PprI3Q9WoE7S0tLRsjzH6ETaazhkExkxldqjl1hm0bcAcTkCJ6VUkfHtIZ8yiVKO2wWYaw7XVI+9SkRM0mQ/hYavt+mF/X7ZD3n5siUx5kVgDhYAfg/pQ4eZ65sAxcAAAAASUVORK5CYII=\n",
  1115. "text/plain": [
  1116. "<PIL.Image.Image image mode=RGB size=100x100 at 0x7F1EC53B6588>"
  1117. ]
  1118. },
  1119. "execution_count": 5,
  1120. "metadata": {},
  1121. "output_type": "execute_result"
  1122. }
  1123. ],
  1124. "source": [
  1125. "(data, label) = trainset[100]\n",
  1126. "print(classes[label])\n",
  1127. "\n",
  1128. "# (data + 1) / 2是为了还原被归一化的数据\n",
  1129. "show((data + 1) / 2).resize((100, 100))"
  1130. ]
  1131. },
  1132. {
  1133. "cell_type": "markdown",
  1134. "metadata": {},
  1135. "source": [
  1136. "Dataloader是一个可迭代的对象,它将dataset返回的每一条数据拼接成一个batch,并提供多线程加速优化和数据打乱等操作。当程序对dataset的所有数据遍历完一遍之后,相应的对Dataloader也完成了一次迭代。"
  1137. ]
  1138. },
  1139. {
  1140. "cell_type": "code",
  1141. "execution_count": 6,
  1142. "metadata": {},
  1143. "outputs": [
  1144. {
  1145. "name": "stdout",
  1146. "output_type": "stream",
  1147. "text": [
  1148. " cat deer horse plane\n"
  1149. ]
  1150. },
  1151. {
  1152. "data": {
  1153. "image/png": "iVBORw0KGgoAAAANSUhEUgAAAZAAAABkCAIAAAAnqfEgAAA09UlEQVR4nO19SZAk53Xel0tl1l7V1dvs0wBmAAx2AiAEimCQIilrs6wIy3JYctjhi0+KUIRPvvrkcPhqRVg32WFH2AeFFZYoOyTKpCRCJkWCBEgAxDL7TE/3dE8vtW+5+fC/72VNVTWla9P/u/Trqsw///wzK/N9b/keYMWKFStWrFixYsWKFStWrFixYsWKFStWrFixYsWKlf+/xFn86Lf/2T+W75zMKNFkDGDc78i/45FR0mQq+6Ty13VdjitKPI2NksHnMV0eKpvdOQgLRikWQ6P4BdnF90UJAtmmVCoZpcCvCp4MW3AdAFkmw2ZpwoOlPL5sWQzkQOVi0SiVUpHnLisTp7II//zf/AfMyMVXNo1SXz1rlO7BUHaJZBcvkANVm3KgUi2QyfgyfhRnAApcyEd7B0bZuHDeKGkk3/WPxrJv7HK2Mlp/0DOK68n61Os1AFEki58kMScuc5tOp3Nn6vuebOLJJ54vB/ID+aRYCgCMB3KmDu+feCrDfvD2D/G4/M6//FdG+fJrbxmlfSyzvX94DOCv3v66+ff2w4dGefPNrxjlaedYxu+Jcu7zv26Ug0jO/c7tD41yeHDDKN3jXQD1alXWpCpX6v1P7hilc/xAvsoio3zx818wyo3rd41y4dIlo2zfet8ov/qLbwHYurhl/r11Z1+mtLpqlJsc/79+7S+M8upXvmyU15590ijR9n2j/Nvf//eYkd/+F79jFL3VPU8uhxvImTr8KsuyOSXgnex4sk2SJLODeK4oSSw/h6AgP4d4MpEDpforczm+bDOZyP1jfkxONuHEZQM3rMmBQln2zJNpT/UmTKc8Ndk5iuSWTibR7OnEsVyX3/29f4fHxYUVK1asnBLxl3wWy9ssLPKJjgEAL+ubfx1XHpkFvpML+oB39QEvg43G8liNJzRweJwgDDBjb4Uh7Z1yyA3EXnBdtQJ8blyYG81xOL6TAfByO27+lRUEYkYVaa8FtCmcTGbrOnw1LTNCAaRTOdz+jthEDt/5NFmQuLJN4vBt78uhx7EYKYVKCKBcF4OxnkRcBNkygyhRJGfUO5R9hzR11WZUu2k4HIGvWQApzUzP05dnxtk6/ETPja9NDuupFea4AMo0b8OCLGC708cJctyVrz65c8soMV+5e8dHAFpnG/J5IMe9+f4PjHKxJed+jldzjdbZ+lf/vlE2X31OTqSzY5S/+J9/AKDIpdgdyXJ1Y4EIo1he7Jc3zxjllRc+I1O6LVbed97+c1kMyBX5zrsfA3ju+c/KuTuPjPLBt75plMBvGWV15aJR9g9l38++9kWjTNY/kXX5fcxKgSupii6+S5zh+PP3fERLJDeTaUl5jgcgpYnk8YeZ8cZOE7GS9GrGEUdz5VYo0GD3qGSpAwCp/ELVcnf5401dudNcXw5d8NXUAkeT2QZ8gMTeFECaqkF3oiFlLSwrVqycGrEPLCtWrJwaWQIJ6011bItSqzoAJiX67cZi29H6QyEoGyXjE3A8IRih0z1OxnMHCn0fQKFApBaKoUhXMqpVOt2J5tT0dR0ZP8l96iKu5wFwGTFIErEzpxOZ9ngkSMqt1eXQBKGuR2OYs/LS5c90n+7GCaFHRsjmevKVz+jBeBRxMrKxH8qy+BUPgEuvdq0l4DGhP3hIS9ohCi5VK0bpHQsqqVYqevbmT7/fBxDRyC8QVqiikDCHHlSCgLibG8eED+PhCECrIeumoY8pgf+iXL99xyiNspzaG6+/bpRPv3ULwL37AhVrZTmLF56/YpTi4Z5RDnZ3jbLaEVd3CXLE5oogyj//+h8ZZWf7AYDLq/L5rR+/Z5RWTQDmL/7SV43y5OUto9y9e90od7YFsg0GgvTf+rmfN8pzz70MoN2RmycZDoxybiKfxEdyOYpDXjLePA8PZNqF6QDLRH0deofnCiGg3oger1QKdWWAG4sYL0rG+E+qPhqNhjEM4/De0/s2pmNef3cakJlOYgCOw9splN9qyhlkKc89476hukqIRhnIcghgUSgASDgnxcWLYi0sK1asnBqxDywrVqycGlkCCTvdtlEGPTERy2EAoMCYWsEXIKAJQUFBsE9EY9IlUitWxCB0mdyhIa1yuQSgWhM4WQg0nMc8KaZHKb5ziPw0bqUBRM9T1FMAgFRtVDmLlHPTUJpLWDEYCFyNmPRUDJmZlaOtx+TlF58wykFHhr13QxBBjZh2TMw8ZKCqALGB11uSt1IsVwAEFYYCaQyPOKXpVAaJafa7BVmfUlmWztXV4O4m4BLHmoelIVReIIIHDc2oOD9ByYCZ6HBCyKlJT4tS1qw6Xt82b7B+rwPgledeMP+eXZEo29OXn5adCbI0D2sqgyG+I+lXLhfh7gcSW5xM2gDe/sF75t8Wces/+YVfNcoLr71mlGuvvmKUvY6M75bkjP7HH/yhTOaZa0b5/OfeAuARAP7NLUn72rwuKPJi7YJR1mJZsG4iMc3f+0+/a5RnN+tYKry4izlWWaIXiBdR8+MYPNWL6Co2dBzMYDoN52kcX38vE8L5IJCNM41up3Rc8CfpSf4gcx75w9cDezz3mEeMuGKe5pEl+tvk3o4DIE71Ll0emoe1sKxYsXKKxD6wrFixcmpkCSTUGhePj7MszgBMIwkVqcHm+VpownBSLNtMUwFBil+qRWYbEm2Vy2UAtTqT+mm+akGAT6tVUxlT2pmaCanpjppKWqvXMRNJUYs6Yb6ixs6iCZNa+ZXm14UEMmFYwjJ57bWrRrmz3ZUtCfeePHfOKPcZG7qx25ZTpjF/9axsY1Lw2jyv9khAhKafJoTDKVGcmvdTAtgC1zbT+ItjJi9nMWH5xXSqRRXgtEVKCsA19sSvKiWBXWaRc5RNyOA4J4Z11taaRtnbk8TOo0MBehurGwDOr0tpy4U1Ua5eFUjoeM8aZTiVZfnff/onRgnee9soFy8JNj/bkEW4cPYqgKeeF4B2sSwobIPKB++8Z5TWplTtXHh2yyi/9Gu/bJRvf//HRrlzS4p1PnttD0CLAeWtp+S4hSPJNT18IOfVLAu2vX5fynp2+nIn7N1hCPxx0QxPF/orYBhXf27O/HXRH4i6YnLJzC5yv2nQOckYIs9kJhEjiQoJC76WyLCujuE842+JCeXiTBMGOKlMA5e8b1mRA24TEkhquHASjwGkC3B4UayFZcWKlVMjSyysp69KFsx0SL/vcALkZo4aL+omd7W0kqk0SWNFNuZLQA0rL3cEZgCCAr3yno7G8TWDA+qApBeZVpLPhCn16ZpH/JiedbXBtAp6PJHzcjL17osFUVY39kJ20pyUCzLa5qoct/WqrJvHhKyMGS4x1ydgOkyDFa2mlGHKtQ19pm5FNDNpu3hMfklY5JSFjDPQwp3wkrlIAUz53o743vNSrRti+hg96C6vS5V5XuD7s8BainIpBICMeTq+XtMTLazzGy1OUk7t1vU7MoVRAqDsyOdFKh6jOgnN2wlfxccDyWOaHom91uu1jaLBnEmcAXj9rc+bf+OurMlfv3fTKJvnxfjqj2Tdxn1ZqJXqulG+8BUp/dn96AM50MN7AGqbcjqdqWCIqCEJX/sPxIzqemLFNBixiVzZ6+jhPpZJHjtayMNSh7pDsJJnI/KUk3T+JpfymnxLOVCs151GjAZdtHSmzPvK02xHpTlABoCAJI8IpTS6tKbNDzTHivZaHhAghuNt4yYuZlHRQiBIxVpYVqxYOTViH1hWrFg5NbIEEl48R4Kno7ZRBt4AAOjXU8tNszOU8ibjeG6gSf2CfRIsFNNkGYBuR/ypmtNRqQouU5+uFotosYvrBJyDAkwxI0fjCYBBn9wSPK+AhSYBvcsFzUnJa9mVBouwa9GdaY7Ls2hWiETo9e/t0xPJ9XninACNpkcfN1FVr9OfPS6JMBCPGAcgmovpL89YKKPrr1k8RZ7j+uoKgDt7h+bfPmuDWEKPDVa0ZDnTgxJFcJGhaJHu0sDHDLWZOk0znOglnXZlDm984UtG0WS07bu7AA6PJAfq/v1t2aAo3vGXPvumTJuXu5pwShW5SWLClg8/IRFCWAZQKEswZ0jYcv+RHKgliA3PvS4reQUyWi0UV0ahQIINrn8hiwFUSrI4RyQF+6Pvfs8o62QBubb1lFE+/8JLRvnu9z82SufBMZaJGg6aH6cgUf3xM84N3pz5AKw/Y2TMoEV1v+TYkIM5iTpGZJsolhs4JvFJyF/xJGJBUpICcDOlOZGJRwpF9TGhhHTMuko53+FYf9d0KTguAF/DSpmFhFasWDn9Yh9YVqxYOTWyBBJutsQqrjOLZ1gdAQiVpY9W64Qg5ZjgUaMGoNmvgYAiI0qaQjUZTQAMmBg0HIjF2Okz4csXJSxLtKhcEaVGIgeHkE0Vkz526axQDIdFmYCvBQ2F+eBjzuhAbDshPoqTEyAhUSSYgtTtMhoyFmXcE0N6tSX462yzKQciJOz2RwAO9yWXxyNcbTE6eNBlNGogkZppVyJlIZc0cpQxQuz8V178WQDXrslR/uzP/8ooTcLtr35JCIvffV9yhXZ5EZXRQa94HMv43V4XyyK/IxJgLEp1Iti8vyO1LI/274lyuA9gvSHpVy1Glm/cFwrju4/+l1HWGGlqEk0cfyQAcJ+h2CtPXjbKOC0AuPup5E997itfMsozz79olHu3BHse9+WUD1mak2Yy/kZVbpLzrwmsazRCAB3SkHz5F/+BUb7+7W8bpcKg9sULQuC32VozyirvWxKgoD2XD6d8ivzAXQjwAcpznW9l/ii7tSZkmWRGdb8U+FtWMmWP/paE0buUSFAT/RLNBeMVTxOjsMCLBA8e102nnSmDZjYf+5shj5RNTA7gdKKELjYPy4oVK6df7APLihUrp0aWQMLGStMoTSomUKI1LlofEzE5U/vcDAiCtM1GwoyyiEUnEakLsmkEIIsFpAxisai7EzFwh33BPqORlL8orNvckCDOuTVJydtsSWhppekBWFlhNUlRwamS/ykdIPgVo5BEi2pa/62QUDnX2QcHGZNaPe66UiVToJZK5PTVCYAC3xxPnBWWcZcTOFORfQ/asm43yCE3JnauNORkJ7TM723fA/Brv/Qr5t/L6wxTrgjsOmoLCDo8FKa6ep1xQ8bF+gMCPYIFzw8xA6Vz3osT0msBDNvCpH58IJUrF87IZFbX1wEM23Jj/Pjjj4xy8drLRgm5Stu7km/5Aq/y+gtStTM+FG6/ClM6G/VNAPfvyS4txhM3tqSY5skLW0aJSSTwta8J+V/GOpJ2h0x7vPf+8v0fALjy7PPm31/49d80ys+Qj/DqmQ2ZCQQA3iSjQ28oi1Cpyt2Cx0nwc6597UDlqAdDy2Lm+fg11qxMeCVl1EvS2S2VfV+pH/w8XEhXA+vqYtJYxmQiVEY9zykASBhhVEioPyUFgooN9YhT1oeVeEs3ecsdHR4AmE5l2FTZBRfEWlhWrFg5NWIfWFasWDk1sgQSBsRHga9+/hTAhFTuWn0U0Fht1gSghUwTHXPjEWHFhJGFlEXbncEIwEFfzL9+KltGzHabJGTL5r6ahloo+3OKdspaWasACAKdpMu5Efcpk3Ruxyoduxask8RaK6weF4e4KSyIkR/4smWzLhVkBbLgN0ge7zCaNuqLsr7SANBsyuQ3G7KSJVrUXeZ8VmmW7+9ISHFCtgYFmCBQ/fCj6wCe2RKQ9dqLAmQ0IfYP//hrMj7DtAGp5fpMiewTxWsqabNRx0x3Jq1cUwKMRVk7tyXbMEHx6JEwtT/zwmcArL8mKHhIlNLcbBrl6lXBfUe3BVupYwEs+mutCuLbPCMVgoNHfQCIBNPppdRwVUFf05z//o7EDT/4kbAA3r4tHU/diYyzUgkxUzR35ZrEHL/0pS8Z5RxzcUMSW97/xjeMsncol2xtQ6Z94yHTXAHMdNZKcyYMzlEDfwTmmqY7Q/KnsTm9tzOzhflvRAZ6OJojrSx9DAWy2DN25iczJZ1k4DqYCTVqxF+5GWLmCSRsWJdxtJjg8Ec/lG67AaPMzVoNQIMI0bMEflasWPkpkCUWVveR1FIUlbPYzTDrDsyrrlnfz66f2vx9HMrzNSQtwTGrUvYGsteN4zGAg2N59mfkBigVWQwB2bdFTiUlzyoVlZ2ZBlQow5arRQAVdoEN6EHkqycnCdLnOK2l3EGoTlD3BKanUN3zXJZqhb587YBCn3VA0uRqIAZUzFfF8xdXATw6Eg9xoFSztOw89nbd3tnj3MQmrbXElJuQv0EpKiJkAO7tyS4/87q4sTsHbaPceyi+9pjXZcAUJ59xhipjBaOxuIh39/cAnKMLX2uDFnmWVe48POQpy+Sikfj7G80VAFvPihn11pelmU2Bfc+P9o+Mcv26dNaZHstoEdOIGrRZBgOxkh492AOQ0L1drMuaH+2Ie/5od4erIZ/84Pv/1yiffCo2qVaYFJj9FDoJgJ0dyRF793t/Y5Q3f+ErcoJchAJtijL79PTpud9Y28Qy0Uw3ZTLIHetsvaNtcd0FA0T/z+iGN/VzZabdqVc+oZmpju2Ye2esRtKKnIIrN2FF+c3jDEDGQfo03LS5vNKaJ3ThJ+yjo7/QJxn9UBe9AXPa2iezpTlWrFj5KRD7wLJixcqpkSWQ8OE9Jraw4U29XsEMd4JaoB7tTFf9mjn9mJL8ySCHXUlF2TkQn+4kdgFkNLmVXnk0lEFYz4BaILZinYqWzrgZU0Los0/HAYBxJIOMdNp0EIbMVamw14vOX7vmKEeadricE/XcT2kMZxBlQst2TKu7Q9q51qrglwsMUxS9NoBGRY4yUpIJ+kRbdZntE5fFOd0niPtgW/DRtM9UmqkACdNsdUKs21yT4/aYujVgrxS3JPhrQndph/wZDqvntao+jscAuj0SbHgnOkdV3v1E/OUXL8ocilPBR+P+IQA/kMt8/oKcYIEgaLIr94w/0YagbL1z2DZKEpP3cVUmY9agS6dvfyizPbwpxTo3PpRqpEe7UiT0YPcux+d9lTunZTKDKAZQZMnU994RkobLL0lA41VGNvbv3OTZs6sokf5owKDB4zLT1xZU9KfEHxeXJVuAhOrYVn+N2SLn2mTExuX8yxXxcPdJ0nA8ZksnJmAWMrkbpwPJoZtMepipxNLSH522/pzdPIyQdwQyf6ukyUTOYx4BmHLxNb9sUayFZcWKlVMj9oFlxYqVUyNL8M5wKHZgmpCeAT6APgtl1GwOGUasVNkRh00r4bIhKLP7t9lT5O49ApnYx0w0pErGtXpJAEhYEPNyheOvr0v5Ra0lympd7NiiJwca9DuYyb5xaSS7uR0tn0TEbhrjUv4JNdE15WROYkKqjKFMJVHrs+i8Tf6G47aE5FbKZzltOaNPbj4AkKRy3GaDsa0SQ7S0qMuMh5YZsskmpOXjqa0QxSfRBMDgSC7lkPHKPRbipAPBZR6nXaoKa0KayCC72+IcWGnIpVlZqQPod2XYFjOPPP/EN19Kn0BQk5jmxaKUB7368lUAXk4OIUihvy1RvP2PpHVNlTU0Zd4bet88uCf46+HtT2WbcgXA5nPS1ijal7M4vCmjjXYl5lgjkirk/HOcNjPyXIa9xnEEoEs+jyIjvy5z36rsLXS71+EocoGe3LpklEYoi4wf4zFR3OTMZzYpJMwWOtvqXZryk1SDjCkw08HX4z3pE4WlU1nSfibeiaQoJBNeWcD7MdPTpt0dzsoDWfMB+FzAIG94Qy5PbX7MWyPlJ0oCkUQKhF3M9HBwbWmOFStWfgrEPrCsWLFyamQJJFxdbxqlxMR5x/EBDIdiXo6HgobG2nNpLAZnpaaEYWVurNx4rDmg+Voo+gB6x5IcyARJBJkAkEpLBvGVfqws5muT4bZGWY5Y8hjgy0IAYcCET1r7mimX94MipcSI+G48Fmim/b6Uo25Oxiw9B+uTQmZR+oyd1aoy7SQRADhiPbpWSoxHCYAS8UWZLVFdBpjiiIEYZuudYRDncy+1eEayF0OL2Hv4CMBKUzbY2xM83u4IWllbkaRQXhYMGMYtsW9VifzlKRNSjx4dANjYEFyjeDnDyTZ8UU7tY1IX/PyvCb3BM09fBrDNrNRGQ67pp9els1bnjqRxrvDmCIkmzgSykmNfoM0x+24V2n0Aq4cyyZU+Q43rcl+V+3KCh8RuVfLxt+klGKrfgBjQdKXLpszaXZdbUdNre0cyWkz81Scv4OsvkwWwJIwOf/zNP55ZJBQYdFYSdk3K1XCbo9wM2gEMWqyz0McUKQCf9wxIrK5ga8LSpQ6rnZIVuVsmKUGoNh9jr4EbN+4B+PimZM8+T9LEl648weMwYJ0uJKbm1WyEfgzBGzBYrzNRmY+aRbEWlhUrVk6NLGtVTxdusajPeAeAS3unHDIzpU/vMsmV2m15WyZa6co2lnsP5f0zINVvIZwAQMYu6rR3jpkMNY3kQTukk2/EYT2+1jbpfW+xbNjHFDPe04h+9DEf2wMahuOpvkW1YnPB05ktf6Z7bDZZJhfYZExCZLLrliraKV4+GfJt36CX+vlnrwHotcUC8hjQ8D368nmJGnWx9Wor8tXZqUwy9cV26LFItRIWATSaq/xc3qJHtLBSfRXz3a6l1HFXNl5pyiIXWREVTX3M0IepBEGAE+RzbwkXc78v5/jaF6UEp9hcBbAK8cEXQ5nShz+WCuSDB2KU1WM5rybtWY0VXGjKsly5JMbL6HAA4AKr4je0UQz9watk3O522kZpVeUED5iMdsyrGUdaWRIAWNuUJf3q3/uyUc6fER7k40dSX6U5ekX1x9PMLCz5wQHIfe1qAemt6LustqHTWkv01dQKNKGJ47lZBlbSYLbih9dOMUSFRWvdqfwutCYPzNV6+1ti6t69swPgeCA/80PS0o0yiSo4tIG8sCljVGTFFPRo/yo1JzuDIwD7exIeGVoLy4oVKz8FYh9YVqxYOTWyLA+LVE36nfH/RYRsKZ1/PtmHayTeTRJ5Ao7HtMO72leDdiyrakajPoACwVFIh3qpqA5vgVRFuqKLbIFTKimbAksB+PA1BmfO0KTVQvTweXQVV7WhjnJaaWcgxZQn1I0XmRDEyvac+6FAtDgZCpCZkHOqEgqAvXRRmvrEgzaAY3p2o5FsqcsC1lJoN1kFAjFd+IMRIxsjjYf0AJw7K2lfIZ3KUawxEIYgCoovmHlERDMecfyuzGpzvQWgypImbWIUs7frovzWP/0tDivw5PKWnLvpxXKmIJf71qfSbfSDG8IVdbQtFTMXuRq1TWE7SOl9L5NOY60mN2FcrAGo1mXYR7vCbPXJR5KoNSTFmMPLvcbOPTu8ZC4LxTKWXpkb7CyJmL/yxZ81isKx7i69/hz2+avXjNLYENh4i3Vvc6K9ncplpra5giJz9iulqUoUPyoReTz3leFWiXir5B1VSUhy3GOOHityEJA3hX6JoChIOSzLWZ895wC4XJGUukuXBAm2zoj3vV6Se0OxZ0TqBeVGPz4SEpFety0nkk0xQ0euNFuLYi0sK1asnBqxDywrVqycGlkCCaeMLEwZp5hGJu5Gur5qg4pYjIk++BjbirnxOqOE/cJ1o7Sj20YZPRoCiMasHqDpWGQNxyYN6SAv1pHxNRDj5HSxGukDZiChdnZR/jNtFAqCIKa8zNQ+6Pje8md6ljebFCVgxUycchhGmlihgc2LckYrqxKA++DuJwAc2s8xSSYU4RbyfBzwE05b6yEYCa2xjmRSLgKoMkx5/rxgww8/YR1GprhSJjemHc7OpFhZkdk6KYno+gMA00gOd+GCgLufUJrzMlOQVBSemHKUkP8ePJJY832SC2rRfodT6uXRWy7CiJlTd6R8pLXaALC+Lud+f1foie9QmTAoHDJPLSSbhcfqkFIsaDfq0wHi+QCeufK0+XeFfL5dcgpOWD80Q6cnylki2fF0OdjJi2wSjQmyIQ3ZLHymBCqDiLLl5UuaQ0IHM6zEE6L7eCJh4hFJLHoD+SR02LSGkLDWkBtg66nnjLK5UgbQWJHlqnEBU57X8EhKvro9iZl2hqLE7PLravkOc7WMOyK/1W0jVStWrPwUiH1gWbFi5dTIEkhYaYiZp3mPpnC835MgQo+FLKknNl6ZXWEKRY2yMWanXWccBpIcEkVXygDqbH8CQsIeCyaG7LhTqgj2PJ9IZ5SnrjA/TUNmkQbIRpjpdqMxNeUwU0M6TeYTR7W5iEOSvDhdbp0mLM1xq5oTyCjeWJQaeeWnicIKmczDPSluMB1KHH6eKjbUYn0m1irPtaN5gwqQtXyHgaRi4AGokGu/wRTQ7QeCm7S0KCI2TPz5dESfk9GoTbFUAqB0jT0y+fmFE5n8ZhgQFbyzVsMMS/Q9ILlgn6nFLinrHvEE7/PemxKDTAl2AtKwnxuPAFRYYHTILMQ9Vswocz+YRhtqDQoD0zXSjYwnMmytXAPwmZdfNf+WQwHX/Uxqy3bJGZ8yAqvhVPVLrDKePicjlrsNSJWn6+b6RIKFcO6rvL8pu0npNsVyDUC5JouwvnnOKDFTcDUMff5YLuLNmwKZXaLR1TXZ68ozguvrwRTAwf4+j0soSq/HjRtCjuizWssrsH+VEmDwvsp46Q0dvkYJs5wrdF6shWXFipVTI/aBZcWKlVMjSyBha0MiGgGLCk0VXrE2nyLps5pfa80URGh4osoenGsrEvtrkMPg3nEXQJDI59WKZPoVmER3/UOpYAqKrBPUNmJklY6mDKuR8KBWbwDItCElZ5Jno/Er7QPqERFUtS0SIaEyEc5JwvaWU6KVKGbCbSZnVCTl3gpP+fBQAmG9EakRyg4AXztWMripHGlFXW1meCYsLnOYSlrOiD0JPQyTX5GZvT96/0dUPpRzpzVeIe1crMHBFiOJiYw2YcrfpQvnMVMC1meAyTmhGRoe61u1PAdXIfeEfIQZ+dR9ZiEeD9pG2SbSr69K5eDxkSCy8ZHEFgemQnBbQPeAV7nHINR4QmKQVPBRyHju+aclCHh4Q+jetWOu63gAfN4PDotMA8IxveV6LLUrsl5vRNxaZnb0nGgIPmAisubZehnD6MwTBmtgx5pByls744VIHR/AyrrQ5DdfFEz37o/lN9UhOYdDbgY3oxuEwxUIkDukbPw/3/oagDNrgjQvXdkySo3Ek00y62uPu4x8kBPS9UWMoyqT3zQ2bhxmTRdOvJ2shWXFipVTI0ssLJ/v7YSpRqaopappIMxj8vMHIX2W9PkFZPhVuqXzZ8VwOzqSF8XBcQqgTkaBc2eY3lWUfQ535SXplyQOUG+JzzII+P7RPClNLIoed5PzP+2fqvOPeYLq7VObAvxqkZmAW8poQ+YBuXQ8O+zfE9CKKfNNtXOk1Q8yzsVKFYBHXoqiGGfodpmrQusppYUy1nwWR17gzUbTKJ22mBuGQ0K7zH7ne9Lipc+Xf4vkR/lFpIWlRnG/Kzk1SowlPv085Y1lK9GJiTN5t1ptV6udXRwHM+TAA1rN2sF8dUXskfvbsj7PvC5cWr/8G79plDY5p771Z39qlFsffQDgJkejqQqfpxzFskrKdfUiG95ceU6U778vdULK1mAS1pT6LclLdkS2traMcuO2UDCnGmeg8RWcQNeQ0WmtjWM0OVDtWbAlsNaN+SzbKigxVvaY4pE3/PZtNoiNZLn2yRP9YJt9betijqW1FgeRr/a2JYmy23kE4OJ5CZR98y++ZZRPb0rZ0xfefNMoRdqMMR8Bak9hoZOxk4wxE7ZSo3VRrIVlxYqVUyP2gWXFipVTI0sMVM9TjEAn5WgCYMrKA/Vea7HLiMzCpYrY8DWCFC2mqZEwoMgsj+7+DoCDHSmmP96TXKFzZwQklpgRVmeNSIMl+IhkMt0DMXojlm8Y5OEzzaRE7rQiqy6mdO4qZV2SKsCUSarBDxAkPi5JTN47JjRVGxUOIrUUiYI4El1o2VM2JmXtxAXgaZEQL0hQ0katrJhhMYq6+VdYGtWoNY3S6YkNX2utAUg47IiTvHhJMmsmpIXQNKkmmZd9+n11ETZWZf3rlSqATlcgVZX5cScB57+LKNTtD7v8hPk4BBFrLG351X/0D43y87/8K0bRhLKXX37FKDc++QSzLUs5Pr3Y+O//+b8Y5cMPhYv5lVdfM8rVa8KvsPonAjAPD/Y5TISZihZVUl2ldQFKD/aFjWAGDosSx8tvp8XPtW1PzIuo4Qv9ATo5piISZIjJFEtd/5HwXnz3h+8Z5a03XjRK90gSvrRipn0keVjOnixLWuEV6chX9+7fw0xIbUhyizLb8e4+pFNC64c0NMSomrY+Ukq/glfHDHDWB8uiWAvLihUrp0bsA8uKFSunRpZY8iOChYjG9mgwwkzqk/Ii+AuJLRMqI9qKdW3BUhe0ePZJ6dd4aecQQLEkSHBtVSKATbI1BDtijddr3Jdoca1JnjCSPYxGZPIreAAyzS5h+xm1ltXgnEw1SqgpQnJGWqwznSy34UfsBpQXEaQCOUOaxwViq/5EJqnceJoL5qMEoFRhIlUkUTxXg1u8Rg4JwmtVpgIx5DRhbKnclDVcPX8ZwHc/EESg+V+tVtMoY22xqaEZzlZLf0psi6vT3tnZBeCQgHxtQ0BQTpf4E2Q+SCiirV/G2tqzTyTCRjjPPifpUS8xn6jKQJIi5Zeek6+effp5zOB9h3ApYbLVN77+TaPcuCWRsvNnBSmvMgx99oyc2kcfCz7yvQxApyOQp0dc7CXKPSAHWueyzESfZRtd9jnRBYwXeoimjEdnZAGJCEJTHjHWPD5l8osdAH0WMF08L11tblwXTsTDQ0F5CZFmSAK/2lh+s1caTxnl+IEsnan9ijkBpVpZWyMzJXlB1AWkzVZd8o142sxV6UbSDLMI92SxFpYVK1ZOjdgHlhUrVk6NLIGEHZawZzRoo8kUQMwsu5wXQbEhYYVyJIz67NDVF4xz1BGrcpv9vuLxBMCjtuC+aCgINF4V3DcmFfpGS2zUFiNxTaYURgPS5hG7mRiKBo+0KaOam1pi7mtDR9ZbpAwXDtkKbMDGWXOiTVgdFjBpeYrLlDmXxrYW3Kh5XGLIcnN9HUDsCmTrH7MZGgOvASRps8E6D4+xlTEJ2h02Xitkss3b338PwDvvCyTM6foIGVoM/IWsdtIFGjIcXCYkVDR98OgRgFJVPi+0BbtVSPK9KEvq7vPYWQZygQB5y9sJJ6mVRltbV41yYUPoOlzG/pgbq61CUQoKAAKukuZksogLHjNjq+wm22LtlM+uoitapMVZu14AYEpc2dkX3gufrUO7rL/RWHOFhTjxWIZtd9hB93HRYhRtmKYAKl8uLSnLZBuFhFONr2k4chwDOLorBUYTpRJkSdPmatMoh20paXqwL8rHJJ4/c5G/Tf7233j5eQAhS6b2HgkuBoPyRToWfA1gav9XXjJHyRd5juZJk+Sc7id0UrAWlhUrVk6RLM3DIj/RQGyTbqcLoMvXqVYnqPu0yNIT/UTrJ3Obhf4/ZVNdqxUB7IzknbPP7JXjjmSInFkXD2id7+/QE7tj/768OpQmOPdnuz6ASCegLx0+8kO1sGhYaWdKbc9ZY4qT75ewTHxlj/LnncmZmqJa7pvKS7hZkjM5uymnVi07AHZph47YrEVZooZkcRrQc18iD3WfVt4KU6iOyDL8V995B8DukQyrPFytGluudnlxM1l/bRSuSu5bpfVaa9TwWJoM3cDREkNKziMn/9VaFu1INAUQs+TV4QaafhWSuezqk+L9BQ3nw35bRmNNeN5D1HVnJ6l+7knu2JYlrdVYbcYq6AKXv0Ur0uNL3bBmu3QMj5nyFrIWvUvryWGPmUZNzLQJk8J6h4+wTPRHl3umlfWM47MwKk/IAk2VYu6t5ldxCiDJ2EmXPGLrLfmkHsq+jYZMcr8jk/zwUKzI47/8vlFefEYM29VGFUDKwu8tpvVlnLY+UALaXA5o6ualOWo0yl/jfU9zE9JSJFuxYuX0i31gWbFi5dTIEkioTLUHB4IBb16/iRnvrKIhNek6tHi1EalyAAyHYl5qr0e1YyvlCoC1NTIE0XNfrYqyUWGy0lTGf3R/wG3FqiwRNVSqgrbKUh5EWKGkXdwyoK9dG4X26GKPSDlUoc+10SximWifGIWEE0Kbiie7aCJbysILNyaJkiuwq9PbA7B/LG7OAfPXnKlAEoeYutsXDMIoAiL6leNMwM4hYXtvNAYwJJSbEt17ZL2aMnIyg+ZkDk9ckUS5IcGUo61cXB/AiOGR7W1xPNcbNZwg47Yg/THpp5UJw/TR7LYlznBM2miHkK1MX3gpkAnc+OgdGY0FJdrRNucUdlzMZDYleqZ6FlNByg2WbSWE2yMeeqXG24bNkEajAYBeX27FB7usaNlnjYsr98zWNQFKBV/u7bsPhcygTz7iOUkXilE0KSkn5FJqNq3RySnGFC3K/27qADhHF3vKlL1Of/z4Hqhq32LGWPq8vg+74nS/PGKRXHMdgM/DuEqRTEY5ZSjxmTaYOepZZ47Y3wb9LCS0YsXKT4PYB5YVK1ZOjSyBhNFUSw3kk+PjNoCAbGGanlNmUxYNBUYsKHFon9eaYpSC3RPHTK7pDaYAWi0BRyXmGdVrsm8rZJ19qlPS9jDgHAQJhgShyWQMwGfYokDb1+W+MZFOxjNFXrXDE2FCWfEExrUkD3URt0ZkUygxdsaErNSfhy39RJDyoNcDMCJw9hh/ySBmuWaNOZybl+fasBCKCWsZj2h6uBbZSKZODmuPAHDKuKT2wKmtSLJbnxlzfRLjalzYxLA0qKr9XzV7aFGuv/vXRkmI0Vwm5kzGQwCTMct6xuQLZFyswlFr5HQcdgWE6rLHE+bQcVYGekyYyKZdOZWTo8rksIwJWZ5OiTHZBo/dZLhwHE8AdFg25LHl7bvvvCuzbcit3jy7ZZSDA8lsuvERmamnyyFhlvMu8COFhGRrcB0t0qI7gmeUsfxIT9ZxUzxGC671N3LKEUPMHj0YdbJH9pXogkBVHQiOHwLwGK/UnsT5/JWaXHOpPA2zEsny55bp/CVKyF/oQn2SirWwrFixcmrEPrCsWLFyamRZI1WS5Gkk7pXPvIgZ3oJUe3Qo7UGslQGa0iZ/tSC7wCBdic1Ws8IYQFhg8C5gx1OPliFHc8lrrqGHcok5nzQ8p0RVJssxb2vKKBKI8rRqR0kaHKLdSs5Ip71elkPCkK19qkwOTBMx3ccMF3oMQq1tbHC2emoCSyvlMmZab2ptReqzoEFT8rjIgXb91MVmbGuDDW+ubm0CeLB/aP6tV5syPKk4XJfx0IbsEoTkHSTBYeqx/IWXpl6vAoi4bpW67FIqLV8lAMODB5wlV1vbZGYJgIA34dkVWfzVCkt/tJssM28TBvg8oonRQCnrGJ/yPACOEkhwECX1X+Et4RG2qIchY4ZtsymTWV+XTMuHhx0AD/eF5ODlt94wCsuicMAI4MGekEDssLVqiaHGp68xA/Ybb2NGlP1dZ+uoK8NR3KTrNs9qsEiEkbkOgIxwMqf+YDuoIe80dTVcZLsslwdSnKd0KVkSAcjyDrvzWdk6t9SZd8Vozq3iVs2szpDh5L5Ks2ItLCtWrJwasQ8sK1asnBpZYskrqnMJCTfPbQJot7UwTdBQoy5xpZ1dyZ3rkaRBbdAG8zkd8hNoALHZdAD4OafdfNGZlu9rhDEhmhgw/22qQTRoDMIBEBJSpUQTXkHbZNHi5ZELxD4BdC+GlpYtEYA6EwtLFe2syVxZwi7NxDt3ecsovbZgtM6xzL/klQAEFZl8maukJBNqJXtcU8IL1IjmylWpTHSZqfjsM78BYO9IcjKnrN8cMvA3JB1jSHykSLlA3D3oS6rnmFHCMAwBHLPozPVlSrW6DPIH/+3reFzikRwx0ppBhuRMBNbl7bRSlUEukU4v1jCrImaGoRXOK0O/r4Wb6RRAwg0KmspIIOMxAzMM5AL5gRw64d2ysiokfBvrAufbgymAARewUSPx5LUrcoIsvVSqxatba5y2fFUqaNL1Y6LxMjUhNHlSs0O1u5cy+TkLRXnOYrTx8f983lBBoDFH+WqzJqe8slqbO2KFHhjfhCM5pZh5yDPJ5HpK88fWuWmZpPbZmyd1xDzmVbEWlhUrVk6NLDEf1ljPPWHByt7DfQADZu7oO63Dogpt7F5mLlVeWEBFa2jU+26eqpowpalPE3aUSflujPle1eSaESdTpAvf44FMmb7yE+hxS2W+Tlma4+hbekR7iqU5+Ymc4AgMQy4drb8Se757ZO91QvlEE5fUQoy0z1ChBODSplTDVMqy7/GhpPBoX1uPZ+TrW4YvKG04OuYbb+3sWQDPPCsvfyWDznKfLp2jmhSTc0vQn61FQBSz+1iJN5jQ5NHU+tf4j3O7KGmvu5A9NJA4iRxFAyk/88bnjPLUM6/I6azJSg5GbaNo59aU5FDawF1KWHg/jHjzuMycGkznm+Devrctn0zkvuonTOvjbyQMygAmzH2bcnGqVZlAhaTAai/UAjFVcnPphH4wGdSe4mwfpw8G/eiYdXDr1fzbyIXVhe8xx62gRGI0sYplTbTkiXC2ahQbA01ZrZW2QSW/yksm4cxt4/EHPo2NUTzPArIo1sKyYsXKqRH7wLJixcqpkSWQsH0otKeDriC+9tExAJcZFgVmLc0Yq4RU2s9D+9DQTk66ZHQgPDEpSGO6qDUfX/cNyQsYEmSVWFleq4ubWV25WrUTGuhEj3pKi3fI7iNePjXln+A60Fg96Iw57PI2J0WlfCChcFBjp9hQgEB7KGd08/odOUetzCBO8ZwAQMBkNGWbW2mIT1eRoDLVVZmmNGoLf4NHE1orS/b39gCUalJkr+BaO+Jo5lfAkiYdRJdUs5OUwNe4S0tlWa6EHYm0M82i+FxkZdbWxCK3EABIIjncgC4ILZCaktn3xg2BbN/+tjj1xwyPKBmA8mf4foCZOo8h4zPquZ+SpUNb43zjm38p585hP70v4ZHDHrPqPB9AkYTLMUG9x8vhaOegHNaBc5NFSE7IV8tyAEiEzltR0eLfBQAu8ByoZ0bxsnxRoJt8xAZRY/4wA3ZFcjkZ7/FQQH4MDqtoTieg9Mf5qS0oM/N+7LyyhUQzFWthWbFi5dSIfWBZsWLl1MiyrjmM/WlOfau5AqBUJkkb8cWI6Tl37kkNvVZoF0NWP3BjBXoTbcgaRZjJeMoWaL2SvBfm/INVTdARu5j4LBxv1JoAiuU6R2P0cDrP6DYiLMpYxqHYJ+XKuCc80stsHKLkFvu7Eter1gUtDsdyRsdMaAqZC6YVRQdHHQAeU94aZBlXuBoSgCsRQ0QMpTxqEcN25Yog5cy0eGEsLVsICU0nMoMeoXqRjAsaYJ2yoH8SycaGBNElQkjIxZhmy4NfmCl/yQj04pwOJAWQMWvpnXd/aJT9fQnV1Vj6UyEvyOYq+ydtSCxbqSLVpWBSzLTPwJAAcMAbr9tuG2XrkuRJXb78ihyxKbfNjbtSVfOjDz+SM8kyAHXGmissySoSEiakIdGUQM/TXCdNblwOdtT7MVHSRy0byuGYIqb5Op5sIQAnx104mtIBelx2TVjrHsuyh3SahLxtQrojzC3nsnguvxMWIGF+RCpLsKGWH7kuZhwLP6FGx1pYVqxYOTViH1hWrFg5NbIEEmpfLLXLgrAAYGdPCJ6Vpj1j4qKnJRS0Y7URaUJsmBN6KVkaEgBBUcsj2FiJikYhnbz3kTI6EOjR4lQO7/ZRB0DQI0sfT2dJBDMPMLH6nLAoL1w4ISKjjTw1xTQmy3iv05YjssFRmXA14V5TZnjGaQRgwFBpleUpYGA0dQUsjIZisWv5SImRSo+NvxI2JTNWd85LRxSmDPQjksdrC6+UxnuR5UFaGVNg9Uzq+AASrb/3tKxKcybnpXskoUxtKxvzSvV7PQAxP3/9jTdl8rzu2p+1we5kF84J5i2WeO6MdmU5i2SEmb6/PpnvC7y4ffKyq5RYRKUg6oWrTxjl2ScvzW7p85RzzkJ2ZkvIrK/gtEhuPJeWgeadzom2jE0XCPBmwDYh4dz/M6KxcpMY7CykceqvO81RpLZllVsi1kgiKQPTPOZuRluMYM5PSSuK3Lwp2QJa5FeOm83ukthGqlasWLFixYoVK1asWLFixYoVK1asWLFixYoVK1asWLFixYqVUyf/D3PcGe48X+nJAAAAAElFTkSuQmCC\n",
  1154. "text/plain": [
  1155. "<PIL.Image.Image image mode=RGB size=400x100 at 0x7F1EC53EAB38>"
  1156. ]
  1157. },
  1158. "execution_count": 6,
  1159. "metadata": {},
  1160. "output_type": "execute_result"
  1161. }
  1162. ],
  1163. "source": [
  1164. "dataiter = iter(trainloader)\n",
  1165. "images, labels = dataiter.next() # 返回4张图片及标签\n",
  1166. "print(' '.join('%11s'%classes[labels[j]] for j in range(4)))\n",
  1167. "show(tv.utils.make_grid((images+1)/2)).resize((400,100))"
  1168. ]
  1169. },
  1170. {
  1171. "cell_type": "markdown",
  1172. "metadata": {},
  1173. "source": [
  1174. "### 4.2 定义网络\n",
  1175. "\n",
  1176. "拷贝上面的LeNet网络,修改self.conv1第一个参数为3通道,因CIFAR-10是3通道彩图。"
  1177. ]
  1178. },
  1179. {
  1180. "cell_type": "code",
  1181. "execution_count": 7,
  1182. "metadata": {},
  1183. "outputs": [
  1184. {
  1185. "name": "stdout",
  1186. "output_type": "stream",
  1187. "text": [
  1188. "Net(\n",
  1189. " (conv1): Conv2d(3, 6, kernel_size=(5, 5), stride=(1, 1))\n",
  1190. " (conv2): Conv2d(6, 16, kernel_size=(5, 5), stride=(1, 1))\n",
  1191. " (fc1): Linear(in_features=400, out_features=120, bias=True)\n",
  1192. " (fc2): Linear(in_features=120, out_features=84, bias=True)\n",
  1193. " (fc3): Linear(in_features=84, out_features=10, bias=True)\n",
  1194. ")\n"
  1195. ]
  1196. }
  1197. ],
  1198. "source": [
  1199. "import torch.nn as nn\n",
  1200. "import torch.nn.functional as F\n",
  1201. "\n",
  1202. "class Net(nn.Module):\n",
  1203. " def __init__(self):\n",
  1204. " super(Net, self).__init__()\n",
  1205. " self.conv1 = nn.Conv2d(3, 6, 5) \n",
  1206. " self.conv2 = nn.Conv2d(6, 16, 5) \n",
  1207. " self.fc1 = nn.Linear(16*5*5, 120) \n",
  1208. " self.fc2 = nn.Linear(120, 84)\n",
  1209. " self.fc3 = nn.Linear(84, 10)\n",
  1210. "\n",
  1211. " def forward(self, x): \n",
  1212. " x = F.max_pool2d(F.relu(self.conv1(x)), (2, 2)) \n",
  1213. " x = F.max_pool2d(F.relu(self.conv2(x)), 2) \n",
  1214. " x = x.view(x.size()[0], -1) \n",
  1215. " x = F.relu(self.fc1(x))\n",
  1216. " x = F.relu(self.fc2(x))\n",
  1217. " x = self.fc3(x) \n",
  1218. " return x\n",
  1219. "\n",
  1220. "\n",
  1221. "net = Net()\n",
  1222. "print(net)"
  1223. ]
  1224. },
  1225. {
  1226. "cell_type": "markdown",
  1227. "metadata": {},
  1228. "source": [
  1229. "### 4.3 定义损失函数和优化器(loss和optimizer)"
  1230. ]
  1231. },
  1232. {
  1233. "cell_type": "code",
  1234. "execution_count": 8,
  1235. "metadata": {},
  1236. "outputs": [],
  1237. "source": [
  1238. "from torch import optim\n",
  1239. "criterion = nn.CrossEntropyLoss() # 交叉熵损失函数\n",
  1240. "optimizer = optim.SGD(net.parameters(), lr=0.001, momentum=0.9)"
  1241. ]
  1242. },
  1243. {
  1244. "cell_type": "markdown",
  1245. "metadata": {},
  1246. "source": [
  1247. "### 4.4 训练网络\n",
  1248. "\n",
  1249. "所有网络的训练流程都是类似的,不断地执行如下流程:\n",
  1250. "\n",
  1251. "- 输入数据\n",
  1252. "- 前向传播+反向传播\n",
  1253. "- 更新参数\n"
  1254. ]
  1255. },
  1256. {
  1257. "cell_type": "code",
  1258. "execution_count": 10,
  1259. "metadata": {},
  1260. "outputs": [
  1261. {
  1262. "name": "stderr",
  1263. "output_type": "stream",
  1264. "text": [
  1265. "/home/bushuhui/.virtualenv/dl/lib/python3.5/site-packages/ipykernel_launcher.py:25: UserWarning: invalid index of a 0-dim tensor. This will be an error in PyTorch 0.5. Use tensor.item() to convert a 0-dim tensor to a Python number\n"
  1266. ]
  1267. },
  1268. {
  1269. "name": "stdout",
  1270. "output_type": "stream",
  1271. "text": [
  1272. "[1, 2000] loss: 2.210\n",
  1273. "[1, 4000] loss: 1.958\n",
  1274. "[1, 6000] loss: 1.723\n",
  1275. "[1, 8000] loss: 1.590\n",
  1276. "[1, 10000] loss: 1.532\n",
  1277. "[1, 12000] loss: 1.467\n",
  1278. "[2, 2000] loss: 1.408\n",
  1279. "[2, 4000] loss: 1.374\n",
  1280. "[2, 6000] loss: 1.345\n",
  1281. "[2, 8000] loss: 1.331\n",
  1282. "[2, 10000] loss: 1.338\n",
  1283. "[2, 12000] loss: 1.286\n",
  1284. "Finished Training\n"
  1285. ]
  1286. }
  1287. ],
  1288. "source": [
  1289. "from torch.autograd import Variable\n",
  1290. "\n",
  1291. "t.set_num_threads(8)\n",
  1292. "for epoch in range(2): \n",
  1293. " \n",
  1294. " running_loss = 0.0\n",
  1295. " for i, data in enumerate(trainloader, 0):\n",
  1296. " \n",
  1297. " # 输入数据\n",
  1298. " inputs, labels = data\n",
  1299. " inputs, labels = Variable(inputs), Variable(labels)\n",
  1300. " \n",
  1301. " # 梯度清零\n",
  1302. " optimizer.zero_grad()\n",
  1303. " \n",
  1304. " # forward + backward \n",
  1305. " outputs = net(inputs)\n",
  1306. " loss = criterion(outputs, labels)\n",
  1307. " loss.backward() \n",
  1308. " \n",
  1309. " # 更新参数 \n",
  1310. " optimizer.step()\n",
  1311. " \n",
  1312. " # 打印log信息\n",
  1313. " running_loss += loss.data[0]\n",
  1314. " if i % 2000 == 1999: # 每2000个batch打印一下训练状态\n",
  1315. " print('[%d, %5d] loss: %.3f' \\\n",
  1316. " % (epoch+1, i+1, running_loss / 2000))\n",
  1317. " running_loss = 0.0\n",
  1318. "print('Finished Training')"
  1319. ]
  1320. },
  1321. {
  1322. "cell_type": "markdown",
  1323. "metadata": {},
  1324. "source": [
  1325. "此处仅训练了2个epoch(遍历完一遍数据集称为一个epoch),来看看网络有没有效果。将测试图片输入到网络中,计算它的label,然后与实际的label进行比较。"
  1326. ]
  1327. },
  1328. {
  1329. "cell_type": "code",
  1330. "execution_count": null,
  1331. "metadata": {
  1332. "lines_to_next_cell": 2
  1333. },
  1334. "outputs": [],
  1335. "source": [
  1336. "dataiter = iter(testloader)\n",
  1337. "images, labels = dataiter.next() # 一个batch返回4张图片\n",
  1338. "print('实际的label: ', ' '.join(\\\n",
  1339. " '%08s'%classes[labels[j]] for j in range(4)))\n",
  1340. "show(tv.utils.make_grid(images / 2 - 0.5)).resize((400,100))"
  1341. ]
  1342. },
  1343. {
  1344. "cell_type": "markdown",
  1345. "metadata": {},
  1346. "source": [
  1347. "接着计算网络预测的label:"
  1348. ]
  1349. },
  1350. {
  1351. "cell_type": "code",
  1352. "execution_count": 12,
  1353. "metadata": {},
  1354. "outputs": [
  1355. {
  1356. "name": "stdout",
  1357. "output_type": "stream",
  1358. "text": [
  1359. "预测结果: cat ship ship ship\n"
  1360. ]
  1361. }
  1362. ],
  1363. "source": [
  1364. "# 计算图片在每个类别上的分数\n",
  1365. "outputs = net(Variable(images))\n",
  1366. "# 得分最高的那个类\n",
  1367. "_, predicted = t.max(outputs.data, 1)\n",
  1368. "\n",
  1369. "print('预测结果: ', ' '.join('%5s'\\\n",
  1370. " % classes[predicted[j]] for j in range(4)))"
  1371. ]
  1372. },
  1373. {
  1374. "cell_type": "markdown",
  1375. "metadata": {},
  1376. "source": [
  1377. "已经可以看出效果,准确率50%,但这只是一部分的图片,再来看看在整个测试集上的效果。"
  1378. ]
  1379. },
  1380. {
  1381. "cell_type": "code",
  1382. "execution_count": 13,
  1383. "metadata": {},
  1384. "outputs": [
  1385. {
  1386. "name": "stdout",
  1387. "output_type": "stream",
  1388. "text": [
  1389. "10000张测试集中的准确率为: 54 %\n"
  1390. ]
  1391. }
  1392. ],
  1393. "source": [
  1394. "correct = 0 # 预测正确的图片数\n",
  1395. "total = 0 # 总共的图片数\n",
  1396. "for data in testloader:\n",
  1397. " images, labels = data\n",
  1398. " outputs = net(Variable(images))\n",
  1399. " _, predicted = t.max(outputs.data, 1)\n",
  1400. " total += labels.size(0)\n",
  1401. " correct += (predicted == labels).sum()\n",
  1402. "\n",
  1403. "print('10000张测试集中的准确率为: %d %%' % (100 * correct / total))"
  1404. ]
  1405. },
  1406. {
  1407. "cell_type": "markdown",
  1408. "metadata": {},
  1409. "source": [
  1410. "训练的准确率远比随机猜测(准确率10%)好,证明网络确实学到了东西。"
  1411. ]
  1412. },
  1413. {
  1414. "cell_type": "markdown",
  1415. "metadata": {},
  1416. "source": [
  1417. "### 4.5 在GPU训练\n",
  1418. "就像之前把Tensor从CPU转到GPU一样,模型也可以类似地从CPU转到GPU。"
  1419. ]
  1420. },
  1421. {
  1422. "cell_type": "code",
  1423. "execution_count": 44,
  1424. "metadata": {},
  1425. "outputs": [],
  1426. "source": [
  1427. "if t.cuda.is_available():\n",
  1428. " net.cuda()\n",
  1429. " images = images.cuda()\n",
  1430. " labels = labels.cuda()\n",
  1431. " output = net(Variable(images))\n",
  1432. " loss= criterion(output,Variable(labels))"
  1433. ]
  1434. },
  1435. {
  1436. "cell_type": "markdown",
  1437. "metadata": {},
  1438. "source": [
  1439. "如果发现在GPU上并没有比CPU提速很多,实际上是因为网络比较小,GPU没有完全发挥自己的真正实力。"
  1440. ]
  1441. },
  1442. {
  1443. "cell_type": "markdown",
  1444. "metadata": {},
  1445. "source": [
  1446. "对PyTorch的基础介绍至此结束。总结一下,本节主要包含以下内容。\n",
  1447. "\n",
  1448. "1. Tensor: 类似Numpy数组的数据结构,与Numpy接口类似,可方便地互相转换。\n",
  1449. "2. autograd/Variable: Variable封装了Tensor,并提供自动求导功能。\n",
  1450. "3. nn: 专门为神经网络设计的接口,提供了很多有用的功能(神经网络层,损失函数,优化器等)。\n",
  1451. "4. 神经网络训练: 以CIFAR-10分类为例演示了神经网络的训练流程,包括数据加载、网络搭建、训练及测试。\n",
  1452. "\n",
  1453. "通过本节的学习,相信读者可以体会出PyTorch具有接口简单、使用灵活等特点。从下一章开始,本书将深入系统地讲解PyTorch的各部分知识。"
  1454. ]
  1455. }
  1456. ],
  1457. "metadata": {
  1458. "kernelspec": {
  1459. "display_name": "Python 3",
  1460. "language": "python",
  1461. "name": "python3"
  1462. },
  1463. "language_info": {
  1464. "codemirror_mode": {
  1465. "name": "ipython",
  1466. "version": 3
  1467. },
  1468. "file_extension": ".py",
  1469. "mimetype": "text/x-python",
  1470. "name": "python",
  1471. "nbconvert_exporter": "python",
  1472. "pygments_lexer": "ipython3",
  1473. "version": "3.5.2"
  1474. }
  1475. },
  1476. "nbformat": 4,
  1477. "nbformat_minor": 2
  1478. }

机器学习越来越多应用到飞行器、机器人等领域,其目的是利用计算机实现类似人类的智能,从而实现装备的智能化与无人化。本课程旨在引导学生掌握机器学习的基本知识、典型方法与技术,通过具体的应用案例激发学生对该学科的兴趣,鼓励学生能够从人工智能的角度来分析、解决飞行器、机器人所面临的问题和挑战。本课程主要内容包括Python编程基础,机器学习模型,无监督学习、监督学习、深度学习基础知识与实现,并学习如何利用机器学习解决实际问题,从而全面提升自我的《综合能力》。