You can not select more than 25 topics Topics must start with a chinese character,a letter or number, can include dashes ('-') and can be up to 35 characters long.

ref_Autograd.ipynb 66 kB

123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331332333334335336337338339340341342343344345346347348349350351352353354355356357358359360361362363364365366367368369370371372373374375376377378379380381382383384385386387388389390391392393394395396397398399400401402403404405406407408409410411412413414415416417418419420421422423424425426427428429430431432433434435436437438439440441442443444445446447448449450451452453454455456457458459460461462463464465466467468469470471472473474475476477478479480481482483484485486487488489490491492493494495496497498499500501502503504505506507508509510511512513514515516517518519520521522523524525526527528529530531532533534535536537538539540541542543544545546547548549550551552553554555556557558559560561562563564565566567568569570571572573574575576577578579580581582583584585586587588589590591592593594595596597598599600601602603604605606607608609610611612613614615616617618619620621622623624625626627628629630631632633634635636637638639640641642643644645646647648649650651652653654655656657658659660661662663664665666667668669670671672673674675676677678679680681682683684685686687688689690691692693694695696697698699700701702703704705706707708709710711712713714715716717718719720721722723724725726727728729730731732733734735736737738739740741742743744745746747748749750751752753754755756757758759760761762763764765766767768769770771772773774775776777778779780781782783784785786787788789790791792793794795796797798799800801802803804805806807808809810811812813814815816817818819820821822823824825826827828829830831832833834835836837838839840841842843844845846847848849850851852853854855856857858859860861862863864865866867868869870871872873874875876877878879880881882883884885886887888889890891892893894895896897898899900901902903904905906907908909910911912913914915916917918919920921922923924925926927928929930931932933934935936937938939940941942943944945946947948949950951952953954955956957958959960961962963964965966967968969970971972973974975976977978979980981982983984985986987988989990991992993994995996997998999100010011002100310041005100610071008100910101011101210131014101510161017101810191020102110221023102410251026102710281029103010311032103310341035103610371038103910401041104210431044104510461047104810491050105110521053105410551056105710581059106010611062106310641065106610671068106910701071107210731074107510761077107810791080108110821083108410851086108710881089109010911092109310941095109610971098109911001101110211031104110511061107110811091110111111121113111411151116111711181119112011211122112311241125112611271128112911301131113211331134113511361137113811391140114111421143114411451146114711481149115011511152115311541155115611571158115911601161116211631164116511661167116811691170117111721173117411751176117711781179118011811182118311841185118611871188118911901191119211931194119511961197119811991200120112021203120412051206120712081209121012111212121312141215121612171218121912201221122212231224122512261227122812291230123112321233123412351236123712381239124012411242124312441245124612471248124912501251125212531254125512561257125812591260126112621263126412651266126712681269127012711272127312741275127612771278127912801281128212831284128512861287128812891290129112921293129412951296129712981299130013011302130313041305130613071308130913101311131213131314131513161317131813191320132113221323132413251326132713281329133013311332133313341335133613371338133913401341134213431344134513461347134813491350135113521353135413551356135713581359136013611362136313641365136613671368136913701371137213731374137513761377137813791380138113821383138413851386138713881389139013911392139313941395139613971398139914001401140214031404140514061407140814091410141114121413141414151416141714181419142014211422142314241425142614271428142914301431143214331434143514361437143814391440144114421443144414451446144714481449145014511452145314541455145614571458145914601461146214631464146514661467146814691470147114721473147414751476147714781479148014811482148314841485148614871488148914901491149214931494149514961497149814991500150115021503150415051506150715081509151015111512151315141515151615171518151915201521152215231524152515261527152815291530153115321533153415351536153715381539154015411542154315441545154615471548154915501551155215531554
  1. {
  2. "cells": [
  3. {
  4. "cell_type": "markdown",
  5. "metadata": {},
  6. "source": [
  7. "## 3.2 autograd\n",
  8. "\n",
  9. "用Tensor训练网络很方便,但从上一小节最后的线性回归例子来看,反向传播过程需要手动实现。这对于像线性回归等较为简单的模型来说,还可以应付,但实际使用中经常出现非常复杂的网络结构,此时如果手动实现反向传播,不仅费时费力,而且容易出错,难以检查。torch.autograd就是为方便用户使用,而专门开发的一套自动求导引擎,它能够根据输入和前向传播过程自动构建计算图,并执行反向传播。\n",
  10. "\n",
  11. "计算图(Computation Graph)是现代深度学习框架如PyTorch和TensorFlow等的核心,其为高效自动求导算法——反向传播(Back Propogation)提供了理论支持,了解计算图在实际写程序过程中会有极大的帮助。本节将涉及一些基础的计算图知识,但并不要求读者事先对此有深入的了解。关于计算图的基础知识推荐阅读Christopher Olah的文章[^1]。\n",
  12. "\n",
  13. "[^1]: http://colah.github.io/posts/2015-08-Backprop/\n",
  14. "\n",
  15. "\n",
  16. "### 3.2.1 Variable\n",
  17. "PyTorch在autograd模块中实现了计算图的相关功能,autograd中的核心数据结构是Variable。Variable封装了tensor,并记录对tensor的操作记录用来构建计算图。Variable的数据结构如图3-2所示,主要包含三个属性:\n",
  18. "\n",
  19. "- `data`:保存variable所包含的tensor\n",
  20. "- `grad`:保存`data`对应的梯度,`grad`也是variable,而不是tensor,它与`data`形状一致。 \n",
  21. "- `grad_fn`: 指向一个`Function`,记录tensor的操作历史,即它是什么操作的输出,用来构建计算图。如果某一个变量是由用户创建,则它为叶子节点,对应的grad_fn等于None。\n",
  22. "\n",
  23. "\n",
  24. "![图3-2:Variable数据结构](imgs/autograd_Variable.png)\n",
  25. "\n",
  26. "Variable的构造函数需要传入tensor,同时有两个可选参数:\n",
  27. "- `requires_grad (bool)`:是否需要对该variable进行求导\n",
  28. "- `volatile (bool)`:意为”挥发“,设置为True,则构建在该variable之上的图都不会求导,专为推理阶段设计\n",
  29. "\n",
  30. "Variable提供了大部分tensor支持的函数,但其不支持部分`inplace`函数,因这些函数会修改tensor自身,而在反向传播中,variable需要缓存原来的tensor来计算反向传播梯度。如果想要计算各个Variable的梯度,只需调用根节点variable的`backward`方法,autograd会自动沿着计算图反向传播,计算每一个叶子节点的梯度。\n",
  31. "\n",
  32. "`variable.backward(grad_variables=None, retain_graph=None, create_graph=None)`主要有如下参数:\n",
  33. "\n",
  34. "- grad_variables:形状与variable一致,对于`y.backward()`,grad_variables相当于链式法则${dz \\over dx}={dz \\over dy} \\times {dy \\over dx}$中的$\\textbf {dz} \\over \\textbf {dy}$。grad_variables也可以是tensor或序列。\n",
  35. "- retain_graph:反向传播需要缓存一些中间结果,反向传播之后,这些缓存就被清空,可通过指定这个参数不清空缓存,用来多次反向传播。\n",
  36. "- create_graph:对反向传播过程再次构建计算图,可通过`backward of backward`实现求高阶导数。\n",
  37. "\n",
  38. "上述描述可能比较抽象,如果没有看懂,不用着急,会在本节后半部分详细介绍,下面先看几个例子。"
  39. ]
  40. },
  41. {
  42. "cell_type": "code",
  43. "execution_count": 1,
  44. "metadata": {},
  45. "outputs": [],
  46. "source": [
  47. "from __future__ import print_function\n",
  48. "import torch as t\n",
  49. "from torch.autograd import Variable as V"
  50. ]
  51. },
  52. {
  53. "cell_type": "code",
  54. "execution_count": 2,
  55. "metadata": {},
  56. "outputs": [
  57. {
  58. "data": {
  59. "text/plain": [
  60. "Variable containing:\n",
  61. " 1 1 1 1\n",
  62. " 1 1 1 1\n",
  63. " 1 1 1 1\n",
  64. "[torch.FloatTensor of size 3x4]"
  65. ]
  66. },
  67. "execution_count": 2,
  68. "metadata": {},
  69. "output_type": "execute_result"
  70. }
  71. ],
  72. "source": [
  73. "# 从tensor中创建variable,指定需要求导\n",
  74. "a = V(t.ones(3,4), requires_grad = True) \n",
  75. "a"
  76. ]
  77. },
  78. {
  79. "cell_type": "code",
  80. "execution_count": 3,
  81. "metadata": {},
  82. "outputs": [
  83. {
  84. "data": {
  85. "text/plain": [
  86. "Variable containing:\n",
  87. " 0 0 0 0\n",
  88. " 0 0 0 0\n",
  89. " 0 0 0 0\n",
  90. "[torch.FloatTensor of size 3x4]"
  91. ]
  92. },
  93. "execution_count": 3,
  94. "metadata": {},
  95. "output_type": "execute_result"
  96. }
  97. ],
  98. "source": [
  99. "b = V(t.zeros(3,4))\n",
  100. "b"
  101. ]
  102. },
  103. {
  104. "cell_type": "code",
  105. "execution_count": 4,
  106. "metadata": {},
  107. "outputs": [
  108. {
  109. "data": {
  110. "text/plain": [
  111. "Variable containing:\n",
  112. " 1 1 1 1\n",
  113. " 1 1 1 1\n",
  114. " 1 1 1 1\n",
  115. "[torch.FloatTensor of size 3x4]"
  116. ]
  117. },
  118. "execution_count": 4,
  119. "metadata": {},
  120. "output_type": "execute_result"
  121. }
  122. ],
  123. "source": [
  124. "# 函数的使用与tensor一致\n",
  125. "# 也可写成c = a + b\n",
  126. "c = a.add(b)\n",
  127. "c"
  128. ]
  129. },
  130. {
  131. "cell_type": "code",
  132. "execution_count": 5,
  133. "metadata": {},
  134. "outputs": [],
  135. "source": [
  136. "d = c.sum()\n",
  137. "d.backward() # 反向传播"
  138. ]
  139. },
  140. {
  141. "cell_type": "code",
  142. "execution_count": 6,
  143. "metadata": {
  144. "scrolled": true
  145. },
  146. "outputs": [
  147. {
  148. "data": {
  149. "text/plain": [
  150. "(12.0, Variable containing:\n",
  151. " 12\n",
  152. " [torch.FloatTensor of size 1])"
  153. ]
  154. },
  155. "execution_count": 6,
  156. "metadata": {},
  157. "output_type": "execute_result"
  158. }
  159. ],
  160. "source": [
  161. "# 注意二者的区别\n",
  162. "# 前者在取data后变为tensor,而后从tensor计算sum得到float\n",
  163. "# 后者计算sum后仍然是Variable\n",
  164. "c.data.sum(), c.sum()"
  165. ]
  166. },
  167. {
  168. "cell_type": "code",
  169. "execution_count": 7,
  170. "metadata": {
  171. "scrolled": false
  172. },
  173. "outputs": [
  174. {
  175. "data": {
  176. "text/plain": [
  177. "Variable containing:\n",
  178. " 1 1 1 1\n",
  179. " 1 1 1 1\n",
  180. " 1 1 1 1\n",
  181. "[torch.FloatTensor of size 3x4]"
  182. ]
  183. },
  184. "execution_count": 7,
  185. "metadata": {},
  186. "output_type": "execute_result"
  187. }
  188. ],
  189. "source": [
  190. "a.grad"
  191. ]
  192. },
  193. {
  194. "cell_type": "code",
  195. "execution_count": 8,
  196. "metadata": {
  197. "scrolled": false
  198. },
  199. "outputs": [
  200. {
  201. "data": {
  202. "text/plain": [
  203. "(True, False, True)"
  204. ]
  205. },
  206. "execution_count": 8,
  207. "metadata": {},
  208. "output_type": "execute_result"
  209. }
  210. ],
  211. "source": [
  212. "# 此处虽然没有指定c需要求导,但c依赖于a,而a需要求导,\n",
  213. "# 因此c的requires_grad属性会自动设为True\n",
  214. "a.requires_grad, b.requires_grad, c.requires_grad"
  215. ]
  216. },
  217. {
  218. "cell_type": "code",
  219. "execution_count": 9,
  220. "metadata": {},
  221. "outputs": [
  222. {
  223. "data": {
  224. "text/plain": [
  225. "(True, True, False)"
  226. ]
  227. },
  228. "execution_count": 9,
  229. "metadata": {},
  230. "output_type": "execute_result"
  231. }
  232. ],
  233. "source": [
  234. "# 由用户创建的variable属于叶子节点,对应的grad_fn是None\n",
  235. "a.is_leaf, b.is_leaf, c.is_leaf"
  236. ]
  237. },
  238. {
  239. "cell_type": "code",
  240. "execution_count": 10,
  241. "metadata": {},
  242. "outputs": [
  243. {
  244. "data": {
  245. "text/plain": [
  246. "True"
  247. ]
  248. },
  249. "execution_count": 10,
  250. "metadata": {},
  251. "output_type": "execute_result"
  252. }
  253. ],
  254. "source": [
  255. "# c.grad是None, 因c不是叶子节点,它的梯度是用来计算a的梯度\n",
  256. "# 所以虽然c.requires_grad = True,但其梯度计算完之后即被释放\n",
  257. "c.grad is None"
  258. ]
  259. },
  260. {
  261. "cell_type": "markdown",
  262. "metadata": {},
  263. "source": [
  264. "计算下面这个函数的导函数:\n",
  265. "$$\n",
  266. "y = x^2\\bullet e^x\n",
  267. "$$\n",
  268. "它的导函数是:\n",
  269. "$$\n",
  270. "{dy \\over dx} = 2x\\bullet e^x + x^2 \\bullet e^x\n",
  271. "$$\n",
  272. "来看看autograd的计算结果与手动求导计算结果的误差。"
  273. ]
  274. },
  275. {
  276. "cell_type": "code",
  277. "execution_count": 11,
  278. "metadata": {},
  279. "outputs": [],
  280. "source": [
  281. "def f(x):\n",
  282. " '''计算y'''\n",
  283. " y = x**2 * t.exp(x)\n",
  284. " return y\n",
  285. "\n",
  286. "def gradf(x):\n",
  287. " '''手动求导函数'''\n",
  288. " dx = 2*x*t.exp(x) + x**2*t.exp(x)\n",
  289. " return dx"
  290. ]
  291. },
  292. {
  293. "cell_type": "code",
  294. "execution_count": 12,
  295. "metadata": {},
  296. "outputs": [
  297. {
  298. "data": {
  299. "text/plain": [
  300. "Variable containing:\n",
  301. " 7.8454 0.4475 5.5884 0.1406\n",
  302. " 0.4044 0.5008 0.4989 13.3268\n",
  303. " 0.3547 0.0623 1.0497 4.2674\n",
  304. "[torch.FloatTensor of size 3x4]"
  305. ]
  306. },
  307. "execution_count": 12,
  308. "metadata": {},
  309. "output_type": "execute_result"
  310. }
  311. ],
  312. "source": [
  313. "x = V(t.randn(3,4), requires_grad = True)\n",
  314. "y = f(x)\n",
  315. "y"
  316. ]
  317. },
  318. {
  319. "cell_type": "code",
  320. "execution_count": 13,
  321. "metadata": {},
  322. "outputs": [
  323. {
  324. "data": {
  325. "text/plain": [
  326. "Variable containing:\n",
  327. " 19.0962 2.1796 14.4631 1.0203\n",
  328. " -0.3276 0.1172 -0.1745 29.7573\n",
  329. " 1.8619 -0.3699 3.9812 11.6386\n",
  330. "[torch.FloatTensor of size 3x4]"
  331. ]
  332. },
  333. "execution_count": 13,
  334. "metadata": {},
  335. "output_type": "execute_result"
  336. }
  337. ],
  338. "source": [
  339. "y.backward(t.ones(y.size())) # grad_variables形状与y一致\n",
  340. "x.grad"
  341. ]
  342. },
  343. {
  344. "cell_type": "code",
  345. "execution_count": 14,
  346. "metadata": {},
  347. "outputs": [
  348. {
  349. "data": {
  350. "text/plain": [
  351. "Variable containing:\n",
  352. " 19.0962 2.1796 14.4631 1.0203\n",
  353. " -0.3276 0.1172 -0.1745 29.7573\n",
  354. " 1.8619 -0.3699 3.9812 11.6386\n",
  355. "[torch.FloatTensor of size 3x4]"
  356. ]
  357. },
  358. "execution_count": 14,
  359. "metadata": {},
  360. "output_type": "execute_result"
  361. }
  362. ],
  363. "source": [
  364. "# autograd的计算结果与利用公式手动计算的结果一致\n",
  365. "gradf(x) "
  366. ]
  367. },
  368. {
  369. "cell_type": "markdown",
  370. "metadata": {},
  371. "source": [
  372. "### 3.2.2 计算图\n",
  373. "\n",
  374. "PyTorch中`autograd`的底层采用了计算图,计算图是一种特殊的有向无环图(DAG),用于记录算子与变量之间的关系。一般用矩形表示算子,椭圆形表示变量。如表达式$ \\textbf {z = wx + b}$可分解为$\\textbf{y = wx}$和$\\textbf{z = y + b}$,其计算图如图3-3所示,图中`MUL`,`ADD`都是算子,$\\textbf{w}$,$\\textbf{x}$,$\\textbf{b}$即变量。\n",
  375. "\n",
  376. "![图3-3:computation graph](imgs/com_graph.svg)"
  377. ]
  378. },
  379. {
  380. "cell_type": "markdown",
  381. "metadata": {},
  382. "source": [
  383. "如上有向无环图中,$\\textbf{X}$和$\\textbf{b}$是叶子节点(leaf node),这些节点通常由用户自己创建,不依赖于其他变量。$\\textbf{z}$称为根节点,是计算图的最终目标。利用链式法则很容易求得各个叶子节点的梯度。\n",
  384. "$${\\partial z \\over \\partial b} = 1,\\space {\\partial z \\over \\partial y} = 1\\\\\n",
  385. "{\\partial y \\over \\partial w }= x,{\\partial y \\over \\partial x}= w\\\\\n",
  386. "{\\partial z \\over \\partial x}= {\\partial z \\over \\partial y} {\\partial y \\over \\partial x}=1 * w\\\\\n",
  387. "{\\partial z \\over \\partial w}= {\\partial z \\over \\partial y} {\\partial y \\over \\partial w}=1 * x\\\\\n",
  388. "$$\n",
  389. "而有了计算图,上述链式求导即可利用计算图的反向传播自动完成,其过程如图3-4所示。\n",
  390. "\n",
  391. "![图3-4:计算图的反向传播](imgs/com_graph_backward.svg)\n",
  392. "\n",
  393. "\n",
  394. "在PyTorch实现中,autograd会随着用户的操作,记录生成当前variable的所有操作,并由此建立一个有向无环图。用户每进行一个操作,相应的计算图就会发生改变。更底层的实现中,图中记录了操作`Function`,每一个变量在图中的位置可通过其`grad_fn`属性在图中的位置推测得到。在反向传播过程中,autograd沿着这个图从当前变量(根节点$\\textbf{z}$)溯源,可以利用链式求导法则计算所有叶子节点的梯度。每一个前向传播操作的函数都有与之对应的反向传播函数用来计算输入的各个variable的梯度,这些函数的函数名通常以`Backward`结尾。下面结合代码学习autograd的实现细节。"
  395. ]
  396. },
  397. {
  398. "cell_type": "code",
  399. "execution_count": 15,
  400. "metadata": {},
  401. "outputs": [],
  402. "source": [
  403. "x = V(t.ones(1))\n",
  404. "b = V(t.rand(1), requires_grad = True)\n",
  405. "w = V(t.rand(1), requires_grad = True)\n",
  406. "y = w * x # 等价于y=w.mul(x)\n",
  407. "z = y + b # 等价于z=y.add(b)"
  408. ]
  409. },
  410. {
  411. "cell_type": "code",
  412. "execution_count": 16,
  413. "metadata": {
  414. "scrolled": true
  415. },
  416. "outputs": [
  417. {
  418. "data": {
  419. "text/plain": [
  420. "(False, True, True)"
  421. ]
  422. },
  423. "execution_count": 16,
  424. "metadata": {},
  425. "output_type": "execute_result"
  426. }
  427. ],
  428. "source": [
  429. "x.requires_grad, b.requires_grad, w.requires_grad"
  430. ]
  431. },
  432. {
  433. "cell_type": "code",
  434. "execution_count": 17,
  435. "metadata": {},
  436. "outputs": [
  437. {
  438. "data": {
  439. "text/plain": [
  440. "True"
  441. ]
  442. },
  443. "execution_count": 17,
  444. "metadata": {},
  445. "output_type": "execute_result"
  446. }
  447. ],
  448. "source": [
  449. "# 虽然未指定y.requires_grad为True,但由于y依赖于需要求导的w\n",
  450. "# 故而y.requires_grad为True\n",
  451. "y.requires_grad"
  452. ]
  453. },
  454. {
  455. "cell_type": "code",
  456. "execution_count": 18,
  457. "metadata": {},
  458. "outputs": [
  459. {
  460. "data": {
  461. "text/plain": [
  462. "(True, True, True)"
  463. ]
  464. },
  465. "execution_count": 18,
  466. "metadata": {},
  467. "output_type": "execute_result"
  468. }
  469. ],
  470. "source": [
  471. "x.is_leaf, w.is_leaf, b.is_leaf"
  472. ]
  473. },
  474. {
  475. "cell_type": "code",
  476. "execution_count": 19,
  477. "metadata": {},
  478. "outputs": [
  479. {
  480. "data": {
  481. "text/plain": [
  482. "(False, False)"
  483. ]
  484. },
  485. "execution_count": 19,
  486. "metadata": {},
  487. "output_type": "execute_result"
  488. }
  489. ],
  490. "source": [
  491. "y.is_leaf, z.is_leaf"
  492. ]
  493. },
  494. {
  495. "cell_type": "code",
  496. "execution_count": 20,
  497. "metadata": {},
  498. "outputs": [
  499. {
  500. "data": {
  501. "text/plain": [
  502. "<AddBackward1 at 0x7f7ea53d8c88>"
  503. ]
  504. },
  505. "execution_count": 20,
  506. "metadata": {},
  507. "output_type": "execute_result"
  508. }
  509. ],
  510. "source": [
  511. "# grad_fn可以查看这个variable的反向传播函数,\n",
  512. "# z是add函数的输出,所以它的反向传播函数是AddBackward\n",
  513. "z.grad_fn "
  514. ]
  515. },
  516. {
  517. "cell_type": "code",
  518. "execution_count": 21,
  519. "metadata": {
  520. "scrolled": true
  521. },
  522. "outputs": [
  523. {
  524. "data": {
  525. "text/plain": [
  526. "((<MulBackward1 at 0x7f7ea53d8cf8>, 0),\n",
  527. " (<AccumulateGrad at 0x7f7ea53d8e80>, 0))"
  528. ]
  529. },
  530. "execution_count": 21,
  531. "metadata": {},
  532. "output_type": "execute_result"
  533. }
  534. ],
  535. "source": [
  536. "# next_functions保存grad_fn的输入,是一个tuple,tuple的元素也是Function\n",
  537. "# 第一个是y,它是乘法(mul)的输出,所以对应的反向传播函数y.grad_fn是MulBackward\n",
  538. "# 第二个是b,它是叶子节点,由用户创建,grad_fn为None,但是有\n",
  539. "z.grad_fn.next_functions "
  540. ]
  541. },
  542. {
  543. "cell_type": "code",
  544. "execution_count": 22,
  545. "metadata": {},
  546. "outputs": [
  547. {
  548. "data": {
  549. "text/plain": [
  550. "True"
  551. ]
  552. },
  553. "execution_count": 22,
  554. "metadata": {},
  555. "output_type": "execute_result"
  556. }
  557. ],
  558. "source": [
  559. "# variable的grad_fn对应着和图中的function相对应\n",
  560. "z.grad_fn.next_functions[0][0] == y.grad_fn"
  561. ]
  562. },
  563. {
  564. "cell_type": "code",
  565. "execution_count": 23,
  566. "metadata": {
  567. "scrolled": true
  568. },
  569. "outputs": [
  570. {
  571. "data": {
  572. "text/plain": [
  573. "((<AccumulateGrad at 0x7f7ea53d8a58>, 0), (None, 0))"
  574. ]
  575. },
  576. "execution_count": 23,
  577. "metadata": {},
  578. "output_type": "execute_result"
  579. }
  580. ],
  581. "source": [
  582. "# 第一个是w,叶子节点,需要求导,梯度是累加的\n",
  583. "# 第二个是x,叶子节点,不需要求导,所以为None\n",
  584. "y.grad_fn.next_functions"
  585. ]
  586. },
  587. {
  588. "cell_type": "code",
  589. "execution_count": 24,
  590. "metadata": {},
  591. "outputs": [
  592. {
  593. "data": {
  594. "text/plain": [
  595. "(None, None)"
  596. ]
  597. },
  598. "execution_count": 24,
  599. "metadata": {},
  600. "output_type": "execute_result"
  601. }
  602. ],
  603. "source": [
  604. "# 叶子节点的grad_fn是None\n",
  605. "w.grad_fn,x.grad_fn"
  606. ]
  607. },
  608. {
  609. "cell_type": "markdown",
  610. "metadata": {},
  611. "source": [
  612. "计算w的梯度的时候,需要用到x的数值(${\\partial y\\over \\partial w} = x $),这些数值在前向过程中会保存成buffer,在计算完梯度之后会自动清空。为了能够多次反向传播需要指定`retain_graph`来保留这些buffer。"
  613. ]
  614. },
  615. {
  616. "cell_type": "code",
  617. "execution_count": 25,
  618. "metadata": {
  619. "scrolled": true
  620. },
  621. "outputs": [
  622. {
  623. "data": {
  624. "text/plain": [
  625. "Variable containing:\n",
  626. " 1\n",
  627. "[torch.FloatTensor of size 1]"
  628. ]
  629. },
  630. "execution_count": 25,
  631. "metadata": {},
  632. "output_type": "execute_result"
  633. }
  634. ],
  635. "source": [
  636. "# 使用retain_graph来保存buffer\n",
  637. "z.backward(retain_graph=True)\n",
  638. "w.grad"
  639. ]
  640. },
  641. {
  642. "cell_type": "code",
  643. "execution_count": 26,
  644. "metadata": {},
  645. "outputs": [
  646. {
  647. "data": {
  648. "text/plain": [
  649. "Variable containing:\n",
  650. " 2\n",
  651. "[torch.FloatTensor of size 1]"
  652. ]
  653. },
  654. "execution_count": 26,
  655. "metadata": {},
  656. "output_type": "execute_result"
  657. }
  658. ],
  659. "source": [
  660. "# 多次反向传播,梯度累加,这也就是w中AccumulateGrad标识的含义\n",
  661. "z.backward()\n",
  662. "w.grad"
  663. ]
  664. },
  665. {
  666. "cell_type": "markdown",
  667. "metadata": {},
  668. "source": [
  669. "PyTorch使用的是动态图,它的计算图在每次前向传播时都是从头开始构建,所以它能够使用Python控制语句(如for、if等)根据需求创建计算图。这点在自然语言处理领域中很有用,它意味着你不需要事先构建所有可能用到的图的路径,图在运行时才构建。"
  670. ]
  671. },
  672. {
  673. "cell_type": "code",
  674. "execution_count": 27,
  675. "metadata": {},
  676. "outputs": [
  677. {
  678. "data": {
  679. "text/plain": [
  680. "Variable containing:\n",
  681. " 1\n",
  682. "[torch.FloatTensor of size 1]"
  683. ]
  684. },
  685. "execution_count": 27,
  686. "metadata": {},
  687. "output_type": "execute_result"
  688. }
  689. ],
  690. "source": [
  691. "def abs(x):\n",
  692. " if x.data[0]>0: return x\n",
  693. " else: return -x\n",
  694. "x = V(t.ones(1),requires_grad=True)\n",
  695. "y = abs(x)\n",
  696. "y.backward()\n",
  697. "x.grad"
  698. ]
  699. },
  700. {
  701. "cell_type": "code",
  702. "execution_count": 28,
  703. "metadata": {},
  704. "outputs": [
  705. {
  706. "name": "stdout",
  707. "output_type": "stream",
  708. "text": [
  709. "Variable containing:\n",
  710. "-1\n",
  711. "[torch.FloatTensor of size 1]\n",
  712. "\n"
  713. ]
  714. }
  715. ],
  716. "source": [
  717. "x = V(-1*t.ones(1),requires_grad=True)\n",
  718. "y = abs(x)\n",
  719. "y.backward()\n",
  720. "print(x.grad)"
  721. ]
  722. },
  723. {
  724. "cell_type": "code",
  725. "execution_count": 29,
  726. "metadata": {},
  727. "outputs": [
  728. {
  729. "data": {
  730. "text/plain": [
  731. "Variable containing:\n",
  732. " 0\n",
  733. " 0\n",
  734. " 0\n",
  735. " 6\n",
  736. " 3\n",
  737. " 2\n",
  738. "[torch.FloatTensor of size 6]"
  739. ]
  740. },
  741. "execution_count": 29,
  742. "metadata": {},
  743. "output_type": "execute_result"
  744. }
  745. ],
  746. "source": [
  747. "def f(x):\n",
  748. " result = 1\n",
  749. " for ii in x:\n",
  750. " if ii.data[0]>0: result=ii*result\n",
  751. " return result\n",
  752. "x = V(t.arange(-2,4),requires_grad=True)\n",
  753. "y = f(x) # y = x[3]*x[4]*x[5]\n",
  754. "y.backward()\n",
  755. "x.grad"
  756. ]
  757. },
  758. {
  759. "cell_type": "markdown",
  760. "metadata": {},
  761. "source": [
  762. "变量的`requires_grad`属性默认为False,如果某一个节点requires_grad被设置为True,那么所有依赖它的节点`requires_grad`都是True。这其实很好理解,对于$ \\textbf{x}\\to \\textbf{y} \\to \\textbf{z}$,x.requires_grad = True,当需要计算$\\partial z \\over \\partial x$时,根据链式法则,$\\frac{\\partial z}{\\partial x} = \\frac{\\partial z}{\\partial y} \\frac{\\partial y}{\\partial x}$,自然也需要求$ \\frac{\\partial z}{\\partial y}$,所以y.requires_grad会被自动标为True. \n",
  763. "\n",
  764. "`volatile=True`是另外一个很重要的标识,它能够将所有依赖于它的节点全部都设为`volatile=True`,其优先级比`requires_grad=True`高。`volatile=True`的节点不会求导,即使`requires_grad=True`,也无法进行反向传播。对于不需要反向传播的情景(如inference,即测试推理时),该参数可实现一定程度的速度提升,并节省约一半显存,因其不需要分配空间计算梯度。"
  765. ]
  766. },
  767. {
  768. "cell_type": "code",
  769. "execution_count": 30,
  770. "metadata": {},
  771. "outputs": [
  772. {
  773. "data": {
  774. "text/plain": [
  775. "(False, True, True)"
  776. ]
  777. },
  778. "execution_count": 30,
  779. "metadata": {},
  780. "output_type": "execute_result"
  781. }
  782. ],
  783. "source": [
  784. "x = V(t.ones(1))\n",
  785. "w = V(t.rand(1), requires_grad=True)\n",
  786. "y = x * w\n",
  787. "# y依赖于w,而w.requires_grad = True\n",
  788. "x.requires_grad, w.requires_grad, y.requires_grad"
  789. ]
  790. },
  791. {
  792. "cell_type": "code",
  793. "execution_count": 31,
  794. "metadata": {},
  795. "outputs": [
  796. {
  797. "data": {
  798. "text/plain": [
  799. "(False, True, False)"
  800. ]
  801. },
  802. "execution_count": 31,
  803. "metadata": {},
  804. "output_type": "execute_result"
  805. }
  806. ],
  807. "source": [
  808. "x = V(t.ones(1), volatile=True)\n",
  809. "w = V(t.rand(1), requires_grad = True)\n",
  810. "y = x * w\n",
  811. "# y依赖于w和x,但x.volatile = True,w.requires_grad = True\n",
  812. "x.requires_grad, w.requires_grad, y.requires_grad"
  813. ]
  814. },
  815. {
  816. "cell_type": "code",
  817. "execution_count": 32,
  818. "metadata": {},
  819. "outputs": [
  820. {
  821. "data": {
  822. "text/plain": [
  823. "(True, False, True)"
  824. ]
  825. },
  826. "execution_count": 32,
  827. "metadata": {},
  828. "output_type": "execute_result"
  829. }
  830. ],
  831. "source": [
  832. "x.volatile, w.volatile, y.volatile"
  833. ]
  834. },
  835. {
  836. "cell_type": "markdown",
  837. "metadata": {},
  838. "source": [
  839. "在反向传播过程中非叶子节点的导数计算完之后即被清空。若想查看这些变量的梯度,有两种方法:\n",
  840. "- 使用autograd.grad函数\n",
  841. "- 使用hook\n",
  842. "\n",
  843. "`autograd.grad`和`hook`方法都是很强大的工具,更详细的用法参考官方api文档,这里举例说明基础的使用。推荐使用`hook`方法,但是在实际使用中应尽量避免修改grad的值。"
  844. ]
  845. },
  846. {
  847. "cell_type": "code",
  848. "execution_count": 33,
  849. "metadata": {},
  850. "outputs": [
  851. {
  852. "data": {
  853. "text/plain": [
  854. "(True, True, True)"
  855. ]
  856. },
  857. "execution_count": 33,
  858. "metadata": {},
  859. "output_type": "execute_result"
  860. }
  861. ],
  862. "source": [
  863. "x = V(t.ones(3), requires_grad=True)\n",
  864. "w = V(t.rand(3), requires_grad=True)\n",
  865. "y = x * w\n",
  866. "# y依赖于w,而w.requires_grad = True\n",
  867. "z = y.sum()\n",
  868. "x.requires_grad, w.requires_grad, y.requires_grad"
  869. ]
  870. },
  871. {
  872. "cell_type": "code",
  873. "execution_count": 34,
  874. "metadata": {},
  875. "outputs": [
  876. {
  877. "data": {
  878. "text/plain": [
  879. "(Variable containing:\n",
  880. " 0.3776\n",
  881. " 0.1184\n",
  882. " 0.8554\n",
  883. " [torch.FloatTensor of size 3], Variable containing:\n",
  884. " 1\n",
  885. " 1\n",
  886. " 1\n",
  887. " [torch.FloatTensor of size 3], None)"
  888. ]
  889. },
  890. "execution_count": 34,
  891. "metadata": {},
  892. "output_type": "execute_result"
  893. }
  894. ],
  895. "source": [
  896. "# 非叶子节点grad计算完之后自动清空,y.grad是None\n",
  897. "z.backward()\n",
  898. "(x.grad, w.grad, y.grad)"
  899. ]
  900. },
  901. {
  902. "cell_type": "code",
  903. "execution_count": 35,
  904. "metadata": {},
  905. "outputs": [
  906. {
  907. "data": {
  908. "text/plain": [
  909. "(Variable containing:\n",
  910. " 1\n",
  911. " 1\n",
  912. " 1\n",
  913. " [torch.FloatTensor of size 3],)"
  914. ]
  915. },
  916. "execution_count": 35,
  917. "metadata": {},
  918. "output_type": "execute_result"
  919. }
  920. ],
  921. "source": [
  922. "# 第一种方法:使用grad获取中间变量的梯度\n",
  923. "x = V(t.ones(3), requires_grad=True)\n",
  924. "w = V(t.rand(3), requires_grad=True)\n",
  925. "y = x * w\n",
  926. "z = y.sum()\n",
  927. "# z对y的梯度,隐式调用backward()\n",
  928. "t.autograd.grad(z, y)"
  929. ]
  930. },
  931. {
  932. "cell_type": "code",
  933. "execution_count": 36,
  934. "metadata": {},
  935. "outputs": [
  936. {
  937. "name": "stdout",
  938. "output_type": "stream",
  939. "text": [
  940. "y的梯度: \n",
  941. " Variable containing:\n",
  942. " 1\n",
  943. " 1\n",
  944. " 1\n",
  945. "[torch.FloatTensor of size 3]\n",
  946. "\n"
  947. ]
  948. }
  949. ],
  950. "source": [
  951. "# 第二种方法:使用hook\n",
  952. "# hook是一个函数,输入是梯度,不应该有返回值\n",
  953. "def variable_hook(grad):\n",
  954. " print('y的梯度: \\r\\n',grad)\n",
  955. "\n",
  956. "x = V(t.ones(3), requires_grad=True)\n",
  957. "w = V(t.rand(3), requires_grad=True)\n",
  958. "y = x * w\n",
  959. "# 注册hook\n",
  960. "hook_handle = y.register_hook(variable_hook)\n",
  961. "z = y.sum()\n",
  962. "z.backward()\n",
  963. "\n",
  964. "# 除非你每次都要用hook,否则用完之后记得移除hook\n",
  965. "hook_handle.remove()"
  966. ]
  967. },
  968. {
  969. "cell_type": "markdown",
  970. "metadata": {},
  971. "source": [
  972. "最后再来看看variable中grad属性和backward函数`grad_variables`参数的含义,这里直接下结论:\n",
  973. "\n",
  974. "- variable $\\textbf{x}$的梯度是目标函数${f(x)} $对$\\textbf{x}$的梯度,$\\frac{df(x)}{dx} = (\\frac {df(x)}{dx_0},\\frac {df(x)}{dx_1},...,\\frac {df(x)}{dx_N})$,形状和$\\textbf{x}$一致。\n",
  975. "- 对于y.backward(grad_variables)中的grad_variables相当于链式求导法则中的$\\frac{\\partial z}{\\partial x} = \\frac{\\partial z}{\\partial y} \\frac{\\partial y}{\\partial x}$中的$\\frac{\\partial z}{\\partial y}$。z是目标函数,一般是一个标量,故而$\\frac{\\partial z}{\\partial y}$的形状与variable $\\textbf{y}$的形状一致。`z.backward()`在一定程度上等价于y.backward(grad_y)。`z.backward()`省略了grad_variables参数,是因为$z$是一个标量,而$\\frac{\\partial z}{\\partial z} = 1$"
  976. ]
  977. },
  978. {
  979. "cell_type": "code",
  980. "execution_count": 37,
  981. "metadata": {
  982. "scrolled": true
  983. },
  984. "outputs": [
  985. {
  986. "data": {
  987. "text/plain": [
  988. "Variable containing:\n",
  989. " 2\n",
  990. " 4\n",
  991. " 6\n",
  992. "[torch.FloatTensor of size 3]"
  993. ]
  994. },
  995. "execution_count": 37,
  996. "metadata": {},
  997. "output_type": "execute_result"
  998. }
  999. ],
  1000. "source": [
  1001. "x = V(t.arange(0,3), requires_grad=True)\n",
  1002. "y = x**2 + x*2\n",
  1003. "z = y.sum()\n",
  1004. "z.backward() # 从z开始反向传播\n",
  1005. "x.grad"
  1006. ]
  1007. },
  1008. {
  1009. "cell_type": "code",
  1010. "execution_count": 38,
  1011. "metadata": {
  1012. "scrolled": true
  1013. },
  1014. "outputs": [
  1015. {
  1016. "data": {
  1017. "text/plain": [
  1018. "Variable containing:\n",
  1019. " 2\n",
  1020. " 4\n",
  1021. " 6\n",
  1022. "[torch.FloatTensor of size 3]"
  1023. ]
  1024. },
  1025. "execution_count": 38,
  1026. "metadata": {},
  1027. "output_type": "execute_result"
  1028. }
  1029. ],
  1030. "source": [
  1031. "x = V(t.arange(0,3), requires_grad=True)\n",
  1032. "y = x**2 + x*2\n",
  1033. "z = y.sum()\n",
  1034. "y_grad_variables = V(t.Tensor([1,1,1])) # dz/dy\n",
  1035. "y.backward(y_grad_variables) #从y开始反向传播\n",
  1036. "x.grad"
  1037. ]
  1038. },
  1039. {
  1040. "cell_type": "markdown",
  1041. "metadata": {},
  1042. "source": [
  1043. "另外值得注意的是,只有对variable的操作才能使用autograd,如果对variable的data直接进行操作,将无法使用反向传播。除了对参数初始化,一般我们不会修改variable.data的值。"
  1044. ]
  1045. },
  1046. {
  1047. "cell_type": "markdown",
  1048. "metadata": {},
  1049. "source": [
  1050. "在PyTorch中计算图的特点可总结如下:\n",
  1051. "\n",
  1052. "- autograd根据用户对variable的操作构建其计算图。对变量的操作抽象为`Function`。\n",
  1053. "- 对于那些不是任何函数(Function)的输出,由用户创建的节点称为叶子节点,叶子节点的`grad_fn`为None。叶子节点中需要求导的variable,具有`AccumulateGrad`标识,因其梯度是累加的。\n",
  1054. "- variable默认是不需要求导的,即`requires_grad`属性默认为False,如果某一个节点requires_grad被设置为True,那么所有依赖它的节点`requires_grad`都为True。\n",
  1055. "- variable的`volatile`属性默认为False,如果某一个variable的`volatile`属性被设为True,那么所有依赖它的节点`volatile`属性都为True。volatile属性为True的节点不会求导,volatile的优先级比`requires_grad`高。\n",
  1056. "- 多次反向传播时,梯度是累加的。反向传播的中间缓存会被清空,为进行多次反向传播需指定`retain_graph`=True来保存这些缓存。\n",
  1057. "- 非叶子节点的梯度计算完之后即被清空,可以使用`autograd.grad`或`hook`技术获取非叶子节点的值。\n",
  1058. "- variable的grad与data形状一致,应避免直接修改variable.data,因为对data的直接操作无法利用autograd进行反向传播\n",
  1059. "- 反向传播函数`backward`的参数`grad_variables`可以看成链式求导的中间结果,如果是标量,可以省略,默认为1\n",
  1060. "- PyTorch采用动态图设计,可以很方便地查看中间层的输出,动态的设计计算图结构。"
  1061. ]
  1062. },
  1063. {
  1064. "cell_type": "markdown",
  1065. "metadata": {},
  1066. "source": [
  1067. "### 3.2.3 扩展autograd\n",
  1068. "\n",
  1069. "\n",
  1070. "目前绝大多数函数都可以使用`autograd`实现反向求导,但如果需要自己写一个复杂的函数,不支持自动反向求导怎么办? 写一个`Function`,实现它的前向传播和反向传播代码,`Function`对应于计算图中的矩形, 它接收参数,计算并返回结果。下面给出一个例子。\n",
  1071. "\n",
  1072. "```python\n",
  1073. "\n",
  1074. "class Mul(Function):\n",
  1075. " \n",
  1076. " @staticmethod\n",
  1077. " def forward(ctx, w, x, b, x_requires_grad = True):\n",
  1078. " ctx.x_requires_grad = x_requires_grad\n",
  1079. " ctx.save_for_backward(w,x)\n",
  1080. " output = w * x + b\n",
  1081. " return output\n",
  1082. " \n",
  1083. " @staticmethod\n",
  1084. " def backward(ctx, grad_output):\n",
  1085. " w,x = ctx.saved_variables\n",
  1086. " grad_w = grad_output * x\n",
  1087. " if ctx.x_requires_grad:\n",
  1088. " grad_x = grad_output * w\n",
  1089. " else:\n",
  1090. " grad_x = None\n",
  1091. " grad_b = grad_output * 1\n",
  1092. " return grad_w, grad_x, grad_b, None\n",
  1093. "```\n",
  1094. "\n",
  1095. "分析如下:\n",
  1096. "\n",
  1097. "- 自定义的Function需要继承autograd.Function,没有构造函数`__init__`,forward和backward函数都是静态方法\n",
  1098. "- forward函数的输入和输出都是Tensor,backward函数的输入和输出都是Variable\n",
  1099. "- backward函数的输出和forward函数的输入一一对应,backward函数的输入和forward函数的输出一一对应\n",
  1100. "- backward函数的grad_output参数即t.autograd.backward中的`grad_variables`\n",
  1101. "- 如果某一个输入不需要求导,直接返回None,如forward中的输入参数x_requires_grad显然无法对它求导,直接返回None即可\n",
  1102. "- 反向传播可能需要利用前向传播的某些中间结果,需要进行保存,否则前向传播结束后这些对象即被释放\n",
  1103. "\n",
  1104. "Function的使用利用Function.apply(variable)"
  1105. ]
  1106. },
  1107. {
  1108. "cell_type": "code",
  1109. "execution_count": 39,
  1110. "metadata": {},
  1111. "outputs": [],
  1112. "source": [
  1113. "from torch.autograd import Function\n",
  1114. "class MultiplyAdd(Function):\n",
  1115. " \n",
  1116. " @staticmethod\n",
  1117. " def forward(ctx, w, x, b): \n",
  1118. " print('type in forward',type(x))\n",
  1119. " ctx.save_for_backward(w,x)\n",
  1120. " output = w * x + b\n",
  1121. " return output\n",
  1122. " \n",
  1123. " @staticmethod\n",
  1124. " def backward(ctx, grad_output): \n",
  1125. " w,x = ctx.saved_variables\n",
  1126. " print('type in backward',type(x))\n",
  1127. " grad_w = grad_output * x\n",
  1128. " grad_x = grad_output * w\n",
  1129. " grad_b = grad_output * 1\n",
  1130. " return grad_w, grad_x, grad_b "
  1131. ]
  1132. },
  1133. {
  1134. "cell_type": "code",
  1135. "execution_count": 40,
  1136. "metadata": {
  1137. "scrolled": true
  1138. },
  1139. "outputs": [
  1140. {
  1141. "name": "stdout",
  1142. "output_type": "stream",
  1143. "text": [
  1144. "开始前向传播\n",
  1145. "type in backwardtype in forward <class 'torch.autograd.variable.Variable'><class 'torch.FloatTensor'>\n",
  1146. "\n",
  1147. "开始反向传播\n"
  1148. ]
  1149. },
  1150. {
  1151. "data": {
  1152. "text/plain": [
  1153. "(None, Variable containing:\n",
  1154. " 1\n",
  1155. " [torch.FloatTensor of size 1], Variable containing:\n",
  1156. " 1\n",
  1157. " [torch.FloatTensor of size 1])"
  1158. ]
  1159. },
  1160. "execution_count": 40,
  1161. "metadata": {},
  1162. "output_type": "execute_result"
  1163. }
  1164. ],
  1165. "source": [
  1166. "x = V(t.ones(1))\n",
  1167. "w = V(t.rand(1), requires_grad = True)\n",
  1168. "b = V(t.rand(1), requires_grad = True)\n",
  1169. "print('开始前向传播')\n",
  1170. "z=MultiplyAdd.apply(w, x, b)\n",
  1171. "print('开始反向传播')\n",
  1172. "z.backward()\n",
  1173. "\n",
  1174. "# x不需要求导,中间过程还是会计算它的导数,但随后被清空\n",
  1175. "x.grad, w.grad, b.grad"
  1176. ]
  1177. },
  1178. {
  1179. "cell_type": "code",
  1180. "execution_count": 41,
  1181. "metadata": {},
  1182. "outputs": [
  1183. {
  1184. "name": "stdout",
  1185. "output_type": "stream",
  1186. "text": [
  1187. "开始前向传播\n",
  1188. "type in forward <class 'torch.FloatTensor'>\n",
  1189. "开始反向传播\n",
  1190. "type in backward <class 'torch.autograd.variable.Variable'>\n"
  1191. ]
  1192. },
  1193. {
  1194. "data": {
  1195. "text/plain": [
  1196. "(Variable containing:\n",
  1197. " 1\n",
  1198. " [torch.FloatTensor of size 1], Variable containing:\n",
  1199. " 0.9633\n",
  1200. " [torch.FloatTensor of size 1], Variable containing:\n",
  1201. " 1\n",
  1202. " [torch.FloatTensor of size 1])"
  1203. ]
  1204. },
  1205. "execution_count": 41,
  1206. "metadata": {},
  1207. "output_type": "execute_result"
  1208. }
  1209. ],
  1210. "source": [
  1211. "x = V(t.ones(1))\n",
  1212. "w = V(t.rand(1), requires_grad = True)\n",
  1213. "b = V(t.rand(1), requires_grad = True)\n",
  1214. "print('开始前向传播')\n",
  1215. "z=MultiplyAdd.apply(w,x,b)\n",
  1216. "print('开始反向传播')\n",
  1217. "\n",
  1218. "# 调用MultiplyAdd.backward\n",
  1219. "# 输出grad_w, grad_x, grad_b\n",
  1220. "z.grad_fn.apply(V(t.ones(1)))"
  1221. ]
  1222. },
  1223. {
  1224. "cell_type": "markdown",
  1225. "metadata": {},
  1226. "source": [
  1227. "之所以forward函数的输入是tensor,而backward函数的输入是variable,是为了实现高阶求导。backward函数的输入输出虽然是variable,但在实际使用时autograd.Function会将输入variable提取为tensor,并将计算结果的tensor封装成variable返回。在backward函数中,之所以也要对variable进行操作,是为了能够计算梯度的梯度(backward of backward)。下面举例说明,有关torch.autograd.grad的更详细使用请参照文档。"
  1228. ]
  1229. },
  1230. {
  1231. "cell_type": "code",
  1232. "execution_count": 42,
  1233. "metadata": {},
  1234. "outputs": [
  1235. {
  1236. "data": {
  1237. "text/plain": [
  1238. "(Variable containing:\n",
  1239. " 10\n",
  1240. " [torch.FloatTensor of size 1],)"
  1241. ]
  1242. },
  1243. "execution_count": 42,
  1244. "metadata": {},
  1245. "output_type": "execute_result"
  1246. }
  1247. ],
  1248. "source": [
  1249. "x = V(t.Tensor([5]), requires_grad=True)\n",
  1250. "y = x ** 2\n",
  1251. "grad_x = t.autograd.grad(y, x, create_graph=True)\n",
  1252. "grad_x # dy/dx = 2 * x"
  1253. ]
  1254. },
  1255. {
  1256. "cell_type": "code",
  1257. "execution_count": 43,
  1258. "metadata": {},
  1259. "outputs": [
  1260. {
  1261. "data": {
  1262. "text/plain": [
  1263. "(Variable containing:\n",
  1264. " 2\n",
  1265. " [torch.FloatTensor of size 1],)"
  1266. ]
  1267. },
  1268. "execution_count": 43,
  1269. "metadata": {},
  1270. "output_type": "execute_result"
  1271. }
  1272. ],
  1273. "source": [
  1274. "grad_grad_x = t.autograd.grad(grad_x[0],x)\n",
  1275. "grad_grad_x # 二阶导数 d(2x)/dx = 2"
  1276. ]
  1277. },
  1278. {
  1279. "cell_type": "markdown",
  1280. "metadata": {},
  1281. "source": [
  1282. "这种设计虽然能让`autograd`具有高阶求导功能,但其也限制了Tensor的使用,因autograd中反向传播的函数只能利用当前已经有的Variable操作。这个设计是在`0.2`版本新加入的,为了更好的灵活性,也为了兼容旧版本的代码,PyTorch还提供了另外一种扩展autograd的方法。PyTorch提供了一个装饰器`@once_differentiable`,能够在backward函数中自动将输入的variable提取成tensor,把计算结果的tensor自动封装成variable。有了这个特性我们就能够很方便的使用numpy/scipy中的函数,操作不再局限于variable所支持的操作。但是这种做法正如名字中所暗示的那样只能求导一次,它打断了反向传播图,不再支持高阶求导。\n",
  1283. "\n",
  1284. "\n",
  1285. "上面所描述的都是新式Function,还有个legacy Function,可以带有`__init__`方法,`forward`和`backwad`函数也不需要声明为`@staticmethod`,但随着版本更迭,此类Function将越来越少遇到,在此不做更多介绍。\n",
  1286. "\n",
  1287. "此外在实现了自己的Function之后,还可以使用`gradcheck`函数来检测实现是否正确。`gradcheck`通过数值逼近来计算梯度,可能具有一定的误差,通过控制`eps`的大小可以控制容忍的误差。\n",
  1288. "关于这部份的内容可以参考github上开发者们的讨论[^3]。\n",
  1289. "\n",
  1290. "[^3]: https://github.com/pytorch/pytorch/pull/1016"
  1291. ]
  1292. },
  1293. {
  1294. "cell_type": "markdown",
  1295. "metadata": {},
  1296. "source": [
  1297. "下面举例说明如何利用Function实现sigmoid Function。"
  1298. ]
  1299. },
  1300. {
  1301. "cell_type": "code",
  1302. "execution_count": 44,
  1303. "metadata": {},
  1304. "outputs": [],
  1305. "source": [
  1306. "class Sigmoid(Function):\n",
  1307. " \n",
  1308. " @staticmethod\n",
  1309. " def forward(ctx, x): \n",
  1310. " output = 1 / (1 + t.exp(-x))\n",
  1311. " ctx.save_for_backward(output)\n",
  1312. " return output\n",
  1313. " \n",
  1314. " @staticmethod\n",
  1315. " def backward(ctx, grad_output): \n",
  1316. " output, = ctx.saved_variables\n",
  1317. " grad_x = output * (1 - output) * grad_output\n",
  1318. " return grad_x "
  1319. ]
  1320. },
  1321. {
  1322. "cell_type": "code",
  1323. "execution_count": 45,
  1324. "metadata": {},
  1325. "outputs": [
  1326. {
  1327. "data": {
  1328. "text/plain": [
  1329. "True"
  1330. ]
  1331. },
  1332. "execution_count": 45,
  1333. "metadata": {},
  1334. "output_type": "execute_result"
  1335. }
  1336. ],
  1337. "source": [
  1338. "# 采用数值逼近方式检验计算梯度的公式对不对\n",
  1339. "test_input = V(t.randn(3,4), requires_grad=True)\n",
  1340. "t.autograd.gradcheck(Sigmoid.apply, (test_input,), eps=1e-3)"
  1341. ]
  1342. },
  1343. {
  1344. "cell_type": "code",
  1345. "execution_count": 46,
  1346. "metadata": {},
  1347. "outputs": [
  1348. {
  1349. "name": "stdout",
  1350. "output_type": "stream",
  1351. "text": [
  1352. "232 µs ± 68.6 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)\n",
  1353. "191 µs ± 6.1 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)\n",
  1354. "215 µs ± 23.1 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)\n"
  1355. ]
  1356. }
  1357. ],
  1358. "source": [
  1359. "def f_sigmoid(x):\n",
  1360. " y = Sigmoid.apply(x)\n",
  1361. " y.backward(t.ones(x.size()))\n",
  1362. " \n",
  1363. "def f_naive(x):\n",
  1364. " y = 1/(1 + t.exp(-x))\n",
  1365. " y.backward(t.ones(x.size()))\n",
  1366. " \n",
  1367. "def f_th(x):\n",
  1368. " y = t.sigmoid(x)\n",
  1369. " y.backward(t.ones(x.size()))\n",
  1370. " \n",
  1371. "x=V(t.randn(100, 100), requires_grad=True)\n",
  1372. "%timeit -n 100 f_sigmoid(x)\n",
  1373. "%timeit -n 100 f_naive(x)\n",
  1374. "%timeit -n 100 f_th(x)"
  1375. ]
  1376. },
  1377. {
  1378. "cell_type": "markdown",
  1379. "metadata": {},
  1380. "source": [
  1381. "显然`f_sigmoid`要比单纯利用`autograd`加减和乘方操作实现的函数快不少,因为f_sigmoid的backward优化了反向传播的过程。另外可以看出系统实现的buildin接口(t.sigmoid)更快。"
  1382. ]
  1383. },
  1384. {
  1385. "cell_type": "markdown",
  1386. "metadata": {},
  1387. "source": [
  1388. "### 3.2.4 小试牛刀: 用Variable实现线性回归\n",
  1389. "在上一节中讲解了利用tensor实现线性回归,在这一小节中,将讲解如何利用autograd/Variable实现线性回归,以此感受autograd的便捷之处。"
  1390. ]
  1391. },
  1392. {
  1393. "cell_type": "code",
  1394. "execution_count": 47,
  1395. "metadata": {},
  1396. "outputs": [],
  1397. "source": [
  1398. "import torch as t\n",
  1399. "from torch.autograd import Variable as V\n",
  1400. "%matplotlib inline\n",
  1401. "from matplotlib import pyplot as plt\n",
  1402. "from IPython import display"
  1403. ]
  1404. },
  1405. {
  1406. "cell_type": "code",
  1407. "execution_count": 48,
  1408. "metadata": {},
  1409. "outputs": [],
  1410. "source": [
  1411. "# 设置随机数种子,为了在不同人电脑上运行时下面的输出一致\n",
  1412. "t.manual_seed(1000) \n",
  1413. "\n",
  1414. "def get_fake_data(batch_size=8):\n",
  1415. " ''' 产生随机数据:y = x*2 + 3,加上了一些噪声'''\n",
  1416. " x = t.rand(batch_size,1) * 20\n",
  1417. " y = x * 2 + (1 + t.randn(batch_size, 1))*3\n",
  1418. " return x, y"
  1419. ]
  1420. },
  1421. {
  1422. "cell_type": "code",
  1423. "execution_count": 49,
  1424. "metadata": {},
  1425. "outputs": [
  1426. {
  1427. "data": {
  1428. "text/plain": [
  1429. "<matplotlib.collections.PathCollection at 0x7f7e9ac91860>"
  1430. ]
  1431. },
  1432. "execution_count": 49,
  1433. "metadata": {},
  1434. "output_type": "execute_result"
  1435. },
  1436. {
  1437. "data": {
  1438. "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXQAAAD8CAYAAABn919SAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4yLCBo\ndHRwOi8vbWF0cGxvdGxpYi5vcmcvhp/UCwAAD11JREFUeJzt3V+MXGd9xvHvU8eU5U+1gWxQvEAN\nKHKpSLHpKkobKaJA64AQMVFRSVtktbShEqhQkEVML4CLKkHmj6peRAokTS5oVArGQS3FWCFtWqmk\n3eAQO3XdFMqfrN14KSzQsqKO+fVix2Bv1t6Z9c7OzLvfj7SamXfP6DxaK0/mvOedc1JVSJJG308N\nOoAkaXVY6JLUCAtdkhphoUtSIyx0SWqEhS5JjbDQJakRFrokNcJCl6RGXLSWO7vkkktq8+bNa7lL\nSRp5Dz744LeqamK57da00Ddv3sz09PRa7lKSRl6Sr3eznVMuktQIC12SGmGhS1Ijli30JE9N8s9J\nvpzkkSTv74y/IMkDSR5N8pdJntL/uJKkc+nmE/oPgVdU1UuBrcC1Sa4CPgB8pKouB74DvLl/MSVJ\ny1l2lUst3AHjfzovN3Z+CngF8Jud8buA9wG3rn5ESRpN+w7OsGf/UY7NzbNpfIxd27ewY9tk3/bX\n1Rx6kg1JHgJOAAeArwBzVfVEZ5PHgP6llKQRs+/gDLv3HmJmbp4CZubm2b33EPsOzvRtn10VelWd\nqqqtwHOBK4EXL7XZUu9NcmOS6STTs7OzK08qSSNkz/6jzJ88ddbY/MlT7Nl/tG/77GmVS1XNAX8H\nXAWMJzk9ZfNc4Ng53nNbVU1V1dTExLJfdJKkJhybm+9pfDV0s8plIsl45/kY8CrgCHAf8OudzXYC\n9/QrpCSNmk3jYz2Nr4ZuPqFfBtyX5GHgX4ADVfXXwLuBdyb5D+DZwO19SylJI2bX9i2Mbdxw1tjY\nxg3s2r6lb/vsZpXLw8C2Jca/ysJ8uiRpkdOrWdZylcuaXpxLktaTHdsm+1rgi/nVf0lqhIUuSY2w\n0CWpERa6JDXCQpekRljoktQIC12SGmGhS1IjLHRJaoSFLkmNsNAlqREWuiQ1wkKXpEZY6JLUCAtd\nkhphoUtSIyx0SWqEhS5JjbDQJakRFrokNcJCl6RGWOiS1AgLXZIaYaFLUiMsdElqhIUuSY2w0CWp\nERa6JDXCQpekRljoktQIC12SGmGhS1IjLHRJaoSFLkmNsNAlqRHLFnqS5yW5L8mRJI8keXtn/H1J\nZpI81Pl5Tf/jSpLO5aIutnkCeFdVfSnJM4EHkxzo/O4jVfXB/sWTJHVr2UKvquPA8c7z7yc5Akz2\nO5gkqTc9zaEn2QxsAx7oDL0tycNJ7khy8SpnkyT1oOtCT/IM4FPAO6rqe8CtwIuArSx8gv/QOd53\nY5LpJNOzs7OrEFmStJSuCj3JRhbK/ONVtRegqh6vqlNV9SPgo8CVS723qm6rqqmqmpqYmFit3JKk\nRbpZ5RLgduBIVX34jPHLztjs9cDh1Y8nSepWN6tcrgbeBBxK8lBn7D3ADUm2AgV8DXhLXxJKkrrS\nzSqXfwSyxK8+u/pxJEkr5TdFJakRFrokNcJCl6RGdHNSVGrSvoMz7Nl/lGNz82waH2PX9i3s2OaX\noDW6LHStS/sOzrB77yHmT54CYGZunt17DwFY6hpZTrloXdqz/+iPy/y0+ZOn2LP/6IASSRfOQte6\ndGxuvqdxaRRY6FqXNo2P9TQujQILXevSru1bGNu44ayxsY0b2LV9y4ASSRfOk6Jal06f+HSVi1pi\noWvd2rFt0gJXU5xykaRGWOiS1AgLXZIaYaFLUiMsdElqhKtcJKlHw3phNwtdknowzBd2c8pFknow\nzBd2s9AlqQfDfGE3C12SejDMF3az0CWpB8N8YTdPikpSD4b5wm4WuiT1aFgv7OaUiyQ1wkKXpEZY\n6JLUCAtdkhphoUtSIyx0SWqEhS5JjbDQJakRFrokNcJCl6RGWOiS1AgLXZIasWyhJ3lekvuSHEny\nSJK3d8afleRAkkc7jxf3P64k6Vy6+YT+BPCuqnoxcBXw1iQ/D9wE3FtVlwP3dl5rBO07OMPVt3yB\nF9z0N1x9yxfYd3Bm0JEkrcCyhV5Vx6vqS53n3weOAJPAdcBdnc3uAnb0K6T65/QNb2fm5il+csNb\nS10aPT3NoSfZDGwDHgCeU1XHYaH0gUtXO5z6b5hveCupN10XepJnAJ8C3lFV3+vhfTcmmU4yPTs7\nu5KM6qNhvuGtpN50VehJNrJQ5h+vqr2d4ceTXNb5/WXAiaXeW1W3VdVUVU1NTEysRmatomG+4a2k\n3nSzyiXA7cCRqvrwGb/6DLCz83wncM/qx1O/DfMNbyX1ppt7il4NvAk4lOShzth7gFuATyR5M/AN\n4A39iah+GuYb3krqTapqzXY2NTVV09PTa7Y/SWpBkgeramq57fymqCQ1wkKXpEZY6JLUCAtdkhph\noUtSI7pZtqhVsu/gjMsDJfWNhb5GTl8E6/R1U05fBAuw1CWtCgt9jZzvIlgW+uB41KSWWOhrxItg\nDR+PmtQaT4quES+CNXy8dLBaY6GvES+CNXw8alJrLPQ1smPbJDdffwWT42MEmBwf4+brr/DQfoA8\nalJrnENfQzu2TVrgQ2TX9i1nzaGDR00abRa61i0vHazWWOha1zxqUkucQ5ekRljoktQIC12SGmGh\nS1IjLHRJaoSFLkmNsNAlqREWuiQ1wkKXpEZY6JLUCAtdkhphoUtSIyx0SWqEhS5JjbDQJakRFrok\nNWIkbnCx7+CMd5WRpGUMfaHvOzhz1n0fZ+bm2b33EIClLklnGPoplz37j551E1+A+ZOn2LP/6IAS\nSdJwGvpCPzY339O4JK1XQ1/om8bHehqXpPVq2UJPckeSE0kOnzH2viQzSR7q/LymXwF3bd/C2MYN\nZ42NbdzAru1b+rVLSRpJ3XxCvxO4donxj1TV1s7PZ1c31k/s2DbJzddfweT4GAEmx8e4+forPCEq\nSYssu8qlqu5Psrn/Uc5tx7ZJC1ySlnEhc+hvS/JwZ0rm4lVLJElakZUW+q3Ai4CtwHHgQ+faMMmN\nSaaTTM/Ozq5wd5Kk5ayo0Kvq8ao6VVU/Aj4KXHmebW+rqqmqmpqYmFhpTknSMlZU6EkuO+Pl64HD\n59pWkrQ2lj0pmuRu4OXAJUkeA94LvDzJVqCArwFv6WNGSVIXulnlcsMSw7f3IYsk6QIM/TdFJUnd\nsdAlqREWuiQ1wkKXpEZY6JLUCAtdkhphoUtSIyx0SWqEhS5JjbDQJakRFrokNcJCl6RGWOiS1AgL\nXZIaYaFLUiMsdElqhIUuSY2w0CWpERa6JDXCQpekRljoktQIC12SGmGhS1IjLHRJaoSFLkmNsNAl\nqREWuiQ1wkKXpEZY6JLUCAtdkhphoUtSIyx0SWqEhS5JjbDQJakRFrokNcJCl6RGLFvoSe5IciLJ\n4TPGnpXkQJJHO48X9zemJGk53XxCvxO4dtHYTcC9VXU5cG/ntSRpgJYt9Kq6H/j2ouHrgLs6z+8C\ndqxyLklSj1Y6h/6cqjoO0Hm8dPUiSZJWou8nRZPcmGQ6yfTs7Gy/dydJ69ZKC/3xJJcBdB5PnGvD\nqrqtqqaqampiYmKFu5MkLWelhf4ZYGfn+U7gntWJI0laqW6WLd4N/BOwJcljSd4M3AL8apJHgV/t\nvJYkDdBFy21QVTec41evXOUskqQL4DdFJakRFrokNcJCl6RGWOiS1AgLXZIaYaFLUiMsdElqhIUu\nSY2w0CWpERa6JDXCQpekRix7LZdRs+/gDHv2H+XY3DybxsfYtX0LO7ZNDjqWJPVdU4W+7+AMu/ce\nYv7kKQBm5ubZvfcQgKUuqXlNTbns2X/0x2V+2vzJU+zZf3RAiSRp7TRV6Mfm5nsal6SWNFXom8bH\nehqXpJY0Vei7tm9hbOOGs8bGNm5g1/YtA0okSWunqZOip098uspF0nrUVKHDQqlb4JLWo6amXCRp\nPbPQJakRFrokNcJCl6RGWOiS1IhU1drtLJkFvr7MZpcA31qDOBfCjKtnFHKacXWMQkYYzpw/W1UT\ny220poXejSTTVTU16BznY8bVMwo5zbg6RiEjjE7OpTjlIkmNsNAlqRHDWOi3DTpAF8y4ekYhpxlX\nxyhkhNHJ+SRDN4cuSVqZYfyELklagaEq9CRfS3IoyUNJpgedZylJxpN8Msm/JTmS5JcGnelMSbZ0\n/n6nf76X5B2DzrVYkj9K8kiSw0nuTvLUQWdaLMnbO/keGaa/YZI7kpxIcviMsWclOZDk0c7jxUOY\n8Q2dv+WPkgx8Fck5Mu7p/Lf9cJJPJxkfZMZeDVWhd/xKVW0d4mVDfwp8rqp+DngpcGTAec5SVUc7\nf7+twC8CPwA+PeBYZ0kyCfwhMFVVLwE2AG8cbKqzJXkJ8PvAlSz8O782yeWDTfVjdwLXLhq7Cbi3\nqi4H7u28HqQ7eXLGw8D1wP1rnmZpd/LkjAeAl1TVLwD/Duxe61AXYhgLfWgl+RngGuB2gKr6v6qa\nG2yq83ol8JWqWu7LXINwETCW5CLgacCxAedZ7MXAF6vqB1X1BPD3wOsHnAmAqrof+Pai4euAuzrP\n7wJ2rGmoRZbKWFVHqmpobvB7joyf7/x7A3wReO6aB7sAw1boBXw+yYNJbhx0mCW8EJgF/jzJwSQf\nS/L0QYc6jzcCdw86xGJVNQN8EPgGcBz4blV9frCpnuQwcE2SZyd5GvAa4HkDznQ+z6mq4wCdx0sH\nnKcFvwv87aBD9GLYCv3qqnoZ8GrgrUmuGXSgRS4CXgbcWlXbgP9l8Ie2S0ryFOB1wF8NOstinfnd\n64AXAJuApyf57cGmOltVHQE+wMIh+OeALwNPnPdNakaSP2bh3/vjg87Si6Eq9Ko61nk8wcK875WD\nTfQkjwGPVdUDndefZKHgh9GrgS9V1eODDrKEVwH/WVWzVXUS2Av88oAzPUlV3V5VL6uqa1g4NH90\n0JnO4/EklwF0Hk8MOM/ISrITeC3wWzVi67qHptCTPD3JM08/B36NhcPeoVFV/wV8M8npu06/EvjX\nAUY6nxsYwumWjm8AVyV5WpKw8HccqpPLAEku7Tw+n4WTecP69wT4DLCz83wncM8As4ysJNcC7wZe\nV1U/GHSeXg3NF4uSvJCfrMa4CPiLqvqTAUZaUpKtwMeApwBfBX6nqr4z2FRn68z5fhN4YVV9d9B5\nlpLk/cBvsHBYexD4var64WBTnS3JPwDPBk4C76yqewccCYAkdwMvZ+GqgI8D7wX2AZ8Ans/C/zDf\nUFWLT5wOOuO3gT8DJoA54KGq2j5kGXcDPw38d2ezL1bVHwwk4AoMTaFLki7M0Ey5SJIujIUuSY2w\n0CWpERa6JDXCQpekRljoktQIC12SGmGhS1Ij/h/CJYJPfXoR0gAAAABJRU5ErkJggg==\n",
  1439. "text/plain": [
  1440. "<Figure size 432x288 with 1 Axes>"
  1441. ]
  1442. },
  1443. "metadata": {},
  1444. "output_type": "display_data"
  1445. }
  1446. ],
  1447. "source": [
  1448. "# 来看看产生x-y分布是什么样的\n",
  1449. "x, y = get_fake_data()\n",
  1450. "plt.scatter(x.squeeze().numpy(), y.squeeze().numpy())"
  1451. ]
  1452. },
  1453. {
  1454. "cell_type": "code",
  1455. "execution_count": 50,
  1456. "metadata": {
  1457. "scrolled": false
  1458. },
  1459. "outputs": [
  1460. {
  1461. "data": {
  1462. "image/png": "iVBORw0KGgoAAAANSUhEUgAAAXwAAAD8CAYAAAB0IB+mAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAADl0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uIDIuMi4yLCBo\ndHRwOi8vbWF0cGxvdGxpYi5vcmcvhp/UCwAAIABJREFUeJzt3Xl4VOX5xvHvkz1AICA7EsMaQJag\nEQXUWrWCW6EuqP1VsWqpbW0LCArWVtwqrVbp4qXFaqVWawBRBFFqBUWtG0gWIOyCLGEnhCVAlvf3\nRwYLIZMMyayZ+3NduZg5c+bM09PjnZP3vPMcc84hIiINX0yoCxARkeBQ4IuIRAkFvohIlFDgi4hE\nCQW+iEiUUOCLiEQJBb6ISJRQ4IuIRAkFvohIlIgL5oe1bNnSpaenB/MjRSTK5G/ZV+PrqY3iaZ+a\nTKxZkCqqvyVLluxyzrWq73aCGvjp6eksXrw4mB8pIlFm8OQFbCkqOWm5GfzlprO4sm+7EFRVP2a2\n0R/b8XlIx8xizWypmc31PO9kZp+Z2RozyzazBH8UJCJSH+OHZJAcH3vCshiDSVefGZFh70+nMob/\nS6DguOe/A55yznUD9gK3+7MwEZG6GN6/AzcN6EiMZ8SmWXI8f7i+HyMHpYe0rnDg05COmZ0OXAk8\nCow1MwMuBr7vWWUaMAl4JgA1ioj4ZP/hUh6as4IZSzbTu0NTptyQSdfWKaEuK2z4OoY/BbgHOLbn\nTgOKnHNlnuebgQ5+rk1ExGdfbNjDmOwcthaVcNe3u/KLS7qREKeJiMerNfDN7Cpgh3NuiZlddGxx\nNatW21jfzEYBowDS0tLqWKaISPWOllXw1H9W8+wH6+jYvBEz7hzI2We0CHVZYcmXM/zBwHfN7Aog\nCWhK5Rl/qpnFec7yTwe2Vvdm59xUYCpAVlaW7rYiIn6zevt+Rr+aw4rCYm48pyP3X9WLJolBnXwY\nUWr9e8c5N9E5d7pzLh24EVjgnPs/YCFwnWe1kcDsgFUpInKcigrH8x99xVV//ojtxYd57pYsJl/b\nV2Ffi/rsnXuBV83sEWAp8Lx/ShIR8a5wXwnjZuTy8drdXNKjNZOv7UurlMRQlxURTinwnXPvA+97\nHq8HBvi/JBGR6r2Zu5X7X8+nrMLx2DV9uPGcjlgEfWM21PT3j4iEvX2HSvnNm8uYnbOV/mmpPDUi\nk/SWjUNdVsRR4ItIWPt47S7Gzchlx/4jjP1Od356URfiYjXdsi4U+CISlg6XlvP4/FU8/9FXdG7V\nmFk/GUS/jqmhLiuiKfBFJOws37qPMdk5rN5+gFsGnsHEy3uSnBBb+xulRgp8EQkb5RWOqYvW8+S7\nq0htlMCLPzyHizJah7qsBkOBLyJhYdOeQ9w9PZfPN+zh8t5tefR7fWjRWE14/UmBLyIh5Zxj5pLN\nPDhnBQB/uL4f15zVQdMtA0CBLyIhs+fgUe6blc87y7cxIL0FfxjRj44tGoW6rAZLgS8iIbFw1Q7u\nmZlH0aGjTLi8Bz+6oDOxMTqrDyQFvogE1aGjZTw2byUvfbqR7m2aMO2HA+jVvmmoy4oKCnwRCZqc\nTUWMzc5h/a6D3HF+J8YNySApXtMtg0WBLyIBV1ZewdML1/GnBWtok5LIK3ecy6CuLUNdVtRR4ItI\nQH216yBjsnPI2VTE8Mz2PDisN82S40NdVlRS4ItIQDjneOXzr3lkbgHxscafb+rP1f3ah7qsqKbA\nFxG/27H/MBNey2fByh2c37Ulj1/fl3bNkkNdVtRT4IuIX81fvo2Js/I5eKSMB67uxciB6cRoumVY\nUOCLiF8cOFLGQ3OWM33xZs5s35QpN2TSrU1KqMuS4yjwRaTeFm/Yw5jpOWzZW8JPL+rC6Eu7kxCn\nnvXhptbAN7MkYBGQ6Fl/pnPuATN7EfgWsM+z6q3OuZxAFSoi4edoWQV/fG81z7y/jg7Nk8n+8UDO\nSW8R6rLEC1/O8I8AFzvnDphZPPCRmb3teW28c25m4MoTkXC1dsd+RmfnsGxLMdeffTq/uboXKUma\nbhnOag1855wDDniexnt+XCCLEpHwVVHhmPbJBia/vZLGiXH89eazGXJm21CXJT7waZDNzGLNLAfY\nAbzrnPvM89KjZpZnZk+ZWaKX944ys8Vmtnjnzp1+KltEQmHbvsOM/PvnPDhnBYO7tuSd0Rco7COI\nVZ7A+7iyWSrwOvBzYDewDUgApgLrnHMP1fT+rKwst3jx4rpXKyIhMyd3K/e/sYyjZRXcf1VPvj8g\nrd49699YuoXH569ia1EJ7VOTGT8kg+H9O/ip4obDzJY457Lqu51TmqXjnCsys/eBoc65JzyLj5jZ\n34Fx9S1GRMLPvpJSHpi9jDdytpLZMZWnbsikU8vG9d7uG0u3MHFWPiWl5QBsKSph4qx8AIV+gNQ6\npGNmrTxn9phZMnApsNLM2nmWGTAcWBbIQkUk+P67bheXT1nEnLxCxlzanZl3DvRL2AM8Pn/VN2F/\nTElpOY/PX+WX7cvJfDnDbwdMM7NYKn9BTHfOzTWzBWbWCjAgB7gzgHWKSBAdLi3nD/9exd8++or0\n0xrz2k8Gkdkx1a+fsbWo5JSWS/35MksnD+hfzfKLA1KRiIRUQWExY7JzWLltPz84L437ruhJowT/\nf0ezfWoyW6oJ9/ap6rkTKPoqnIgAUF7hmLpoHcP+8jG7Dx7l7z88h0eG9wlI2AOMH5JBcpWbnyTH\nxzJ+SEZAPk/UWkFEgM17D3H39Fw++2oPQ85sw2PX9KVF44SAfuaxC7OapRM8CnyRKOacY9aXW5j0\n5nIc8MT1/bj2rA71nm7pq+H9Oyjgg0iBLxKl9h48yq/eyGde/jbOSW/OkyMy6diiUajLkgBS4ItE\noQ9W72T8jFz2HjrKvUN7MOrCzsSqZ32Dp8AXiSIlR8uZ/HYB0z7ZSLfWTXjh1nPo3aFZqMuSIFHg\ni0SJvM1FjMnOYd3Og9w2uBP3DM0gqcosGWnYFPgiDVxZeQXPvL+OP763hpZNEnn5jnMZ3LVlqMuS\nEFDgizRgG3cfZEx2Dl9+XcR3+7Xn4WG9adZIPeujlQJfpAFyzvHqF5t4eO4K4mKMP96YybBMTX+M\ndgp8kQZm5/4jTJyVx38KdjCoy2k8cX0/tSsQQIEv0qD8Z8V27n0tj/1Hyvj1Vb344aB0YjTdUjwU\n+CINwMEjZTw8dwWvfrGJnu2a8soNmWS0TQl1WRJmFPgiEW7Jxr2Myc5h095D3PmtLoz5TjcS4zTd\nUk6mwBeJUKXlFfzpvTU8vXAt7Zolkz1qIAM6tQh1WRLGFPgiEWjtjgOMyc4hf8s+rjv7dB64uhcp\nSZpuKTVT4ItEEOcc//hkI7+dV0CjhFie/cFZDO3dLtRlSYSoNfDNLAlYBCR61p/pnHvAzDoBrwIt\ngC+Bm51zRwNZrEg02158mPEz81i0eicXZbTi99f2pXXTpFCXJRHElzP8I8DFzrkDZhYPfGRmbwNj\ngaecc6+a2bPA7cAzAaxVJGrNyy/kvtfzOVxazsPDe/ODc9OC1rNeGg5f7mnrgAOep/GeHwdcDHzf\ns3waMAkFvohfFR8uZdLs5cxauoV+pzfjyRsy6dKqSajLkgjl0xi+mcUCS4CuwNPAOqDIOVfmWWUz\noO9ti/jRp+t3c/f0XLYVH+aXl3Tjrou7Eh+r21BL3fkU+M65ciDTzFKB14Ge1a1W3XvNbBQwCiAt\nLa2OZYpEjyNl5Tz579VM/XA9Z7RoxMw7B9I/rXmoy5IG4JRm6TjniszsfeA8INXM4jxn+acDW728\nZyowFSArK6vaXwoiUmnltmJGv5rDym37+f65adx/ZU8aJWgynfhHrX8fmlkrz5k9ZpYMXAoUAAuB\n6zyrjQRmB6pIkYauosLxtw/Xc+WfPmL19v0AfLBqJ/9evj3ElUlD4supQztgmmccPwaY7pyba2Yr\ngFfN7BFgKfB8AOsUabC2FJUwbnoun6zfTYxBhfvf8omz8gEY3l+XyKT+fJmlkwf0r2b5emBAIIoS\niQbOOWbnbOXXs5dRUeFITY6nqKT0hHVKSst5fP4qBb74hS75i4RA0aGj3PWvpYzOziGjTQpv//JC\n9lUJ+2O2FpUEuTppqHQ1SCTIPlyzk3Ezctl94Cjjh2Rw57e6EBtjtE9NZks14a6bl4i/6AxfJEgO\nl5Yz6c3l3Pz856QkxfPGzwbzs293JdZzg5LxQzJIjj+xrXFyfCzjh2SEolxpgHSGLxIEy7bsY3R2\nDmt3HODWQelMuLwHSVXC/dg4/ePzV7G1qIT2qcmMH5Kh8XvxGwW+SACVVzie/WAdT727mtOaJPDS\n7QO4oFsrr+sP799BAS8Bo8AXCZCvdx9izPQclmzcy5V92/Ho8N6kNkoIdVkSxRT4In7mnGPG4s08\nOGc5MTHGlBsyGZbZXt0tJeQU+CJ+tPvAESbOyuffK7YzsPNpPDGiHx00y0bChAJfxE8WrNzOPTPz\nKC4p4/4re3Lb4E7ExOisXsKHAl+kng4eKePReQW88tnX9Gibwj/vOJcebZuGuiyRkyjwRerhy6/3\nMjY7h417DvHjCzsz9rLuJMbF1v5GkRBQ4IvUQWl5BX9esJanF66lbdMk/vWj8ziv82mhLkukRgp8\nkVO0bucBxmbnkLt5H9f078CkYWfSNCk+1GWJ1EqBL+Ij5xz//OxrHn1rBUnxsTz9/bO4sm+7UJcl\n4jMFvogPdhQf5p7X8nh/1U4u7N6Kx6/rS5umSaEuS+SUKPBFvHhj6RYen7+KLUUlxBjExhgPDTuT\nm887Q1+ikoikwBepxhtLtzDhtTwOl1UAlXehijejaVK8wl4iltoji1TjkbdWfBP2xxwpq+Dx+atC\nVJFI/flyE/OOZrbQzArMbLmZ/dKzfJKZbTGzHM/PFYEvVySwjpSVM/ntlew6cLTa13X3KYlkvgzp\nlAF3O+e+NLMUYImZvet57Snn3BOBK08keFZv388vX82hoLCYRgmxHDpaftI6uvuURDJfbmJeCBR6\nHu83swJADbulwaiocPz9vxv43TsrSUmM47lbsjh4pIyJs/IpKf1f6Ef73aeOXcTWzVki1yldtDWz\ndKA/8BkwGLjLzG4BFlP5V8Deat4zChgFkJaWVs9yRfyrcF8J42bk8vHa3VzaszWTr+1LyyaJ37yu\ngKv0xtItJ/wC3FJUwsRZ+QBRu08ikTnnfFvRrAnwAfCoc26WmbUBdgEOeBho55y7raZtZGVlucWL\nF9ezZBH/mJ2zhV+/sYzDZRU0io+lqKSUDlEe7N4Mnryg2husd0hN5uMJF4egouhiZkucc1n13Y5P\nZ/hmFg+8BrzsnJsF4JzbftzrzwFz61uMSDDsO1TKr2cv483craSf1ojCfYcpKikFdObqjbeL1bqI\nHVl8maVjwPNAgXPuyeOWH/+d8u8By/xfnoh/fbx2F0OmLGJefiHjLuvO0bIKjlSZfllSWq7pl1V4\nu1iti9iRxZd5+IOBm4GLq0zB/L2Z5ZtZHvBtYEwgCxWpj8Ol5Tw0ZwX/97fPaJQYy6yfDuKui7tR\nuO9wtevrzPVE44dkkBx/YtvnaL+IHYl8maXzEVDdVwvn+b8cEf9bvnUfo1/NYc2OA4wceAYTLu9J\nckJleLVPTa52bFpnric6Nryli9iRTa0VpMEqr3BMXbSeJ99dRfNGCUy7bQDf6t7qhHXGD8nQ9Esf\nDe/fQQEf4RT40iBt2nOIsdNz+GLDXq7o05ZHh/eheeOEk9bTmatEEwW+NCjOOWYu2cyDc1ZgwJMj\n+vG9/h1qbHimM1eJFgp8aTD2HDzKxFl5zF++nQGdWvDkiH6c3rxRqMsSCRsKfGkQFq7cwfiZeRSX\nlHLfFT24/fzOxMaojbHI8RT4EtEOHS3jt/MK+OenX5PRJoWXbh9Az3ZNQ12WSFhS4EvEytlUxNjs\nHL7afZA7zu/EuCEZJFWZKy4i/6PAl4hTVl7BXxau5c8L1tImJZGX7ziXQV1ahroskbCnwJeI8tWu\ng4zOziF3UxHDM9vz4LDeNEuOD3VZIhFBgS8RwTnHK59/zSNzC0iIi+HPN/Xn6n7tQ12WSERR4EvY\n27H/MPfOzGPhqp1c0K0lj1/Xj7bNkkJdlkjEUeBLWJu/fBsTZ+Vz8EgZk67uxS0D04mpx3RL3bVJ\nopkCX8LSgSNlPPjmcmYs2cyZ7Zsy5YZMurVJqdc2ddcmiXYKfAk7X2zYw9jpOWzZW8Jd3+7KLy7p\nRkKcL528a/b4/FUnNEmD//W+V+BLNFDgS9g4WlbBlP+s5tkP1nF680ZM//FAstJb+G37umuTRDsF\nvoSF1dv3M/rVHFYUFnNDVkd+fXUvmiTW7fD0Nk6v3vcS7RT4ElIVFY4X/7uBye+spEliHH+9+WyG\nnNm2zturaZxeve8l2tUa+GbWEfgH0BaoAKY65/5oZi2AbCAd2ACMcM7tDVyp0tAU7ith/Iw8Plq7\ni0t6tGbytX1plZJYr23WNE7/8YSLv1lHs3QkGvlyhl8G3O2c+9LMUoAlZvYucCvwnnNusplNACYA\n9wauVGlI5uRu5Vev51Na7vjt9/pw04CONfas91Vt4/TqfS/RzJd72hYChZ7H+82sAOgADAMu8qw2\nDXgfBb7UYt+hUn7z5jJm52wls2MqU27IJL1lY79tX+P0It6d0lw3M0sH+gOfAW08vwyO/VJo7e/i\npGH579pdDP3jIubmFTL2O92ZeedAv4Y9VI7TJ1fpmKlxepFKPl+0NbMmwGvAaOdcsa9/fpvZKGAU\nQFpaWl1qlDB0Kt9YPewZQ3/+o6/o3LIxs34yiH4dUwNSl+5RK+KdOedqX8ksHpgLzHfOPelZtgq4\nyDlXaGbtgPedczWeRmVlZbnFixf7oWwJpaozYaDyLPqxa/qcFKwrthYzOnspq7cf4JaBZzDx8p4k\nJ6hnvcipMLMlzrms+m6n1iEdqzyVfx4oOBb2Hm8CIz2PRwKz61uMRIaaZsIcU17hePaDdQx7+iP2\nHirlxR+ew0PDeivsRULIlyGdwcDNQL6Z5XiW3QdMBqab2e3A18D1gSlRwk1tM2E27TnE3dNz+XzD\nHoae2ZbfXtOHFo0TglmiiFTDl1k6HwHeBuwv8W85Egm8zYRp1yyJmUs2M+nN5QA8cX0/rj2rg1+m\nW4pI/dW/I5VEnepmwiTFxdAqJZFxM3Lp1a4pb//yAq47+3SFvUgYUWsFOWVVZ8K0aJxAaXkFKwqL\nmXB5D350QWdi69GzXkQCQ4EvdTK8fweGnNmW384r4KVPN9K9TROm3NCfXu2bhro0EfFCgS91krup\niDHZOazfdZA7zu/EuCEZJMVrBo5IOFPgyykpK6/g6YXr+NOCNbROSeSVO85lUNeWoS5LRHygwBef\nfbXrIGOyc8jZVMSwzPY89N3eNGsUH+qyRMRHCnyplXOOf32+iYfnriA+1vjTTf35br/2oS5LRE6R\nAl9qtHP/ESa8lsd7K3cwuOtpPHF9P9o1U+dJkUikwBev3l2xnQmv5bH/SBm/uaoXtw5KJ0bTLUUi\nlgJfTnLgSBkPz1lB9uJN9GrXlFdvzKRbm5RQlyUi9aTAr4dTaREcKZZs3MOY7Fw27T3ETy/qwuhL\nu5MQpy9kizQECvw6qulm2ZEY+kfLKvjje6t55v11tE9NZvqPB3JOeotQlyUifqTAr6OaWgRHWuCv\n3bGf0dk5LNtSzPVnn85vru5FSpKmW4o0NAr8OqqtRXAkqKhw/OOTDTz29koaJcTy7A/OYmjvdqEu\nS0QCRIFfR5F+s+ztxYcZNyOXD9fs4qKMVvz+ur60TkkKdVkiEkC6GldHkXyz7LfyCrnsqUV8sWEP\njwzvzd9vPUdhLxIFdIZfR5F4s+ziw6U8MHs5ry/dQr+OqTw1oh+dWzUJdVkiEiQK/HoY3r9DWAf8\n8T5dv5u7p+eyrfgwoy/txs++3ZX42FP7A68hTkMViSa+3MT8BTPbYWbLjls2ycy2mFmO5+eKwJYp\ndXWkrJzfzivgpuc+JSEuhpl3DmT0pd3rFPYTZ+WzpagEx/+mob6xdEtgChcRv/Plv/oXgaHVLH/K\nOZfp+Znn37LEHwoKixn2l4+Zumg93x+Qxlu/OJ/+ac3rtK2apqGKSGTw5Sbmi8wsPfCliL9UVDj+\n9tF6npi/mqbJ8bxwaxYX92hTr202hGmoItGuPmP4d5nZLcBi4G7n3N7qVjKzUcAogLS0tHp8nPhi\n895DjJuRy6fr93BZrzY8dk0fTmuSWO/tRvo0VBGp+7TMZ4AuQCZQCPzB24rOuanOuSznXFarVq3q\n+HFSG+ccry/dzOVTPiR/8z5+f11f/nrz2X4Je4jsaagiUqlOZ/jOue3HHpvZc8Bcv1Ukp6zo0FF+\n9foy3sovJOuM5jx1QyYdWzTy62dE4jRUETlRnQLfzNo55wo9T78HLKtpfQmcD9fsZNyMXPYcPMo9\nQzP48YVdiA1Qz/pImoYqIierNfDN7F/ARUBLM9sMPABcZGaZgAM2AD8OYI1SjcOl5Ux+eyUv/ncD\n3Vo34fmR59C7Q7NQlyUiYcyXWTo3VbP4+QDUIj7K37yPMdNzWLvjALcN7sQ9QzNIqjK+LiJSlb5p\nG0HKyit49oN1TPnPGlo2SeSft5/L+d1ahrosEYkQCvwIsXH3QcZOz2XJxr1c1bcdjwzvTWqjhFCX\nJSIRRIEf5pxzTF+8iYfmrCAmxvjjjZkMy9SFUxE5dQr8MLbrwBEmzsrn3RXbGdj5NP4wop++6CQi\ndabAD1P/WbGdCbPyKD5cxv1X9uS2wZ2ICdB0SxGJDgr8MHPwSBmPvLWCf32+iZ7tmvLyHZlktE0J\ndVki0gAo8MPEG0u38OhbBew8cASAi3u05pkfnEViXO3TLdWnXkR8oVschoHXlmxm3Izcb8Ie4JN1\nu3k7f1ut71WfehHxlQI/xNbtPMCEWXmUVbgTlvvaa1596kXEVxrSCRHnHC99upGH566gtNxVu44v\nvebVp15EfKXAD4EdxYcZPzOPD1bvpKaJN75MwVSfehHxlYZ0guzt/EIum7KIz77aTbPkeCqqP7n3\nude8+tSLiK8U+EFSfLiUsdNz+MnLX5LWohFzf34BxSWlXtd/7Jo+Ps20Gd6/A49d04cOqckY0CE1\n2ef3ikh00ZBOEHy2fjdjp+dSuK+EX1zclZ9f0o342BivwzEdUpNPKbDVp15EfKEz/AA6UlbOY28X\ncONznxIXa8z8ySDGXpZBfGzlbtdwjIgEk87wA2TVtv2Mzs6hoLCYmwakcf+VPWmceOLu1m0DRSSY\nFPh+VlHheOHjr/j9O6tomhzH327J4tJebbyur+EYEQkWX25x+AJwFbDDOdfbs6wFkA2kU3mLwxHO\nub2BKzMybC0q4e7puXyyfjeX9mzD5Gv70LJJYqjLEhEBfBvDfxEYWmXZBOA951w34D3P86g2O2cL\nQ6YsIndzEb+7tg/P3XK2wl5Ewoov97RdZGbpVRYPo/LG5gDTgPeBe/1YV8TYd6iU+2cvY07uVs4+\nozlPjujHGac1DnVZIiInqesYfhvnXCGAc67QzFr7saawUlMnyo/W7GLcjFx2HTjCuMu6c+e3uhAX\nq4lPIhKeAn7R1sxGAaMA0tLS/L79QLYGPtaJ8lhzsmOdKI+WVVCwrZi/f7yBLq0a89wtg+lzejO/\nfKaISKDUNfC3m1k7z9l9O2CHtxWdc1OBqQBZWVleGgnUjbdABvwS+t46Ud73ej5lFY5bB6Vz79Ae\nJCfU3rNeRCTU6jr+8CYw0vN4JDDbP+WcmkC3BvbWcbKswvGP2wYw6btnKuxFJGLUGvhm9i/gEyDD\nzDab2e3AZOA7ZrYG+I7nedAFujWwt46T7ZomcWH3Vn75DBGRYPFlls5NXl66xM+1nLJAtwYed1l3\n7nkt74R+9UlxMdx7eQ+/bF9EJJgiekpJIHvR7D5whHeWb6O03JEQV7mbOqQmM/navvpmrIhEpIhu\nrRCoXjQLV+5g/Mw8iktKue+KHtxxfmdiarpTiYhIBIjowAf/9qI5dLSMR98q4OXPvqZH2xReun0A\nPds19cu2RURCLeID31+Wfr2XsdNz2bD7IKMu7MzY73QnKV4zcESk4Yj6wC8tr+AvC9byl4Vrads0\niVfuOI+BXU4LdVkiIn4X1YG/fucBxkzPJXdTEdf078CkYWfSNCk+1GWJiARERM/SqSvnHP/8dCND\np3xI3uYiAD77ag8LCrx+YVhEJOJF3Rn+jv2HuXdmHgtX7STGwHmm2Pu7LYOISLiJqjP8d5ZtY+iU\nD/nvut00S46nokpnH3+2ZRARCTdREfj7D5cyfkYud/5zCe1Tk3jrF+dTXFJa7br+assgIhJuGvyQ\nzhcb9jAmO4etRSXc9e2u/OKSbiTExQS8LYOISLhpsGf4R8sq+N07Kxnx10+IMWPGnQMZNyTjmzYJ\ngWzLICISjhrkGf7q7fsZ/WoOKwqLufGcjtx/VS+aJJ74PzVQbRlERMJVxAV+TXe4qqhw/P2/G/jd\nOytJSYzjuVuy+E6vNl635c+2DCIi4S6iAr+mO1yd27kF42bk8vHa3VzSozWTr+1Lq5TEUJYrIhJW\nIirwvd3h6sE5yymvcJRVOB67pg83ntMRM3W3FBE5XkQFvrcpk3sPldI/LZWnRmSS3rJxkKsSEYkM\nERX43qZSpiTFMePHA4mLbbCTjkRE6q1eCWlmG8ws38xyzGyxv4ryZvyQDJLiTiw5MTaGh4f1VtiL\niNTCH2f433bO7fLDdmrVrU0TUhslsK34MADtmiVx79AemmkjIuKDiBjSKa9wTF20niffXUXzRglM\nu20A3+reKtRliYhElPoGvgP+bWYO+KtzbmrVFcxsFDAKIC0t7ZQ/YNOeQ9w9PZfPN+zhij5teXR4\nH5o3Tqhn2SIi0ae+gT/YObfVzFoD75rZSufcouNX8PwSmAqQlZXlqttIdZxzzFyymQfnrMCAJ0f0\n43v9O2i6pYhIHdUr8J1zWz3/7jCz14EBwKKa31W7PQePct+sfN5Zvo0BnVrw5Ih+nN68UX03KyIS\n1eoc+GbWGIhxzu33PL4MeKi+BS1ctYN7ZuZRdOgoEy/vwR0XdCY2Rmf1IiL1VZ8z/DbA654hljjg\nFefcO3XdWMnRcn47r4CXPt1rJDRZAAAHNElEQVRIRpsUpv1wAL3aN61HeSIicrw6B75zbj3Qzx9F\n5G4qYkx2Dl/tPsiPLujE3ZdlkFSldbGIiNRPSKdllpVX8PTCdfxpwRrapCTy8h3nMqhLy1CWJCLS\nYIUs8L/adZAx2TnkbCpieGZ7BnRqwfgZeepNLyISIEEPfOccr3z+NY/MLSAhLoY/39Sf8grnte2x\nQl9ExD+CGvhlFY47pi3mvZU7OL9rS564vh9tmyUxePKCatsePz5/lQJfRMRPghr4a7bvp2TtLh64\nuhcjB6YT45lu6a3tsbflIiJy6oIa+HGxMcz9+fl0a5NywnJvbY/bpyYHqzQRkQYvqD2Fu7ZqclLY\nQ2Xb4+Qq0zCT42MZPyQjWKWJiDR4QT3D99YG59g4vbebk4uISP2FTXvk4f07KOBFRAJIt4kSEYkS\nCnwRkSihwBcRiRIKfBGRKKHAFxGJEgp8EZEoocAXEYkSCnwRkShRr8A3s6FmtsrM1prZBH8VJSIi\n/lfnwDezWOBp4HKgF3CTmfXyV2EiIuJf9TnDHwCsdc6td84dBV4FhvmnLBER8bf6BH4HYNNxzzd7\nlomISBiqT/O06npfupNWMhsFjPI8PWJmy+rxmcHSEtgV6iJ8oDr9JxJqBNXpb5FSp196xdcn8DcD\nHY97fjqwtepKzrmpwFQAM1vsnMuqx2cGher0r0ioMxJqBNXpb5FUpz+2U58hnS+AbmbWycwSgBuB\nN/1RlIiI+F+dz/Cdc2VmdhcwH4gFXnDOLfdbZSIi4lf1ugGKc24eMO8U3jK1Pp8XRKrTvyKhzkio\nEVSnv0VVnebcSddZRUSkAVJrBRGRKBGQwK+t5YKZJZpZtuf1z8wsPRB11FJjRzNbaGYFZrbczH5Z\nzToXmdk+M8vx/Pwm2HV66thgZvmeGk66Wm+V/uTZn3lmdlaQ68s4bh/lmFmxmY2usk5I9qWZvWBm\nO46fDmxmLczsXTNb4/m3uZf3jvSss8bMRoagzsfNbKXn/9PXzSzVy3trPD6CUOckM9ty3P+3V3h5\nb9BasXipM/u4GjeYWY6X9wZlf3rLoIAen845v/5QeQF3HdAZSABygV5V1vkp8Kzn8Y1Atr/r8KHO\ndsBZnscpwOpq6rwImBvs2qqpdQPQsobXrwDepvK7EecBn4Ww1lhgG3BGOOxL4ELgLGDZcct+D0zw\nPJ4A/K6a97UA1nv+be553DzIdV4GxHke/666On05PoJQ5yRgnA/HRY25EOg6q7z+B+A3odyf3jIo\nkMdnIM7wfWm5MAyY5nk8E7jEzKr7IlfAOOcKnXNfeh7vBwqI3G8KDwP+4Sp9CqSaWbsQ1XIJsM45\ntzFEn38C59wiYE+Vxccff9OA4dW8dQjwrnNuj3NuL/AuMDSYdTrn/u2cK/M8/ZTK77qElJf96Yug\ntmKpqU5P1owA/hWoz/dFDRkUsOMzEIHvS8uFb9bxHND7gNMCUItPPENK/YHPqnl5oJnlmtnbZnZm\nUAv7Hwf828yWWOU3l6sKpzYXN+L9P6Rw2JcAbZxzhVD5Hx3Qupp1wmmfAtxG5V9x1ant+AiGuzxD\nTy94GYIIp/15AbDdObfGy+tB359VMihgx2cgAt+Xlgs+tWUIBjNrArwGjHbOFVd5+Usqhyb6AX8G\n3gh2fR6DnXNnUdmZ9GdmdmGV18Nif1rlF/C+C8yo5uVw2Ze+Cot9CmBmvwLKgJe9rFLb8RFozwBd\ngEygkMrhkqrCZn8CN1Hz2X1Q92ctGeT1bdUsq3V/BiLwfWm58M06ZhYHNKNufybWi5nFU7mjX3bO\nzar6unOu2Dl3wPN4HhBvZi2DXCbOua2ef3cAr1P55/HxfGpzEQSXA18657ZXfSFc9qXH9mNDXp5/\nd1SzTljsU8/FuKuA/3OewduqfDg+Aso5t905V+6cqwCe8/L54bI/44BrgGxv6wRzf3rJoIAdn4EI\nfF9aLrwJHLuqfB2wwNvBHCiecbzngQLn3JNe1ml77NqCmQ2gcn/tDl6VYGaNzSzl2GMqL+RVbUD3\nJnCLVToP2HfsT8Ig83rmFA778jjHH38jgdnVrDMfuMzMmnuGKC7zLAsaMxsK3At81zl3yMs6vhwf\nAVXletH3vHx+uLRiuRRY6ZzbXN2LwdyfNWRQ4I7PAF19voLKK87rgF95lj1E5YELkETln/1rgc+B\nzoG8Gu6lxvOp/BMoD8jx/FwB3Anc6VnnLmA5lTMKPgUGhaDOzp7Pz/XUcmx/Hl+nUXkzmnVAPpAV\ngjobURngzY5bFvJ9SeUvoEKglMqzotupvF70HrDG828Lz7pZwN+Oe+9tnmN0LfDDENS5lspx2mPH\n57GZbe2BeTUdH0Gu8yXPcZdHZVi1q1qn5/lJuRDMOj3LXzx2TB63bkj2Zw0ZFLDjU9+0FRGJEvqm\nrYhIlFDgi4hECQW+iEiUUOCLiEQJBb6ISJRQ4IuIRAkFvohIlFDgi4hEif8HC5wNJAVBQWYAAAAA\nSUVORK5CYII=\n",
  1463. "text/plain": [
  1464. "<Figure size 432x288 with 1 Axes>"
  1465. ]
  1466. },
  1467. "metadata": {},
  1468. "output_type": "display_data"
  1469. },
  1470. {
  1471. "name": "stdout",
  1472. "output_type": "stream",
  1473. "text": [
  1474. "2.0188677310943604 2.8898627758026123\n"
  1475. ]
  1476. }
  1477. ],
  1478. "source": [
  1479. "# 随机初始化参数\n",
  1480. "w = V(t.rand(1,1), requires_grad=True)\n",
  1481. "b = V(t.zeros(1,1), requires_grad=True)\n",
  1482. "\n",
  1483. "lr =0.001 # 学习率\n",
  1484. "\n",
  1485. "for ii in range(8000):\n",
  1486. " x, y = get_fake_data()\n",
  1487. " x, y = V(x), V(y)\n",
  1488. " \n",
  1489. " # forward:计算loss\n",
  1490. " y_pred = x.mm(w) + b.expand_as(y)\n",
  1491. " loss = 0.5 * (y_pred - y) ** 2\n",
  1492. " loss = loss.sum()\n",
  1493. " \n",
  1494. " # backward:手动计算梯度\n",
  1495. " loss.backward()\n",
  1496. " \n",
  1497. " # 更新参数\n",
  1498. " w.data.sub_(lr * w.grad.data)\n",
  1499. " b.data.sub_(lr * b.grad.data)\n",
  1500. " \n",
  1501. " # 梯度清零\n",
  1502. " w.grad.data.zero_()\n",
  1503. " b.grad.data.zero_()\n",
  1504. " \n",
  1505. " if ii%1000 ==0:\n",
  1506. " # 画图\n",
  1507. " display.clear_output(wait=True)\n",
  1508. " x = t.arange(0, 20).view(-1, 1)\n",
  1509. " y = x.mm(w.data) + b.data.expand_as(x)\n",
  1510. " plt.plot(x.numpy(), y.numpy()) # predicted\n",
  1511. " \n",
  1512. " x2, y2 = get_fake_data(batch_size=20) \n",
  1513. " plt.scatter(x2.numpy(), y2.numpy()) # true data\n",
  1514. " \n",
  1515. " plt.xlim(0,20)\n",
  1516. " plt.ylim(0,41) \n",
  1517. " plt.show()\n",
  1518. " plt.pause(0.5)\n",
  1519. " \n",
  1520. "print(w.data.squeeze()[0], b.data.squeeze()[0])"
  1521. ]
  1522. },
  1523. {
  1524. "cell_type": "markdown",
  1525. "metadata": {},
  1526. "source": [
  1527. "用autograd实现的线性回归最大的不同点就在于autograd不需要计算反向传播,可以自动计算微分。这点不单是在深度学习,在许多机器学习的问题中都很有用。另外需要注意的是在每次反向传播之前要记得先把梯度清零。\n",
  1528. "\n",
  1529. "本章主要介绍了PyTorch中两个基础底层的数据结构:Tensor和autograd中的Variable。Tensor是一个类似Numpy数组的高效多维数值运算数据结构,有着和Numpy相类似的接口,并提供简单易用的GPU加速。Variable是autograd封装了Tensor并提供自动求导技术的,具有和Tensor几乎一样的接口。`autograd`是PyTorch的自动微分引擎,采用动态计算图技术,能够快速高效的计算导数。"
  1530. ]
  1531. }
  1532. ],
  1533. "metadata": {
  1534. "kernelspec": {
  1535. "display_name": "Python 3",
  1536. "language": "python",
  1537. "name": "python3"
  1538. },
  1539. "language_info": {
  1540. "codemirror_mode": {
  1541. "name": "ipython",
  1542. "version": 3
  1543. },
  1544. "file_extension": ".py",
  1545. "mimetype": "text/x-python",
  1546. "name": "python",
  1547. "nbconvert_exporter": "python",
  1548. "pygments_lexer": "ipython3",
  1549. "version": "3.6.8"
  1550. }
  1551. },
  1552. "nbformat": 4,
  1553. "nbformat_minor": 2
  1554. }

机器学习越来越多应用到飞行器、机器人等领域,其目的是利用计算机实现类似人类的智能,从而实现装备的智能化与无人化。本课程旨在引导学生掌握机器学习的基本知识、典型方法与技术,通过具体的应用案例激发学生对该学科的兴趣,鼓励学生能够从人工智能的角度来分析、解决飞行器、机器人所面临的问题和挑战。本课程主要内容包括Python编程基础,机器学习模型,无监督学习、监督学习、深度学习基础知识与实现,并学习如何利用机器学习解决实际问题,从而全面提升自我的《综合能力》。