<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: machine-gurning</title>
    <description>The latest articles on DEV Community by machine-gurning (@yakimoff).</description>
    <link>https://dev.to/yakimoff</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/yakimoff"/>
    <language>en</language>
    <item>
      <title>Async, Await, Future, Completer, in Dart - Succinct guide with examples</title>
      <dc:creator>machine-gurning</dc:creator>
      <pubDate>Mon, 30 Jan 2023 14:41:41 +0000</pubDate>
      <link>https://dev.to/yakimoff/async-await-future-completer-in-dart-succinct-guide-with-examples-2el0</link>
      <guid>https://dev.to/yakimoff/async-await-future-completer-in-dart-succinct-guide-with-examples-2el0</guid>
      <description>&lt;h1&gt;
  
  
  Async
&lt;/h1&gt;

&lt;h3&gt;
  
  
  Definition and use
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;keyword when defining a function that makes that function &lt;em&gt;asynchronous&lt;/em&gt;. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;e.g.:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight dart"&gt;&lt;code&gt;&lt;span class="kt"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;myFunc&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="kd"&gt;async&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="p"&gt;...&lt;/span&gt; &lt;span class="p"&gt;};&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;When an &lt;code&gt;async&lt;/code&gt; function is called, lines will be executed in a &lt;em&gt;synchronous&lt;/em&gt; fashion until the keyword &lt;code&gt;await&lt;/code&gt; (explained below) is encountered, at which point the interpreter will "wait" until that line "resolves" (described below). &lt;/li&gt;
&lt;li&gt;In the meantime, it will treat the function as "paused" (layman's description) and continue to run the next part of the program -- it will go back to the context of where the function is called and run the next line there.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Rationale
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;If you want a function to be able to "pause" and "wait" for certain lines within it to resolve, because they aren't instantaneously available, the function must be &lt;code&gt;async&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  Await
&lt;/h1&gt;

&lt;h3&gt;
  
  
  Definition and use
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Within &lt;code&gt;async&lt;/code&gt; functions, the keyword &lt;code&gt;await&lt;/code&gt; can "pause" the interpreter on that line until it is "resolved". &lt;/li&gt;
&lt;li&gt;Only things that return a &lt;code&gt;Future&lt;/code&gt; object (explained below) can be &lt;code&gt;awaited&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight dart"&gt;&lt;code&gt;&lt;span class="kt"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;parentFunc&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="kd"&gt;async&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"parent func start"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="c1"&gt;// At the next line, the interpreter will "stop" and "wait"&lt;/span&gt;
  &lt;span class="c1"&gt;// until the thing that is being awaited (Future.delayed, &lt;/span&gt;
  &lt;span class="c1"&gt;// explained below) is "resolved". &lt;/span&gt;
  &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;Future&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="na"&gt;delayed&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Duration&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nl"&gt;seconds:&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"parent func wait is finished"&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt; 
  &lt;span class="n"&gt;firstChildFunc&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
  &lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;'parent func continuing'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="n"&gt;secondChildFunc&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
  &lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"parent func end"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Rationale
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Don't want to block other things from happening while one thing is being resolved, but do want to respect the order of events within a given function.&lt;/li&gt;
&lt;li&gt;Therefore, if something is taking time, "pause" there, keep calculating other things below where that function is called, and once that thing is resolved, keep going through that function.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Examples
&lt;/h3&gt;

&lt;p&gt;Here is an exercise in trying to guess the order of print statements. There are more such exercises below. Write down which order you think it will print before scrolling further.&lt;/p&gt;

&lt;p&gt;We have &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;a &lt;code&gt;main&lt;/code&gt; func which is called when you run the program&lt;/li&gt;
&lt;li&gt;an async &lt;code&gt;parentFunc&lt;/code&gt; which is referenced in the main func&lt;/li&gt;
&lt;li&gt;the &lt;code&gt;parentFunc&lt;/code&gt; has two functions referenced in it. Both of those are &lt;code&gt;await&lt;/code&gt; -ed.&lt;/li&gt;
&lt;li&gt;each child function is also async, and contains some await clauses
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight dart"&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="s"&gt;'dart:async'&lt;/span&gt;&lt;span class="o"&gt;;&lt;/span&gt;

&lt;span class="kt"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;main&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="n"&gt;parentFunc&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
  &lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"main func complete"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="kt"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;parentFunc&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="kd"&gt;async&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"parent func start"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;Future&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="na"&gt;delayed&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Duration&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nl"&gt;seconds:&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"parent func wait is finished"&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt;
  &lt;span class="n"&gt;firstChildFunc&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
  &lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;'parent func continuing'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="n"&gt;secondChildFunc&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
  &lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"parent func end"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="kt"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;firstChildFunc&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="kd"&gt;async&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"1st child func start"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;Future&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="na"&gt;delayed&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Duration&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nl"&gt;seconds:&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"1st child first wait is finished"&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt;
  &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;Future&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="na"&gt;delayed&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Duration&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nl"&gt;seconds:&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"1st child second wait is finished"&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt;
  &lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"1st child func end"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="kt"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;secondChildFunc&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="kd"&gt;async&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"2nd child func start"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;Future&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="na"&gt;delayed&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Duration&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nl"&gt;seconds:&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"2nd child first wait is finished"&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt;
  &lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"2nd child func end"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;.&lt;br&gt;
.&lt;br&gt;
.&lt;br&gt;
.&lt;br&gt;
.&lt;br&gt;
.&lt;br&gt;
Answer:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight dart"&gt;&lt;code&gt;
&lt;span class="c1"&gt;// parent func start&lt;/span&gt;
&lt;span class="c1"&gt;// main func complete&lt;/span&gt;
&lt;span class="c1"&gt;// parent func wait is finished&lt;/span&gt;
&lt;span class="c1"&gt;// 1st child func start&lt;/span&gt;
&lt;span class="c1"&gt;// parent func continuing&lt;/span&gt;
&lt;span class="c1"&gt;// 2nd func start&lt;/span&gt;
&lt;span class="c1"&gt;// parent func end&lt;/span&gt;
&lt;span class="c1"&gt;// 1st child first wait is finished&lt;/span&gt;
&lt;span class="c1"&gt;// 2nd child first wait is finished&lt;/span&gt;
&lt;span class="c1"&gt;// 2nd func end&lt;/span&gt;
&lt;span class="c1"&gt;// 1st child second wait is finished&lt;/span&gt;
&lt;span class="c1"&gt;// 1st func end&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Explanation:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;the main function calls the parent function&lt;/li&gt;
&lt;li&gt;the parent function has a synchronous print statement, which, as expected, prints right away&lt;/li&gt;
&lt;li&gt;then we encounter an &lt;code&gt;await&lt;/code&gt; which blocks &lt;code&gt;parentFunc()&lt;/code&gt; from executing more lines, but unblocks the &lt;code&gt;main&lt;/code&gt; function, hence the &lt;code&gt;main func complete&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;then we are solely blocked by the &lt;code&gt;parentFunc&lt;/code&gt; 's &lt;code&gt;await&lt;/code&gt; -- there is nothing remaining to execute in the &lt;code&gt;main&lt;/code&gt; func&lt;/li&gt;
&lt;li&gt;we enter the &lt;code&gt;firstChildFunc&lt;/code&gt; and print a statement immediately, then hit an &lt;code&gt;await&lt;/code&gt;. So, as above, the &lt;code&gt;parentFunc&lt;/code&gt; keeps running, hence:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight dart"&gt;&lt;code&gt;&lt;span class="c1"&gt;// 1st child func start&lt;/span&gt;
&lt;span class="c1"&gt;// parent func continuing&lt;/span&gt;
&lt;span class="c1"&gt;// 2nd func start&lt;/span&gt;
&lt;span class="c1"&gt;// parent func end&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;and so on.. &lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  Future
&lt;/h1&gt;

&lt;h3&gt;
  
  
  Definition and use
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;A future is an object that represents the result of an asynchronous operation. A future object will return a value at a later time.&lt;/li&gt;
&lt;li&gt;A future has two states: uncompleted and completed. When a future completes, it has two possibilities:

&lt;ul&gt;
&lt;li&gt;Completed with a value&lt;/li&gt;
&lt;li&gt;Failed with an error&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Rationale
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Give a function the type &lt;code&gt;Future&amp;lt;T&amp;gt;&lt;/code&gt; (where &lt;code&gt;T&lt;/code&gt; is the type of what is resolved) so that it can be &lt;code&gt;await&lt;/code&gt;-ed. &lt;/li&gt;
&lt;li&gt;Otherwise you would have to type all the detail of what is being awaited into the &lt;code&gt;async&lt;/code&gt; function that has the &lt;code&gt;await&lt;/code&gt; keyword on one of its lines&lt;/li&gt;
&lt;li&gt;Not hard, just the object type that can be &lt;code&gt;await&lt;/code&gt; -end&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Examples
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Below, the &lt;code&gt;main&lt;/code&gt; function is &lt;code&gt;async&lt;/code&gt;, therefore we can write &lt;code&gt;await&lt;/code&gt; &lt;/li&gt;
&lt;li&gt;The function that is being &lt;code&gt;await&lt;/code&gt; -ed must be of type &lt;code&gt;Future&amp;lt;String&amp;gt;&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;That function returns a &lt;code&gt;Future&amp;lt;String&amp;gt;&lt;/code&gt; object &lt;/li&gt;
&lt;li&gt;Execute it. Nothing for two seconds, then the statement &lt;code&gt;Finally!&lt;/code&gt; is printed to the console
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight dart"&gt;&lt;code&gt;&lt;span class="kt"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;main&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="kd"&gt;async&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kt"&gt;String&lt;/span&gt; &lt;span class="n"&gt;bigAsk&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;slowlyGetTheString&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
  &lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;bigAsk&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="n"&gt;Future&lt;/span&gt;&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="kt"&gt;String&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;slowlyGetTheString&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;Future&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="na"&gt;delayed&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Duration&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nl"&gt;seconds:&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt; &lt;span class="s"&gt;"Finally!"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h1&gt;
  
  
  Completer
&lt;/h1&gt;

&lt;h3&gt;
  
  
  Definition and use
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;An object which can return be returned as a &lt;code&gt;Future&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;Use when you want to define for yourself exactly what will be returned as the future.&lt;/li&gt;
&lt;li&gt;Initialise as a &lt;code&gt;Completer&amp;lt;T&amp;gt;()&lt;/code&gt; (where &lt;code&gt;T&lt;/code&gt; is the type that the future will ultimately resolve into &lt;/li&gt;
&lt;li&gt;Fill with &lt;code&gt;completer.complete( ... )&lt;/code&gt; (where &lt;code&gt;...&lt;/code&gt; is of type &lt;code&gt;T&lt;/code&gt;)&lt;/li&gt;
&lt;li&gt;Return as &lt;code&gt;completer.future&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Rationale
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Have precise control over the type and contents of what is returned from a &lt;code&gt;Future&lt;/code&gt; function.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Examples
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Say I don't want to return exactly that which was retrieved from an API or a database, but instead I want to augment it slightly before returning it. &lt;/li&gt;
&lt;li&gt;The completer lets me pack whatever I like into the &lt;code&gt;Future&lt;/code&gt; object and then return that instead.&lt;/li&gt;
&lt;li&gt;Below, I call a function which then (pretends to) query an API (&lt;code&gt;firstPartOfPhrase&lt;/code&gt;), then uses that string in a larger string, and returns &lt;em&gt;that&lt;/em&gt; as the future object, not the original (pretend) query:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight dart"&gt;&lt;code&gt;&lt;span class="kt"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;main&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="kd"&gt;async&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kt"&gt;String&lt;/span&gt; &lt;span class="n"&gt;bigAsk&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;slowlyGetTheString&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
  &lt;span class="n"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;bigAsk&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="c1"&gt;// Note, this is async only because I am using await, &lt;/span&gt;
&lt;span class="c1"&gt;// not to do with completer&lt;/span&gt;
&lt;span class="n"&gt;Future&lt;/span&gt;&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="kt"&gt;String&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;slowlyGetTheString&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="kd"&gt;async&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;

  &lt;span class="c1"&gt;// Initialise my completer&lt;/span&gt;
  &lt;span class="n"&gt;Completer&lt;/span&gt;&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="kt"&gt;String&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;c&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;Completer&lt;/span&gt;&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="kt"&gt;String&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;();&lt;/span&gt;

  &lt;span class="c1"&gt;// Do something that takes time &lt;/span&gt;
  &lt;span class="kt"&gt;String&lt;/span&gt; &lt;span class="n"&gt;firstPartOfPhrase&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;Future&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="na"&gt;delayed&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Duration&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nl"&gt;seconds:&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt; &lt;span class="s"&gt;"Finally!"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

  &lt;span class="c1"&gt;// Assign a string to the completer. Not necessarily the same &lt;/span&gt;
  &lt;span class="c1"&gt;// string as was received from the line previously &lt;/span&gt;
  &lt;span class="n"&gt;c&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="na"&gt;complete&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"&lt;/span&gt;&lt;span class="si"&gt;$firstPartOfPhrase&lt;/span&gt;&lt;span class="s"&gt; it is finished!"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

  &lt;span class="c1"&gt;// Now that the completer has that information, return completer.future &lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;c&lt;/span&gt;&lt;span class="o"&gt;.&lt;/span&gt;&lt;span class="na"&gt;future&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



</description>
      <category>dart</category>
      <category>beginners</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>Fastai Chapter 4 - The important parts, part 1: Tensors</title>
      <dc:creator>machine-gurning</dc:creator>
      <pubDate>Wed, 25 Jan 2023 13:53:13 +0000</pubDate>
      <link>https://dev.to/yakimoff/fastai-chapter-4-the-important-parts-3gop</link>
      <guid>https://dev.to/yakimoff/fastai-chapter-4-the-important-parts-3gop</guid>
      <description>&lt;p&gt;Chapter four, "MNIST_BASICS", or "Lesson 4" of the online course, is the most important and apparently most difficult chapter of the fastai course. &lt;/p&gt;

&lt;p&gt;In this series of posts post I will cover, in-depth, the most important concepts, and provide a few more examples of each to make them stick. I will include various practice questions throughout; the answers are available at the end of the post. I strongly recommend attempting the questions before looking at the answers.&lt;/p&gt;

&lt;p&gt;The book is available online &lt;a href="https://github.com/fastai/fastbook" rel="noopener noreferrer"&gt;here&lt;/a&gt;&lt;br&gt;
The course is accessible &lt;a href="https://www.youtube.com/playlist?list=PLfYUBJiXbdtSvpQjSnJJ_PmDQB_VyT5iU" rel="noopener noreferrer"&gt;here&lt;/a&gt; &lt;/p&gt;
&lt;h1&gt;
  
  
  Full sequence contents:
&lt;/h1&gt;

&lt;ol&gt;
&lt;li&gt;&lt;strong&gt;Tensors and tensor operations&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;Building and training a simple regression model&lt;/li&gt;
&lt;li&gt;Substituting pytorch/fastai components&lt;/li&gt;
&lt;li&gt;Building and training a nonlinear model&lt;/li&gt;
&lt;li&gt;Answers to practice questions&lt;/li&gt;
&lt;/ol&gt;
&lt;h1&gt;
  
  
  1. Tensors and tensor operations
&lt;/h1&gt;
&lt;h2&gt;
  
  
  Overview
&lt;/h2&gt;

&lt;p&gt;Crucial to all deep learning is the concept of a tensor. Succinctly, a tensor is a generalisation of concepts that we should already be familiar with: vectors and matrices. &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A one-dimensional (or "Rank 1") tensor is equivalent to a number&lt;/li&gt;
&lt;li&gt;A two-dimensional (or "Rank 2") tensor is equivalent to a 2D matrix of numbers&lt;/li&gt;
&lt;li&gt;A three-dimensional (or "Rank 3") tensor is equivalent to a "cube" of numbers, or stacked matrices&lt;/li&gt;
&lt;li&gt;and so on..&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Nothing else to it, for now&lt;/p&gt;
&lt;h2&gt;
  
  
  Rationale
&lt;/h2&gt;

&lt;p&gt;As  you will see / have seen, neural networks fundamentally comprise large matrices of numbers that have been expertly picked by the computer. Tensors are simply the way in which we perform operations on, store, and query those numbers. The data we feed neural networks must also be represented in tables of numbers. Learning about tensors and tensor operations is crucial for manipulating the data that is fed into the network, as well as investigating what the network itself is doing.&lt;/p&gt;

&lt;p&gt;Why use pytorch tensors over numpy arrays and good olde python lists? I quote the author:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;The vast majority of methods and operators supported by NumPy on these structures are also supported by PyTorch, but PyTorch tensors have additional capabilities. One major capability is that these structures can live on the GPU, in which case their computation will be optimized for the GPU and can run much faster (given lots of values to work on). In addition, PyTorch can automatically calculate derivatives of these operations, including combinations of operations. As you'll see, it would be impossible to do deep learning in practice without this capability.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h2&gt;
  
  
  Operations
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://www.youtube.com/watch?v=x9JiIFvlUwk" rel="noopener noreferrer"&gt;Here&lt;/a&gt; is a useful video on basic tensor operations that covers more than you need to know&lt;/p&gt;
&lt;h3&gt;
  
  
  Initialising a tensor
&lt;/h3&gt;

&lt;p&gt;First ensure pytorch is installed and imported, then make your first tensor:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;
&lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;tensor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Jeremy has aliased &lt;code&gt;torch.tensor&lt;/code&gt; to just &lt;code&gt;tensor&lt;/code&gt;, and you can do the same with this line:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;tensor&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;tensor&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can turn python lists and numpy arrays into tensors&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="nf"&gt;tensor&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;

&lt;span class="nf"&gt;tensor&lt;/span&gt;&lt;span class="p"&gt;([[&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;],[&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;]])&lt;/span&gt;

&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;numpy&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;np&lt;/span&gt;

&lt;span class="nf"&gt;tensor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;np&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;array&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;]))&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Other ways of creating a tensor, not all of which you need to commit to memory, are:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;torch.rand(2,3)&lt;/code&gt;: Creates a rank 2 tensor of random positive numbers&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;torch.randn(2,3)&lt;/code&gt;: Creates a rank 2 tensor of random positive and negative numbers&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;torch.empty(4,4)&lt;/code&gt;: empty 4x4 rank 2 tensor&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;torch.zeros(2,3,4)&lt;/code&gt;: rank three ("cube") tensor of zeros&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;torch.ones(...)&lt;/code&gt;: self-explanatory&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;torch.eye(3,3)&lt;/code&gt;: Creates an identity matrix of dimensions 3x3 (get it? "eye-dentity")&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;torch.arange(4)&lt;/code&gt;:  Creates a rank one tensor of the numbers 1 to 4&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Parameters
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;dtype&lt;/code&gt;: You can specify data types implicitly or explicitly in the tensor definition. &lt;/li&gt;
&lt;li&gt;
&lt;code&gt;device&lt;/code&gt;: specify the device that the tensor is saved to. Try &lt;code&gt;"cuda"&lt;/code&gt; - if it is available, your tensor will be saved to the graphics card.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;requires_grad&lt;/code&gt;: to be explained below.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="nf"&gt;tensor&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="mf"&gt;1.&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt; &lt;span class="c1"&gt;# implicitly
&lt;/span&gt;
&lt;span class="nf"&gt;tensor&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="n"&gt;dtype&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;float32&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;device&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;cuda&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; 
&lt;span class="c1"&gt;# Personally, I get an error
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Transforming tensor type
&lt;/h3&gt;

&lt;p&gt;You can call tensor methods to change their type:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;tensor&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;bool&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;   &lt;span class="c1"&gt;# bool : tensor([false, true, true, true])
&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;float&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;  &lt;span class="c1"&gt;# float32 : tensor([0.0, 1.0, 2.0, 3.0]) 
&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;short&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;  &lt;span class="c1"&gt;# int16
&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;long&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;   &lt;span class="c1"&gt;# int64
&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;half&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;   &lt;span class="c1"&gt;# float16
&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;double&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="c1"&gt;# float64
&lt;/span&gt;

&lt;span class="c1"&gt;# Typically does what you would expect:
&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;tensor&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="bp"&gt;False&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;float&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="c1"&gt;# tensor([0.0, 1.0])
&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Transforming tensor shape
&lt;/h3&gt;

&lt;p&gt;We will often need to re-shape tensors to make them fit what we're trying to jam them into. In this chapter, we are introduced to these reshaping methods:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;torch.stack()&lt;/code&gt;: Takes a sequence (list, tuple, whatever) of tensors of the same size and concatenates them "along a new dimension". Imagine having a "stack" of 2D tensors, all the same dimensions, and then combining them to form a "cube" of numbers. The output tensor is of one higher dimension than the inputs. Tensors must be of the same size.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;x1&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;tensor&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
&lt;span class="n"&gt;x2&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;tensor&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
&lt;span class="n"&gt;x3&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;tensor&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;6&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;

&lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;stack&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="n"&gt;x1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="n"&gt;x2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="n"&gt;x3&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;There is an optional parameter for dimension, &lt;code&gt;dim&lt;/code&gt;, which effectively asks you which direction you want the stacking to take place. Consider the scenario of stacking two rank 3 tensors: you can stack them in one of three ways:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2pnfxl2u9vtbxp6pwf95.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2pnfxl2u9vtbxp6pwf95.png" alt="Image description" width="696" height="502"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;torch.cat()&lt;/code&gt;: Unlike &lt;code&gt;stack&lt;/code&gt;, &lt;code&gt;cat&lt;/code&gt; concatenates the tensors along the specified dimension. The output tensor has the same rank as the inputs.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;x1&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;zeros&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
&lt;span class="n"&gt;x2&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ones&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;

&lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;cat&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="n"&gt;x1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;x2&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="n"&gt;dim&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="c1"&gt;# try dim = 1 and 2
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;blockquote&gt;
&lt;p&gt;For your information, the following transformations are all considered "views" on an unchanged tensor. When you apply one of the following methods, no new memory is allocated, and the original tensor is simply referenced in a new way. Read more here: &lt;a href="https://pytorch.org/docs/stable/tensor_view.html" rel="noopener noreferrer"&gt;https://pytorch.org/docs/stable/tensor_view.html&lt;/a&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;**tensor().view()**&lt;/code&gt;: Returns a new tensor with the same data but a different shape. Having the first parameter as &lt;code&gt;-1&lt;/code&gt; lets pytorch infer that dimension from the other ones. The dimensions entered must be valid, in that the tensor's data must fit the dimensions correctly.&lt;/li&gt;
&lt;li&gt;In the chapter, we use &lt;code&gt;view&lt;/code&gt; to transform the shape of our 28x28 rank 2 tensor into a rank 1 tensor of 784 long, to then be fed into the neural network. It would not accept a rank 2 tensor, so we flatten the data out.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;randn&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;size&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="c1"&gt;# torch.Size([4, 4])
&lt;/span&gt;
&lt;span class="n"&gt;y&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;view&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;16&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;y&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;size&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="c1"&gt;# torch.Size([16])
&lt;/span&gt;
&lt;span class="n"&gt;z&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;view&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;8&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;  &lt;span class="c1"&gt;# the size -1 is inferred from other dimensions
&lt;/span&gt;&lt;span class="n"&gt;z&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;size&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="c1"&gt;# torch.Size([2, 8])
&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;**tensor().squeeze()**&lt;/code&gt;: Removes dimensions of size one. Helpful for eliminating "excess" dimensions. Changes a [n, 1] shaped tensor into [n] shape. Optional parameter &lt;code&gt;dim&lt;/code&gt; will only squeeze the tensor in that dimension, and ignore other ways it could be squeezed
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;x1&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;zeros&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;

&lt;span class="n"&gt;x1&lt;/span&gt; 
&lt;span class="c1"&gt;# tensor([[[0., 0., 0.]],   # Note that it is 2D data, trapped
#         [[0., 0., 0.]]])  # in a 3D tensor
&lt;/span&gt;
&lt;span class="n"&gt;x1&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;squeeze&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="c1"&gt;# tensor([[0., 0., 0.],   
#         [0., 0., 0.]])
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;**tensor().unsqueeze()**&lt;/code&gt;: Does the opposite of squeeze. Adds an extra dimension of size one to the data. Useful for when you need to perform matrix multiplication and need extra dimensions to make it work. &lt;code&gt;dim&lt;/code&gt; lets you decide which dimension to insert it along.&lt;/li&gt;
&lt;li&gt;In the chapter, we &lt;code&gt;unsqueeze&lt;/code&gt; a tensor that represents the image category (&lt;code&gt;train_y&lt;/code&gt;) to make sure it has the same dimensionality as &lt;code&gt;train_x&lt;/code&gt;. However, if you don't unsqueeze, the rest of the notebook runs without issue, so I am not entirely sure why he did this, other than good practice...
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;x1&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;torch&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;zeros&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;

&lt;span class="n"&gt;x1&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;unsqueeze&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Tensor operations
&lt;/h3&gt;

&lt;p&gt;Finally we get to mathematical operations involving tensors. &lt;/p&gt;

&lt;p&gt;Standard operations act the same was as they do in linear algebra, where adding, subtracting, multiplying and dividing by a scalar is performed on every element of the tensor:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;tensor&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;

&lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;   &lt;span class="c1"&gt;# tensor([2,4,6])
&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;
&lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt;
&lt;span class="c1"&gt;# etc
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Crucially, matrix multiplication of tensors works using the standard pythong &lt;code&gt;@&lt;/code&gt; symbol.&lt;/p&gt;

&lt;p&gt;This will be instrumental to building our neural network. You need not know more than the fact that the following calculations work, and what matrix multiplication is on a conceptual level (3blue1brown has an insightful series of videos on linear algebra)&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;tensor&lt;/span&gt;&lt;span class="p"&gt;([[&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;],[&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;]])&lt;/span&gt; 
&lt;span class="n"&gt;y&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;tensor&lt;/span&gt;&lt;span class="p"&gt;([[&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="p"&gt;],[&lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="p"&gt;]])&lt;/span&gt;

&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="nd"&gt;@y&lt;/span&gt;
&lt;span class="c1"&gt;# tensor([[ 9, 12],
#         [ 9, 12]])
&lt;/span&gt;
&lt;span class="n"&gt;y&lt;/span&gt;&lt;span class="nd"&gt;@x&lt;/span&gt;
&lt;span class="c1"&gt;# tensor([[ 7, 14],
#         [ 7, 14]])
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Other operations you will come across are self-explanatory but worth listing: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;mean()&lt;/code&gt;: Returns a tensor of rank 0 with the value. Inserting a tuple as a parameter constrains the mean to just the dimensions specified in the tuple. &lt;/li&gt;
&lt;li&gt;In the chapter, we use this to compare a large stack of 1010 training samples all contained within a rank 3 of shape 1010x28x28 tensor, with an "idealised" image in a rank 2 tensor of shape 28x28. We wish to get back a rank 1 tensor that has one mean value per training sample, so we define the following function. &lt;/li&gt;
&lt;li&gt;
&lt;code&gt;(-1, -2)&lt;/code&gt; specifies that we want to take the mean of the second last and last dimension only, but keep the other dimensions (i.e. the 1010) intact.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;mnist_distance&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;a&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="n"&gt;b&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt; 
  &lt;span class="nf"&gt;return &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;a&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="n"&gt;b&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;abs&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;mean&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt; 

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;abs()&lt;/code&gt;: Self-explanatory&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;sqrt()&lt;/code&gt;: Self-explanatory &lt;/li&gt;
&lt;li&gt;
&lt;code&gt;item()&lt;/code&gt;: returns the actual tensor element value. Only works with rank 0 tensors, i.e. tensors with one element.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Tensor broadcasting
&lt;/h3&gt;

&lt;p&gt;One characteristic of the pytorch tensor which is used frequently during the chapter is that of &lt;em&gt;broadcasting&lt;/em&gt;. For many tensor operations, if two tensors are combined in some way, and one tensor has fewer dimensions than the other, pytorch will try to sensibly extend ("broadly cast") the lower dimension or smaller tensor so that it may play nicely with the larger tensor. &lt;/p&gt;

&lt;p&gt;A simple example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="nf"&gt;tensor&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="nf"&gt;tensor&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
&lt;span class="c1"&gt;# tensor([4,8])
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;An instance of this in the chapter is when we compare a single rank 2 tensor of shape 28x28 with a rank 3 tensor of shape 1010x28x28. We subtract one from the other, and no error occurs because pytorch interprets what we want and artificially 'extends' the first tensor to play nice with the second:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# Function described above
&lt;/span&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;mnist_distance&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;a&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="n"&gt;b&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt; 
  &lt;span class="nf"&gt;return &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;a&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="n"&gt;b&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;abs&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;mean&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt; 

&lt;span class="n"&gt;valid_3_dist&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;mnist_distance&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;valid_3_tens&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;mean3&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="c1"&gt;# 
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Familiarity with tensors to the depth of this article is satisfactory for having a good time reading this chapter.&lt;/p&gt;

&lt;p&gt;Further information on tensors in this in-depth video by pytorch: &lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.youtube.com/watch?v=r7QDUPb2dCM" rel="noopener noreferrer"&gt;https://www.youtube.com/watch?v=r7QDUPb2dCM&lt;/a&gt;&lt;/p&gt;

</description>
      <category>cloud</category>
      <category>devops</category>
      <category>scalability</category>
      <category>productivity</category>
    </item>
  </channel>
</rss>
