<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Kostas</title>
    <description>The latest articles on DEV Community by Kostas (@vrinek_94).</description>
    <link>https://dev.to/vrinek_94</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/vrinek_94"/>
    <language>en</language>
    <item>
      <title>A short guide to setup a Jupyter notebook for Ruby (on a mac, with a fish shell)</title>
      <dc:creator>Kostas</dc:creator>
      <pubDate>Fri, 01 Mar 2019 20:42:22 +0000</pubDate>
      <link>https://dev.to/vrinek_94/a-short-guide-to-setup-a-jupyter-notebook-with-ruby-on-a-mac-with-a-fish-shell-5g52</link>
      <guid>https://dev.to/vrinek_94/a-short-guide-to-setup-a-jupyter-notebook-with-ruby-on-a-mac-with-a-fish-shell-5g52</guid>
      <description>&lt;h3&gt;
  
  
  0. have Homebrew install
&lt;/h3&gt;

&lt;p&gt;see &lt;a href="https://brew.sh"&gt;https://brew.sh&lt;/a&gt; for instructions&lt;/p&gt;

&lt;h3&gt;
  
  
  0. have Ruby installed in some form
&lt;/h3&gt;

&lt;p&gt;I suggest &lt;a href="https://github.com/rbenv/rbenv"&gt;rbenv&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  1. download Anaconda from &lt;a href="https://www.anaconda.com/downloads"&gt;https://www.anaconda.com/downloads&lt;/a&gt;
&lt;/h3&gt;

&lt;h3&gt;
  
  
  2. install ☝️ (VS Code is optional, feel free to skip it)
&lt;/h3&gt;

&lt;h3&gt;
  
  
  3. &lt;code&gt;set -g fish_user_paths /anaconda3/bin/ $fish_user_paths&lt;/code&gt;
&lt;/h3&gt;

&lt;h3&gt;
  
  
  4. test with &lt;code&gt;jupyter notebook&lt;/code&gt;
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;it should open a browser window to the working directory&lt;/li&gt;
&lt;li&gt;the next steps will add a "Ruby" option to the "New &amp;gt; Notebook" menu (up right)&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  5. install iRuby
&lt;/h3&gt;

&lt;p&gt;instructions adapted from &lt;a href="https://github.com/sciruby/iruby#homebrew"&gt;https://github.com/sciruby/iruby#homebrew&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight shell"&gt;&lt;code&gt;brew &lt;span class="nb"&gt;install &lt;/span&gt;automake gmp libtool wget
brew &lt;span class="nb"&gt;install &lt;/span&gt;zeromq &lt;span class="nt"&gt;--HEAD&lt;/span&gt;
brew &lt;span class="nb"&gt;install &lt;/span&gt;czmq &lt;span class="nt"&gt;--HEAD&lt;/span&gt;

&lt;span class="nb"&gt;set&lt;/span&gt; &lt;span class="nt"&gt;-x&lt;/span&gt; LIBZMQ_PATH &lt;span class="o"&gt;(&lt;/span&gt;brew &lt;span class="nt"&gt;--prefix&lt;/span&gt; zeromq&lt;span class="o"&gt;)&lt;/span&gt;/lib
&lt;span class="nb"&gt;set&lt;/span&gt; &lt;span class="nt"&gt;-x&lt;/span&gt; LIBCZMQ_PATH &lt;span class="o"&gt;(&lt;/span&gt;brew &lt;span class="nt"&gt;--prefix&lt;/span&gt; czmq&lt;span class="o"&gt;)&lt;/span&gt;/lib

gem &lt;span class="nb"&gt;install &lt;/span&gt;cztop
gem &lt;span class="nb"&gt;install &lt;/span&gt;iruby &lt;span class="nt"&gt;--pre&lt;/span&gt;
iruby register &lt;span class="nt"&gt;--force&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;h3&gt;
  
  
  6. restart your Jupyter notebook (&lt;code&gt;Ctrl-C&lt;/code&gt; in the terminal it is running and then start it up again)
&lt;/h3&gt;

&lt;h1&gt;
  
  
  🎉
&lt;/h1&gt;

</description>
      <category>jupyter</category>
      <category>mac</category>
      <category>fish</category>
      <category>ruby</category>
    </item>
    <item>
      <title>Predicting MOBA match results</title>
      <dc:creator>Kostas</dc:creator>
      <pubDate>Sat, 16 Feb 2019 05:05:26 +0000</pubDate>
      <link>https://dev.to/vrinek_94/predicting-moba-match-results-575a</link>
      <guid>https://dev.to/vrinek_94/predicting-moba-match-results-575a</guid>
      <description>&lt;p&gt;After finishing with my &lt;a href="https://www.coursera.org/specializations/deep-learning" rel="noopener noreferrer"&gt;Coursera specialization on deep learning&lt;/a&gt; I wanted to pick up a simple enough machine learning project to sink my teeth in. I wanted something that would be simple enough to formulate and something I can relate to.&lt;/p&gt;

&lt;p&gt;So I decided on predicting match results for my favorite MOBA, &lt;a href="https://en.wikipedia.org/wiki/Vainglory_(video_game)" rel="noopener noreferrer"&gt;Vainglory&lt;/a&gt;.&lt;/p&gt;

&lt;h1&gt;
  
  
  Table of Contents
&lt;/h1&gt;

&lt;ol&gt;
&lt;li&gt; The problem
&lt;/li&gt;
&lt;li&gt; A little more context
&lt;/li&gt;
&lt;li&gt; The dataset
&lt;/li&gt;
&lt;li&gt; Training the Model
&lt;/li&gt;
&lt;li&gt; Initial results
&lt;/li&gt;
&lt;li&gt; Cleaning data
&lt;/li&gt;
&lt;li&gt; Later results with clean data
&lt;/li&gt;
&lt;li&gt; Difference (or lack thereof) between training with and without talents
&lt;/li&gt;
&lt;li&gt; What did we learn?
&lt;/li&gt;
&lt;li&gt;Next steps&lt;/li&gt;
&lt;/ol&gt;

&lt;h1&gt;
  
  
  &lt;a id="the-problem"&gt;&lt;/a&gt;The problem
&lt;/h1&gt;

&lt;h2&gt;
  
  
  Predicting talent effectiveness
&lt;/h2&gt;

&lt;p&gt;I decided to focus on one mode in particular, "Aral", a casual mode which picks heroes at random for the players and the players then have a selection of talents to pick from. The goal is to destroy the other team's base while keeping your own still standing.&lt;/p&gt;

&lt;p&gt;To put it simply, I want to answer the following question:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Which talent gives me the best odds at winning a specific match?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;With "specific match" defined as the roster of already selected heroes from both teams.&lt;/p&gt;

&lt;h1&gt;
  
  
  &lt;a id="a-little-more-context"&gt;&lt;/a&gt;A little more context
&lt;/h1&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/hLTeeAIM4lw"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;h2&gt;
  
  
  What is a MOBA
&lt;/h2&gt;

&lt;p&gt;A MOBA, &lt;a href="https://en.wikipedia.org/wiki/Multiplayer_online_battle_arena" rel="noopener noreferrer"&gt;Multiplayer Online Battle Arena&lt;/a&gt;, is a video game genre popularized (maybe even originated) by DOTA, &lt;a href="https://en.wikipedia.org/wiki/Defense_of_the_Ancients" rel="noopener noreferrer"&gt;Defense of the Ancients&lt;/a&gt;, a mod of Warcraft 3.&lt;/p&gt;

&lt;p&gt;The mod was so popular that it spawned quite a large number of follow up games, most notably &lt;a href="https://en.wikipedia.org/wiki/Dota_2" rel="noopener noreferrer"&gt;DOTA 2&lt;/a&gt; and &lt;a href="https://en.wikipedia.org/wiki/League_of_Legends" rel="noopener noreferrer"&gt;League of Legends&lt;/a&gt;. On the mobile side, Vainglory and &lt;a href="https://en.wikipedia.org/wiki/Arena_of_Valor" rel="noopener noreferrer"&gt;Arena of Valor&lt;/a&gt; have been pretty successful and personally, lacking a good PC setup, I flocked to these mobile ones.&lt;/p&gt;

&lt;p&gt;The game genre plays a little bit like this:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  there are two teams on a map&lt;/li&gt;
&lt;li&gt;  each team has a base and some defenses&lt;/li&gt;
&lt;li&gt;  the first team that destroys the other team's base wins&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  What is a hero
&lt;/h2&gt;

&lt;p&gt;Before a match begins, each player has to choose &lt;a href="https://www.vainglorygame.com/heroes/" rel="noopener noreferrer"&gt;their hero&lt;/a&gt;. Each hero has specific traits and abilities that differentiate them from the rest. They differ by movement speed, attack speed, damage output, defenses, special abilities, passive traits and more.&lt;/p&gt;

&lt;p&gt;For example, the hero I play the most on Vainglory is &lt;a href="https://www.vainglorygame.com/heroes/lyra/" rel="noopener noreferrer"&gt;Lyra&lt;/a&gt;. She is a mage with healing and protective magic. She has a ranged basic attack that slows targets and not much in the way of personal defense mechanisms.&lt;/p&gt;

&lt;p&gt;Her usual role is one of the Captain, which basically means she's best at supporting other heroes be their best and keeping them alive.&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/5URpZoYn9a8"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;h2&gt;
  
  
  What is an ability
&lt;/h2&gt;

&lt;p&gt;Each hero in Vainglory has 1 perk and 3 abilities.&lt;/p&gt;

&lt;p&gt;The perk is often tied to their basic attack. For example Lyra's perk, &lt;strong&gt;Principle Arcanum&lt;/strong&gt;, adds a second hit to her basic attack, making it more powerful and slowing down her target.&lt;/p&gt;

&lt;p&gt;The abilities are usually activated by the player to provide an effect and have a cooldown and maybe some sort of cost (like mana, stamina, rage etc).&lt;/p&gt;

&lt;p&gt;For example Lyra has these 3 abilities:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Imperial Sigil&lt;/strong&gt;: she scribes a sigil on the ground that heals allies and damages enemies&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Bright Bulwark&lt;/strong&gt;: she puts up a bubble which prevents enemies from using movement-based abilities&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Arcane Passage&lt;/strong&gt;: she creates a teleportation tunnel between two points&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The basic attack, the perk and the abilities are the main ways that a hero affects their environment and the course of the match.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is a talent
&lt;/h2&gt;

&lt;p&gt;In select modes of Vainglory, namely Aral and Blitz, the players may select one of 3 talents for their heroes. These talents are unique to each hero and affect how their hero plays.&lt;/p&gt;

&lt;p&gt;For example here are Lyra's talents:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Twin Missile&lt;/strong&gt;: trades the slowdown effect of her basic attack for more damage&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Mobile Bulwark&lt;/strong&gt;: makes her bubble follow her instead of remaining on the spot where she cast it&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Gythian Ward&lt;/strong&gt;: grants a small barrier and cleanses all debuffs to allies within its perimeter&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These talents are usually a trade between an advantage and a disadvantage. They can also be upgraded to improve their effectiveness.&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/lM8MF6WlAG8"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;h1&gt;
  
  
  &lt;a id="the-dataset"&gt;&lt;/a&gt;The dataset
&lt;/h1&gt;

&lt;p&gt;My data consist of matches already played along with their results.&lt;/p&gt;

&lt;h2&gt;
  
  
  Three-hot vector for teams composition
&lt;/h2&gt;

&lt;p&gt;To model the hero composition of each team, I ended up using a &lt;em&gt;three-hot vector&lt;/em&gt;. (I have no idea if this is a real thing)&lt;/p&gt;

&lt;h3&gt;
  
  
  What is three-hot?
&lt;/h3&gt;

&lt;p&gt;A &lt;strong&gt;one-hot vector&lt;/strong&gt; is a vector (with vector being essentially an ordered list of numbers) where all values are 0 and only &lt;strong&gt;one&lt;/strong&gt; of them is 1. It looks a bit like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;vector&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The idea behind a three-hot vector is that it's like a one-hot vector but with three positions set to 1 instead of only one. This makes it possible to represent a set of 3 unique values. It looks like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;vector&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Another way to explain this is with type theory. In this case, the one-hot vector can represent an enum and the three-hot vector can represent a unique set of exactly three enums.&lt;/p&gt;

&lt;h3&gt;
  
  
  Why three-hot?
&lt;/h3&gt;

&lt;p&gt;In our case, a team's roster can only contain one of each hero. It's impossible for example to have a team that consists of two Lyras and one Idris.&lt;/p&gt;

&lt;p&gt;Additionally, the heroes are not ordered in the team.&lt;/p&gt;

&lt;p&gt;This allows us to express one team's roster with one three-hot vector whose size is the amount of available heroes (45 at the time of writing).&lt;/p&gt;

&lt;h2&gt;
  
  
  Talents one-hot vector
&lt;/h2&gt;

&lt;p&gt;When deciding how to structure the hero talents in the dataset, I had a few options.&lt;/p&gt;

&lt;p&gt;Since each hero has 3 talents (rare, epic, legendary), I could choose a &lt;strong&gt;softmax 3&lt;/strong&gt; activation for the output node. I decided against this because it felt at the time as too complicated for a model to learn all the associations.&lt;/p&gt;

&lt;p&gt;Another approach would be to model the output again as softmax but to include all talents of all heroes (135 in total). This also felt suboptimal because there will always be 132 talents that are unavailable to the player.&lt;/p&gt;

&lt;p&gt;Instead, I went with a trick I learned in the Coursera course. I put the list of talents in the input and asked the model to predict the possible outcome given a selected talent. This way I'll have to make 3 predictions each time to answer my question of "which talent?".&lt;/p&gt;

&lt;p&gt;So, I ended up representing the talents as a one-hot vector of 45x3+1=136 positions (45 heroes x 3 talents each + 1 "no talent selected"). Note that it is possible for the player to not have unlocked any talents of their chosen hero. This is the case where the "no talent selected" position would be flipped to ON.&lt;/p&gt;

&lt;p&gt;Sidenote: on second thought, the "no talent selected" position on the talents vector could probably be eliminated. I'll have to perform an experiment or two to validate this.&lt;/p&gt;

&lt;h2&gt;
  
  
  Label experiments (tanh, sigmoid, softmax)
&lt;/h2&gt;

&lt;p&gt;For the labels I had no idea how to model them so I run a few experiments.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;


&lt;colgroup&gt;
&lt;col&gt;

&lt;col&gt;
&lt;/colgroup&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Value range&lt;/th&gt;
&lt;th&gt;Output layer activation function&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;

&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;(-1, 1)&lt;/td&gt;
&lt;td&gt;tanh&lt;/td&gt;
&lt;/tr&gt;


&lt;tr&gt;
&lt;td&gt;(0, 1)&lt;/td&gt;
&lt;td&gt;sigmoid&lt;/td&gt;
&lt;/tr&gt;


&lt;tr&gt;
&lt;td&gt;2x(0, 1)&lt;/td&gt;
&lt;td&gt;softmax&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Of the above, the &lt;code&gt;softmax&lt;/code&gt; approach resulted in the best results (as measured by accuracy of the validation set) so I stuck with it. The difference was small and I can only guess that this was partly in the way Tensorflow (the Keras backend I'm using) is optimized.&lt;/p&gt;

&lt;h2&gt;
  
  
  Putting them all together
&lt;/h2&gt;

&lt;p&gt;To generate the input from one match, I took the match and for each hero I defined:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;my team&lt;/strong&gt; as a three-hot vector&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;other team&lt;/strong&gt; as a three-hot vector&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;my talent&lt;/strong&gt; as a one-hot vector&lt;/li&gt;
&lt;li&gt;  the &lt;strong&gt;verdict&lt;/strong&gt; from the perspective of the hero as a number from 1 (win) to -1 (lost)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;and stacked the vectors to produce a 226-sized vector.&lt;/p&gt;

&lt;p&gt;For example here is a pre-stacked data entry:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"matchID"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"ab137e32-e381-11e8-a4e9-02b7582ce766"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"x"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"ours"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"theirs"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"myTalents"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"y"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Which represents a match between "our" team with Inara, Petal and Varya and "their" team with Kinetic, Ringo and Skaarf. The selected talent is Petal's "Bounce" and the verdict was a win for our (Petal's) team.&lt;/p&gt;

&lt;p&gt;Because we compile each data entry from the perspective of one hero/player, we end up with 6 entries per match (2 teams x 3 heroes).&lt;/p&gt;

&lt;h2&gt;
  
  
  API for fetching matches
&lt;/h2&gt;

&lt;p&gt;To compile my dataset I made use of the &lt;a href="https://developer.vainglorygame.com" rel="noopener noreferrer"&gt;Vainglory API&lt;/a&gt;. It has &lt;a href="https://vainglory-gamelocker-documentation.readthedocs.io/en/master/matches/matches.html" rel="noopener noreferrer"&gt;a handy endpoint to fetch latest matches&lt;/a&gt; and then &lt;a href="https://vainglory-gamelocker-documentation.readthedocs.io/en/master/telemetry/telemetry.html" rel="noopener noreferrer"&gt;another to fetch the telemetry of a match&lt;/a&gt; (which includes the talents selected for each hero). With a &lt;a href="https://gitlab.com/gademo/vainglory-stats/tree/master/fish_functions" rel="noopener noreferrer"&gt;suite of little scripts&lt;/a&gt;, running on a remote VM, I managed to get about 30k data entries per day.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why ScaleWay VM
&lt;/h2&gt;

&lt;p&gt;Initially, I was fetching the data directly to my laptop. This quickly got old as I was generating gaps in the fetching schedule because of putting the laptop to sleep often.&lt;/p&gt;

&lt;p&gt;To keep the data fetching going, I got &lt;a href="https://www.scaleway.com/pricing/#anchor_starter" rel="noopener noreferrer"&gt;a cheap VM by ScaleWay&lt;/a&gt;. At $2 per month, it can't get much cheaper and it allows me to fetch data and compile my dataset 24/7.&lt;/p&gt;

&lt;h2&gt;
  
  
  Split to train/dev/test
&lt;/h2&gt;

&lt;p&gt;To begin training, I split my dataset in 3 parts:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt; A 10k test set, to be used after all training has been performed to evaluate the model.&lt;/li&gt;
&lt;li&gt; A 10k dev set (also called validation set sometimes), to be used on every training epoch to evaluate the training progress.&lt;/li&gt;
&lt;li&gt; The rest as a training set, used to train the model.&lt;/li&gt;
&lt;/ol&gt;

&lt;h1&gt;
  
  
  &lt;a id="training-the-model"&gt;&lt;/a&gt;Training the Model
&lt;/h1&gt;

&lt;p&gt;All of my code can be found at &lt;a href="https://gitlab.com/gademo/vainglory-stats" rel="noopener noreferrer"&gt;https://gitlab.com/gademo/vainglory-stats&lt;/a&gt;. The code makes a few assumptions on where things reside. Feel free to fork and edit and run your own experiments.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why Keras
&lt;/h2&gt;

&lt;p&gt;I opted for &lt;a href="https://vainglory-gamelocker-documentation.readthedocs.io/en/master/matches/matches.html" rel="noopener noreferrer"&gt;Keras&lt;/a&gt; on this one because I wanted to focus more on the process of training and evolving a model and less on the details of optimizing it. Keras allows me to define and train a model with very few lines of code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="c1"&gt;# a few imports...
&lt;/span&gt;
&lt;span class="n"&gt;X&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;Input&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;shape&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;n_x&lt;/span&gt;&lt;span class="p"&gt;,))&lt;/span&gt;
&lt;span class="n"&gt;Y&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;Dense&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1024&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;activation&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;relu&lt;/span&gt;&lt;span class="p"&gt;)(&lt;/span&gt;&lt;span class="n"&gt;X&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;Y&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;Dropout&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mf"&gt;0.1&lt;/span&gt;&lt;span class="p"&gt;)(&lt;/span&gt;&lt;span class="n"&gt;Y&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;Y&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;Dense&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;256&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;activation&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;relu&lt;/span&gt;&lt;span class="p"&gt;)(&lt;/span&gt;&lt;span class="n"&gt;Y&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;Y&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;Dropout&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mf"&gt;0.1&lt;/span&gt;&lt;span class="p"&gt;)(&lt;/span&gt;&lt;span class="n"&gt;Y&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;Y&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;Dense&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;n_y&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;activation&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;softmax&lt;/span&gt;&lt;span class="p"&gt;)(&lt;/span&gt;&lt;span class="n"&gt;Y&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;model&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;Model&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;inputs&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;X&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;outputs&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;Y&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;compile&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Adam&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;fit&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;X_train&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;Y_train&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="c1"&gt;# the training set
&lt;/span&gt;    &lt;span class="n"&gt;epochs&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;num_of_epochs&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;validation_data&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;X_dev&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;Y_dev&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="c1"&gt;# the dev set
&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;With the above snippet, Keras will train the model and  print useful metrics like accuracy and loss on every epoch.&lt;/p&gt;

&lt;p&gt;The entirety of the training script can be found &lt;a href="https://gitlab.com/gademo/vainglory-stats/blob/master/floyd-scripts/train.py" rel="noopener noreferrer"&gt;on Gitlab&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why FloydHub
&lt;/h2&gt;

&lt;p&gt;Like with compiling the dataset, I also wanted to decouple my laptop being on and training the model. &lt;a href="https://www.floydhub.com" rel="noopener noreferrer"&gt;FloydHub&lt;/a&gt; is a pretty simple way to achieve this. After setting up an account and installing their CLI, training the model is remarkably simple:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;floyd run &lt;span class="nt"&gt;--data&lt;/span&gt; vrinek/datasets/casual-aral:casual_aral &lt;span class="s2"&gt;"python train.py"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That &lt;code&gt;--data&lt;/code&gt; argument comes from having previously uploaded the compiled dataset to FloydHub with:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;floyd data init vrinek/casual-aral &lt;span class="c"&gt;# one time setup&lt;/span&gt;
floyd data upload                  &lt;span class="c"&gt;# every time the dataset updates (about daily)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It may be worth mentioning here that FloydHub come with a few hours of CPU and GPU usage for free and more can be purchased.&lt;/p&gt;

&lt;h1&gt;
  
  
  &lt;a id="initial-results"&gt;&lt;/a&gt;Initial results
&lt;/h1&gt;

&lt;h2&gt;
  
  
  Results at 99% accuracy
&lt;/h2&gt;

&lt;p&gt;So, I have my dataset, I shuffle it, split it in training/dev/test sets and let my training script train the model.&lt;/p&gt;

&lt;p&gt;After some experimentation I was able to hit 99% accuracy (as measured on the test set) with a neural network of 2 hidden layers: 1024 ReLU units and 256 ReLU units. Each unit was followed by a 10% dropout layer.&lt;/p&gt;

&lt;p&gt;The whole model looked like this when written in Keras:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;X&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;Input&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;shape&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;n_x&lt;/span&gt;&lt;span class="p"&gt;,))&lt;/span&gt;
&lt;span class="n"&gt;Y&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;Dense&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1024&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;activation&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;relu&lt;/span&gt;&lt;span class="p"&gt;)(&lt;/span&gt;&lt;span class="n"&gt;X&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;Y&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;Dropout&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mf"&gt;0.1&lt;/span&gt;&lt;span class="p"&gt;)(&lt;/span&gt;&lt;span class="n"&gt;Y&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;Y&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;Dense&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;256&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;activation&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;relu&lt;/span&gt;&lt;span class="p"&gt;)(&lt;/span&gt;&lt;span class="n"&gt;Y&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;Y&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;Dropout&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mf"&gt;0.1&lt;/span&gt;&lt;span class="p"&gt;)(&lt;/span&gt;&lt;span class="n"&gt;Y&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;Y&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;Dense&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;n_y&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;activation&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;softmax&lt;/span&gt;&lt;span class="p"&gt;)(&lt;/span&gt;&lt;span class="n"&gt;Y&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Triumph giving way to skepticism
&lt;/h2&gt;

&lt;p&gt;Results of 99% accuracy are a bit odd for this kind of problem. And even more odd for a first time researcher.&lt;/p&gt;

&lt;p&gt;Taking a good look at the results, I noticed that for a set roster, varying the talent only affected the result by ~5%. This was not what I was expecting.&lt;/p&gt;

&lt;p&gt;To verify I re-trained the model, this time without the talents. I realized that taking out the talents portion of the input data did not change the results, accuracy stayed at 99%.&lt;/p&gt;

&lt;p&gt;Skepticism intensifying... 🤔&lt;/p&gt;

&lt;h2&gt;
  
  
  Matches present in both train and dev/test sets
&lt;/h2&gt;

&lt;p&gt;Digging a little deeper, I realized that shuffling the dataset was a mistake. Because each match result is represented in 6 different data entries, shuffling the dataset ended up spreading the data of each match across the training, dev and test sets.&lt;/p&gt;

&lt;p&gt;In other words, I was validating my model on the same data that I was training it on.&lt;/p&gt;

&lt;h1&gt;
  
  
  &lt;a id="cleaning-data"&gt;&lt;/a&gt;Cleaning data
&lt;/h1&gt;

&lt;p&gt;My obvious next step was to clean up this mess.&lt;/p&gt;

&lt;h2&gt;
  
  
  Assertions
&lt;/h2&gt;

&lt;p&gt;My first priority was to introduce some assertions so this does not happen again (I'm used to TDD so this made absolute sense to me).&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;training_set_ids&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;set&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="n"&gt;match&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;matchID&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;match&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;training_set&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
&lt;span class="n"&gt;dev_set_ids&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;set&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="n"&gt;match&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;matchID&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;match&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;dev_set&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
&lt;span class="n"&gt;test_set_ids&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;set&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="n"&gt;match&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;matchID&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;match&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;test_set&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;

&lt;span class="c1"&gt;# Assert that the three sets do not overlap
&lt;/span&gt;&lt;span class="nf"&gt;assert&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nf"&gt;len&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;training_set_ids&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&lt;/span&gt; &lt;span class="n"&gt;dev_set_ids&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="nf"&gt;assert&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nf"&gt;len&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;training_set_ids&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&lt;/span&gt; &lt;span class="n"&gt;test_set_ids&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="nf"&gt;assert&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nf"&gt;len&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;dev_set_ids&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&lt;/span&gt; &lt;span class="n"&gt;test_set_ids&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Omit shuffling
&lt;/h2&gt;

&lt;p&gt;Obviously, with these lines in place, my training script was failing which only confirmed my earlier observations. The simplest way to fix this was to omit shuffling my data (and also tweak the size of the sets to a multiple of 6).&lt;/p&gt;

&lt;h2&gt;
  
  
  Treating data as time-series
&lt;/h2&gt;

&lt;p&gt;I wasn't particularly happy to omit shuffling my data. Trying to debate it &lt;a href="https://en.wikipedia.org/wiki/Rubber_duck_debugging" rel="noopener noreferrer"&gt;with my rubber duck companion&lt;/a&gt;, I recalled that when a model is trained on a time-series of data (eg meteorological or financial) the data is not shuffled. In those cases, it's important for the model to be able to predict the &lt;em&gt;future&lt;/em&gt; given the &lt;em&gt;past&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;In this case, the argument was pretty weak though. Each match is pretty much independent of the previous ones. It is dependent on the previous matches of the players partaking in the match but our model is not built to accommodate this knowledge.&lt;/p&gt;

&lt;p&gt;Nevertheless, this debate was enough to let me rest for a little and lower the priority of safely shuffling my data. Not shuffling would suffice for now.&lt;/p&gt;

&lt;h1&gt;
  
  
  &lt;a id="later-results-with-clean-data"&gt;&lt;/a&gt;Later results with clean data
&lt;/h1&gt;

&lt;h2&gt;
  
  
  Compared to baseline accuracy of 50%
&lt;/h2&gt;

&lt;p&gt;Re-training the model on the training set (this time with 500k data entries, ~83k matches), the results were pretty bad:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  Dev set accuracy hovered around &lt;strong&gt;56%&lt;/strong&gt; (assuming that a "pick a random number" baseline algorithm would succeed at 50%)&lt;/li&gt;
&lt;li&gt;  dev set loss steadily growing while train set loss shrinks&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  &lt;a id="difference-between-training-with-and-without-talents"&gt;&lt;/a&gt;Difference (or lack thereof) between training with and without talents
&lt;/h1&gt;

&lt;p&gt;I also compared results between training the model with and without talents:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;  &lt;a href="https://www.floydhub.com/vrinek/projects/vainglory-stats/31" rel="noopener noreferrer"&gt;https://www.floydhub.com/vrinek/projects/vainglory-stats/31&lt;/a&gt; is the model trained with talents&lt;/li&gt;
&lt;li&gt;  &lt;a href="https://www.floydhub.com/vrinek/projects/vainglory-stats/30" rel="noopener noreferrer"&gt;https://www.floydhub.com/vrinek/projects/vainglory-stats/30&lt;/a&gt; is the same model, trained without talents&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The difference in accuracy is marginal on the dev set. Even on the training set there does not seem to be any benefit when adding the talents to the dataset (80% vs 79.3%).&lt;/p&gt;

&lt;h2&gt;
  
  
  Contribution of talents in first layer
&lt;/h2&gt;

&lt;p&gt;In order to verify the contribution level of the talents, I plotted the weights of the first layer as a heatmap:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fi.imgur.com%2FopRFwZm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fi.imgur.com%2FopRFwZm.png" alt="heatmap graph"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With this visualization, it was pretty easy to spot the difference. 0-89 on the Y axis represents the hero feature weights and 90-225 represents the talent ones. On average it looks like the hero-specific features are fitted to more extreme weights than the talents.&lt;/p&gt;

&lt;p&gt;Some random sampling of the results validates this. Picking one data entry and varying the selected talent usually affected the output value by up to 5%.&lt;/p&gt;

&lt;h1&gt;
  
  
  &lt;a id="what-did-we-learn"&gt;&lt;/a&gt;What did we learn?
&lt;/h1&gt;

&lt;p&gt;In the end, my motive to work on this problem had more to do with gaining hands-on experience on a machine learning problem top-to-bottom and less with actually solving the problem.&lt;/p&gt;

&lt;p&gt;So, what did I learn?&lt;/p&gt;

&lt;h2&gt;
  
  
  About the problem
&lt;/h2&gt;

&lt;p&gt;The model did not learn much. This could mean a few things:&lt;/p&gt;

&lt;h3&gt;
  
  
  The model's architecture cannot fit the data
&lt;/h3&gt;

&lt;p&gt;This could be a possibility. Given though that the same architecture managed to learn all its matches by heart, I doubt a more complex model would be of much benefit.&lt;/p&gt;

&lt;h3&gt;
  
  
  Not enough data
&lt;/h3&gt;

&lt;p&gt;Quite possible. Right now, with 600k data entries, we stand at 100k matches. This may be a small dataset for this problem.&lt;/p&gt;

&lt;p&gt;The obvious solution to this is to gather more data. Given I have a little VM working tirelessly to gather data, this should be a matter of time.&lt;/p&gt;

&lt;h3&gt;
  
  
  A different mix of features would improve results
&lt;/h3&gt;

&lt;p&gt;As an active Vainglory player, I have a little "expert" knowledge on the subject matter. A match's result is influenced by the team formation (the heroes of each team) but that's hardly the only influencing factor. Other factors can be:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Player experience&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;One of the most important factors when it comes to a match's verdict is the experience of the players involved. Vainglory is very much a skill-based game.&lt;/p&gt;

&lt;p&gt;The Vainglory API exposes data on the players' skill level but it was not included in the model in an effort to keep it simple.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Player fitness&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;It can be argued that a player performs best at certain times or under certain conditions. These may include time of day (being tired if playing at 2AM at night), network connection speed (lag hurts both the performance and the mental stability of the player) and even notifications popping up while playing (especially if the player accidentally taps on one).&lt;/p&gt;

&lt;p&gt;This is not something that the Vainglory API has (or can easily have) data on so the point here is that our model will not be able to predict this. We could though maybe infer it somehow, maybe.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Talent levels&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;As I mentioned earlier, talents have levels. It's possible that a talent at level 1 and at level 10 influence the result in different ways.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Player AFK&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Sometimes a player goes AFK. Sometimes a bot takes over (if the game recognises this behaviour), other times the hero is seen idling in the map. In both cases, it's expected that the chances of the team worsen.&lt;/p&gt;

&lt;p&gt;The Vainglory API has info on this and it could be incorporated into our data pretty easily.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Player - hero compatibility/experience&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In other words, "how good is this player when they play this particular hero?". Until recently, Vainglory was not providing data on this. It could be inferred by examining a player's historic matches with said hero but that's a separate problem on its own.&lt;/p&gt;

&lt;p&gt;Recently though, Vainglory started recording a sort of "hero XP" bar that fills up after a match finishes. This could be a good enough proxy for hero/player compatibility.&lt;/p&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  About data hygiene
&lt;/h2&gt;

&lt;p&gt;The only way a model has to learn is through data. If the training set is not sufficiently separate from the dev and test ones, then the model will not generalize.&lt;/p&gt;

&lt;p&gt;A lack of generalization was the problem I had but it manifested in a different way than what I expected: instead of a big difference between training and dev set error, I saw perfect scores where I should most probably not be seeing.&lt;/p&gt;

&lt;h1&gt;
  
  
  &lt;a id="next-steps"&gt;&lt;/a&gt;Next steps
&lt;/h1&gt;

&lt;h2&gt;
  
  
  More data
&lt;/h2&gt;

&lt;p&gt;Like I mentioned before, I'll keep gathering more data. Once I hit 1.2m records (200k matches), I'll give this another shot. I do expect the dev set accuracy to not improve much but I also expect the training set accuracy to fall closer to where the dev set one will be. This is because the model will be forced to generalize more with all this data.&lt;/p&gt;

&lt;p&gt;I'll also try expanding my dataset on the other dimension: adding more features. This will take some thinking on which features to focus on and how to model them.&lt;/p&gt;

&lt;h2&gt;
  
  
  Application
&lt;/h2&gt;

&lt;p&gt;The expected end result of this experiment will be to build an application that serves this model as a web service. I have identified a couple of useful technologies to get this done (eg &lt;a href="https://js.tensorflow.org" rel="noopener noreferrer"&gt;tensorflow-js&lt;/a&gt;) and I'm planning to give it a try while waiting for the dataset to gather itself.&lt;/p&gt;

</description>
      <category>machinelearning</category>
      <category>keras</category>
      <category>python</category>
    </item>
  </channel>
</rss>
