<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: andreanasuto</title>
    <description>The latest articles on DEV Community by andreanasuto (@andreanasuto).</description>
    <link>https://dev.to/andreanasuto</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/andreanasuto"/>
    <language>en</language>
    <item>
      <title>Tell Me Your Secret</title>
      <dc:creator>andreanasuto</dc:creator>
      <pubDate>Thu, 23 May 2019 23:44:18 +0000</pubDate>
      <link>https://dev.to/andreanasuto/tell-me-your-secret-3g9k</link>
      <guid>https://dev.to/andreanasuto/tell-me-your-secret-3g9k</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;I'll share a secret and you'll tell me yours&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;'Tell me your secret' is a ROR (Ruby on Rails) app inspired by 'The future of secrets' art exhibition by Sarah Newman, Jessica Yurkofsky and Rachel Kalmar from &lt;a href="https://metalabharvard.github.io/"&gt;metaLab (at) Harvard&lt;/a&gt; part of the Berkman Klein Center for Internet &amp;amp; Society.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://tell-me-your-secrets.herokuapp.com/secrets/new"&gt;Live version&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A simple form asks to anonymously share a secret. The app automatically store the IP address of the user inserting the secret.&lt;br&gt;
Once the secret is written down, another secret is shared back together with the geo-location of that secret.&lt;/p&gt;

&lt;h2&gt;
  
  
  The idea
&lt;/h2&gt;

&lt;h3&gt;
  
  
  The background
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ZE4cAvpZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://upload.wikimedia.org/wikipedia/commons/4/4e/ELIZA_conversation.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ZE4cAvpZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://upload.wikimedia.org/wikipedia/commons/4/4e/ELIZA_conversation.jpg" alt="ELIZA by prof. Weizenbaum"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The idea behind the project is inspired by the 1966 experiment &lt;a href="https://en.wikipedia.org/wiki/ELIZA"&gt;ELIZA&lt;/a&gt; by the MIT professor &lt;a href="https://en.wikipedia.org/wiki/Joseph_Weizenbaum"&gt;Joseph Weizenbaum&lt;/a&gt;, one of the father of artificial intelligence. The program showed an interesting paradox in the relationship between machine and human.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;While ELIZA was capable of engaging in discourse, ELIZA could not converse with true understanding. However, many early users were convinced of ELIZA’s intelligence and understanding, despite Weizenbaum’s insistence to the contrary.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;The users of the ELIZA shared deep secrets with it and told the computer extremely relevant information. Weizenbaum early discovered that speaking with a machine was a liberating act instead of being claustrophobic.&lt;/p&gt;

&lt;h3&gt;
  
  
  Mechanic intimacy
&lt;/h3&gt;

&lt;p&gt;This pattern has become more and more relevant with the spread of internet and social media platforms. Data centers are now one of the most relevant source of personal secrets in the world and mostly because we &lt;em&gt;give&lt;/em&gt; them these information, just like in the ELIZA experiment.&lt;br&gt;
'Tell me your secret' is a project where the act of sharing a secret is a human-to-human bilateral interaction through a machine, instead of unilateral and solitary experience.&lt;br&gt;
At the same time, it creates awareness around the problem of how much private information we share online and how long these information are stored in the data centers.&lt;/p&gt;

</description>
      <category>rails</category>
      <category>ror</category>
      <category>geolocation</category>
      <category>socialnetwork</category>
    </item>
    <item>
      <title>A portfolio tracker for cryptocurrencies</title>
      <dc:creator>andreanasuto</dc:creator>
      <pubDate>Fri, 17 May 2019 17:32:55 +0000</pubDate>
      <link>https://dev.to/andreanasuto/a-portfolio-tracker-for-cryptocurrencies-3j2d</link>
      <guid>https://dev.to/andreanasuto/a-portfolio-tracker-for-cryptocurrencies-3j2d</guid>
      <description>&lt;p&gt;I first heard about Bitcoin in 2011 during a discussion with a bunch of friends studying finance like me.&lt;br&gt;
For the following years, I've never touched or thought about buying bitcoins or other coins until I came across another discussion, similar set up (a bunch of friends sitting at one table) but different locations, the US instead of Italy.&lt;br&gt;
This time I did my research. &lt;/p&gt;

&lt;p&gt;Problems:&lt;br&gt;
I was surprised by the lack of information, the 'greyness' of the market and the absence of old school financial-statistical tools and analysis as a Markowitz portfolio optimization. After all, I was a student in finance once.&lt;br&gt;
I've started to buy some coins with a tiny amount of money, mostly as trading experiments. I ended up creating another old school tool, an Excel spreadsheet with my different positions and the trades that I had. I've bought some coins and sold others across the time but still I had another issue.&lt;br&gt;
The fluctuation of the cryptocurrencies is well-known. In a day a coin can lose 20% of its value while gaining back its price one day later: it's a rollercoaster. Having the current value of multiple coins (i.e. a portfolio) can be a nightmare, since you have to daily get by hand the price of each coin, multiply the dollar value of each coin for your quantity and lastly sum all this up. &lt;/p&gt;

&lt;p&gt;User research:&lt;br&gt;
Before jumping into the project, I’ve borrowed my skills and experience as a human-centered researcher to study and extract some hints from real users cases. I’ve reached out to some friends investing in alt-coins asking about their routines with these investments to create a study sample.&lt;br&gt;
I was impressed by how many times they check per day how the market is going. Secondly, some of them they have similar spreadsheets as mine with some hierarchical others. First the total value of the investments, secondly, other spreadsheets with each coin position and trading booksheets. Others simply compound the gain or loss on the flight.&lt;/p&gt;

&lt;p&gt;Moreover, I am interested (as the majority of investors) firstly in the total value of the portfolio and its gain, secondly in a detailed analysis/price trends for each coin. Even if I am a fan of data, I was looking for something incredibly simple.&lt;br&gt;
In other words: a tool that tells me clearly and firstly how much money is worthed my investment. Nothing more. Clarity and simplicity before a Pandora's box of charts open up.&lt;/p&gt;

&lt;p&gt;Solutions:&lt;br&gt;
First of all, I need an automated solution to get data, mainly coin prices. Coinmarketcap.com is probably the best source, thanks to his comprehensive listing of alt-coins and their respective information.&lt;br&gt;
Using Nokogiri, a beautiful Ruby gem, I was able to extract the price for each coin. But it wasn't enough.&lt;br&gt;
New coins come out pretty quickly, only 2017 there were 235 new ICO (source: coinschedule). I also needed to constantly get these new coin names with its prices. After creating a simple login/signup page, using the findings from the user research, I’ve built the main page where all the fundamental actions take place. On top the current total value of the portfolio, below what compose the portfolio and an immediate call to action to add a coin and its quantity to the portfolio. In this way, you can add or remove coins without going on another page. The information for each coin (like its price, value in bitcoin and so forth) is reduced to two: price and quantity.&lt;br&gt;
However, you can still visit a single coin page and perform a similar action while seeing more information.&lt;/p&gt;

</description>
      <category>cyrpto</category>
      <category>ruby</category>
      <category>sinatra</category>
      <category>fintech</category>
    </item>
    <item>
      <title>Building a Ruby gem with CLI that scrapes data from an online page</title>
      <dc:creator>andreanasuto</dc:creator>
      <pubDate>Mon, 13 May 2019 16:44:44 +0000</pubDate>
      <link>https://dev.to/andreanasuto/building-a-ruby-gem-with-cli-that-scrapes-data-from-an-online-page-2137</link>
      <guid>https://dev.to/andreanasuto/building-a-ruby-gem-with-cli-that-scrapes-data-from-an-online-page-2137</guid>
      <description>&lt;p&gt;I've been completing my software development bootcamp in full-stack web development at Flatiron School (now part of the WeWork galaxy).&lt;/p&gt;

&lt;p&gt;As the first portfolio project, I am required to developed a command-in-line application.&lt;/p&gt;

&lt;p&gt;The task for CLI Project is to build a Ruby gem that provides a Command Line Interface (CLI) to an external data source. The CLI will be composed of an Object Oriented Ruby application. &lt;br&gt;
I could scrape information from any webpage I liked: weather forecasting page, news page, anything sport related. Based on that information I created my own program. I picked the Transfermkt website (&lt;a href="https://www.transfermarkt.com/"&gt;https://www.transfermarkt.com/&lt;/a&gt;), I scraped updated information on my favorite soccer league (or football, it depends on which side of the ocean you live!) and played around with that information. I created a program that could provide a user with detailed information about a specific team in the league I've selected: Serie A, the top Italian soccer league.&lt;br&gt;
I've been using the fantastic 'Nokogiri gem' in Ruby to scrape data from the website.&lt;/p&gt;

&lt;p&gt;Transfermarkt is a great German website for soccer. It has a lot of data including several detailed financial values.  I've used in the past to make some fun computations and analysis on European soccer. &lt;/p&gt;

&lt;p&gt;My program shows first and foremost the current standings of the league:&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Welcome to Serie A CLI
1. Juventus
2. SSC Napoli
3. Inter
4. AC Milan
5. Atalanta
6. AS Roma
7. Torino
8. Lazio
9. Sampdoria
10. Cagliari Calcio
11. Fiorentina
12. Sassuolo
13. SPAL
14. Parma
15. Genoa
16. Bologna
17. Udinese Calcio
18. FC Empoli
19. Frosinone
20. Chievo Verona
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;After that, it is possible to select a team based on their ranking and see information as:&lt;/p&gt;

&lt;p&gt;(I've picked my favorite team as an example: Inter Milan)&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;"Inter is currently valued $635.44mn"
Total team market value in millions of $

"Inter players average age is 28,8"
The average age of the team

"They play at Stadio San Siro - Giuseppe Meazza"
This line shows the stadium where they play

&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



</description>
      <category>cli</category>
      <category>ruby</category>
      <category>nokogiri</category>
      <category>objectorientedruby</category>
    </item>
  </channel>
</rss>
