Parsing the Interwebs

glennanj1 profile image glennanj1 ・1 min read

My first cli project I created from scratch used Nokogiri. I found that using the The Bastards Book of Ruby was a great resource. I highly recommend checking out their website to refer to, if your looking to make a web scraper cli. Grabbing css selectors for the first time was not an easy task. The internet is deep and so are its selectors. You might be asking yourself how do you grab selectors? Using Nokogiri a ruby gem you parse the url creating nodes that you can then iterate over and use to present data to the end user. If you inspect the page and grab the css selector from the html you parsed, you can grab very specific data. I found that not every website could be scraped and that was my biggest setback. As a beginner in software development sometimes its better to take a step back, realizing that its not your fault your not getting the results you expect. The key here is to keep trying and don't give up. You will produce results with perseverance and patience.


Editor guide