<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: meikay</title>
    <description>The latest articles on DEV Community by meikay (@meikay).</description>
    <link>https://dev.to/meikay</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/meikay"/>
    <language>en</language>
    <item>
      <title>How To Scale your Startup with Web Scraping</title>
      <dc:creator>meikay</dc:creator>
      <pubDate>Tue, 29 Oct 2019 21:42:09 +0000</pubDate>
      <link>https://dev.to/meikay/how-to-scale-your-startup-with-web-scraping-2fb</link>
      <guid>https://dev.to/meikay/how-to-scale-your-startup-with-web-scraping-2fb</guid>
      <description>&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fthepracticaldev.s3.amazonaws.com%2Fi%2Fc05peabvw9fotfik95op.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fthepracticaldev.s3.amazonaws.com%2Fi%2Fc05peabvw9fotfik95op.jpg" alt="startup"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Web scraping is a phrase often heard throughout the tech industry. But what is web scraping? How would you apply it to scale your startup/business? We'll talk about the 3 steps of implementing web scraping to scale your startup/business. &lt;/p&gt;

&lt;h3&gt;
  
  
  Why is Data Important to Startups and Businesses?
&lt;/h3&gt;

&lt;p&gt;Data gives you concrete numbers and allows companies to compare themselves to their competition and stay ahead! When utilized correctly, data reduces wasted money and time by improving processes and pinpointing where companies should focus their time and energy. It becomes especially important for marketing to consumers in order to gain insights into their preferences, behaviors and habits. As you collect more data, you would make the necessary changes to improve their conversions rates.&lt;/p&gt;

&lt;h3&gt;
  
  
  What is Web Scraping?
&lt;/h3&gt;

&lt;p&gt;Web scraping also known as data scraping, data harvesting, or web crawlers is a way to extract data from websites in a usable and a structured format. Web scraping can be done in many different ways, such as manual data gathering, custom scripts, or web scraping tools such as &lt;a href="https://www.scraperapi.com/?fp_ref=meisuleen60" rel="noopener noreferrer"&gt;ScraperAPI&lt;/a&gt;. It makes web scraping very simple by finding proxies, setting up headless browsers, and handling CAPTCHAs. &lt;a href="https://www.scraperapi.com/?fp_ref=meisuleen60" rel="noopener noreferrer"&gt;ScraperAPI&lt;/a&gt; is great because you can get started with 1,000 free API calls and then switch over to their advanced plan. To get access to much more and 10% off, use the promo code SCRAPE217831 . Take a look at their &lt;a href="https://www.scraperapi.com/documentation" rel="noopener noreferrer"&gt;documentation&lt;/a&gt; to start scraping! Web scraping tools are often the approach chosen by data scientists and companies due to them being fast, cost effective and highly accurate.&lt;/p&gt;

&lt;h3&gt;
  
  
  How Would You Use The Data From Web Scraping
&lt;/h3&gt;

&lt;p&gt;&lt;em&gt;1. Market Research &amp;amp; Analyze Data collected&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;You can use web scraping to compare market prices to keep your product competitive and win over customers from your competitors. Startups/businesses must keep an eye out for competitor promotions to see what they focus on when it comes to their target audience. This allows you to avoid repeating mistakes of unsuccessful startups and predict new trends in your market &lt;/p&gt;

&lt;p&gt;&lt;em&gt;2. Track online Reputation&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Check what people think once they’ve tried your product. The collected data on reviews and recommendations for specific websites or social media will introduce you to what people think which is helpful because you can optimize your product to meet their needs. Gathering information on your industry, competitors, partners, and customers with thought-out business strategies and promotion campaigns is the best way to scale your startup/business.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;3. Target your audience/market product&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Using web scraping, you can research your target audience to analyze their preferences and behaviors to appeal to them and convince them to buy your product. You can also scrape data on advertising which allows you learn the mistakes and accomplishments of well-established businesses which allows you to focus on growth and not just stay afloat.&lt;/p&gt;

&lt;p&gt;In conclusion, web scraping is very useful in staying ahead of your competition and with the emergence of web scraping tools such as &lt;a href="https://www.scraperapi.com/?fp_ref=meisuleen60" rel="noopener noreferrer"&gt;ScraperAPI&lt;/a&gt;, it is easier and faster than ever before. Keep in mind that analyzing data, online reputation and marketing is a full proof way for startup/business growth. &lt;/p&gt;

</description>
      <category>startup</category>
      <category>webscraping</category>
      <category>growthhacker</category>
      <category>webcrawler</category>
    </item>
    <item>
      <title>Sinatra CRUD App</title>
      <dc:creator>meikay</dc:creator>
      <pubDate>Thu, 24 Oct 2019 02:05:35 +0000</pubDate>
      <link>https://dev.to/meikay/sinatra-crud-app-1c11</link>
      <guid>https://dev.to/meikay/sinatra-crud-app-1c11</guid>
      <description>&lt;h3&gt;
  
  
  How to build a CRUD app using Sinatra and MVC?
&lt;/h3&gt;

&lt;p&gt;My Homework Log app is made for users to be able to log in/signup/logout of a homework assignment as well as be able to edit and delete their homework assignments.&lt;/p&gt;

&lt;h3&gt;
  
  
  MVC Structure
&lt;/h3&gt;

&lt;p&gt;Before diving into the project, I thought about what my MVC would look like. What type of data do I need in my models? How do I set up my routes and connect everything in my controllers? And what would all this look like in my views? Instead of building out the MVC structure from scratch, I used a gem named Corneal created by Brian Emory which scafolds the MVC structure for you. It’s simple installation makes set up a breeze!&lt;/p&gt;

&lt;p&gt;I built a User model and a Homework_assignment model for my project.&lt;/p&gt;

&lt;h3&gt;
  
  
  Active Record Associations
&lt;/h3&gt;

&lt;p&gt;For this project, we were required to use at least one has_many relationship on a User model and one belongs_to relationship on another model. Each user has_many homework assignments and each homework assignment belongs_to a user.&lt;/p&gt;

&lt;h3&gt;
  
  
  Password Encryption
&lt;/h3&gt;

&lt;p&gt;Password encryption was required to protect user accounts from getting hacked. It uses a hashing algorithm to scramble and salt passwords. In order to implement this we installed a gem called bcrypt and my User’s table had a password_digest column. I then added has_secure_password to my User model and now have all the functionalities of the bcrypt gem.&lt;/p&gt;

&lt;h3&gt;
  
  
  Database Migrations
&lt;/h3&gt;

&lt;p&gt;We’ve been using SQLite for our projects so far which is light weight and simple to use. In db/migrate I created a users table and a homework_assignments table to which I added the data that was needed to make this app usable. I ran rake db:migrate to create a schema file which is a graphical depiction of your database.&lt;/p&gt;

&lt;h3&gt;
  
  
  Render your View
&lt;/h3&gt;

&lt;p&gt;At this point, we should test out our app adding get '/' do erb:index end to your application_contoller.rb . we should see a welcome page that states everything has been setup up successfully if you’re using Corneal.&lt;/p&gt;

&lt;h3&gt;
  
  
  Controllers
&lt;/h3&gt;

&lt;p&gt;I built three controllers which include the application_controller, a homework_assignments_controller and a users_controller. They all contain the necessary routes to help the user sign-in/sign-out, sign-up, and edit.&lt;/p&gt;

&lt;h3&gt;
  
  
  Login/Signup
&lt;/h3&gt;

&lt;p&gt;We use the action and method attributes in HTML to connects to the /signup and post routes in the controllers.&lt;/p&gt;

&lt;h3&gt;
  
  
  Profile Page
&lt;/h3&gt;

&lt;p&gt;It’s time to be redirected to the user’s profile page. This happens when iterating over the user object and then set the dynamic route to print out the user name of the current user using the user.username route.&lt;/p&gt;

&lt;h3&gt;
  
  
  Edit
&lt;/h3&gt;

&lt;p&gt;To support PATCH and DELETE method types, you would need to add Rack::MethodOverride inside of the config.ru file. Secondly, set up the input like so .&lt;/p&gt;

&lt;h3&gt;
  
  
  Ensure User can only edit and delete their own content
&lt;/h3&gt;

&lt;p&gt;At this point, I need to search for the saved user ID in the database that matches the entered username and displays the user’s Homework assignments matching the username.&lt;/p&gt;

&lt;h3&gt;
  
  
  Validate User’s data input
&lt;/h3&gt;

&lt;p&gt;In order for the user to be redirected after signing up, their input needs to be validated so that bad data does not get saved into the data base which will cause problems down the road. I decided to use active record for this where in the models/user.rb I added&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;validates :username, presence: true  &lt;br&gt;
validates :username, uniqueness: true&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

</description>
      <category>ruby</category>
      <category>webdev</category>
      <category>crud</category>
      <category>sinatra</category>
    </item>
    <item>
      <title>My Experience at Codeland 2019</title>
      <dc:creator>meikay</dc:creator>
      <pubDate>Wed, 24 Jul 2019 00:23:46 +0000</pubDate>
      <link>https://dev.to/meikay/my-experience-at-codeland-2019-3270</link>
      <guid>https://dev.to/meikay/my-experience-at-codeland-2019-3270</guid>
      <description>&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Wnnv4sO7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i.imgur.com/3zgrX8b.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Wnnv4sO7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i.imgur.com/3zgrX8b.jpg" alt="coding"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This year I decided to attend the Codeland conference which did not disappoint! I have been to other conferences and this was by far the most fun and inclusive one. Talks were filled with laughs and fun. We learned how we can apply programming to solving everyday problems and taking it further by creating apps that can save lives. We also received a lot of goodies from the conference such as shirts, mugs, stickers and a whole bunch of other things that we can show off. &lt;/p&gt;

&lt;p&gt;The conference was great for new comers and veterans alike. I felt so comfortable telling others that I am a student and not feeling like I have to be an expert. They offered workshops for all levels which made everything less intimidating but still challenging. My workshop of choice was 'Building with accessibility in mind' by Luisa Morales. It was eye opening because it showed me that I could be doing more to make my website and projects more accessible.&lt;/p&gt;

&lt;p&gt;I was so happy to see other Flatiron students who usually attend remote study groups with me! It was great to bond with them and talk about the next steps of our coding journey. I also had the chance to see Saron Yitbarek who is the CEO and founder of CodeNewbie and also put together this amazing conference. She is someone who I look up to for inspiration and motivation on a daily basis. I was a little starstruck.&lt;/p&gt;

&lt;h3&gt;
  
  
  Key Takeaways
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Keep Building things that you are passionate about&lt;/li&gt;
&lt;li&gt;We have special abilities that others don't, so use them for something bigger 
than yourself&lt;/li&gt;
&lt;li&gt;Accessibility should be the standard when building apps&lt;/li&gt;
&lt;li&gt;Don't be afraid to approach people (they're awesome!)&lt;/li&gt;
&lt;li&gt;Keep the user in mind when building&lt;/li&gt;
&lt;li&gt;Would definitely attend again!&lt;/li&gt;
&lt;li&gt;Make the world a better place with programming :)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you saw me at the conference and have any questions feel free to message me or follow me on twitter @meisuleen!&lt;/p&gt;

</description>
      <category>techtalks</category>
      <category>community</category>
      <category>career</category>
      <category>motivation</category>
    </item>
    <item>
      <title>Ruby Web Scraper with Nokogiri</title>
      <dc:creator>meikay</dc:creator>
      <pubDate>Fri, 05 Apr 2019 16:08:41 +0000</pubDate>
      <link>https://dev.to/meikay/cli-data-gem-walk-through-web-scraper-1bg3</link>
      <guid>https://dev.to/meikay/cli-data-gem-walk-through-web-scraper-1bg3</guid>
      <description>&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0xczkjp0x7zryvxp6mhp.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0xczkjp0x7zryvxp6mhp.jpg" alt="coding" width="800" height="533"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I was interested in remote full stack positions so I decided why not build that for my portfolio project? I built a Command Line Application that uses object oriented programming to scrape remoteok.io for their top 100 full stack developer jobs. The application is able to do a second level scrape to retrieve the job description and can be installed as a gem on your local environment. My project uses the Nokogiri gem which is an open source software library to parse HTML and XML.&lt;/p&gt;

&lt;p&gt;I will be going through how I built my CLI Data Gem project in the following steps:&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 1 Install Bundler gem
&lt;/h3&gt;

&lt;p&gt;In your terminal type:&lt;br&gt;
&lt;br&gt;
 &lt;code&gt;bundle install remote_jobs&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2 Add Necessary Files
&lt;/h3&gt;

&lt;p&gt;I had to map out the models I will be using through out my program. I decided to name my folder within lib, remote_jobs and within that folder, I added 3 more files that contained my classes.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9aqg3idjr9tncc0o4qri.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9aqg3idjr9tncc0o4qri.PNG" alt="files" width="282" height="361"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 3 Setup File Environment
&lt;/h3&gt;

&lt;p&gt;Then, I required the files in remote_jobs.rb to set up my project’s environment. At this point, I had to figure out what objects to build and what their attributes were going to be.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi5c9wswg2m57vg7n93k1.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fi5c9wswg2m57vg7n93k1.PNG" alt="environment" width="706" height="488"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 4 Add Dependecies
&lt;/h3&gt;

&lt;p&gt;In my remote_jobs.gemspec file, I added the necessary dependecies such as Nokogiri, colorize, and pry. I also added information about my gem such as its name, version, description, authors and homepage.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsdivbmnhnnkx8sgopmid.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsdivbmnhnnkx8sgopmid.PNG" alt="dependencies" width="800" height="397"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 5 Suedo Code
&lt;/h3&gt;

&lt;p&gt;I imagined how my CLI would behave and wrote down some steps. Then, I thought about what methods I would use to make my CLI behave that way. I also needed to find out what data I want to scrape.&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 6 Scrape
&lt;/h3&gt;

&lt;p&gt;I looked for the the data’s css selectors so that I am able to access them through their children. I decided to scrape the first 100 full stack jobs from remotok.io and iterated over each job object to grab the job name, company name, and url. On the second level scrape, I scraped the description of each job. However, instead of scraping many times and potentially being locked out of the website, I only scraped if the job the user chose does not have a description.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F77ce8rvjbohe0xbrifak.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F77ce8rvjbohe0xbrifak.PNG" alt="scrape" width="800" height="309"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Problems I Ran Into &amp;amp; How I(We) Solved It
&lt;/h3&gt;

&lt;p&gt;So inevitably, I ran into difficulty. During this project I was not allowed to reach out to technical coaches for help, so I relied on my tried and trusted friend Google, as well as reaching out to my peers for some advice! When you need to code a project from scratch, you start to realize your weak points. So I figured out that my understanding of object orientation was not as strong as I had thought. I re-watched some videos on that subject and implemented my understanding to my project.&lt;/p&gt;

&lt;p&gt;As you can see in the caption above, I said I(we). This was meant to encourage you to ask for help when you get stuck. If you have tried googling everything and searched stack overflow and still can’t come to a solution, don’t be afraid to ask for help. Which comes to the next issue I bumped into, the data on the second scrape was not displaying the way I thought it should. When I ran my program, there was a {linebreak} everywhere I looked in the description. So I asked a fellow peer to take a look and see what he thought was causing the issue. We played around with the code in pry(debugging tool) and came up with using gsub to fix the issue. This method globally substitutes the first argument with the second argument. This allows us to replace {linebreak} with a literal line break everywhere it is found within the description.&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;clean_description = description.gsub("{linebreak}", "\n")&lt;br&gt;
        puts "#{clean_description}".colorize(:blue)&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;All in all, I am grateful to have gone through this experience and can’t wait till my next project!&lt;/p&gt;

</description>
      <category>beginners</category>
      <category>ruby</category>
      <category>webscraper</category>
      <category>webdev</category>
    </item>
  </channel>
</rss>
