<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Jonathan Fetterolf</title>
    <description>The latest articles on DEV Community by Jonathan Fetterolf (@fetterollie).</description>
    <link>https://dev.to/fetterollie</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/fetterollie"/>
    <language>en</language>
    <item>
      <title>Collab Project</title>
      <dc:creator>Jonathan Fetterolf</dc:creator>
      <pubDate>Thu, 25 Sep 2025 03:25:44 +0000</pubDate>
      <link>https://dev.to/fetterollie/collab-project-3j0l</link>
      <guid>https://dev.to/fetterollie/collab-project-3j0l</guid>
      <description>&lt;p&gt;&lt;a href="https://github.com/fetterollie" rel="noopener noreferrer"&gt;GitHub&lt;/a&gt; | &lt;a href="https://www.linkedin.com/in/jonathanfetterolf/" rel="noopener noreferrer"&gt;LinkedIn&lt;/a&gt; | &lt;a href="https://twitter.com/fetterollie" rel="noopener noreferrer"&gt;Twitter&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I recently had an IRL friend (a concert going, beer drinking, game playing kind of friend) ask me for a reference. He wasn't specifically clear on what type or what for, but he is job hunting, so I figured he meant professional. I told him that I'd be more than happy to give a personal reference but couldn't give him a professional one... UNLESS... we did a collab project on the side.&lt;/p&gt;




&lt;h2&gt;
  
  
  WHAT TO DO!?!?
&lt;/h2&gt;

&lt;p&gt;Well, I've been mulling over an elaborate idea of creating some sort of content generation application that streamlines and automates most steps in the content generation pipeline. We had a quick brainstorming meeting and came up with a few want's, need's, and requirement's for what we'll be building.&lt;/p&gt;




&lt;h2&gt;
  
  
  WELL, WHAT WILL IT HAVE!?!?
&lt;/h2&gt;

&lt;p&gt;At this point we know that a lot of of the services will be backend heavy so we started plotting our meat &amp;amp; potatoes API. Beyond that we think we'll have some sort of text-to-speech (TTS) service, image gathering/generation service, some sort of llm service, and all of this will be in front of PostgreSQL db.&lt;/p&gt;




&lt;h2&gt;
  
  
  WHAT MAKES THIS SO FUN!?
&lt;/h2&gt;

&lt;p&gt;Well, my friend has ZERO, ZILCH, NADA collaborative development experience. So first things first, we came up with a list of things to familiarize himself with as well as the right ways to do it. So we're starting from the start here. IDE, Git (CLI), project documentation, ENV's, etc., etc., etc. Also, most importantly no cheating w/ "ChatGerald" or "Geminguy". Rely on documentation and S/O as much as you can. If you end up at one of those aforementioned helpers, use it to explain concepts to you, not do the work for you!&lt;/p&gt;




&lt;h2&gt;
  
  
  ANYHOW...
&lt;/h2&gt;

&lt;p&gt;That's where we're at. If you want in, please let me know. I'm sure we'll need an abundance of help. Also, This all happened a month ago so the project's underway...&lt;/p&gt;

&lt;p&gt;Peace, love, and shiny things...&lt;/p&gt;




&lt;h2&gt;
  
  
  Want to Follow Along?
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://github.com/fetterollie" rel="noopener noreferrer"&gt;GitHub&lt;/a&gt; | &lt;a href="https://www.linkedin.com/in/jonathanfetterolf/" rel="noopener noreferrer"&gt;LinkedIn&lt;/a&gt; | &lt;a href="https://twitter.com/fetterollie" rel="noopener noreferrer"&gt;Twitter&lt;/a&gt;&lt;/p&gt;

</description>
      <category>programming</category>
      <category>ai</category>
      <category>socialmedia</category>
      <category>career</category>
    </item>
    <item>
      <title>Coding: Growth, Learning, &amp; Transformation</title>
      <dc:creator>Jonathan Fetterolf</dc:creator>
      <pubDate>Thu, 17 Aug 2023 21:38:44 +0000</pubDate>
      <link>https://dev.to/fetterollie/coding-growth-learning-transformation-24mj</link>
      <guid>https://dev.to/fetterollie/coding-growth-learning-transformation-24mj</guid>
      <description>&lt;p&gt;&lt;a href="https://github.com/fetterollie"&gt;GitHub&lt;/a&gt; | &lt;a href="https://www.linkedin.com/in/jonathanfetterolf/"&gt;LinkedIn&lt;/a&gt; | &lt;a href="https://twitter.com/fetterollie"&gt;Twitter&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;When I started my new position, I figured it would be technical, but I couldn’t have guessed that it would leave me completing coding challenges in my free time. This new role required me to dive into an array of unfamiliar languages, tools, and technologies. As I navigated this terrain, I recognized the importance of reinforcing my foundational coding skills. The constant problem-solving demands of my day-to-day coding motivated me to seek out resources that could aid my growth. I wanted to efficiently master these tools and demonstrate my ability to learn, adapt, and implement. This quest for knowledge led me to explore a trio of online resources recommended by a mentor outside the company. Each of these platforms continue to play a pivotal role in my coding journey.&lt;/p&gt;




&lt;h3&gt;
  
  
  &lt;a href="https://leetcode.com/"&gt;LeetCode&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;LeetCode became my initial port of call. Its array of coding challenges and problems provided me with the perfect arena to hone my skills. As I delved into its challenges, I found myself facing complex problems that demanded strategic thinking. Over time, I noticed a significant shift in how I approached these tasks. I transitioned from stumbling through problems to adopting a more systematic and organized approach. This transition was thanks in part to an enlightening article on algorithms and problem-solving techniques, which revolutionized my perspective.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--MdSViUu1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/s8ye6khdrz0fu5u1d8hc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--MdSViUu1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/s8ye6khdrz0fu5u1d8hc.png" alt="code for is palindrome" width="800" height="487"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--U64455g7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/d7adnvt98tv6qosn2q2e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--U64455g7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/d7adnvt98tv6qosn2q2e.png" alt="results for is palindrome" width="800" height="188"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h3&gt;
  
  
  &lt;a href="https://www.hackerrank.com/"&gt;HackerRank&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;HackerRank, the second platform in my arsenal, complemented my LeetCode experience. What stood out to me were its diverse problem sets and the ability to solve challenges using multiple programming languages. This flexibility not only broadened my repertoire but also exposed me to different problem-solving methodologies. Each challenge I tackled felt like a mini-victory, reaffirming my progress.&lt;/p&gt;




&lt;h3&gt;
  
  
  &lt;a href="https://www.codewars.com/"&gt;Codewars&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;Codewars rounded out my trio of learning resources. Its unique approach of gamifying coding challenges injected an element of fun into my learning journey. The challenges on Codewars ranged in difficulty, allowing me to progressively build my skills. I found that working my way from easier problems to more complex ones was a productive strategy. Through this approach, I honed my problem-solving skills, breaking down intricate problems into manageable steps.&lt;/p&gt;




&lt;h2&gt;
  
  
  Personal Progress and Achievements
&lt;/h2&gt;

&lt;p&gt;Before embracing these platforms, I often found myself overwhelmed by problems and attempted to tackle them all at once. However, as I continued to engage with LeetCode, HackerRank, and Codewars, my coding approach transformed. Concepts that initially seemed overwhelming became clearer as I gained insights from each problem completed.&lt;/p&gt;

&lt;p&gt;Although I can't pinpoint a single problem that brought about a significant leap in my skills, an article on algorithms and problem-solving techniques acted as a turning point. This newfound understanding reshaped how I approached complex coding problems. As I encountered increasingly complex challenges, I found that my anxiety decreased, and my confidence soared. My thought process became more organized, enabling me to code more efficiently while taking smaller, deliberate steps in the problem-solving process.&lt;/p&gt;




&lt;h2&gt;
  
  
  Lessons Learned
&lt;/h2&gt;

&lt;p&gt;These platforms weren't without their challenges. Some problems initially appeared insurmountable. However, I learned that starting with simpler problems and gradually working my way up was an effective strategy. Developing a robust problem-solving methodology provided me with a roadmap to tackle even the most intricate challenges.&lt;/p&gt;

&lt;p&gt;Consistency emerged as a key learning approach that worked best for me. Solving a single problem each day, typically after a full day's work, proved to be an effective way to integrate learning into my routine without overwhelming myself.&lt;/p&gt;

&lt;p&gt;Acknowledging that no one knows everything and that seeking assistance is perfectly acceptable was a transformative realization. Keeping helpful references saved and revisiting them when needed proved instrumental in overcoming these hurdles.&lt;/p&gt;




&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;In retrospect, the benefits of engaging with these coding websites have been threefold: enhanced problem-solving techniques, quicker access to appropriate resources, and reduced anxiety when approaching coding challenges. The combined effect of these advantages has fortified my coding skills and enabled me to face new problems with confidence.&lt;/p&gt;

&lt;p&gt;For those embarking on a coding improvement journey through online resources, my advice is simple: don't feel pressured to know everything from the outset. Dive into tutorials, address knowledge gaps along the way, and embrace the process of continuous learning.&lt;/p&gt;

&lt;p&gt;Looking ahead, my plan involves continual integration of these platforms into my daily routine, leveraging them as tools to refine my skills after a productive day's work. &lt;/p&gt;

&lt;p&gt;As I progress, I have my sights set on deepening my expertise in JavaScript, Java, and potentially exploring C#. The journey continues, and I'm excited to embrace new languages and areas of coding.&lt;/p&gt;

&lt;p&gt;To stay updated with my future coding-related blog posts, feel free to follow my journey on &lt;a href="https://www.linkedin.com/in/jonathanfetterolf/"&gt;LinkedIn&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Let me know which language or tool you think I should delve into next!&lt;/p&gt;




&lt;h2&gt;
  
  
  Want to Follow Along?
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://github.com/fetterollie"&gt;GitHub&lt;/a&gt; | &lt;a href="https://www.linkedin.com/in/jonathanfetterolf/"&gt;LinkedIn&lt;/a&gt; | &lt;a href="https://twitter.com/fetterollie"&gt;Twitter&lt;/a&gt;&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>javascript</category>
      <category>coding</category>
      <category>programming</category>
    </item>
    <item>
      <title>Streamlining Water Well Maintenance in Tanzania: A Logistic Regression Approach</title>
      <dc:creator>Jonathan Fetterolf</dc:creator>
      <pubDate>Mon, 07 Aug 2023 00:36:37 +0000</pubDate>
      <link>https://dev.to/fetterollie/streamlining-water-well-maintenance-in-tanzania-a-logistic-regression-approach-eo9</link>
      <guid>https://dev.to/fetterollie/streamlining-water-well-maintenance-in-tanzania-a-logistic-regression-approach-eo9</guid>
      <description>&lt;p&gt;&lt;a href="https://github.com/fetterollie" rel="noopener noreferrer"&gt;GitHub&lt;/a&gt; | &lt;a href="https://www.linkedin.com/in/jonathanfetterolf/" rel="noopener noreferrer"&gt;LinkedIn&lt;/a&gt; | &lt;a href="https://twitter.com/fetterollie" rel="noopener noreferrer"&gt;Twitter&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fergmiy7t0pqfblg9w41x.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fergmiy7t0pqfblg9w41x.jpg" alt="Flag of Tanzania"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Access to clean and potable water is a fundamental necessity, yet many regions, including Tanzania, face challenges in providing this essential resource. The IHH Humanitarian Relief Foundation, an NGO dedicated to improving water access, strives to efficiently allocate their maintenance efforts by accurately predicting the functionality of water pumps. By building a classification model, they can optimize their operations, maximize maintenance resources, and ensure clean water is readily available to the people of Tanzania.&lt;/p&gt;




&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0343kyp3p900w5wdbgt0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0343kyp3p900w5wdbgt0.png" alt="Well Status By Year"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  The Cost of Errors
&lt;/h2&gt;

&lt;p&gt;With Tanzania's vast geographical area and limited resources, it is crucial to deploy maintenance and repair efforts judiciously. The country's infrastructure spans over 21,000 miles of roadways, making targeted repairs essential. Constructing a well in Tanzania can cost upwards of $10,000, considering factors such as labor, drilling depth, rock density, location, and fuel costs. Repairing wells, which can range from a few hundred to several thousand dollars, is an expense that should be allocated only to wells in genuine need.&lt;/p&gt;




&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxx4csv21fndu7szp18zx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxx4csv21fndu7szp18zx.png" alt="Distribution of wells in Tanzania"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Baseline Model and Simple Model Performance
&lt;/h2&gt;

&lt;p&gt;The baseline model, which predicts all wells as functional, achieved an accuracy of approximately 54%. However, this approach proved insufficient for accurate predictions. Consequently, several simple models were explored, including Logistic Regression, Decision Tree Classifier, Random Forest Classifier, Gradient Booster Model, and XGBoost Model.&lt;/p&gt;

&lt;p&gt;Among these models, Logistic Regression emerged as the optimal choice. It achieved an accuracy of approximately 79.1%, outperforming the other models in terms of speed, interpretability, and resistance to overfitting. The Logistic Regression model was further refined using GridSearchCV to identify the best hyperparameters, including the mean imputation strategy for numerical values, a C value of 1.0, penalty 'l2', and the 'liblinear' solver.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Final Logistic Regression Model
&lt;/h2&gt;

&lt;p&gt;Considering the similar performance of multiple models, Logistic Regression was selected as the final model due to its efficiency in training, interpretability, and robustness against overfitting. The final Logistic Regression model achieved an accuracy of approximately 79.6%. This model provides IHH with a reliable tool for predicting the functionality of water pumps, aiding in prioritizing maintenance efforts and optimizing resource allocation.&lt;/p&gt;




&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbmab3rjv2qxhu4xqlqae.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbmab3rjv2qxhu4xqlqae.png" alt="ROC/AUC Model Comparison"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Further Exploration and Questions
&lt;/h2&gt;

&lt;p&gt;While the binary classification of functional and non-functional wells is valuable, there is potential benefit in identifying specific wells that require repairs while still being functional. Such insight could enable targeted preventive maintenance, avoiding costly repairs in the future.&lt;/p&gt;

&lt;p&gt;Given more time and resources, it would be worthwhile to create a model that predicts the original status groups ('functional,' 'non-functional,' and 'functional needs repair') instead of converting the target into a binary outcome. This expanded model could provide more detailed information and enhance decision-making processes.&lt;/p&gt;

&lt;p&gt;Understanding the limiting factors in delivering resources to wells that require repairs is vital. Identifying the challenges related to maintenance professionals, time constraints, financial resources, availability of parts, and knowledge gaps will help IHH devise effective strategies for addressing these obstacles.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;The implementation of a logistic regression model has empowered the IHH Humanitarian Relief Foundation to enhance their water well maintenance operations in Tanzania. By accurately predicting the functionality of water pumps, the NGO can efficiently allocate resources and prioritize repairs where they are most needed. As ongoing efforts continue to optimize resource utilization, explore more granular predictions, and address limiting factors, IHH moves closer to its goal of ensuring clean and potable water for all Tanzanians.&lt;/p&gt;




&lt;h2&gt;
  
  
  Want to Follow Along?
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://github.com/fetterollie" rel="noopener noreferrer"&gt;GitHub&lt;/a&gt; | &lt;a href="https://www.linkedin.com/in/jonathanfetterolf/" rel="noopener noreferrer"&gt;LinkedIn&lt;/a&gt; | &lt;a href="https://twitter.com/fetterollie" rel="noopener noreferrer"&gt;Twitter&lt;/a&gt;&lt;/p&gt;

</description>
      <category>datascience</category>
      <category>python</category>
      <category>linearregression</category>
      <category>data</category>
    </item>
    <item>
      <title>Deploying a PERN Full-Stack CRUD App with Vercel: Hosting and Integration Made Easy!</title>
      <dc:creator>Jonathan Fetterolf</dc:creator>
      <pubDate>Sat, 29 Jul 2023 17:58:47 +0000</pubDate>
      <link>https://dev.to/fetterollie/deploying-a-pern-full-stack-crud-app-with-vercel-hosting-and-integration-made-easy-44fk</link>
      <guid>https://dev.to/fetterollie/deploying-a-pern-full-stack-crud-app-with-vercel-hosting-and-integration-made-easy-44fk</guid>
      <description>&lt;p&gt;&lt;a href="https://github.com/fetterollie"&gt;GitHub&lt;/a&gt; | &lt;a href="https://www.linkedin.com/in/jonathanfetterolf/"&gt;LinkedIn&lt;/a&gt; | &lt;a href="https://twitter.com/fetterollie"&gt;Twitter&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;Over the past few weeks, I’ve been working to create a PERN full-stack CRUD application. I wanted to share it with others and make it available for friends, family, and colleagues to check out. So, I started looking into options to host my application.&lt;/p&gt;




&lt;p&gt;I chose to deploy my application on Vercel with the hobby plan for a few reasons. First and foremost, Vercel is incredibly easy to use. Its user-friendly interface and streamlined deployment process made it a great fit for me, as I didn't want to deal with complex hosting setups. Additionally, the hobby plan offered a free tier, which was perfect for hosting my personal project. Lastly, Vercel's seamless integration with Git repositories, especially GitHub, allowed me to set up automatic deployments whenever I pushed changes to my repository, saving me time and effort.&lt;/p&gt;




&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--CZZBwMPL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/luifunmy278hupp19k5r.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--CZZBwMPL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/luifunmy278hupp19k5r.jpg" alt="Vercel Logo" width="800" height="418"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;Deploying my project on Vercel was a breeze since my code was already on GitHub. All I had to do was import my Git repository to Vercel. I went to the Vercel dashboard, clicked on the "New Project" button, and selected the "GitHub" option. After authorizing Vercel to access my GitHub account, I chose the repository I wanted to deploy. With a few configurations, like selecting the branch to deploy and setting up build options, my application was automatically deployed whenever I made new changes to the selected branch.&lt;/p&gt;




&lt;p&gt;Besides the GitHub integration, Vercel offers other deployment methods too. These options include Vercel CLI which allows the user to deploy directly from their local development environment. Another option is deploy hooks which are unique URLs that trigger deployments when accessed. The final option is a deployment via a RESTful API which allows the user to create Deployments by making an HTTP POST request.&lt;/p&gt;




&lt;p&gt;Vercel provided me with an excellent hosting solution for my PERN full-stack CRUD application. Its ease of use, free hobby plan, and seamless Git integration made the deployment process smooth and hassle-free. Whether deploying via GitHub, using the CLI, deploy hooks, or the REST API, Vercel offered flexibility and efficiency in hosting my web application.&lt;/p&gt;




&lt;p&gt;In my upcoming post, I’ll walk you through the process of hosting my PostgreSQL database and integrating it with this application on Vercel. This follow-up post will be the last step of deploying a full-stack application with a backend database.&lt;/p&gt;




&lt;h2&gt;
  
  
  Want to Follow Along?
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://github.com/fetterollie"&gt;GitHub&lt;/a&gt; | &lt;a href="https://www.linkedin.com/in/jonathanfetterolf/"&gt;LinkedIn&lt;/a&gt; | &lt;a href="https://twitter.com/fetterollie"&gt;Twitter&lt;/a&gt;&lt;/p&gt;

</description>
      <category>vercel</category>
      <category>react</category>
      <category>webdev</category>
      <category>javascript</category>
    </item>
    <item>
      <title>Unlocking Digital Accessibility: Navigating the World of 508 Compliance</title>
      <dc:creator>Jonathan Fetterolf</dc:creator>
      <pubDate>Sat, 22 Jul 2023 15:18:25 +0000</pubDate>
      <link>https://dev.to/fetterollie/unlocking-digital-accessibility-navigating-the-world-of-508-compliance-4e6c</link>
      <guid>https://dev.to/fetterollie/unlocking-digital-accessibility-navigating-the-world-of-508-compliance-4e6c</guid>
      <description>&lt;p&gt;&lt;a href="https://github.com/fetterollie"&gt;GitHub&lt;/a&gt; | &lt;a href="https://www.linkedin.com/in/jonathanfetterolf/"&gt;LinkedIn&lt;/a&gt; | &lt;a href="https://twitter.com/fetterollie"&gt;Twitter&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  There are...
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;285 million people with some sort of visual impairment...&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;275 million people with moderate-to-profound hearing impairment...&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Many more with physical, speech, cognitive, and neurological disabilities...&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Let's ensure ALL of our users have access to this service regardless of their capabilities.&lt;/strong&gt;&lt;/p&gt;




&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;"The moral test of government is how that government treats those who are in the dawn of life, the children; those who are in the twilight of life, the elderly; and those who are in shadows of life, the sick, the needy, and the handicapped."&lt;/em&gt; &lt;strong&gt;- Hubert H. Humphrey&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;




&lt;h2&gt;
  
  
  So what is 508 compliance anyway…?
&lt;/h2&gt;

&lt;p&gt;508 compliance refers to the standards and guidelines set forth by Section 508 of the Rehabilitation Act of 1973 in the United States. The goal of this standard is to eliminate barriers in technology that could prevent individuals with disabilities from accessing, using, and interacting with digital information and services. It requires federal agencies to ensure that their electronic and information technology (EIT) is accessible to individuals with disabilities. &lt;br&gt;
Specifically, it mandates that all federal agencies must make their EIT accessible to people with disabilities, including employees and members of the public. It also serves as a best practice for non-governmental organizations and businesses to ensure inclusivity and equal access to digital information and services for all individuals, regardless of their disabilities.&lt;/p&gt;


&lt;h2&gt;
  
  
  508 practices:
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;From the start:&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;It is essential to incorporate accessibility considerations from the early stages of the application design and development process as certain requirements can prove challenging to retroactively implement for complex interactions present in modern user interfaces.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Providing alternative text for images.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The alt text is read by screen readers and other assistive technologies used by individuals with visual impairments or those who have difficulty viewing images. For example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;img src="fetterollie-best-blog-ever" alt="Best blog that exists"&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Using captions and transcripts for multimedia.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Captions and transcripts make multimedia content, such as videos and audio, accessible to individuals with hearing impairments.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Ensuring keyboard accessibility for all functionalities.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Keyboard accessibility makes digital content and applications usable for individuals with various disabilities, including motor impairments and some types of visual impairments.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Using sufficient color contrast for text and images.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Sufficient contrast makes applications and content usable for individuals with certain visual impairments such as color-blindness.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Structuring content with appropriate headings and labels.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The use appropriate headings and labels promotes accessibility, improves navigation, enhances user comprehension, and ensures compliance with web standards, leading to a better overall user experience on the web.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Avoiding content that could cause seizures or other physical reactions.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Providing resizable text and adaptable layouts.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Adaptable layouts create an accessible and user-friendly digital environment that caters to diverse audiences and devices. &lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Making sure forms and interactive elements are accessible.&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  Challenges with 508:
&lt;/h2&gt;

&lt;p&gt;Designing applications for 508 accessibility poses implementation challenges such as complex interactions, third-party components, dynamic content, multimedia accessibility, assistive technology compatibility, responsive design, accessibility expertise, design vs. accessibility balance, regular updates, and compliance education. Early integration of accessibility, user testing, and staying informed about guidelines can help overcome these challenges.&lt;/p&gt;




&lt;h2&gt;
  
  
  In conclusion:
&lt;/h2&gt;

&lt;p&gt;508 compliance is crucial because it ensures that electronic and information technology is accessible to individuals with disabilities, promoting inclusivity, equal access to digital content and services, better user experiences, and legal compliance. By adhering to these standards, organizations and developers can create technology that caters to a broader audience and supports the rights and needs of individuals with disabilities, fostering a more inclusive and equitable digital environment.&lt;/p&gt;




&lt;p&gt;Sources: &lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.access-board.gov/about/"&gt;U.S. Access Board&lt;/a&gt; - Advancing Full Access and Inclusion for All&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.section508.gov/"&gt;Section508.gov&lt;/a&gt; - Buy. Build. Be Accessible.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.hhs.gov/"&gt;U.S. Department of Health and Human Services&lt;/a&gt; - Enhancing the health and well-being of all Americans&lt;/p&gt;




&lt;h2&gt;
  
  
  Want to Follow Along?
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://github.com/fetterollie"&gt;GitHub&lt;/a&gt; | &lt;a href="https://www.linkedin.com/in/jonathanfetterolf/"&gt;LinkedIn&lt;/a&gt; | &lt;a href="https://twitter.com/fetterollie"&gt;Twitter&lt;/a&gt;&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Resources for Learning React and the PERN Stack: From Setup to Authentication</title>
      <dc:creator>Jonathan Fetterolf</dc:creator>
      <pubDate>Sat, 15 Jul 2023 21:30:18 +0000</pubDate>
      <link>https://dev.to/fetterollie/resources-for-learning-react-and-the-pern-stack-from-setup-to-authentication-92c</link>
      <guid>https://dev.to/fetterollie/resources-for-learning-react-and-the-pern-stack-from-setup-to-authentication-92c</guid>
      <description>&lt;p&gt;This post highlights a variety of resources that can help you learn React and the PERN stack (&lt;a href="https://www.postgresql.org/" rel="noopener noreferrer"&gt;PostgreSQL&lt;/a&gt;, &lt;a href="https://expressjs.com/" rel="noopener noreferrer"&gt;Express&lt;/a&gt;, &lt;a href="https://react.dev/" rel="noopener noreferrer"&gt;React&lt;/a&gt;, and &lt;a href="https://nodejs.org/" rel="noopener noreferrer"&gt;Node.js&lt;/a&gt;) for building fully functional applications. From tools like Create React App for streamlined setup to comprehensive tutorials and crash courses on React, Material UI, and PostgreSQL, these resources cover essential concepts and techniques. Additionally, there are tutorials on working with APIs and implementing user authentication and login functionality in the PERN stack. By utilizing these resources, you can gain a solid foundation in building React applications and understand how to integrate various technologies to create robust full-stack solutions.&lt;/p&gt;




&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0ayebi0zx4f0dx3yturi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0ayebi0zx4f0dx3yturi.png" alt="home page app screenshot"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Create React App
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://create-react-app.dev/" rel="noopener noreferrer"&gt;Create React App&lt;/a&gt; is a popular tool that helps you set up a new React project with a pre-configured development environment. It provides a streamlined setup process and allows you to start building React applications quickly. Learning Create React App is important because it simplifies the initial setup of a React project and you can focus more on learning and building React applications, rather than dealing with complex configuration and tooling choices.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://www.youtube.com/playlist?list=PL4cUxeGkcC9gZD-Tvwfod2gaISzfRiP9d" rel="noopener noreferrer"&gt;Full Modern React Tutorial - YouTube&lt;/a&gt;:
&lt;/h3&gt;

&lt;p&gt;This YouTube tutorial offers a comprehensive guide to React, covering essential concepts, components, state management, and more. It's a valuable resource for beginners looking to learn React from scratch.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://www.youtube.com/watch?v=w7ejDZ8SWv8" rel="noopener noreferrer"&gt;React JS Crash Course - YouTube&lt;/a&gt;:
&lt;/h3&gt;

&lt;p&gt;This crash course on React provides a condensed introduction to React's core concepts and features. It's designed to give you a quick overview and get you up to speed with building React applications.&lt;/p&gt;




&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdu0a6vw98z96ysd1mv5m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdu0a6vw98z96ysd1mv5m.png" alt="demo app screenshot car display"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  MUI (Material UI):
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://mui.com/" rel="noopener noreferrer"&gt;Material UI&lt;/a&gt; is a popular React component library that follows Google's Material Design guidelines. The resources you mentioned provide tutorials and crash courses on using Material UI with React, teaching you how to create visually appealing and responsive user interfaces.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://www.youtube.com/playlist?list=PL4cUxeGkcC9gjxLvV4VEkZ6H6H4yWuS58" rel="noopener noreferrer"&gt;Material UI Tutorial - The Net Ninja&lt;/a&gt;:
&lt;/h3&gt;

&lt;p&gt;This playlist provides a comprehensive introduction to React by guiding you through building a complete project from start to finish.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://www.youtube.com/watch?v=_W3uuxDnySQ&amp;amp;list=PLQg6GaokU5CyVrmVsYa9R3g1z2Tsmfpm-" rel="noopener noreferrer"&gt;Material UI Crash Course&lt;/a&gt;:
&lt;/h3&gt;

&lt;p&gt;This video provides a comprehensive overview of React, covering essential topics such as components, state management, JSX, lifecycle methods, hooks, forms, and routing.&lt;/p&gt;




&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp2rpldf22ptdqghlejoh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp2rpldf22ptdqghlejoh.png" alt="demo app screenshot edit car details"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  APIs:
&lt;/h2&gt;

&lt;p&gt;These resources cover how to work with APIs in React applications. They introduce you to using popular libraries like &lt;a href="https://www.axios.com/" rel="noopener noreferrer"&gt;Axios&lt;/a&gt; for making API requests and guide you through specific examples like fetching weather data from the OpenWeatherMap API.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://www.youtube.com/watch?v=MdIfZJ08g2I" rel="noopener noreferrer"&gt;How to Use Weather API for Beginners&lt;/a&gt;
&lt;/h3&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://www.youtube.com/watch?v=ZEKBDXGnD4s" rel="noopener noreferrer"&gt;React Axios API Requests&lt;/a&gt;
&lt;/h3&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://www.npmjs.com/package/axios" rel="noopener noreferrer"&gt;Axios - NPM&lt;/a&gt;
&lt;/h3&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://openweathermap.org/forecast5" rel="noopener noreferrer"&gt;OpenWeatherMap&lt;/a&gt;
&lt;/h3&gt;




&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjkn2cwfwocz6n6sz2nq1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fjkn2cwfwocz6n6sz2nq1.png" alt="demo app screenshot vehicle input"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  PostgreSQL:
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://www.postgresql.org/" rel="noopener noreferrer"&gt;PostgreSQL&lt;/a&gt; is a powerful open-source relational database management system. The resources you mentioned include a comprehensive tutorial that covers PostgreSQL from beginner to intermediate level, and a PERN stack course that introduces you to using PostgreSQL along with Express, React, and Node.js.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://www.youtube.com/watch?v=qw--VYLpxG4" rel="noopener noreferrer"&gt;Comprehensive PostgreSQL&lt;/a&gt;
&lt;/h3&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://www.youtube.com/watch?v=ldYcgPKEZC8" rel="noopener noreferrer"&gt;PERN Stack Course&lt;/a&gt;
&lt;/h3&gt;




&lt;h2&gt;
  
  
  Login:
&lt;/h2&gt;

&lt;p&gt;These resources focus on implementing user authentication and login functionality in your React application using the PERN stack. They cover topics like JWT (JSON Web Tokens), user registration, and backend authentication implementation.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://www.digitalocean.com/community/tutorials/how-to-add-login-authentication-to-react-applications" rel="noopener noreferrer"&gt;Add Login Authentication to React Applications&lt;/a&gt;
&lt;/h3&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://www.youtube.com/watch?v=4TgbH3TM51s" rel="noopener noreferrer"&gt;Frontend PERN authentication&lt;/a&gt;
&lt;/h3&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://www.youtube.com/watch?v=ykB4KeuxKVI" rel="noopener noreferrer"&gt;Backend PERN authentication&lt;/a&gt;
&lt;/h3&gt;

</description>
      <category>pern</category>
      <category>react</category>
      <category>postgres</category>
      <category>node</category>
    </item>
    <item>
      <title>API's, MUI, PERN, &amp; more...</title>
      <dc:creator>Jonathan Fetterolf</dc:creator>
      <pubDate>Mon, 10 Jul 2023 00:33:18 +0000</pubDate>
      <link>https://dev.to/fetterollie/apis-mui-pern-more-2i19</link>
      <guid>https://dev.to/fetterollie/apis-mui-pern-more-2i19</guid>
      <description>&lt;p&gt;&lt;a href="https://github.com/fetterollie"&gt;GitHub&lt;/a&gt; | &lt;a href="https://www.linkedin.com/in/jonathanfetterolf/"&gt;LinkedIn&lt;/a&gt; | &lt;a href="https://twitter.com/fetterollie"&gt;Twitter&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;This week I worked on several tasks and gained experience with various technologies. Here's a breakdown of what I worked on:&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;a href="https://www.postman.com/"&gt;Postman Application&lt;/a&gt;:
&lt;/h2&gt;

&lt;p&gt;I worked extensively with the Postman application, which is a powerful platform for building and using APIs. It simplifies the entire API lifecycle and promotes collaboration among team members, allowing us to create better APIs faster.&lt;/p&gt;

&lt;h2&gt;
  
  
  API Integration with &lt;a href="https://axios-http.com/docs/intro"&gt;Axios&lt;/a&gt;:
&lt;/h2&gt;

&lt;p&gt;As part of my internship, I utilized Axios, a promise-based HTTP client, for integrating APIs. Axios is commonly used in both node.js and browser environments. It provided me with a convenient and efficient way to make HTTP requests and handle responses. I used Axios to integrate external APIs and update the weather information within our application based on the user's location. It was exciting to see real-time weather updates within the application.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;a href="https://mui.com/"&gt;Material UI&lt;/a&gt;:
&lt;/h2&gt;

&lt;p&gt;I worked with Material UI, an open-source React component library. Material UI implements Google's Material Design principles and offers a wide range of prebuilt components that are production-ready out of the box. I utilized Material UI components such as Box, Button, Container, Textfield, and Typography to enhance the user interface of the application.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;a href="https://mui.com/material-ui/customization/theming/"&gt;Custom Theming&lt;/a&gt; in Material UI:
&lt;/h2&gt;

&lt;p&gt;In addition to using Material UI components, I implemented custom theming to improve the look of the application. It was fascinating to work on defining a custom palette for both light mode and dark mode, ensuring a consistent and visually appealing experience for our users.&lt;/p&gt;

&lt;h2&gt;
  
  
  Fourth of July Celebration:
&lt;/h2&gt;

&lt;p&gt;As a fun break from work, I joined my friends in Alexandria, VA, to celebrate the Fourth of July. We enjoyed the festivities associated with this national holiday, which allowed me to relax and recharge.&lt;/p&gt;

&lt;h2&gt;
  
  
  PERN Stack &amp;amp; Interoperability:
&lt;/h2&gt;

&lt;p&gt;In terms of the stack we used, I focused on achieving interoperability between our application and the database using the PERN stack. PERN stands for &lt;a href="https://www.postgresql.org/"&gt;PostgreSQL&lt;/a&gt;, &lt;a href="https://expressjs.com/en/api.html"&gt;Express&lt;/a&gt;, &lt;a href="https://react.dev/"&gt;React&lt;/a&gt;, and &lt;a href="https://nodejs.org/"&gt;Node.js&lt;/a&gt;. I worked with PostgreSQL, a powerful and open-source object-relational database known for its performance. Express, a minimal and flexible Node.js web application framework, which provided a robust set of features for building the back-end. With React, I could created user interfaces by combining reusable components, and Node.js served as the runtime environment for executing JavaScript code.&lt;/p&gt;




&lt;p&gt;Throughout this week, I gained valuable experience in API integration, front-end development with React and Material UI, custom theming, and back-end development with the PERN stack. These skills will undoubtedly contribute to my growth.&lt;/p&gt;

&lt;h2&gt;
  
  
  Challenges
&lt;/h2&gt;

&lt;p&gt;One of the biggest challenges I faced this week was working with MUI theming. While implementing the custom theme, I encountered issues where certain components I used in the application would break the theming. This required me to scrap those components and start again to ensure the consistent look and feel across the application.&lt;/p&gt;

&lt;p&gt;Upon closer examination, I realized that the problem stemmed from my approach to importing MUI components. Initially, I was relying on the automatically generated imports suggested by Visual Studio Code. However, I later discovered that I should have been grabbing and using the imports directly from the MUI documentation.&lt;/p&gt;

&lt;p&gt;This realization emphasized the importance of closely following the official documentation when working with third-party libraries like Material UI. The documentation provides precise instructions and guidelines for using each component correctly, including the proper way to import them.&lt;/p&gt;

&lt;p&gt;By adhering to the documentation, I ensured that I was using the appropriate imports, which played a crucial role in maintaining the integrity of the MUI theming and avoiding issues with component compatibility. This experience served as a valuable lesson in the significance of meticulous attention to detail and the benefits of relying on official documentation as a reliable resource.&lt;/p&gt;

&lt;p&gt;Moving forward, I learned to be more cautious with automatically generated imports and to double-check the documentation for accurate import statements.&lt;/p&gt;




&lt;h2&gt;
  
  
  Want to Follow Along?
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://github.com/fetterollie"&gt;GitHub&lt;/a&gt; | &lt;a href="https://www.linkedin.com/in/jonathanfetterolf/"&gt;LinkedIn&lt;/a&gt; | &lt;a href="https://twitter.com/fetterollie"&gt;Twitter&lt;/a&gt;&lt;/p&gt;

</description>
      <category>api</category>
      <category>postgres</category>
      <category>react</category>
      <category>node</category>
    </item>
    <item>
      <title>My Journey into Software Development: First Week of Internship</title>
      <dc:creator>Jonathan Fetterolf</dc:creator>
      <pubDate>Fri, 30 Jun 2023 22:44:54 +0000</pubDate>
      <link>https://dev.to/fetterollie/my-journey-into-software-development-first-week-of-internship-36ki</link>
      <guid>https://dev.to/fetterollie/my-journey-into-software-development-first-week-of-internship-36ki</guid>
      <description>&lt;p&gt;&lt;a href="https://github.com/fetterollie"&gt;GitHub&lt;/a&gt; | &lt;a href="https://www.linkedin.com/in/jonathanfetterolf/"&gt;LinkedIn&lt;/a&gt; | &lt;a href="https://twitter.com/fetterollie"&gt;Twitter&lt;/a&gt;&lt;/p&gt;




&lt;h3&gt;
  
  
  New internship, who dis...?
&lt;/h3&gt;




&lt;h2&gt;
  
  
  Introduction:
&lt;/h2&gt;

&lt;p&gt;Starting my new internship in software development/data management has been an exciting and challenging experience. In this blog post, I will share the details of my first week, including the onboarding process, setting up the development environment, and creating a basic application using Create React App. Let's dive into my journey!&lt;/p&gt;

&lt;h3&gt;
  
  
  Onboarding:
&lt;/h3&gt;

&lt;p&gt;During my first week, the onboarding process played a crucial role in helping me become familiar with the company's culture, policies, and expectations. Here are the key aspects of my onboarding experience:&lt;/p&gt;

&lt;h3&gt;
  
  
  General onboarding and paperwork:
&lt;/h3&gt;

&lt;p&gt;I went through various forms and documents to officially join the company. This involved signing employment agreements, filling out tax forms, and providing necessary identification documents.&lt;/p&gt;

&lt;h3&gt;
  
  
  Computer setup:
&lt;/h3&gt;

&lt;p&gt;Setting up my work computer was an essential step. I was guided through the process of connecting to the company's network, installing necessary software, and configuring security settings.&lt;/p&gt;

&lt;h4&gt;
  
  
  Logging into accounts:
&lt;/h4&gt;

&lt;p&gt;I received credentials for different accounts, such as email, project management tools, and version control systems. I made sure to secure my passwords and enable two-factor authentication whenever possible.&lt;/p&gt;

&lt;h4&gt;
  
  
  Setting up applications:
&lt;/h4&gt;

&lt;p&gt;To streamline my development workflow, I installed and configured applications specific to the company's tech stack. This included code editors, integrated development environments (IDEs), and communication tools.&lt;/p&gt;

&lt;h4&gt;
  
  
  Email Signature:
&lt;/h4&gt;

&lt;p&gt;I created a professional email signature that includes my name, position, and contact information. I followed the guidelines provided by the company to maintain a consistent and professional image in my communications.&lt;/p&gt;




&lt;h2&gt;
  
  
  Trainings:
&lt;/h2&gt;

&lt;p&gt;As an intern, I underwent several training sessions to familiarize myself with important aspects of the company's operations and policies. Here are the key areas I received training in:&lt;/p&gt;

&lt;h3&gt;
  
  
  Ethics:
&lt;/h3&gt;

&lt;p&gt;Understanding and adhering to ethical standards is crucial in any professional setting. I received training on the ethical considerations related to my work, including intellectual property, data privacy, and confidentiality.&lt;/p&gt;

&lt;h3&gt;
  
  
  Harassment:
&lt;/h3&gt;

&lt;p&gt;Promoting a safe and inclusive workplace environment is essential. I participated in harassment training to learn how to recognize and prevent any form of harassment, fostering a respectful work environment for all employees.&lt;/p&gt;

&lt;h3&gt;
  
  
  HIPAA (Health Insurance Portability and Accountability Act):
&lt;/h3&gt;

&lt;p&gt;Since I am interning at an organization that handles healthcare data, I received training on HIPAA regulations. These regulations aim to protect the privacy and security of patient health information.&lt;/p&gt;

&lt;h3&gt;
  
  
  Code of Conduct:
&lt;/h3&gt;

&lt;p&gt;Every organization has a code of conduct that outlines acceptable behavior and expectations from employees. I familiarized myself with the company's code of conduct to ensure I aligned my actions with its principles.&lt;/p&gt;




&lt;h2&gt;
  
  
  Hybrid Work
&lt;/h2&gt;

&lt;p&gt;Understanding my work schedule and attendance requirements was crucial for a successful internship. In this case, the internship follows a hybrid model with mostly remote work but requires me to be in the office one day per week. I made sure to plan my tasks and assignments accordingly.&lt;/p&gt;




&lt;h2&gt;
  
  
  Setting up the Integrated Development Environment (IDE):
&lt;/h2&gt;

&lt;p&gt;As a software development intern, setting up my development environment was essential for efficient coding. Here's a breakdown of the tools and frameworks I worked with:&lt;/p&gt;

&lt;h3&gt;
  
  
  Visual Studio Code:
&lt;/h3&gt;

&lt;p&gt;I installed Visual Studio Code (VS Code), a popular code editor that provides a rich set of features for editing and debugging code. I configured VS Code according to the company's preferences.&lt;/p&gt;

&lt;h3&gt;
  
  
  Node.js:
&lt;/h3&gt;

&lt;p&gt;I installed Node.js, a JavaScript runtime that allows me to run JavaScript code on the server-side. Node.js provides a powerful ecosystem and libraries that I can leverage in my projects.&lt;/p&gt;

&lt;h3&gt;
  
  
  Node Version Manager (NVM):
&lt;/h3&gt;

&lt;p&gt;I utilized Node Version Manager (NVM), a handy tool that helps manage multiple versions of Node.js on my system. It allowed me to switch between different Node.js versions with ease.&lt;/p&gt;

&lt;h3&gt;
  
  
  Yarn:
&lt;/h3&gt;

&lt;p&gt;Yarn helps manage dependencies in projects by providing a reliable and efficient way to install, update, and manage the packages your project depends on.&lt;/p&gt;

&lt;h3&gt;
  
  
  Git:
&lt;/h3&gt;

&lt;p&gt;Git offers numerous benefits and features that make it an essential tool for developers including version control, easy collaboration, creating branches, and merging branches. &lt;/p&gt;




&lt;h2&gt;
  
  
  Learn the Create React App
&lt;/h2&gt;

&lt;p&gt;I primarily used tutorials on YouTube to learn the Create React App framework. These are the two tutorials I found most useful: &lt;br&gt;
&lt;a href="https://dev.tourl"&gt;Tutorial 1&lt;/a&gt;&lt;br&gt;
&lt;a href="https://dev.tourl"&gt;Tutorial 2&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Creating an Application
&lt;/h3&gt;

&lt;p&gt;After learning the basics, I was asked to create a simple application. These were the initial requirements for the application: &lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Add a few pages to navigate between&lt;/li&gt;
&lt;li&gt;Create a clicker widget that allows you to count both up and down&lt;/li&gt;
&lt;li&gt;Make a separate page that has the functionality to filter through an array&lt;/li&gt;
&lt;li&gt;Make a list of car objects (at least 10)  that have the following data fields:
Unique Id,
Model,
Year,
Color,
Image Source&lt;/li&gt;
&lt;li&gt;Create a functionality that will ingest data, and filter results&lt;/li&gt;
&lt;li&gt;Create text fields to allow for user input&lt;/li&gt;
&lt;li&gt;Output results&lt;/li&gt;
&lt;li&gt;Limit results so there are no duplicates&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  Summary:
&lt;/h2&gt;

&lt;p&gt;My first week of interning has been an enlightening and invigorating experience. From seamless onboarding to immersive training and hands-on development, I have gained valuable insights and embarked on a remarkable journey. The supportive work environment, the introduction to cutting-edge development tools, and the captivating world of data management have set the stage for an exciting internship filled with growth, learning, and countless opportunities. I eagerly look forward to what lies ahead as I continue to expand my skills and contribute to the company's mission.&lt;/p&gt;




&lt;h2&gt;
  
  
  Want to Follow Along?
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://github.com/fetterollie"&gt;GitHub&lt;/a&gt; | &lt;a href="https://www.linkedin.com/in/jonathanfetterolf/"&gt;LinkedIn&lt;/a&gt; | &lt;a href="https://twitter.com/fetterollie"&gt;Twitter&lt;/a&gt;&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>reactjsdevelopment</category>
      <category>datascience</category>
      <category>employeeexperience</category>
    </item>
    <item>
      <title>Using Data Science to Fight Malaria: A Breakthrough in Blood Cell Classification</title>
      <dc:creator>Jonathan Fetterolf</dc:creator>
      <pubDate>Sun, 04 Jun 2023 13:22:02 +0000</pubDate>
      <link>https://dev.to/fetterollie/using-data-science-to-fight-malaria-a-breakthrough-in-blood-cell-classification-1gcf</link>
      <guid>https://dev.to/fetterollie/using-data-science-to-fight-malaria-a-breakthrough-in-blood-cell-classification-1gcf</guid>
      <description>&lt;p&gt;&lt;a href="https://github.com/fetterollie"&gt;GitHub&lt;/a&gt; | &lt;a href="https://www.linkedin.com/in/jonathanfetterolf/"&gt;LinkedIn&lt;/a&gt; | &lt;a href="https://twitter.com/fetterollie"&gt;Twitter&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Introduction:
&lt;/h2&gt;

&lt;p&gt;Data science has the potential to revolutionize the medical field. I demonstrated this by developing an application to swiftly and accurately identify the presence of Malaria in blood cells. This innovative approach enhances the capabilities of doctors and technicians, allowing them to allocate their valuable resources more effectively and ultimately save more lives. In this blog post, we will explore the development of this application, the challenges encountered, and the exciting future possibilities.&lt;/p&gt;

&lt;h2&gt;
  
  
  Building the Application:
&lt;/h2&gt;

&lt;p&gt;The project began with a dataset consisting of images of blood cells categorized as either Uninfected Blood Cells or Parasitized Blood Cells. To determine where the application could be most impactful, auxiliary data from the World Health Organization was incorporated to identify regions with high Malaria prevalence. Python, along with libraries such as pandas, matplotlib, seaborn, and geopandas, were utilized to analyze and visualize this additional data.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--IY2FAfO6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/i61ezvv50h0lx7a40sn6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--IY2FAfO6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/i61ezvv50h0lx7a40sn6.png" alt="Map of Malaria Deaths" width="800" height="844"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Using the power of Python, Tensorflow, and Keras, I constructed a neural network and trained it using over 15,000 blood cell images. Remarkably, the network achieved an impressive accuracy rate of 96% in classifying the blood cells. To leverage the advantages of GPU computing, Google Colab was employed to create a notebook that facilitated the training process. Subsequently, a live application was developed using streamlit.io, enabling users to submit their own blood cell images and receive predictions in real-time.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--0ovU1HqT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/aic4pa91pkls99d5b0ci.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--0ovU1HqT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/aic4pa91pkls99d5b0ci.png" alt="Example Image Augmentations" width="800" height="794"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The Impact:
&lt;/h2&gt;

&lt;p&gt;The neural network proved to be highly reliable, accurately predicting blood cell classifications in over 96% of cases. This breakthrough technology empowers doctors to make quicker and more informed decisions in treating Malaria cases, thereby halting further transmission and ultimately saving lives. By providing an accurate diagnosis and determining the parasitic burden, this application enhances the efficiency and effectiveness of medical interventions.&lt;/p&gt;

&lt;h2&gt;
  
  
  Challenges Encountered:
&lt;/h2&gt;

&lt;p&gt;While developing the application, several challenges were encountered, demonstrating the complexity of implementing advanced technologies in real-world scenarios. One such challenge was the restriction on Google Colab's compute units, necessitating limitations on the utilization of GPU processing for model training. Given additional resources, such as a higher budget, the model could be trained with more data, leading to even greater accuracy in cell classifications. Furthermore, certain features of Keras did not seamlessly integrate with Colab, specifically in terms of image pre-processing. To circumvent this issue and prevent overtraining the neural network, alternative methods were employed for image pre-processing, omitting the brightness and contrast adjustments.&lt;/p&gt;

&lt;h2&gt;
  
  
  Future Possibilities:
&lt;/h2&gt;

&lt;p&gt;Given more time and resources, I have plans for enhancements to the application. Expanding the dataset and retraining the model would improve its accuracy and robustness. Additionally, a new feature could be introduced, allowing users to submit images of entire blood smears, which would then be automatically split into individual cell images for input to the model. This advancement would enable the model to estimate the parasitic burden, a crucial factor used by clinicians to make informed decisions regarding the treatment of Malaria cases.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion:
&lt;/h2&gt;

&lt;p&gt;The development of an application that utilizes data science to classify Malaria in blood cells represents a significant breakthrough in the medical field. By leveraging advanced technologies such as neural networks and GPU computing, this project demonstrates the potential for data-driven solutions to positively impact healthcare. Despite the challenges faced, the impressive accuracy achieved and the future possibilities outlined highlight the importance of continued exploration and innovation in the field of data science for medical applications. With ongoing advancements, we can look forward to a future where data-driven approaches play a vital role in combating diseases and improving patient outcomes.&lt;/p&gt;

&lt;p&gt;If you want to see or check my work, you can find the project details on my &lt;a href="https://github.com/fetterollie/"&gt;GitHub&lt;/a&gt; here: &lt;a href="https://github.com/fetterollie/Malaria-Data-Exploration"&gt;Malaria Blood Cell Classification&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Want to Follow Along?
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://github.com/fetterollie"&gt;GitHub&lt;/a&gt; | &lt;a href="https://www.linkedin.com/in/jonathanfetterolf/"&gt;LinkedIn&lt;/a&gt; | &lt;a href="https://twitter.com/fetterollie"&gt;Twitter&lt;/a&gt;&lt;/p&gt;

</description>
      <category>datascience</category>
      <category>cnn</category>
      <category>neuralnetwork</category>
      <category>python</category>
    </item>
    <item>
      <title>DO YOU YAML?</title>
      <dc:creator>Jonathan Fetterolf</dc:creator>
      <pubDate>Tue, 17 Jan 2023 01:23:17 +0000</pubDate>
      <link>https://dev.to/fetterollie/do-you-yaml-521p</link>
      <guid>https://dev.to/fetterollie/do-you-yaml-521p</guid>
      <description>&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--EN6AXtC5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/pzo62x2t806lxuo1tiaf.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--EN6AXtC5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/pzo62x2t806lxuo1tiaf.jpeg" alt="circuits" width="800" height="533"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Say what?
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--9s5VGIPG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/4kvoe5860szmj82iir01.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--9s5VGIPG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/4kvoe5860szmj82iir01.jpg" alt="Futurama Fry - Is that an alien language?" width="577" height="433"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Introduction of YAML
&lt;/h2&gt;

&lt;p&gt;YAML stands for "YAML Ain’t Markup Language" - this is known as a recursive acronym. YAML is often used for writing configuration files. It’s human readable, easy to understand and can be used with other programming languages. Although YAML is commonly used in many disciplines, it has received criticism on the amoutn of whitespace .yml files have, difficulty in editing, and complexity of the standard. Despite the criticism, properly using YAML ensures that you can reproduce the results of a project and makes sure that the virtual environment packages play nicely with system packages. (If you're looking for another way to share environments there are other alternatives to YAML which include &lt;a href="https://github.com/crdoconnor/strictyaml"&gt;StrictYAML&lt;/a&gt; (a type-safe YAML parser) and &lt;a href="https://nestedtext.org/en/stable/"&gt;NestedText&lt;/a&gt;)&lt;/p&gt;

&lt;p&gt;One of the first steps in entering into an existing data science project is setting up your virtual environment. This makes sure that dependencies and packages used for this project do not interfere with each other or write over those previously used. There will often be a file with the .yml  extension in the project files so you can quickly get working on the existing project. Below, I’ll quickly run through the steps I take to create a virtual environment on my M2 MacBook with &lt;a href="https://anaconda.org/"&gt;Anaconda&lt;/a&gt; already installed. &lt;/p&gt;

&lt;h2&gt;
  
  
  Steps
&lt;/h2&gt;

&lt;p&gt;So where is that YAML file on &lt;a href="https://github.com/"&gt;GitHub&lt;/a&gt; and what do I do with it?:&lt;/p&gt;

&lt;p&gt;First, projects typically will have one .yml file but sometimes you’ll see special instructions in the project’s read me:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--nj9oUXva--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/plxiniuq7evtsk5feae5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--nj9oUXva--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/plxiniuq7evtsk5feae5.png" alt="yml in readme" width="665" height="177"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here’s what the actual YAML file will look like (they're usually on the root level of the directory, but can sometimes be further down in the directory: &lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--KCrj9uYl--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/byx5a72og1901bsmffka.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--KCrj9uYl--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/byx5a72og1901bsmffka.png" alt="yml file github" width="775" height="218"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This is what the file will look like on GitHub:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--r-aEZY1r--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/szmxm2rfgcjh2wzejxkv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--r-aEZY1r--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/szmxm2rfgcjh2wzejxkv.png" alt="yml file gihub" width="752" height="222"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To save the .yml file, simply click the Raw button here:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--JVsPwUV4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/246pui5fgxcmgr2wzl3h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--JVsPwUV4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/246pui5fgxcmgr2wzl3h.png" alt="raw button" width="752" height="222"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then, in the newly opened tab, right click and save as:&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ZIx6QNyH--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/llru9is2elo4sy9oivzv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ZIx6QNyH--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/llru9is2elo4sy9oivzv.png" alt="save as yml" width="578" height="401"&gt;&lt;/a&gt;&lt;br&gt;
(Make sure to save this somewhere you can easily find this as you’ll need to navigate to it!)&lt;/p&gt;

&lt;p&gt;Open up a new terminal session and navigate to the directory where you saved the .yml file.&lt;/p&gt;

&lt;p&gt;To create this new environment, I’ll enter:&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;conda env create -f geoenvironment.yml&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;After the virtual environment is done installing, you’ll need to activate it to use it. To do so, you’ll need to know it’s name which should be displayed with the command to activate it. If not, check a list of your environments by entering:&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;conda info --envs&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--PSYIhRer--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/enfbw2ny0szziehkm2ej.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--PSYIhRer--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/enfbw2ny0szziehkm2ej.png" alt="conda info" width="798" height="150"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then activate the new environment by entering: (replace 'project-env' with the name of your virtual environment)&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;conda activate project-env&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--wjpHft6d--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ehuxle631q3xabo1mw84.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--wjpHft6d--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ehuxle631q3xabo1mw84.png" alt="conda activate" width="606" height="202"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now you’re ready to start chugging on that existing project. &lt;/p&gt;

&lt;p&gt;When you’re done working in that virtual environment, don’t forget to deactivate and switch to the next environment you want!&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;conda deactivate&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;If you want to start a project from scratch, I prefer to start with a very basic virtual environment and add the packages I need as I go along. My basic framework usually consists of: &lt;br&gt;
&lt;a href="https://www.python.org/"&gt;Python&lt;/a&gt;&lt;br&gt;
&lt;a href="https://numpy.org/"&gt;NumPy&lt;/a&gt;&lt;br&gt;
&lt;a&gt;Pandas&lt;/a&gt;&lt;br&gt;
&lt;a href="Matplotlib%20%E2%80%94%20Visualization%20with%20Pythonhttps://matplotlib.org"&gt;MatplotLib&lt;/a&gt;&lt;br&gt;
&amp;amp; sometimes &lt;a&gt;Seaborn&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Finally, once you've created your environment and you're ready to unleash it on the world you can run a simple command to export the .yml file. Once you have your file you can upload it or share it with whomever you need. Here is the command to export (feel free to replace "environment" with the desired name of your new environment):&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;code&gt;conda env export &amp;gt; environment.yml&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--01TZyCEe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/pplefcz1ybxqnr45ujrk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--01TZyCEe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/pplefcz1ybxqnr45ujrk.png" alt="conda env export" width="264" height="28"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The whole process of creating and activating a new virtual environment is pretty simple when it works… However, if you run into errors such as not being able to find the right packages, it can get a little hairy. Luckily, there are great resources out that are just a quick google away. The most useful resources I found for these errors were on Stack Overflow and Apple Developer.&lt;/p&gt;

&lt;p&gt;If you want to create a virtual environment from a .yml file, here’s a link to one of my projects (Tanzanian Water Wells: Predicting the Functionality of Water Wells in Tanzania) where you can try it out! &lt;/p&gt;

&lt;h2&gt;
  
  
  In Summary:
&lt;/h2&gt;

&lt;p&gt;YAML isn’t scary (also ain’t markup language)&lt;br&gt;
The .yml is an important feature in any Data Science workflow.&lt;br&gt;
The .yml is used to ensure that packages and versions are the same &lt;br&gt;
The .yml helps with reproducibility.&lt;br&gt;
Including a .yml on a project allows for collaboration&lt;br&gt;
YAML is altogether pretty simple. &lt;/p&gt;

&lt;h2&gt;
  
  
  Resources
&lt;/h2&gt;

&lt;p&gt;The official &lt;a href="https://yaml.org/"&gt;YAML&lt;/a&gt; Web Site&lt;/p&gt;

&lt;p&gt;If you’re looking for further resources on running &lt;a href="https://www.tensorflow.org/"&gt;TensorFlow&lt;/a&gt; and &lt;a href="https://keras.io/"&gt;Keras&lt;/a&gt; on a newer MacBook, I recommend checking out this YouTube video: &lt;a href="https://youtu.be/o4-bI_iZKPA"&gt;How to Install Keras GPU for Mac M1/M2 with Conda&lt;/a&gt; &lt;/p&gt;

&lt;p&gt;If you’re looking for a resource on how to install Anaconda, I highly recommend that you go straight to the source, &lt;a href="https://anaconda.org/"&gt;anaconda.org&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;The M2 MacBook gave me some challenges when trying to work with &lt;a href="https://www.tensorflow.org/"&gt;TensorFlow&lt;/a&gt; and Keras due to some fancy chip architecture which you can read about here: &lt;a href="https://medium.com/@sorenlind/tensorflow-with-gpu-support-on-apple-silicon-mac-with-homebrew-and-without-conda-miniforge-915b2f15425b"&gt;TensorFlow with GPU support on Apple Silicon Mac with Homebrew and without Conda / Miniforge&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you want to take a dive into the YAML world, here's an in-depth tutorial: &lt;a href="https://www.cloudbees.com/blog/yaml-tutorial-everything-you-need-get-started"&gt;YAML: Everything You Need to Get Started in Minutes&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Further, Very Serious Notes
&lt;/h2&gt;

&lt;p&gt;Why did the YAML cross the road? &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--9XJycQdT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kexul8x2nngqk22cw7nm.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--9XJycQdT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kexul8x2nngqk22cw7nm.jpg" alt="YAML Camel - Does not approve of your file" width="250" height="203"&gt;&lt;/a&gt;&lt;br&gt;
To get away from the package that broke the YAML's backend.&lt;/p&gt;

&lt;h2&gt;
  
  
  Want to Follow Along?
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://github.com/fetterollie"&gt;GitHub&lt;/a&gt; | &lt;a href="https://www.linkedin.com/in/jonathanfetterolf/"&gt;LinkedIn&lt;/a&gt; | &lt;a href="https://twitter.com/fetterollie"&gt;Twitter&lt;/a&gt;&lt;/p&gt;

</description>
      <category>datascience</category>
      <category>yaml</category>
      <category>python</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>7 Favorite Sites to Find Datasets</title>
      <dc:creator>Jonathan Fetterolf</dc:creator>
      <pubDate>Fri, 16 Dec 2022 02:46:50 +0000</pubDate>
      <link>https://dev.to/fetterollie/7-favorite-sites-to-find-datasets-4pp3</link>
      <guid>https://dev.to/fetterollie/7-favorite-sites-to-find-datasets-4pp3</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;So, you're looking for the next perfect dataset to use in your upcoming project? Look no further. Well, look a little further... I share my favorite sites below. &lt;/p&gt;

&lt;p&gt;I'm always looking for the next dataset to use in solidifying a new concept I'm studying. This list should help cut down the time it takes to find that perfect dataset. The sites included all have access to &lt;strong&gt;free&lt;/strong&gt; datasets. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s---YZJKfFO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xjfh83j0nz322cz7vbge.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s---YZJKfFO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xjfh83j0nz322cz7vbge.png" alt="kaggle logo" width="750" height="350"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Kaggle
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://www.kaggle.com/datasets"&gt;Kaggle&lt;/a&gt; is an online community platform for data scientists and machine learning enthusiasts. Kaggle hosts over 186,000 data sets on topics ranging from games to death rates and everything in between.&lt;/p&gt;

&lt;p&gt;What is nice about Kaggle?&lt;br&gt;
Kaggle suggests categories of datasets such as trending, music, business, computer science, and classification so you can quickly get to a dataset you're interested in. If you know exactly what you're looking for (or what you don't want included), Kaggle also offers options to filter by file size, file type, and license type. My favorite feature that Kaggle offers is that each dataset has a "Usability" rating which takes into account the completeness, credibility, and compatibility of the dataset. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--kNjimPxW--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/eg6jczk34lfq5ffhs37z.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--kNjimPxW--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/eg6jczk34lfq5ffhs37z.jpg" alt="Data Is Plural" width="378" height="86"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Data Is Plural
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://www.data-is-plural.com/"&gt;Data Is Plural&lt;/a&gt; is a weekly newsletter of useful datasets published by Jeremy Singer-Vine. There have been over 300 editions of this newsletter, the earliest is from October 21, 2015. &lt;/p&gt;

&lt;p&gt;Pros and Cons for Data Is Plural:&lt;br&gt;
This is a super fun site for discovering useful/curious datasets. It's not the easiest site to navigate but once you navigate to reading the full archive as a spreadsheet, it becomes searchable by keywords (using ctrl+f).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--hNZYeHmI--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8uyb7zr285hpim0mjrfk.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--hNZYeHmI--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8uyb7zr285hpim0mjrfk.jpeg" alt="FiveThirtyEight" width="800" height="451"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  FiveThirtyEight
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://data.fivethirtyeight.com/"&gt;FiveThirtyEight&lt;/a&gt; shares the data behind some of their articles for you to use to create your own stories and visualizations. FiveThirtyEight offers datasets used in articles about culture, economics, politics, science &amp;amp; health, and sports.&lt;/p&gt;

&lt;p&gt;Pros and Cons for FiveThirtyEight:&lt;br&gt;
Their datasets are listed in order of most recently updated with those in the status of currently being updated first. The most prominent place to click will take you to the article(s) that reference the dataset. The best way to explore what the dataset offers is to click the info link and navigate to the GitHub repository that houses all its available datasets (Repository found &lt;a href="https://github.com/fivethirtyeight/data"&gt;here&lt;/a&gt;).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Blt8Esb1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/lipntgdy90ci8dpkch8p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Blt8Esb1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/lipntgdy90ci8dpkch8p.png" alt="UCI Machine Learning Repository" width="384" height="124"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  UCI Machine Learning Repository
&lt;/h2&gt;

&lt;p&gt;The University of California, Irvine hosts its own &lt;a href="https://archive.ics.uci.edu/ml/datasets.php"&gt;Machine Learning Repository&lt;/a&gt; with over 600 datasets. The site is very intuitive but looks a bit dated. It has an open beta for its new site that you can access &lt;a href="https://archive-beta.ics.uci.edu/"&gt;here&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;Pros and Cons for UCI Machine Learning Repository:&lt;br&gt;
Most of its datasets are intended for use with classification. The site is very easy to use and its beta site has an updated look and includes filtering features within the search option. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--JsIZX3Oc--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/u159ga6kksbw8yfv8zt5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--JsIZX3Oc--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/u159ga6kksbw8yfv8zt5.png" alt="data.gov" width="461" height="109"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Data.gov
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://data.gov/"&gt;Data.gov&lt;/a&gt; Hosts over 245,000 datasets and is managed and hosted by the U.S. General Services Administraion, &lt;a href="https://www.gsa.gov/about-us/organization/federal-acquisition-service/technology-transformation-services"&gt;Technology Transformation Service&lt;/a&gt;. Recently, I used Data.gov to find a dataset for &lt;a href="https://dev.to/fetterollie/meteorite-mania-58ff"&gt;Meteorite Mania&lt;/a&gt;, my personal project that uses geopandas to plot the data of known meteorite landing sites onto a map of the world. &lt;/p&gt;

&lt;p&gt;Pros and Cons for Data.gov:&lt;br&gt;
Data.gov has a basic layout that's easy to use and intuitive. It is easy to filter by location, formats, publishers, and bureaus.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--XFQnw4s5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ao1s2tb56380dfdxbyam.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--XFQnw4s5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ao1s2tb56380dfdxbyam.png" alt="BuzzFeedNews" width="680" height="134"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  BuzzFeedNews
&lt;/h2&gt;

&lt;p&gt;Who doesn't love a good BuzzFeed article? &lt;a href="https://github.com/BuzzFeedNews/everything"&gt;BuzzFeedNews&lt;/a&gt; has made its datasets available with links to their repositories and articles so you can see what is actually going on with the data behind the scenes. If you haven't looked into this source, I highly recommend it. &lt;/p&gt;

&lt;p&gt;Just Love for BuzzFeedNews:&lt;/p&gt;

&lt;p&gt;I love that you can see the notebooks behind the articles. This allows you to see exactly how the authors used the data to get to the later stated conclusions. The datasets have already been analyzed so you may struggle to come up with a new spin on them but it's a great source for learning new methods. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--RP8OEC21--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/fyq6tznm07z12xdtl6u6.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--RP8OEC21--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/fyq6tznm07z12xdtl6u6.jpg" alt="AwesomeData" width="420" height="43"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  AwesomeData
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://github.com/awesomedata/awesome-public-datasets"&gt;AwesomeData&lt;/a&gt; is a topic-centric list of links to open datasets. This is a great place to explore a topic through the lens of data. &lt;/p&gt;

&lt;p&gt;Pros &amp;amp; Cons for Awesome Data: &lt;/p&gt;

&lt;p&gt;If you're looking for a quick and to-the-point link to a dataset, this isn't it. If you're looking to explore and find some unique and interesting places with datasets, this one is for you!&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://www.kaggle.com/datasets"&gt;Kaggle&lt;/a&gt; is my go to for finding a useable dataset as quickly as possible. &lt;a href="https://data.gov/"&gt;Data.gov&lt;/a&gt; is a very close second. If you're looking for a little bit of a detour that can lead you down a tangent but end up somewhere fun in the end: &lt;a href="https://www.data-is-plural.com/"&gt;Data Is Plural&lt;/a&gt; is where it's at. &lt;a href="https://github.com/BuzzFeedNews/everything"&gt;BuzzFeedNews&lt;/a&gt; is great if you want to learn some new techniques or see the 'how' behind some number crunching. Any of these resources should get you to a useable data set, it just depends on how you want to get there...&lt;/p&gt;

</description>
      <category>data</category>
      <category>datascience</category>
      <category>bigdata</category>
      <category>beginners</category>
    </item>
    <item>
      <title>Meteorite Mania</title>
      <dc:creator>Jonathan Fetterolf</dc:creator>
      <pubDate>Fri, 02 Dec 2022 02:34:23 +0000</pubDate>
      <link>https://dev.to/fetterollie/meteorite-mania-58ff</link>
      <guid>https://dev.to/fetterollie/meteorite-mania-58ff</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fslkozhel0qrzno85nvgx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fslkozhel0qrzno85nvgx.png" alt="Image description" width="800" height="450"&gt;&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;Author:&lt;/strong&gt; Jonathan Fetterolf&lt;/p&gt;

&lt;center&gt; *If you want to know how I put this together, I step through my process from start to finish after the conclusions and limitations...* 
or view the juicy details here: 
**[Meteorite Mania GitHub Repository](https://github.com/fetterollie/meteorite-mania)**
&lt;/center&gt;
&lt;h1&gt;
  
  
  Overview
&lt;/h1&gt;

&lt;p&gt;This notebook analyzes a dataset of known meteorite landing sites and the data associated with those meteorites. I derived some statistics about the known meteorites and mapped their landing sites as well as looked into where the majority of meteorites fell. In terms of mass, the largest known meteorite is 60000kg, which is contrasted by the smallest that comes in at 1e-05kg. The mean mass of the dataset is approximately 15.6kg, while the median is approximately 0.029kg. Nearly 74% of known meteorite landing sites are in the southern hemisphere while 26% are in the northern hemisphere. This overwhelming amount of known meteorite landing sites can be explained by this information compiled by the &lt;a href="https://sites.wustl.edu/meteoritesite/items/some-meteorite-statistics/" rel="noopener noreferrer"&gt;Department of Earth and Planetary Sciences&lt;/a&gt; at the Washington University in St. Louis. &lt;/p&gt;

&lt;p&gt;They state: &lt;/p&gt;
&lt;center&gt;&lt;em&gt;"Nearly all meteorites are found in deserts. (Yes, Antarctica is a desert because the annual precipitation rate is very low.) Deserts are places that accumulate meteorites over thousands of years and then nothing much happens to the meteorite. Also, meteorites are easier to find in deserts than in places with topography, vegetation, and other rocks."&lt;/em&gt;&lt;/center&gt;
&lt;h1&gt;
  
  
  Understanding &amp;amp; Cleaning Data
&lt;/h1&gt;

&lt;p&gt;This comprehensive data set from The Meteoritical Society contains information on all of the known meteorite landings up to 2013. The Fusion Table is collected by Javier de la Torre. This data consists of 34,513 meteorites. This dataset was obtained &lt;a href="https://catalog.data.gov/dataset/meteorite-landings" rel="noopener noreferrer"&gt;here&lt;/a&gt; through &lt;a href="https://data.gov/" rel="noopener noreferrer"&gt;data.gov&lt;/a&gt;.&lt;/p&gt;
&lt;h1&gt;
  
  
  Conclusions
&lt;/h1&gt;

&lt;p&gt;&lt;strong&gt;Mass&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In terms of mass, the largest known meteorite is 60000kg which is contrasted by the smallest that comes in at 1e-05kg. &lt;/li&gt;
&lt;li&gt;The mean mass of the dataset is approximately 15.6kg.&lt;/li&gt;
&lt;li&gt;The median is approximately 0.029kg. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Location&lt;/strong&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feldjqgvkrd41a4dfdbmh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feldjqgvkrd41a4dfdbmh.png" alt="All Strike Locations" width="800" height="409"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Approximately 74% of known meteorite landing sites are in the southern hemisphere.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5l5rc6dfv3w6wxw80p1a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5l5rc6dfv3w6wxw80p1a.png" alt="Southern Hemisphere Known Strike Locations" width="800" height="409"&gt;&lt;/a&gt;&lt;br&gt;
Approximately 26% of known meteorite landing sites are in the northern  hemisphere.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgrf2yv94h88v9cjofx7m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgrf2yv94h88v9cjofx7m.png" alt="Northern Hemisphere Known Strike Locations" width="800" height="409"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Limitations&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The dataset only contains data up to 2013.&lt;/li&gt;
&lt;li&gt;There are many known sites exactly on the equator. I did not include these in either northern or southern hemisphere.&lt;/li&gt;
&lt;li&gt;Many sites in the southern hemisphere, specifically Antartica are close so the visual representation of the data leads you to believe there are more landing sites in the northern hemisphere when in actuality, it is the opposite conclusion.&lt;/li&gt;
&lt;/ul&gt;
&lt;h1&gt;
  
  
  Walkthrough
&lt;/h1&gt;
&lt;h2&gt;
  
  
  Imports
&lt;/h2&gt;


&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import pandas as pd
import matplotlib.pyplot as plt
%matplotlib inline
import seaborn as sns
import geopandas as gpd
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;I used a data set with the geometry of landmasses to plot the shapes of land under my scatterplots.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;countries = gpd.read_file(
               gpd.datasets.get_path("naturalearth_lowres"))
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Importing data from file:
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;df = pd.read_csv('data/Meteorite_Landings.csv', index_col=0)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Cleaning &amp;amp; Filtering
&lt;/h2&gt;

&lt;p&gt;I cleaned up the columns, dropped null values, filtered out a data point with an incorrect year, filtered out any entries with a mass of zero (what is a meteorite without mass...?), converted mass(g) to mass(kg) in a new column, and changed values in the fall column from 'Fell' to 'Not Found'.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;newcols = [col.strip().lower().replace(' ','') for col in df.columns]
df.columns = newcols

df = df.dropna()

df = df[ df['year'] &amp;lt;= 2013]

df = df[ df['mass(g)'] &amp;gt; 0]

df['mass(kg)'] = df['mass(g)']/1000

key = {'Not Found' : 'Fell'}
replace_key = {v: k for k, v in key.items()}
df['fall'].replace(replace_key, inplace=True)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  The Juicy Code
&lt;/h2&gt;

&lt;p&gt;I calculated the maximum, minimum, mean, and median for the mass values.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;largest_mass = df['mass(kg)'].max()
smallest_mass = df['mass(kg)'].min()
mean_mass = df['mass(kg)'].mean()
median_mass = df['mass(kg)'].median()
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  The Juiciest Code
&lt;/h2&gt;

&lt;p&gt;I wanted to plot coordinates of the following subsets of known meteorite landing sites. The following code filters for each and subsequently plots a scatterplot over a world map:&lt;/p&gt;

&lt;p&gt;Top 500 Largest by Mass (kg)&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;df_top_500_largest_strikes = df.sort_values(by='mass(kg)', ascending=False).head(500)

fig5, ax5 = plt.subplots(figsize=(20, 10))

countries.plot(color="lightgrey", ax=ax5)

sns.scatterplot(data=df_top_500_largest_strikes, x='reclong', y='reclat', hue='mass(kg)', size='mass(kg)', ax=ax5, palette='flare')

ax5.set_title('Landing Sites of 500 Largest Known Meteorites')
ax5.grid(visible=True, alpha=0.5)
ax5.set_xlabel('Longitude')
ax5.set_ylabel('Latitude')
ax5.legend(title='Mass (kg)')
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr4dpgpkonqc1wg25g6jh.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fr4dpgpkonqc1wg25g6jh.png" alt="Landing Sites of 500 Largest Known Meteorites" width="800" height="409"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Top 100 Largest by Mass (kg)&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;df_top_100_largest_strikes = df.sort_values(by='mass(kg)', ascending=False).head(100)

fig4, ax4 = plt.subplots(figsize=(20, 10))

countries.plot(color="lightgrey", ax=ax4)

sns.scatterplot(data=df_top_100_largest_strikes, x='reclong', y='reclat', hue='mass(kg)', size='mass(kg)', ax=ax4, palette='flare')

ax4.set_title('Landing Sites of 100 Largest Known Meteorites')
ax4.grid(visible=True, alpha=0.5)
ax4.set_xlabel('Longitude')
ax4.set_ylabel('Latitude')
ax4.legend(title='Mass (kg)')
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkzssdo3i78rw1af72tre.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkzssdo3i78rw1af72tre.png" alt="Landing Sites of 100 Largest Known Meteorites" width="800" height="409"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Top 10 Largest by Mass (kg)&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;df_top_10_largest_strikes = df.sort_values(by='mass(kg)', ascending=False).head(10)

fig6, ax6 = plt.subplots(figsize=(20, 10))

countries.plot(color="lightgrey", ax=ax6)

sns.scatterplot(data=df_top_10_largest_strikes, x='reclong', y='reclat', hue='mass(kg)', size='mass(kg)', ax=ax6, palette='flare')

ax6.set_title('Landing Sites of 10 Largest Known Meteorites')
ax6.grid(visible=True, alpha=0.5)
ax6.set_xlabel('Longitude')
ax6.set_ylabel('Latitude')
ax6.legend(title='Mass (kg)')
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzjwlpumzjcjtfr5nvh48.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzjwlpumzjcjtfr5nvh48.png" alt="Landing Sites of 10 Largest Known Meteorites" width="800" height="409"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;I'm happy I wasn't living yet to see any of these largest known meteorite strikes.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Northern Hemisphere Known Sites&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;df_northern_hem = df[ df['reclat'] &amp;gt; 0]

fig1, ax1 = plt.subplots(figsize=(20, 10))

countries.plot(color="lightgrey", ax=ax1)

sns.scatterplot(data=df_northern_hem, x='reclong', y='reclat',\
                hue='fall', ax=ax1, palette='flare', alpha=0.4)

ax1.set_title('Northern Hemisphere Known Meteorite Landings')
ax1.grid(visible=True, alpha=0.5)
ax1.set_xlabel('Longitude')
ax1.set_ylabel('Latitude')
ax1.legend(title='')
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkkcmwaw3dzvacmllizko.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkkcmwaw3dzvacmllizko.png" alt="Northern Hemisphere Known Meteorite Landings" width="800" height="409"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Southern Hemisphere Known Sites&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;df_southern_hem = df[ df['reclat'] &amp;lt; 0]

fig1, ax1 = plt.subplots(figsize=(20, 10))

countries.plot(color="lightgrey", ax=ax1)

sns.scatterplot(data=df_southern_hem, x='reclong', y='reclat', hue='fall', ax=ax1, palette='flare', alpha=0.4)

ax1.set_title('Southern Hemisphere Known Meteorite Landings')
ax1.grid(visible=True, alpha=0.5)
ax1.set_xlabel('Longitude')
ax1.set_ylabel('Latitude')
ax1.legend(title='')
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqvoa9s663vq13jwy9vlo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqvoa9s663vq13jwy9vlo.png" alt="Southern Hemisphere Known Meteorite Landings" width="800" height="409"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Find Out More
&lt;/h2&gt;

&lt;p&gt;If you want to find out more about how I put this together (including seeing some fun exploratory visualizations) you can view the &lt;a href="https://github.com/fetterollie/meteorite-mania" rel="noopener noreferrer"&gt;GitHub Repository&lt;/a&gt; for this project. Please feel free to drop a comment or reach out via &lt;a href="https://www.linkedin.com/in/jonathanfetterolf/" rel="noopener noreferrer"&gt;LinkedIn&lt;/a&gt;. &lt;/p&gt;

</description>
      <category>watercooler</category>
    </item>
  </channel>
</rss>
