<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: CCONLEY-FI</title>
    <description>The latest articles on DEV Community by CCONLEY-FI (@cconley-dev).</description>
    <link>https://dev.to/cconley-dev</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/cconley-dev"/>
    <language>en</language>
    <item>
      <title>Building and running javascriptlets for advanced data acquisition.</title>
      <dc:creator>CCONLEY-FI</dc:creator>
      <pubDate>Sat, 17 Feb 2024 01:11:49 +0000</pubDate>
      <link>https://dev.to/cconley-dev/building-and-running-javascriptlets-for-advanced-data-acquisition-181n</link>
      <guid>https://dev.to/cconley-dev/building-and-running-javascriptlets-for-advanced-data-acquisition-181n</guid>
      <description>&lt;p&gt;In an effort to utilize the full potential of an Android device, I decided to make a short scriplet for web scraping. Particularly for finding specific file types like PDFs, EPUBs, or JPGs, the combination of Javascriptlets, Termux, and enhanced browser functionalities offers a compelling solution. This detailed guide walks through setting up the necessary tools and crafting scripts to automate the search for these file types directly from an Android device, illustrating the process with practical examples.&lt;/p&gt;

&lt;h3&gt;
  
  
  Initial Setup with Termux
&lt;/h3&gt;

&lt;p&gt;Termux is the backbone of this operation, providing a powerful Linux environment on Android. After installing Termux from the Google Play Store, or F-droid if needed, the following commands will prepare the environment for scripting:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;pkg update &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; pkg upgrade
pkg &lt;span class="nb"&gt;install &lt;/span&gt;python
pkg &lt;span class="nb"&gt;install &lt;/span&gt;git
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;These steps ensure that the Termux environment is ready for advanced operations, including web scraping tasks.&lt;/p&gt;

&lt;h3&gt;
  
  
  Enhancing Capabilities with Browser Extensions
&lt;/h3&gt;

&lt;p&gt;To augment the web scraping process, installing browser extensions on a compatible browser like Kiwi or fenix(firefox) can significantly streamline operations. Adding an extension like Tampermonkey or Mobile Dev Tools enables the user to manage and execute Javascriptlets with ease, facilitating the automation of web tasks directly from the browser.&lt;/p&gt;

&lt;h3&gt;
  
  
  Crafting Javascriptlets for File Search
&lt;/h3&gt;

&lt;p&gt;Javascriptlets can be designed to initiate searches for specific file types across the web. Here’s a concise script aimed at finding PDFs usong googles search logic:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="nx"&gt;javascript&lt;/span&gt;&lt;span class="p"&gt;:(&lt;/span&gt;&lt;span class="kd"&gt;function&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;var&lt;/span&gt; &lt;span class="nx"&gt;query&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;encodeURIComponent&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;filetype:pdf&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="kd"&gt;var&lt;/span&gt; &lt;span class="nx"&gt;url&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;`https://www.google.com/search?q=&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;query&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="nb"&gt;window&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;open&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;url&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;})();&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Adapting this script to search for EPUBs or JPGs is as straightforward as changing &lt;code&gt;filetype:pdf&lt;/code&gt; to &lt;code&gt;filetype:epub&lt;/code&gt; or &lt;code&gt;filetype:jpg&lt;/code&gt; in the script.&lt;/p&gt;

&lt;h3&gt;
  
  
  Advanced Web Scraping with Termux
&lt;/h3&gt;

&lt;p&gt;For more nuanced scraping tasks, such as parsing search results to extract specific URLs or directly downloading files, Python scripts executed within Termux are exceptionally useful. Tools such as Beautiful Soup can parse HTML content to find and list downloadable links. Here's an example script that searches for downloadable PDF links on a webpage:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;requests&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;bs4&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;BeautifulSoup&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;find_downloads&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;url&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="n"&gt;page&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;requests&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;url&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;soup&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;BeautifulSoup&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;page&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;content&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;html.parser&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;links&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;soup&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;find_all&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;a&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;href&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;link&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;links&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;link&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;href&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="nf"&gt;endswith&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;.pdf&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
            &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;link&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;href&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;

&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;__name__&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;__main__&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;target_url&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;https://example.com&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;
    &lt;span class="nf"&gt;find_downloads&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;target_url&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This script could be easily modified to search for &lt;code&gt;.epub&lt;/code&gt; or &lt;code&gt;.jpg&lt;/code&gt; files by replacing &lt;code&gt;.endswith('.pdf')&lt;/code&gt; with the desired file extension in the script.&lt;/p&gt;

&lt;h3&gt;
  
  
  Automating and Scheduling with Termux
&lt;/h3&gt;

&lt;p&gt;To automate the execution of scripts for repeating data collection, Termux supports scheduling through cron jobs. This functionality allows scripts to run at specified intervals, ensuring continuous data collection without manual intervention:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"0 * * * * python /path/to/find_downloads.py"&lt;/span&gt; | crontab -
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This command sets the &lt;code&gt;find_downloads.py&lt;/code&gt; script to run hourly, demonstrating Termux’s capability to automate web scraping tasks.&lt;/p&gt;

&lt;h3&gt;
  
  
  Conclusion
&lt;/h3&gt;

&lt;p&gt;Leveraging the capabilities of Javascriptlets for initiating web searches, coupled with the power of Termux for advanced scripting and scheduling, users can effectively automate the search and collection of specific file types like PDFs, EPUBs, and JPGs on their Android devices. This approach not only makes targeted data collection more accessible but also significantly expands the scope of projects that can be undertaken directly from a mobile device, showcasing the practical and versatile applications of these tools for sophisticated web scraping tasks. Use your own creativity to develop other use cases. Keep in mind that you should always consider a sites usage rules and legal processes.&lt;/p&gt;

</description>
      <category>building</category>
      <category>and</category>
      <category>running</category>
      <category>javascri</category>
    </item>
    <item>
      <title>Post-Bootcamp Studies as a Software Engineer</title>
      <dc:creator>CCONLEY-FI</dc:creator>
      <pubDate>Mon, 29 Jan 2024 18:31:01 +0000</pubDate>
      <link>https://dev.to/cconley-dev/post-bootcamp-studies-as-a-software-engineer-3o30</link>
      <guid>https://dev.to/cconley-dev/post-bootcamp-studies-as-a-software-engineer-3o30</guid>
      <description>&lt;h1&gt;
  
  
  Introduction
&lt;/h1&gt;

&lt;p&gt;As a current Bootcamp student and aspiring Full-stack Developer, the question on my mind, as well as the minds of fellow cohorts, is quite simple. When we're done here, how do we land the ideal career? With that in mind, I've started to take a look at the field of AI integration and Development. What follows are a few overarching core concepts for breaking into the industry at an entry level, compiled from the perspective of a nascent software engineer.&lt;/p&gt;

&lt;p&gt;Core Languages One Should Know (or at least be familiar with)&lt;br&gt;
JavaScript is indispensable for web development, enabling both client-side and server-side applications through frameworks like React and Node.js. As it is considered a foundational language among most Development teams, it is not uncommon for technical interviews to involve a demonstration of proficiency in JS, if not the frameworks.&lt;/p&gt;

&lt;p&gt;Python excels in simplicity and versatility, making it ideal for various applications, from web development to data science. Python has one of the most user-friendly data structures out there; it can be found in industries as varied as gaming to medical databasing, and everything in between.&lt;/p&gt;

&lt;p&gt;An example of Python code used to fine-tune AI modeling:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;tensorflow&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;tf&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;tensorflow.keras.layers&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Dense&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;tensorflow.keras.models&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;Sequential&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;tensorflow.keras.datasets&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;mnist&lt;/span&gt;

&lt;span class="c1"&gt;# Load dataset (for example, MNIST handwritten digits dataset)
&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;train_images&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;train_labels&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;test_images&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;test_labels&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;mnist&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;load_data&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

&lt;span class="c1"&gt;# Normalize the images
&lt;/span&gt;&lt;span class="n"&gt;train_images&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;train_images&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="mf"&gt;255.0&lt;/span&gt;
&lt;span class="n"&gt;test_images&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;test_images&lt;/span&gt; &lt;span class="o"&gt;/&lt;/span&gt; &lt;span class="mf"&gt;255.0&lt;/span&gt;

&lt;span class="c1"&gt;# Build the model
&lt;/span&gt;&lt;span class="n"&gt;model&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;Sequential&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;
    &lt;span class="n"&gt;tf&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;keras&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;layers&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;Flatten&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;input_shape&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;28&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;28&lt;/span&gt;&lt;span class="p"&gt;)),&lt;/span&gt;
    &lt;span class="nc"&gt;Dense&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;128&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;activation&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;relu&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
    &lt;span class="nc"&gt;Dense&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;activation&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;softmax&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;])&lt;/span&gt;

&lt;span class="c1"&gt;# Compile the model
&lt;/span&gt;&lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;compile&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;optimizer&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;adam&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
              &lt;span class="n"&gt;loss&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;sparse_categorical_crossentropy&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
              &lt;span class="n"&gt;metrics&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;accuracy&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;

&lt;span class="c1"&gt;# Train the model
&lt;/span&gt;&lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;fit&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;train_images&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;train_labels&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;epochs&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="c1"&gt;# Evaluate the model
&lt;/span&gt;&lt;span class="n"&gt;test_loss&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;test_acc&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;model&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;evaluate&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;test_images&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;test_labels&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;verbose&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="s"&gt;Test accuracy:&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;test_acc&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Java and C# have and without a doubt will remain critical for enterprise and mobile app development. Particularly for devs working with legacy code found in the majority of long-standing code found in large companies.&lt;/p&gt;

&lt;p&gt;An example of Spring Boot, a popular web building framework for Java:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;

@SpringBootApplication
@RestController
public class ExampleApplication {

    public static void main(String[] args) {
        SpringApplication.run(ExampleApplication.class, args);
    }

    @GetMapping("/hello")
    public String hello(@RequestParam(value = "name", defaultValue = "World") String name) {
        return String.format("Hello %s!", name);
    }
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Directed Study Resources
&lt;/h2&gt;

&lt;p&gt;Focused learning platforms such as Pluralsight and LinkedIn Learning are often quoted as prime learning resources with a broad, if somewhat general, scope. EdX and Coursera provide access to university courses that deepen foundational knowledge, giving you a bit more depth to your learning. As a resident of Colorado, I've also recently learned that Udemy is provided to anyone possessing a library card. Keep an eye on local resources and benefits!&lt;/p&gt;

&lt;p&gt;Project-based learning is crucial to any developer. One of the first pieces of advice given to me was "don't stop coding." I've learned almost as much from making side projects which interested me.&lt;/p&gt;

&lt;p&gt;Familiarity with agile methodologies and DevOps practices, including tools like Jenkins and Docker, is becoming increasingly important in modern development environments.&lt;/p&gt;

&lt;h2&gt;
  
  
  Essential AI Readings
&lt;/h2&gt;

&lt;p&gt;For those aiming to specialize in AI, reading and understanding the work already done by others is a must. A few books which are widely recommended are:&lt;/p&gt;

&lt;p&gt;"Artificial Intelligence: A Modern Approach" by Stuart Russell and Peter Norvig&lt;br&gt;
"Pattern Recognition and Machine Learning" by Christopher M. Bishop&lt;br&gt;
"Deep Learning" by Ian Goodfellow, Yoshua Bengio, and Aaron Courville&lt;br&gt;
For a broader perspective, especially on the implications and implementations of AI, "Weapons of Math Destruction" by Cathy O'Neil is a must-read. It takes on the challenge of highlighting the impact of widely used algorithms and the effect they have on society as a whole.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>A beginner's experience working with JavaScript: The Superhero Showdown</title>
      <dc:creator>CCONLEY-FI</dc:creator>
      <pubDate>Fri, 05 Jan 2024 17:41:26 +0000</pubDate>
      <link>https://dev.to/cconley-dev/a-beginners-experience-working-with-javascript-the-superhero-showdown-1398</link>
      <guid>https://dev.to/cconley-dev/a-beginners-experience-working-with-javascript-the-superhero-showdown-1398</guid>
      <description>&lt;h3&gt;
  
  
  By &lt;a href="https://github.com/CCONLEY-FI"&gt;Chris Conley&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;As part of my early experiences in the Flatiron School's coding program, I undertook a project that served as my introduction to the fundamentals of web development—specifically, event listeners, DRY programming, and API/JSON fetching. The project, named "Superhero Showdown", was a web application that allowed users to engage with a short list of superheroes in a simulated battle environment. This endeavor provided a hands-on approach to learning JavaScript, HTML, and CSS. Here, I share the process and insights gained from this project.&lt;/p&gt;

&lt;h2&gt;
  
  
  Project Overview
&lt;/h2&gt;

&lt;p&gt;The end goal of the Superhero Showdown project was to create an interactive web app where users could select superheroes, view their statistics, and compare them in a very basic way. The project required the integration of front-end development using JavaScript for interactive functionality, HTML for the web page's structure, and CSS for its visual design. I also dipped my toes in backend dev when fetching from the API and interacting with our .json.&lt;/p&gt;

&lt;h3&gt;
  
  
  Data Management and Dynamic Rendering
&lt;/h3&gt;

&lt;p&gt;The first iteration of the project was going to utilize access to a free API &lt;a href="https://superheroapi.com/"&gt;"SuperHeroAPI"&lt;/a&gt; for fetching data.  After wrestling with fetch and some unforeseen challenges around importing an apikey, I determined that it would simply be more efficient to create a .json from which to fetch our heroes' data.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--gDN7uuy3--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8grrra4bahbbgvvkokbt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--gDN7uuy3--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8grrra4bahbbgvvkokbt.png" alt="Image description" width="800" height="776"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Interactive Features
&lt;/h3&gt;

&lt;p&gt;Incorporating interactive elements, ie: selecting/creating a hero and applying visual effects(mouseover,mouseout) to meet deliverable requirements, represented a notable learning curve. This part of the project helped me understand the importance of scope when working with iterative programming and event listeners in JavaScript, creating dynamic user experiences.&lt;/p&gt;

&lt;h2&gt;
  
  
  HTML and CSS: Structure and Design
&lt;/h2&gt;

&lt;p&gt;The project also served as a practical introduction to HTML and CSS, helping me become more comfortable with structuring web content and applying styles.&lt;/p&gt;

&lt;h3&gt;
  
  
  HTML Structure
&lt;/h3&gt;

&lt;p&gt;I furthered my understanding of organizing a webpage using basic HTML elements. While I had some basic and conceptual knowledge of HTML, I explicitly learned about the challenge of creating vs modifying HTML; learning how to expand my HTML code in tandem with my JS/CSS as needed.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--7t_EILK2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6l62c51e0d7pc65a1ght.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--7t_EILK2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6l62c51e0d7pc65a1ght.png" alt="Image description" width="800" height="221"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  CSS Styling
&lt;/h3&gt;

&lt;p&gt;Applying CSS was the next step. I learned about simple properties, flex, filter, grid, etc. for layout design and eventListener interactions.  Major credit for the CSS refactoring goes to my collaborator in this project, Nick Gallegos.&lt;/p&gt;

&lt;h2&gt;
  
  
  Challenges and Key Learnings
&lt;/h2&gt;

&lt;p&gt;One of the key challenges I faced was managing the state of the application, particularly in resetting selections after each showdown. Furthermore, I faced a handful of fretful refactors and one distressing wipe/fresh start to our project in order to properly implement our core "renderHero" function.  While I hesitated in the reconstruction process, there's no doubt that I was able to learn a great deal from my mistakes. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--3ecy6JAc--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/dmnwvw0gk1ti9vjlg4c6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--3ecy6JAc--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/dmnwvw0gk1ti9vjlg4c6.png" alt="Image description" width="800" height="367"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;The Superhero Showdown project was a foundational experience in my coding journey, being the first time I was able to create something from nothing. It emphasized the importance of JavaScript in adding interactivity, the role of HTML in content structuring, and the impact of CSS in design. This project was instrumental in building my understanding of web application development from the ground up.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/CCONLEY-FI"&gt;Keep an eye on my journey!&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/ngallegos8"&gt;Nick Gallegos&lt;/a&gt; SuperHero-Showdown collaborator and aspiring full stack wizard!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/CCONLEY-FI/superhero-showdown"&gt;SuperHero-Showdown&lt;/a&gt;&lt;/p&gt;

</description>
    </item>
  </channel>
</rss>
