<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Hosea Ngeywo </title>
    <description>The latest articles on DEV Community by Hosea Ngeywo  (@hillycyb3r).</description>
    <link>https://dev.to/hillycyb3r</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/hillycyb3r"/>
    <language>en</language>
    <item>
      <title>How To Avoid Online Scams - Tips and Tricks.</title>
      <dc:creator>Hosea Ngeywo </dc:creator>
      <pubDate>Tue, 22 Nov 2022 17:37:55 +0000</pubDate>
      <link>https://dev.to/hillycyb3r/how-to-avoid-online-scams-tips-and-tricks-5798</link>
      <guid>https://dev.to/hillycyb3r/how-to-avoid-online-scams-tips-and-tricks-5798</guid>
      <description>&lt;p&gt;While most of us are well aware of the online scam, surprisingly we still fall prey to them. You only need to invest a few minutes on the internet before you realize that there are some really bad people out there who will try anything to make a quick buck on your money. These scammers have learned to be very smart and cunning. Therefore, in this article I will talk about something that might interest you: How to prevent yourself from scams and what you can do while using the internet?&lt;/p&gt;

&lt;p&gt;These days, there are many scams online. With all the information we have at our fingertips, it's easy to get sucked in by a clever scammer. Here are some tips on how to stop yourself from getting scammed.&lt;/p&gt;

&lt;h2&gt;
  
  
  1. Use a good antivirus program.
&lt;/h2&gt;

&lt;p&gt;Antivirus software is essential for protecting your computer from viruses and other malicious software.&lt;/p&gt;

&lt;h2&gt;
  
  
  2. Don't click on links from  strangers.
&lt;/h2&gt;

&lt;p&gt;The first rule of online security is to be wary of what you click. You should never click on an email link in your inbox unless you were expecting the email to arrive.&lt;/p&gt;

&lt;p&gt;Don't click on links from strangers or random websites that appear in your email inbox or on social media sites like Facebook or Twitter. If you're not sure where the link leads, don't click on it!&lt;/p&gt;

&lt;h2&gt;
  
  
  3. Avoid opening attachments.
&lt;/h2&gt;

&lt;p&gt;Be careful when opening attachments in emails or messages, especially if they're asking for personal information like passwords or credit card numbers.&lt;/p&gt;

&lt;h2&gt;
  
  
  4. Trust your instincts.
&lt;/h2&gt;

&lt;p&gt;If a deal sounds too good to be true, it probably is. Trust your instincts and do your research. There are promotional links circulated especially through whats-app groups purporting to be from top companies promising to gift its customers. The best way before click the links is to first check the official website if they are running the promotion. &lt;/p&gt;

&lt;p&gt;In conclusion, it is always a good idea to approach websites with a healthy dose of skepticism. However, cyber criminals are becoming increasingly more sophisticated in their ability to trick people into providing them with personal information or financial/credit card details. Never assume that you are completely safe from online scam simply because you have never been scammed before. The only way to protect yourself from online scams is by educating yourself about the techniques used and warning others.&lt;/p&gt;

</description>
      <category>onlinefraud</category>
      <category>scam</category>
      <category>onlinescam</category>
      <category>phishing</category>
    </item>
    <item>
      <title>DNS Enumeration Part 1(HOST)</title>
      <dc:creator>Hosea Ngeywo </dc:creator>
      <pubDate>Sun, 20 Nov 2022 05:37:53 +0000</pubDate>
      <link>https://dev.to/hillycyb3r/dns-enumeration-part-1host-33l4</link>
      <guid>https://dev.to/hillycyb3r/dns-enumeration-part-1host-33l4</guid>
      <description>&lt;h2&gt;
  
  
  Key Questions
&lt;/h2&gt;

&lt;p&gt;What is DNS enumeration? How does it differ from more common forms of reconnaissance? What tools can you use to perform DNS (Domain Name Service) enumeration? I'll answer these questions and more in this article.&lt;/p&gt;

&lt;h2&gt;
  
  
  Definition
&lt;/h2&gt;

&lt;p&gt;While the Internet is much more robust than any other form of communication, there are some basic methods used to find out its structure and elements. While these methods also allow for gathering information about an organization, they can be used by attackers as well. &lt;/p&gt;

&lt;p&gt;DNS enumeration is one of these ways. The term 'DNS' stands for Domain Name System and in general, refers to the service responsible for translating hostnames (such as akarns.com) into IP addresses. For example, if your computer accesses akarns.com directly and asks the DNS server how to find this URL, it will return its address in an A record's name field. DNS server also has resources that let you know what type of record exists at a given point in time and whether changing it will affect something specific (to you).&lt;/p&gt;

&lt;h2&gt;
  
  
  DNS Enumeration Tools
&lt;/h2&gt;

&lt;p&gt;There are numerous tools used to perform DNS enumeration. However, the decision to use a particular tool solely depends on the attacker's preferences. Many prefer automated tools but in our discussion, I will try to explain manually available tools to better understand the working of different tools. This article covers the following tools: &lt;strong&gt;Host&lt;/strong&gt;, &lt;strong&gt;nslookup&lt;/strong&gt;, and &lt;strong&gt;dig&lt;/strong&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Host
&lt;/h2&gt;

&lt;p&gt;The term host basically means a piece of software whose task is to parse DNS domain names, IP addresses, and query strings in a specific format. Host command can be used to convert domain names to IP addresses and IP back to domain names.&lt;/p&gt;

&lt;p&gt;The syntax to perform host enumeration is:&lt;br&gt;
&lt;code&gt;host domain_name&lt;/code&gt;&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--iFdNu457--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xg54e0jibgjvecp81jcx.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--iFdNu457--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xg54e0jibgjvecp81jcx.PNG" alt="Image description" width="680" height="125"&gt;&lt;/a&gt;&lt;br&gt;
Running the command above returned the IP address of the host domain.&lt;br&gt;
&lt;em&gt;&lt;strong&gt;Name servers&lt;/strong&gt;&lt;/em&gt;&lt;br&gt;
To find the name servers the domain run on you add the &lt;strong&gt;-t&lt;/strong&gt; tag followed by &lt;strong&gt;ns(name server)&lt;/strong&gt;. The &lt;strong&gt;-t&lt;/strong&gt; denotes the type.&lt;br&gt;
&lt;code&gt;host -t ns example.com&lt;/code&gt;&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--poKCPcwe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0hrnj77sxykv6o3r27vq.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--poKCPcwe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0hrnj77sxykv6o3r27vq.PNG" alt="Image description" width="695" height="113"&gt;&lt;/a&gt;&lt;br&gt;
&lt;strong&gt;&lt;em&gt;Mail server&lt;/em&gt;&lt;/strong&gt;&lt;br&gt;
To get the mail server you add &lt;strong&gt;mx&lt;/strong&gt; to the -t tag.&lt;br&gt;
&lt;code&gt;host -t mx example.com&lt;/code&gt;&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--GBvCP-6V--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kya0xxbo1csdf8c1gj6w.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--GBvCP-6V--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kya0xxbo1csdf8c1gj6w.PNG" alt="Image description" width="645" height="104"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;&lt;strong&gt;Reverse lookup&lt;/strong&gt;&lt;/em&gt;&lt;br&gt;
In a scenario where you only have the IP address but you don't know the domain, the reverse lookup will be the best method to use. If the output doesn't present a domain then I will prefer using a web-based tool called shodan.io. &lt;br&gt;
&lt;code&gt;host ip_address&lt;/code&gt;&lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--8--NK-Vk--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/bed01u0qyhwt09deeim8.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--8--NK-Vk--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/bed01u0qyhwt09deeim8.PNG" alt="Image description" width="647" height="120"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the next article, I will cover &lt;strong&gt;nslookup&lt;/strong&gt; and &lt;strong&gt;dig&lt;/strong&gt; commands. &lt;br&gt;
Thanks for reading I look forward to your feedback.&lt;/p&gt;

</description>
      <category>dnstool</category>
      <category>dnsenumeration</category>
      <category>host</category>
      <category>reverselookup</category>
    </item>
    <item>
      <title>What is a Robots.txt file?</title>
      <dc:creator>Hosea Ngeywo </dc:creator>
      <pubDate>Sat, 19 Nov 2022 07:58:10 +0000</pubDate>
      <link>https://dev.to/hillycyb3r/robotstxt-file-mf6</link>
      <guid>https://dev.to/hillycyb3r/robotstxt-file-mf6</guid>
      <description>&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--KUEN85P---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/qfcwht22hs4cowlei22h.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--KUEN85P---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/qfcwht22hs4cowlei22h.jpg" alt="Image description" width="720" height="480"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;A robots.txt file is a way of controlling access to the list of popular URLs and the list of non-popular pages, based on a variety of security concerns. For example, you may want to enforce HTTPS, or prevent people from searching for private data like your customers’ names and addresses.&lt;/p&gt;

&lt;p&gt;A robots.txt file instructs web crawlers on how to access the page, what sections of the page to ignore, and what sections to follow. By controlling which section pages are crawled and indexed by a web crawler, you can specify what content appears in search results for that page.&lt;/p&gt;

&lt;h2&gt;
  
  
  Enumerating robots.txt files
&lt;/h2&gt;

&lt;p&gt;To locate robots.txt in any website append the /robots.txt at the end of the web URL example, &lt;a href="http://your_dormain/robots.txt"&gt;http://your_dormain/robots.txt&lt;/a&gt;. If the website uses robots.txt to instruct web crawlers, your output will look like this.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;User-Agent: *
Allow : /admin/
Disallow: /
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;From the output above, it is clear the web crawler has been instructed to index files under the &lt;strong&gt;/admin/&lt;/strong&gt; directory and ignore the rest.&lt;br&gt;
Using the information gathered from the robots.txt file the attacker can visit &lt;a href="https://your_domain/admin/"&gt;https://your_domain/admin/&lt;/a&gt; to find more information that can aid in achieving malicious intentions.&lt;br&gt;
In the scenario above the attacker may be directed to the admin panel login page. This poses a critical security risk since attackers will try all means to access the admin panel.&lt;/p&gt;
&lt;h2&gt;
  
  
  Impact
&lt;/h2&gt;

&lt;p&gt;The Robots.txt file does not present any security concerns. However, if the contents of the file reveal sensitive information the attackers can take advantage and access the unauthorized web directories for their advantage. It is always advisable not to list important files/directories in the robots.txt file.&lt;/p&gt;
&lt;h2&gt;
  
  
  Prevention
&lt;/h2&gt;

&lt;p&gt;To prevent sensitive information from being displayed under the robots.txt file developers can use &lt;strong&gt;X-Robots-Tag&lt;/strong&gt; with appropriate values to instruct web crawlers on whether to index files/directories or not. The &lt;strong&gt;X-Robots-Tag&lt;/strong&gt; is placed in the response header section.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;X-Robots-Tag: googlebot: nofollow
X-Robots-Tag: otherbot: noindex, nofollow

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The values noindex, and nofollow tell the crawler not to index the Googlebot and other bots.&lt;/p&gt;

&lt;p&gt;Let's take a practical look at the following scenario...&lt;br&gt;
If a user uploads images to his or her website, for example, the image could end up on an index page where users may be exposed to adult content. In this case, the developers of the website will want to ensure that no crawlable pages contain adult content by adding a line in their robots.txt file that specifies "noindex,nofollow"&lt;/p&gt;

&lt;p&gt;Thanks for reading. Looking forward to your feedback. &lt;/p&gt;

</description>
      <category>cybersecurity</category>
      <category>enumeration</category>
      <category>activerecon</category>
      <category>webcrawlers</category>
    </item>
    <item>
      <title>Understanding .DS_Store File Part 1</title>
      <dc:creator>Hosea Ngeywo </dc:creator>
      <pubDate>Fri, 09 Sep 2022 10:19:38 +0000</pubDate>
      <link>https://dev.to/hillycyb3r/understanding-dsstore-file-part-1-7o2</link>
      <guid>https://dev.to/hillycyb3r/understanding-dsstore-file-part-1-7o2</guid>
      <description>&lt;p&gt;&lt;strong&gt;Do you own a MacBook?&lt;/strong&gt; &lt;br&gt;
If yes, then you must be careful because you could be exposing critical information about your website or applications through '&lt;strong&gt;.DS_Store&lt;/strong&gt;' files. DS refers to Desktop Services.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What are DS_Store files?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;A file with the 'ds_store' extension is a Mac OS X database file. Mac database files are used by many applications to store data in the form of flat files, which are stored on a database server when the database application is not running. Databases typically store information such as names of people and companies, phone numbers, email addresses, or other personal data for an organization (including its employees and customers), or information about how one product should be handled in its supply chain.&lt;/p&gt;

&lt;p&gt;The application Mac DS_Store File is specially designed for storage and access to files on the Mac OS X operating system. This wizard allows you to manage the content of your Mac hard drive, creating, moving, and removing folders and files, renaming them, copying between volumes of disks, or creating new hard drives from scratch.&lt;/p&gt;

&lt;p&gt;Mac Ds_store files are associated with a specific product, e.g., toolkit, media player, etc. The Mac Ds_store files are exploitable and have the potential to obtain users’ information.&lt;/p&gt;

&lt;p&gt;This file contains several potential security vulnerabilities that can be exploited by security researchers popularly known as hackers.&lt;/p&gt;

&lt;p&gt;Stay tuned for part 2 on how DS_Store files are exploited.&lt;/p&gt;

</description>
      <category>ctf</category>
      <category>macfiles</category>
      <category>cybersecurity</category>
    </item>
  </channel>
</rss>
