My DevOps Journey: Insights and Recommended Resources
As a professional navigating the ever-evolving landscape of technology, my journey into the realm of DevOps has been both enlightening and transformative. This week, I focused on refining my understanding of the Software Development Life Cycle (SDLC), exploring tools for Virtualization, Learning core concepts of Networking, and Deep-diving into the Linux Operating System.
Key Learnings from This Week
Software Development Life Cycle (SDLC):
What is SDLC?
A process for planning, creating, testing, and deploying an information system.
SDLC Phases
- Planning
- Requirement Analysis
- Designing
- Implementation
- Testing
- Deployment
Planning
- The purpose of the application.
- The details about the end-user of the product.
- Key elements like format and attributes of the application for designing
- The overall user interface design of the software.
Requirement Analysis
- Detailed information about each element to design the software.
- Validating the installation of elements in the application according to requirements.
- Calibrating the security protocols and performing risk analysis for the application.
- All details are filed in the Software Requirement Specification Document. (SRS)
Designing
- Devise the system design following the SRS Document.
- Check the overall architecture of the software and feasibility with the requirements.
- All details are added to the Design Document Specification (DDS) and shared with analysts.
Implementation / Coding
- Developers start writing the code using the Languages chosen for the software development.
- Implement the software product.
- Developers use pre-defined guidelines and development tools to implement the code.
Testing
- Development software is deployed in multiple test environments to check the functioning of all the attributes in the software architecture.
- If the testing team finds errors or bugs, they forward it back to the developer's team.
- This testing process continues until the software is stable and works completely fine.
Deployment / Maintenance
- The software application is ready for deployment and consumer use.
- The development team set up links for the application which is accessible to the users.
- Debugging of the application is done regularly if a bus is found.
Virtualization:
What is Virtualization?
Virtualization allows multiple virtual instances (VMs) to run on a single physical machine, maximizing resource utilization and efficiency.
It abstracts hardware resources, enabling easier management and scalability of IT infrastructure.
Types of Virtualizations:
Hardware Virtualization: Uses a hypervisor to create and manage VMs that share physical hardware resources.
Operating System Virtualization: Runs multiple isolated user-space instances (containers) on a single OS kernel.
Storage Virtualization: Abstracts logical storage from physical storage devices, improving flexibility and scalability.
Benefits of Virtualization:
Resource Efficiency: Consolidates hardware, reducing costs and energy consumption.
Scalability: Easily scale infrastructure up or down based on demand.
Isolation: Provides strong isolation between VMs and applications, enhancing security.
Networking Core concepts:
Here are simplified definitions of Networking concepts you need to know as a DevOps Engineer. These definitions provide a foundational understanding of each topic, suitable for beginners looking to grasp essential concepts in networking, security, protocols, and operating systems.
Subnets: Subnets are subdivisions of a larger network that allow for better organization and management of network resources. They help in optimizing traffic and security within a network.
Public/Private Network: A public network is accessible to anyone, typically over the Internet. A private network is restricted to specific users or devices and is often used within organizations for internal communications.
CIDR Notation: Classless Inter-Domain Routing (CIDR) notation is a compact representation of an IP address and its associated network mask. It allows for more flexible allocation of IP addresses.
Static/Dynamic IPs: Static IP addresses remain constant and are manually configured for a device or network interface. Dynamic IP addresses are assigned automatically by a DHCP server and may change over time.
Firewall: A firewall is a network security system that monitors and controls incoming and outgoing network traffic based on predetermined security rules. It acts as a barrier between a trusted internal network and untrusted external networks.
Proxy: A proxy server acts as an intermediary between client devices and other servers. It facilitates indirect network connections and can enhance security, privacy, and performance.
NAT (Network Address Translation): NAT is a technique that modifies IP address information in IP packet headers while they are in transit across a router or firewall, often used to conserve IP addresses or connect private networks to the internet.
Public & Private DNS: Public DNS servers translate domain names (like www.example.com) to IP addresses. Private DNS servers perform the same function but are used within private networks and are not accessible from the public internet.
VPN (Virtual Private Network): A VPN extends a private network across a public network (like the internet), enabling users to send and receive data securely as if their devices were directly connected to the private network.
HTTP (Hypertext Transfer Protocol): HTTP is the foundation of data communication for the World Wide Web. It defines how messages are formatted and transmitted, allowing web browsers to retrieve web pages from servers.
HTTPS (Hypertext Transfer Protocol Secure): HTTPS is the secure version of HTTP. It encrypts data sent and received between a user's browser and a website, providing a higher level of security against eavesdropping and tampering.
FTP (File Transfer Protocol): FTP is a standard network protocol used to transfer files between a client and a server on a computer network. It operates over TCP/IP and supports both anonymous and authenticated transfers.
TCP (Transmission Control Protocol): TCP is a connection-oriented protocol that provides reliable and ordered delivery of data packets between devices over an IP network. It manages the establishment and termination of a connection.
SSL/TLS (Secure Sockets Layer/Transport Layer Security): SSL and TLS are cryptographic protocols that provide secure communication over a computer network. They encrypt data transmitted between a client (e.g., web browser) and a server (e.g., website).
SSH (Secure Shell): SSH is a cryptographic network protocol used for secure remote access to a computer or server. It provides a secure channel over an unsecured network, typically for command-line access or file transfer.
SMTP (Simple Mail Transfer Protocol): SMTP is a protocol used for sending email messages between servers. It operates on TCP port 25 and handles the transmission of emails over the Internet.
Reverse Proxy: A reverse proxy is a server that sits between client devices and backend servers. It intercepts requests from clients and forwards them to the appropriate server. It can provide load balancing, caching, and security benefits.
Forward Proxy: A forward proxy is a server that sits between client devices and the internet. It handles requests from clients to external servers on their behalf, providing anonymity, security, and content filtering.
I/O Management: Input/Output (I/O) management refers to the process of controlling data transfer between a computer system and external devices (such as storage drives, network interfaces, and peripherals).
File System: A file system is a method and data structure used by an operating system to store, retrieve, and organize files on storage devices such as hard drives, SSDs, and external drives.
Thread and Process Management: Thread and process management involves managing the execution of processes and threads within an operating system. It includes scheduling, synchronization, and resource allocation to optimize performance and efficiency.
Linux:
What is Linux: A widely used open-source operating system kernel that forms the core of various Linux distributions (like Ubuntu, CentOS, RedHat, Debian, etc.). It provides the foundation for running applications and managing system resources on servers and other computing devices.
For DevOps: In the context of DevOps, Linux serves are the preferred operating system due to its flexibility, scalability, and robustness in handling various tasks such as automation, deployment, configuration management, and monitoring. It offers a rich ecosystem of tools and utilities (like shell scripting, package managers, and command-line interfaces) that efficiently manage infrastructure, deploy applications, and orchestrate workflows.
Most Widely used Linux Flavours:
Ubuntu
CentOS
RHEL (Red Hat Enterprise Linux)
File system hierarchy
o / - This is the top-level directory
o /root - It is the home directory for the root user
o /home - It is the home directory for other users
o /boot - It contains bootable files for Linux
o /etc - It contains all configuration files
o /usr - by default software’s are installed in this directory
o /bin - It contains commands used by all users
o /sbin - It contains commands used by only the root user (root)
The Basic Commands I Learned in Linux:
o cat (create & append file)
o touch (create blank file)
o nano (create & edit file)
o vi/vim (create & edit file)
o ls (list) (-a, -la)
o cd (change directory)
o pwd (print working directory)
o mkdir (create directory, multiple)
o cp (copy)
o mv (move)
o mv (rename)
o rm (remove file)
o tree (see in tree structure)
o rm -rf (remove directory & recursive)
o grep (pick & print)
o less (see output)
o head (see top 10 lines)
o tail (see last 10 lines)
o sort (display in Alphabetic/Numeric order)
o /tar (to pack)
o gz (to compress)
o wget (to download)
o File/Directory Permissions:
o chmod (permissions)
o chown (owner)
o chgrp (group)
o hostname (to see hostname)
o ifconfig (to get ip address)
o cat /etc/rele (to get os version)
o apt get install service_name (to install package)
o sudo (to get root privileges)
o whoami (to see user)
o awk
o grep
Recommended Resources
As I continue this journey, I find that sharing knowledge is invaluable. Below are resources that I recommend for anyone interested in pursuing or deepening their understanding of DevOps:
Youtube Channels for DevOps:
Train with Shubham:
DevOps Sheet with free Resources
Abhishek Vermella:
DevOps 45 Days Full course for free
Cloud Champ:
DevOps Roadmaps with free Resources, links, and books
M Prashant:
Conclusion
Reflecting on my DevOps journey this week has reiterated the importance of continual learning and adaptation. As I integrate these insights into my professional practice, I hope others can find inspiration and guidance from my experiences and the resources I recommend. DevOps is not merely a set of practices; it is a culture that drives innovation and efficiency in our increasingly digital world. Embracing this journey can undoubtedly pave the way for a fruitful career in technology.
Top comments (1)
Ubuntu is really moving away from being "Linux" as we know it... and has become slow and bloated. So a lot of people are moving on to Arch. Or its "easy-install-version" EndeavourOS. In Arch, instead of "apt" to install community programs, you use "pacman -Sy" or "yay -Sy" or "yay -Ss" to search.
As for the SDLC... it rarely happens like that. Its usually an iterative process that is ongoing, until one day the app is decommissioned. And what I have seen break more apps or serivces, is that on a certain day, a certain invoice doesn't get paid... and the app along with all its backups disappear. So... stay sharp!
Also... use cloud as a forntend or WAF and self host! A beginner friendly start, I've heard, is yunohost. Wow, I just had a look and it really does everything for you... but I guess you can look around the filesystem, and eventually try the expert mode install. And later just install a python app, or maybe nodejs app, or if you have hair on your teeth, apache or nginx... and also play around with wireshark, on your local machine. And then graduate to doing docker... and then kubernetes. And then managing multiple servers. But rule number one is monitoring... and you don't need big nasty multi dependency systems, you can literally just sign up for onlineoronot or something like that,or run curl myurl || (echo Oh no | email myadmin@me.com) in a loop... Boom... you're an expert now :-p
But daily drive Linux... it will teach you more valuable skills than anything else.