Having a deep understanding of the underlying operating system concepts is quite essential in DevOps field. From file systems to process management, these concepts will help you demonstrate your expertise and ace your next interview.
1. Networking:
Computer networking refers to interconnected computing devices that exchange data and share resources with each other. Begin by studying OSI Model. It will help you understand linked topics.
2. POSIX Basics:
It is a family of standards for maintaining compatibility between operating systems. It describes utilities, APIs, and services that a compliant OS should provide.
3. Sockets:
It is an endpoint for a two-way communication link between two different processes on the network. The socket mechanism provides a means of inter-process communication by establishing named contact points between the client and server.
4. Processes:
A process means a program in execution. It generally takes an input, processes it and gives the appropriate output.
5. Startup Management (init.d):
init.d is a daemon, which is the first process (PID 1) for a Linux system. Other processes, services, daemons, and threads are started by init. You can write your own scripts in /etc/init.d to start on the system boot.
6. Service Management (systemd):
It replaces the sysvinit process to become the first process with PID=1. It replaces init.d. It uses systemctl command to perform related operations.
7. Threads in OS:
It is an active entity which executes a part of a process. It is a sequential flow of tasks within a process. It is also called lightwright process as they share common resources. A process can have multiple threads.
8. Concurrency in OS:
It is the execution of multiple instruction sequences at the same time. It happens in the OS when there are several process threads running in parallel. It helps in coordinating process execution and memory allocation.
9. I/O Management:
OS manages various I/O devices including mouse, keyboards, touchpad, disk drives, display adapters, USB devices, display screens, etc.
10. Virtualization:
It means the creation of a virtual version of either OS, server, storage device or network resources. It uses software that simulates hardware functionality to create a virtual system.
11. Memory Management:
It is the collection of data in a specific format. It is used to store instructions and process data. Primary motive of a computer is to execute programs. Programs along with data they can access should be in memory.
12. File System:
A file is named collection of related information recorded on secondary storage AKA disk. Generally, a file is a sequence of bits, bytes, lines, or records whose meaning is defined by the file's creator and user.
Thanks for reading this.
If you have an idea and want to build your product around it, schedule a call with me.
If you want to learn more in DevOps and Backend space, follow me.
If you want to connect, reach out to me on Twitter and LinkedIn.
Top comments (2)
Spot on with Networking.. some times that is the what everyone ignore most.
Absolutely. It's the core concept you must know to be good in debugging infrastructure issues.