Computer
Fix Joseph Posted on 1:40 am

Why be able to work on the command line?

Today we are going to talk about why to learn the GNU/Linux operating system, the advantages of working at the command line, and how this all relates to the Unix philosophy.

In the world of desktop operating systems, like Windows, the concept of a program is highly specialized. If you want a program to watch a video, you go and download a particular program that was specifically written for that purpose. If you want to listen to music, there is also a whole range of programs that do this. In general, for any typical task, a huge number of programs have already been written and distributed under different conditions for different purposes.

As a rule, these programs weigh quite a lot and provide additional integrations and functions that you may not need at all. Sometimes it comes to the point of absurdity when you have to download and install gigabyte packages just to be able to edit text. For example, in order to show you advertisements from time to time, the developers integrate the code of full-fledged browsers like Chrome into the programs. Or, if you want to play a video inside an application, instead of integrating their code with the standard programs already installed on your system, they embed a full fledged video player inside their product, with a lot of unnecessary code that will for no reason slow down your system and eat up precious memory.

The problem with this approach starts to emerge when you need to solve a fairly specific task, and you can’t find a ready-made program that implements the desired logic.

“How am I going to run Photoshop on this Linux of yours? I’ve got work to do, not nonsense to do.”

Don’t get me wrong, this approach works quite well when it comes to professional packages: engineering, creative packages like AutoCAD, MATLAB, Adobe Photoshop, Blender and many others. These large and rather bloated software solutions allow millions of people around the world to learn useful professions and create amazing products, adopting best practices and standardizing workflow.

At the other extreme, we would have to master programming just to implement the necessary functionality every time. And this would turn into a mad waste of time for a huge number of people, forcing them to dive into areas which are not core to them.

Between these two extremes there is a certain intermediate level, which allows you to solve non-standard tasks, programs to solve which are clearly not provided. But nevertheless to use for this some relatively large ready-made building blocks. We have already got rid of the need to write program code, but we still have a fairly flexible system of convenient utilities, which, when combined into chains, can create the logic we need.

These building blocks are simple programs, utilities which may be quite useless on their own but are originally designed to be used in conjunction with others. The GNU/Linux operating system has about a hundred of these basic utilities, available out of the box in any typical distribution. Thousands more are usually in the standard repositories and can be installed by a single command. Many of these utilities can be run in different modes, using a standardized flag system. With this flexibility, we have a truly limitless field of experimentation, where only your imagination can limit your possibilities.

And the glue for these utilities is simple text, which they take to the standard input, modify as part of their work and pass on to the next utilities. You get a kind of data pipeline. This simple, human-readable text is the universal communication interface.

If Unix was developed today, I think it would not be very popular. For example, they would impose an object system, as Microsoft did with PowerShell, or they would impose some serialized data format, such as JSON or YAML. Certainly they would have used encryption, certificates to sign the data packets and the protocol would have been binary and unreadable without the use of special utilities. And working with the file system and with the kernel from the user space would be like working with a database.

But fortunately for us, all these traditions of overcomplication came much later, and the founding fathers of Unix were not too bad at it. And, of course, they did it wisely.

“I’m learning Python, it’s the future, but your bash is a tinhorn.”

Maybe such “make-it-and-break-it” conveyor chains will not claim to be a complete and long-term solution, which will be beautifully designed and optimally implemented, but this is all compensated by the ease of use, the ability to quickly sketch a rough draft and play with data using only the standard means of the operating system. And having found the right approach, you can write a script in Python, if you have such a desire.

This experience will be especially invaluable for engineers, who often have to work in very cramped conditions, so they need to use every opportunity to automate and simplify their work.

“Linux needs to be fine-tuned with a file to make it look decent. Here are my configs.”

If you’re a programmer, your world is usually limited to your one personal computer, where you can install any programs you want to make your work more efficient. So you will probably have dozens of additional utilities installed on your machine, and the standard Bash will be replaced by the advanced Zsh. The configuration files of many programs will be customized, and a lot of “aliases” for the scripts you run will create that comfortable language which only you will understand. You can even use some exotic GUI, like the i3 tile manager, with its own settings and hotkeys.

By taking this process to the limit, your virtual workplace can resemble the cockpit of a spacecraft from a sci-fi movie in terms of sophistication and technology.

“And that’s right, I don’t work with less than three screens. And only with a mechanical keyboard.”

However, what about those professionals who build and administer complex systems consisting of thousands of servers? They simply don’t have the ability to install handy utilities on all the machines and tweak everything to their liking. Often there is even no way to physically install anything, because access to the Internet on a production server can be blocked altogether. And some virtual machines will be so old or so important that you will be afraid to even breathe on them, lest you accidentally break something.

You have to make do with what you have at hand. And here the ability to work in a standard environment, the command line, multiplied by the enormous potential that is embedded in the operating system itself, will be the moment of truth, which distinguishes a good specialist from a mediocre one. Knowledge of hi-tech new trends, modern hacks extending the functionality of standard utilities, and supposedly making them more convenient, will not help you in any way. All of this knowledge will be completely useless in real life and will only distract you. Also, knowledge about the graphical desktops and how they differ from each other will be completely useless.

I repeat once again, the graphical desktop interface or cell phone interface is not worse or better than the command line interface. You just have to understand the strengths and weaknesses of both approaches and use each where it is more appropriate.

“And I work from Remote Desktop to my work machine and write code. It’s all working swiftly.”

Imagine working from home, connecting to your office computer through an encrypted VPN gateway. The company has decided that you can’t connect directly to the production system, but only from your office computer’s subnet. Therefore, we connect to this computer in order to reconnect to a remote server on the Internet. But this server is just a monitoring server, sometimes called Bastion Host, which has one network interface connected to the worldwide network and the other to the internal protected network where all the most important servers are located. Therefore, through it we have to reconnect directly to the server with which we originally wanted to work.

Agree that the graphical interface simply can not work adequately in these conditions. It is possible to send the image over such distances, through so many intermediate nodes, but it will be very inconvenient. The resulting picture will not be updated in real time, creating at best a constant delay, at worst constant connection interruptions. And the latency is sometimes such that it is simply impossible to work.

However, even if the bandwidth between your computer and the destination server is only a few kilobytes per second, which is often the case, connecting to the virtual terminal via SSH can still be quite comfortable. Using the command line interface, you can make any change, check and fix any problem, test an application, check the system for security, check the system logs, and much more. All you need to know is which commands to run and in what order. And the system will responsively do everything you ask of it, without unnecessary questions. It’s as if you’re not working on a remote server on the other side of the world, but on your home station. This is because transmitting a short text command is many times easier than a continuous video stream.

“It’s kind of weird to stare at a black screen in the 21st century, a graphical interface is better.”

Also, unlike graphical interfaces, with a command line interface the information you’re working with is always unambiguous and focused. You always see the text exactly where you expect it to be, not a smeared palette of colors, between which there must be letters scattered all over the screen. Clearly, with some effort, you can create well-designed and high-quality graphical interfaces, but it’s a long way from standardization. Each program often strives to provide its own unique user experience that sets it apart from its competitors. But these visual components do not have to match the general gamut of the applications running on your computer.

You can scream about being old-fashioned, outdated and useless, but the fact is that in the IT world, working in the terminal is the main and often only possible way of working with complex information systems.

And, of course, no one forbids you to work with a command line interface and use the same graphical interface, opening the necessary sessions in the many tabs of the virtual terminal, and at the same time keep open dozens of pages with the necessary documentation in your browser.

“Windows is enough for me. I’ve been working with it for many years and I’m fine with it. And for gaming I have a PlayStation.”

Clearly, if you use your computer for entertainment, you probably use an operating system like Microsoft Windows or MacOS and are quite happy with it. If your profession simply leaves you no other choice, you will use the software that your colleagues use and have no problems at all.

However, let’s take a broader view. What if you use your computer for tasks for which there is no single and generally accepted toolkit? Imagine hundreds of servers scattered around the globe, dozens of complex systems like Kubernetes, relational and document-oriented databases, message brokers, and various supporting infrastructure. In complex microservice architectures consisting of hundreds of interacting services, every day we have to deal with a lot of abstractions and conventions, which are not easy to remember, let alone work with them. Programmers, systems administrators, DevOps engineers, network security specialists, data analysts – all these professions require different tools, different skills and different approaches. And among the same programmers, there are those who develop application software, others who develop websites or mobile applications. There are those who specialize more in frontend or backend. But we shouldn’t forget about those who work close to the hardware and write device drivers or develop embedded systems. And these are just some of the possible directions.

You simply will not write graphical applications for all the variety of tasks that arise on a daily basis for the people who create and operate such systems. On the contrary, these needs are often naturally covered by the interaction of the simple utilities that are included in the basic build of any GNU/Linux distribution.

Yes, perhaps not as nice, not as presentable, but, from an engineer’s point of view, the command line completely covers the needs and allows you to avoid distractions and focus directly on the task at hand.

Therefore, the very posing of the question “what is better” makes no sense. For the tasks that our profession in the previously mentioned specialties sets us, the command line is the best possible interface, which due to its simplicity can be used in any application even for those tasks for which it was not created.

“Who uses this operating system anyway. Mac is better.”

Ordinary people tend to exaggerate the importance of operating systems from Microsoft and Apple, considering them as a kind of global standard, because in their daily activities they have no opportunity to make sure that there could be anything else. Yes, today the operating systems based on the Linux kernel cannot take a worthy place among the desktop systems, although in fact they are not inferior to them in any way. However, let us be honest and do not forget to mention that for some reason the same corporations have completely lost out on the market of server solutions. And today, the GNU/Linux operating system and partly the BSD family systems completely dominate this segment.

They are used today not only in classical servers located in data centers all over the world, but also in cell phones, routers, supercomputers, all kinds of household appliances, cars, video cameras, and many other applications. And I would not be surprised if even in spacecraft and lunar modules, where there are no special hardware requirements, everything works about the same as on your home computer.

All those social networks and messengers, as well as the numerous Internet resources without which we cannot imagine our lives today, overwhelmingly likely use Unix-like operating systems for their functioning. And in all these and many other applications, your knowledge will be up to date and not obsolete for a very long time to come.

“Wait, what about .Net? That’s a very popular technology.”

And yes, I know that .Net and some other solutions are actively used on the backend of many reputable companies. However, this is more of a deviation from the norm, and global trends rather suggest otherwise. They are not able to shake the dominance of platforms such as Java, and the same Java feels great on servers with GNU/Linux and OpenJDK on board. Even if the developers themselves use a Microsoft system in their work and have never seen Linux.

It does not matter which framework or programming language is more modern, faster or more architecturally successful, as long as large corporations still prefer Java for business reasons.

The very existence of systems such as Mono, which makes development in .Net cross-platform, and WSL, which allows you to use Linux inside Windows, speaks to how precarious Windows’ position in this market segment is today.

“Am I even a writer and why would I need to dive into all this?”

Even if you are not an engineer and it does not resonate directly with your work, the GNU/Linux operating system has many interesting and useful tools that have not only specific “IT” applications, but are also part of the world’s heritage. Such tools include, for example, the Git version control system and the TeX typesetting system.

I want to stress once again that it is not necessary to take the so-called Unix philosophy as the ultimate truth. It is merely an abstraction, an idea which in some cases may be convenient, in others not sufficient. However, the fact that it has been in active use for so many years and that no real alternatives have been devised during this time makes it clear that it has stood the test of time and is not about to go anywhere.

“Linux is completely obsolete, and everything has to be rewritten. All the code in there is from the seventies, the nineties at most.”

However, amongst the variety of opinions, you can occasionally see that this whole approach is obsolete and needs to be completely overhauled. All these operating system utilities, such as the text editor vim, the shell bash, the systemd initialization system and many others (underline that) are used only because they are still preinstalled in most distributions, and only therefore still popular. If it were not for this deterrent to progress, leaving no chance for the younger generation in their quest to “make the world a better place”, we would have been using handy tools long ago instead of being held hostage by this historical junk.

Unfortunately, this approach can often be found in various discussions, which means that such a belief is firmly planted in the minds of many people. There is nothing easier than to tear some statement out of its historical context and elevate it to an absolute. Well, let me continue this line of reasoning by saying that the Linux kernel itself is completely outdated, if only because it is monolithic. Let us rewrite it too and switch to a more “advanced” micro-kernel architecture, as we did, for instance, with the Minix operating system.

Let us also not forget that the processor architecture we use today is also completely obsolete, as it pulls decades of backward compatibility with technologies that no longer exist. Let us urgently create a new architecture that will be devoid of all this legacy, and use it exclusively. Then let’s train millions of specialists to be able to implement it in all production and server facilities around the world. And we won’t forget to completely rebuild the factories for the new technological process, because we need to replace the entire global server fleet in the shortest possible time.

All this will require, at best, hundreds of billions of dollars and years of international effort. But what is that compared to the opinion of some experts, really?

Of course, new operating systems, programming languages, tools and libraries must be developed. Progress should by no means be stopped. New and progressive trends, however weak they may have been when they emerged, have every chance to strengthen and replace obsolete approaches and practices. However, it would be very strange to expect that what is being used now will completely disappear from history without any trace. Most likely, elements of the ideas and software code we use today will remain, for future generations, multiple vestiges and anachronisms with which they will be forced to coexist. Including those technologies that we consider advanced and most successful at this stage, but which will undoubtedly become obsolete in the new historical realities.

So while someone waits and suffers, we will study and use what we have and understand that behind all these technologies are decades of hard evolutionary process consisting of the labor and efforts of millions of people. Among them are both good and bad professionals making good and bad decisions. These people may work in cramped conditions according to a general corporate plan over which they have no control, or free artists working in their free time on what they are really interested in.

So let us not forget the principle of historicism and base our careers and lives on this understanding. And let us also remember that if we can achieve anything, it is only because we stand, without a doubt, on the shoulders of giants.