Linux: a retrospective

Adam Brown ’27

I have been using Linux for nearly five years on my gaming desktop, Steam Deck, or even my MacBook. What originally started as a quick way to get through virtual school, avoiding Windows’ bloat and excessive pricing, quickly became a deep passion of mine, reshaping my understanding of computing.

For those unfamiliar with Linux, it was created by Linus Torvalds in 1991 as an alternative to the expensive, AT&T-owned UNIX operating system. On its own, “Linux” is only a kernel, a foundational connection between hardware and software. It manages such base-level tasks as allocating certain sections of memory to programs, each precisely timed and tracked to prevent conflicts; along with translating software commands to hardware commands, like opening the DVD drive for a media player app, and making sure that no matter what brand or type of DVD drive is attached to the computer, it opens—this last functionality is called a driver. And so, because a kernel can only do so much, he incorporated simple utilities (such as the command mkdir to make a new folder) from Richard Stallman’s GNU project to create a fully-featured operating system. Thus, GNU/Linux was born.

And it was born under the GNU General Public License; it was and will forever be free and open-source. The public can view exactly what the developers type, and hobbyists can (and often do) suggest changes to every aspect of the ecosystem.

My Linux experiment started with Ubuntu, the “easiest” Linux distro.

Linux’s open philosophy quickly resulted in multiple offshoots (distros, short for “distributions”, each one suited for a unique setting. Some are ultra-lightweight, some are more user-friendly, many are for enterprise servers like Microsoft’s and Apple’s, while some are truly just for fun (most notably Hannah Montana Linux).

My Linux experiment started with Ubuntu, the “easiest” Linux distro; with its intuitive tools and desktop, it provided me—and many more newcomers—with a stable, reliable first dip-of-the-toes into Linux. I soon started “distro-hopping” as I saw other people’s flashy setups—they were not using beginner Ubuntu, but more sophisticated distros. I experimented with numerous distros before landing on Fedora, which I used for several years.

I could rise from using a measly beginner distro to a real one, and I would finally know the real Linux.

Fedora was the perfect distro for me. Everything I needed came pre-configured, everything from easy installation of NVIDIA drivers to immediate support for typing in Chinese with Pinyin. Most distros do not configure these automatically, and the process to do so can be quite cumbersome. All the while, I had immediate access to the terminal, a place which many users of any operating system fear due to its unattractive, unintuitive, raw information-based interface. As the years passed, I would open the terminal for certain tasks or merely to explore. Most of the time, I used intuitive graphical apps to configure my settings and install programs.

I have been viewing Linux-related content for as long as I have been using Linux itself. And in learning about the Linux environment, I would constantly hear about one distro in particular. It carried with it a profound sense of elitism: Arch Linux. It was the antithesis of user-friendliness; nearly everything related to installing was done in the terminal, by no means a walk in the park. Yet this difficulty created a certain elusiveness, a feeling that if I could become part of that top-tier club, I would have “made it” in Linux.

I could rise from using a measly beginner distro to a real one, and I would finally know the real Linux.

I dismissed Arch Linux as useless. Overcomplicated for no reason, I thought.

This is nonsense, of course, and so for those years, I dismissed Arch Linux as useless. Overcomplicated for no reason, I thought. I stayed comfortable, I confined myself—logically—to what worked. “If it ain’t broke, don’t fix it.” Yet the possibility of joining that club planted a seed in my mind. It slowly grew, and what worked well grew stale and boring. My subconscious sought change, and so for those years, I looked at what other people did with Arch: their perfectly set, customized desktops and their terminals open with that beautiful Arch logo.

A few weeks ago, with some spare time on my hands, I decided to set out and see if I could actually do it.

From its inception, Arch was created for minimalism—and it shows.

The fabled process of installing Arch Linux starts the same as any other distro: downloading and flashing the distro’s file to a USB stick, and then booting from it. What’s different, and what makes Arch so infamous, is everything after. Rather than loading a clean, friendly, live desktop to install from, Arch brings you back to the fundamentals of Linux: the command line. Nothing more, nothing less.

A screenshot of the Arch Linux installation screen in a virtual machine – Adam Brown ’27

Staring at the blinking cursor in the terminal, I was instantly intimidated; I knew what hurdles stood in the way between a simple text line and a finished installation. But people have done this before. And so, as I was advised to, I ventured to the ArchWiki for a comprehensive guide. It brought me through everything from setting my region and connecting to the internet to manually partitioning my hard drive and formatting it. It was simple, but not easy. As I worked, every command I hand-typed had exactly the effect that the Wiki and I intended. Every setting, every program installed in those few hours of setup was there precisely because I willed them to be. Not because Microsoft was paid to put it in, not because Apple wanted me to buy into a subscription—no, I installed them out of my own decision and reason. I put them there. No one else did.

But in advancing through my installation, not once did I sign into an account.

And to emerge with a bare-bones installation, I had to install everything manually. I installed network and Bluetooth utilities, the kernel, the bootloader, and even the package installer itself. But in completing these tedious tasks, I learned what Linux needs to function. I wasn’t just pressing buttons; I was telling my computer what to do directly—no middle man—and seeing my results happen in real-time. I read the names of programs I installed along the way, and tried to comprehend the files I edited. What I didn’t know already, I researched, either through the ArchWiki itself or by talking with an AI. With the guidance of the ArchWiki, I edited configuration files to name my computer and set my region, and ran base-level commands to create a user and give it a password. And at the consummation of my learning, I rebooted my computer to find a spectacular terminal login window waiting for me, with which I finally installed a desktop environment and considered my install complete.

But in the desire for convenience, current computer users have lost the power of self-reliance.

The beauty in this process comes in its simplicity. From its inception, Arch was created for minimalism—and it shows. Exactly nothing that you don’t consider installing manually is left out of your installation, often resulting in a non-functional system. But in advancing through my installation, not once did I sign into an account. Not once did I have to provide my location (It’s optional and stored locally in a simple text file), my age, my identification, or sign into “optional” telemetry.

In today’s world, major computing companies, notably Apple and Microsoft, have made the process of installing and using an operating system easier than ever. But in the desire for convenience, current computer users have lost the power of self-reliance. They have lost their own freedom of choice in lieu of what Microsoft deems “fitting.” What if you want to remove Edge from your Windows installation in favor of a different browser? Too bad. Microsoft does not deem you trustworthy enough to do this simple task. Arch Linux does not include a browser by default, yet it is trivially easy to install one when you invariably need to. The absence of such software in Arch Linux requires the user to choose, and thus it makes the user know their computer.

As a result, when issues arise—as they do on any system—I am better equipped to handle them. It doesn’t read as an empty failure, but as a callback to my own knowledge, and to the great hivemind of the internet and all the work of millions over decades. By solving these problems by myself, even if I simply copy a few commands that worked for someone else, I will recognize phrases, perhaps a folder name for a program, or another term that I had come across during the installation, and my knowledge further strengthens.

But why build this knowledge? It’s certainly a compelling question, and I’m not sure if I can define a concrete answer—most things nowadays can be solved with an AI query anyway. Nay, the answer to this question is more majestic in scale: the hardware of your laptop, the kernel that rests on it, your desktop environment, your file explorer, and everything in between did not come into existence by chance. They are the result of the marvelous act of true human cooperation. We rest on the shoulders of giants. In a simple Linux install, I contributed hours, but so too did I harvest the hours of numerous other humans, people, developers, who have poured their lives into the unimaginable edifice of humanity that is the computer.

The farmer tenderly cultivates his crops, methodically arranging a garden, too, and then feasts on the food he knows blossomed from his own labor. Every seed that blossoms into a burgeoning flower—he planted it; every facet of my Linux desktop comes from my own choice, learning, and input, and it provides me with a feast of accomplishment. 

Yet the farmer cannot grow his crops from nothing. His shears come from China, his tractor from Indiana, and his fertilizer from Canada—these he cannot make himself. Yet he uses them to create a beautiful symphony of cooperation. I am the cultivator, and Linux is my garden.