Why I Use Ubuntu

|

If you are reading this it is because I finally got around to taking Scott up on his request for Ubuntu content. I am not a well versed Linux user and over the years have had a love/hate relationship with Linux and the open source community. My beef is not with proprietary software as much as it is with outrageous pricing and no access to the code to make it work for you. I don’t believe Microsoft or Apple are 100% evil nor do I think that Linux or open source are always the best solution for the job - nor do I think Microsoft or Apple are always the best solution. I am just a basic user.

I have an older slim desktop that runs Ubuntu server with a SAMBA share for my personal content storage. Aside from that I dual boot my Vista laptop with 32-bit Ubuntu. So why Ubuntu on my desktop? Well, for that we need to go back in time a bit.

My first introduction to Linux, back in the 90’s, was by a friend who helped me set up an early RedHat install and then attempted to help me learn how to use it. It didn’t take long for me to hate it. It was during the early days of RPM’s, and although I saw the potential, I was still annoyed that it was not as easy as a windows .exe file. My experience with RPM’s was that there were always dependencies that you never discovered until you were trying to install. It seemed to me that I had to uncompress files twice and then compile the application and it was more than I could just easily do/remember. Add to that the constant frustration of not knowing how to do stuff or how to fix the things that would not work and for a non-engineer Linux newbie, it was the beginning of my love/hate relationship with Linux.

After awhile I started wanting to play with Linux again, a pattern that would continue to repeat itself over the years. I don’t know when it was, but the same friend that first attempted to help me with RedHat had mentioned some “apt-get” system that would check to see if you needed any dependencies before attempting to install the application. Not only that, but it would ask you if you wanted them installed (DUH!) and would install them prior to the application install if you wanted. This sounded much better than the RPM hell I had experienced. At some point I remember trying Debian - again with said friends help. About the only thing I remember was that the installer was not that friendly and there was something like 7 freaking install disks! I don’t remember at what point in my Linux journey I tried Debian, but I do seem to believe it was short lived. Not because of “apt-get” but again because of my usage frustrations of not being able to fix or move around from a lack of understanding. To be honest, there really was not a good desktop Linux installation available.

Somewhere along the way I developed a friendship with a couple of LAN gamers who were into Linux as well and through them I was introduced to SUSE Linux. I think my love/hate SUSE journey spanned versions 6 - 8, maybe even 5. Using SUSE Linux was probably a better experience than I had ever had before but it was also a whole new hell that ultimately caused me to stop using it. I would read or stumble upon some cool sounding application and want to use it but couldn't find a SUSE specific installer or who ever had been maintaining the SUSE package had stopped. I think the final “straw” was when I wanted to try GnuCash. I had read some really cool things about it and wanted to try some of the features but the newer version was not being maintained for SUSE and since I did not know how to compile it from source I gave up in frustration. It also seems to me that OpenOffice.org was another issue. It came pre-installed, but I always seemed to have problems updating to the most recent versions.

I remember along the way (maybe even before SUSE) looking at and even forcing a hard drive install of Knoppix which was interesting and being reintroduced to “apt-get” gave me the warm fuzzies again. But like all the other times before, I gave up out of frustration.

Then I started hearing about Ubuntu. I liked that it was based on Debian and I was told they were desktop focused. Although I am not sure, I “think” I started trying Ubuntu around version 5 or 6. What I initially liked was that most things just worked. With the exception of cheap laptop wireless cards most hardware was not a problem. In fairness, this could have been as much a product of time as Debian, Ubuntu or kernel driver smarts. All I know was that for the first time I saw something that wasn’t bad for a desktop. Today it is uncommon to find a computer that Ubuntu won’t run on and wireless card issues seem to be something of the past. I even prefer the Ubuntu live cd to a Knoppix live cd for tech work. I have had more problems booting a computer from Knoppix then I ever have from an Ubuntu disk. This “working out of the box” experience has been largely responsible for my Linux satisfaction level. Especially once cheap (broadcom) wireless adapters worked out of the box.

So why Ubuntu? Well for me it just seems to install and work on just about any computer I install it on. I have the comfort of using “apt-get”, my wireless adapter works and I don’t feel like I need to be a computer engineer to install or use Linux. But am I an Ubuntu fanboy? I’m not so sure anymore. Lately it seems to me that Ubuntu is feeling more bloated and a bit more buggy. I am especially frustrated with how “beta” like the most recent LTS (Long Term Support) version is. It just seems to me that LTS implies something that should be rock solid, very stable but I guess that's just my desired reality. Truth be told even I realize that LTS just means Ubutu will provide a longer time of support. As a result, I have started looking at other distributions and may consider switching. But I don’t see myself switching to a non-Debian based distro. Fair or not fair, I just hate RPM’s and still like to install my updates using “apt-get”.

So maybe the title of this post should really be “Why I use Debian based distro’s”.


Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

RPMs etc

As I write this I am putting the finishing touches on a Ubuntu install on an eight year old computer for a great-grandkid.

During the OS upgrade (from fedora 10) I was beginning to wonder if the DVD drive was getting faulty. Ubuntu finally suggested that the CD might be dirty. I'm impressed. I managed to get 10.04 installed then took the drive apart and cleaned it.

I have used Red Hat 7 through 9, Knoppix, SUSE 9 & 10, and Fedora 9 through 13. I have loaded Ubuntu and Knoppix several times but always get frustrated at the sudo [instruction] procedure. I also feel like it's like the Microsoft way where the user has admin privileges and lulls the user to be conned into giving the password to install something they don't understand like say a trojan. I don't want the 14 year old to be able to do that. For my computer I just prefer to have a root terminal open where I can gat at it - as an administrator. I DID find how to set up a root user and password in Ubuntu I just don't think it's necessary. Obviously, this is just a personal issue.

Back when I was using Red Hat 9 I wanted to install Gnucash but got caught up in the dependency thing but that is the only time I have experienced it and have had no trouble installing gnucash and kmymoney running on fedora 13. I think we all get used to using what we first learned. I got used to unpacking, making and installing with Red Hat 7 and, with the exception of gnucash, was comfortable but when I was presented with the apt-get I had no interest in dealing with a different way of installing.

I just finished installing the gimp and like the Ubuntu way of presenting software installs. Other than the presentation Ubuntu does installs flawlessly as does Fedora ...in every case I've tried.

Next day: Curious hardware issues with the old computer. Ubuntu Alternate won't boot if I use a USB keyboard but Ubuntu 10.04 LTS boots and is now installed. Knoppix 6.2 and Fedora 13/KDE live start to boot but then the machine locks up. Well, that may be wrong. Fedora 13 live is now up but it took almost an hour to get there and it took a couple of minutes to get the file browser up. This may still be a faulty CD/DVD or maybe the ribbon cable.

I give up. Ubuntu it is.

later, john

I may be wrong now
but I don't think so.
Theme from Monk


Thomas's picture

re: Why I use Ubuntu

Ubuntu is easy to use. PERIOD. Seriously though, my two year old girl watches cartoons on an Ubuntu machine and has no problem starting and stopping the shows. :)

Thomas

BTW, it's stable.


response to "Why I use Ubuntu post"

well, I must admit my first ubuntu experiences were not bad ones and the problems I have had with ubuntu relate to proprietary software not ubuntu itself. that said I hate how Ubuntu does configuration of the system. I seriously dislike that I have to spend a considerable amount of time fixing ubuntu removing all the nonsense bloat it contains in startup or spend the same amount of time reading the manual so I install it in a correctly optimized way. Essentially I dislike how ubuntu assumes I have no idea how to do anything and defaults to doing it for me. And I have had various problems with it's attempts at doing things properly. In the same way that the ie4l script sometimes fails. So there are compromises that must be made in order to have "ease of use."

that said, I have to say my first experience with redhat at 6.x happened quite differently as I was in a UNIX/Linux class at the time and quite honestly not a single bit of it was confusing. However dependency hell first starting making sense when I started wanting to try random gpl software from the internet in order to possibly begin working as a bug reporter on the various projects, this was and is a correctly identified problem with rpm based distros. I say is because yum does not even come close to comparing with the coolness that is apt-get and the argument should not be made. what you really should have been using back then was nextstep which was sort of like staring at your monitor and seeing heaven. unfortunate that's all out of style now.


Scott Dowdle's picture

yum vs. apt-get?

You posted anonymously so I'm not sure who you are. I prefer people to get credit for their contributions but this site does allow anonymous for a reason. :)

I just wanted to respond to your comment on dependency hell and your believe that yum does not compare to apt-get. I disagree. Dependency problems are most often caused by trying to install third-party apps, sometimes made available in a distro's native package format.

One advantage that Debian has over many other distributions including RHEL and Fedora is that Debian has been around for a very long time and has been both gathering up software and packaging it as well as making their package library more granular. For example, when Red Hat went from Red Hat Linux to Red Hat Enterprise Linux, they reduced the number of packages offered in an effort to make it more supportable since they sell binary updates and support services. What was left behind as Red Hat Linux followed suite and we eventually got Fedora. Fedora has gradually been increasing their package count over the 5 years they have been around and are starting to catch up with regards to package count.

The point I'm trying to make here is that if you have a large library of distro provided packages you are less likely to install a third-party package and encounter a poorly done package that gives dependency problems. When sticking with distro provided packages I'd say that both .deb and .rpm are fairly equal in the experience they provide the end user. Dependency problems with stock packages are quite rare.

At some point Fedora morphed into an "Innovator Distribution" and as a result they are constantly releasing a large number of updates. I'll admit the large number of updates has on rare occations introduced incomplete repo syncs on some mirrors that could cause dependency problems for a small window of time... but they have been working on that and it hasn't really been a widespread problem.

So far as package technology goes, rpm / yum and dpkg / apt-get are very complimentary with neither having a distinct advantage. A package is only as good as the packager made it and a packing system can not make up for a sub-standard third-party package.


yum vs Apt-get

I will simply affirm Scott's thought in that rpm / yum and dpkg / apt-get are very complimentary with neither having a distinct advantage, and if anything, I think yum may have the edge. Both the Fedora repo's and the Ubuntu repo's have had sync errors where updates would not work correctly. I also think Fedora has really improved their repo issues with Fedora 13 as while updates come quickly and in number, I have yet to have a problem following any update and have not experience any failed updates.

I will also give a thumbs up to Linux Mint in the repo deployment in that perhaps since they "trail" the Ubuntu releases and as such, their updates seem to work more consistently than even Ubuntu's.

Finally, dependency hell is a thing of the past for the most part.

apt vs rpm, a voice for apt

I would disagree somewhat with Dowdle.

apt/dpkg packages are able to span distro versions much more gracefully than rpms. Simply put, deb packages from debian4 tend to work on ubuntu 8.04 and up as well as debian 5 where rpms for rhel4 usually dont work for rhel5, fedora, and when you put suse and mandriva in the mix its a total crap shoot. Essentially, deb packages tend to require less effort to maintain because they tend to work in future versions of the distro. Additionally they are 'cross distro' compatible more often in that ubuntu, voyage, mint, debian, etc can all use the same deb a lot of the time. Please read that this is generally speaking and by no means an absolute.

That might just be coincidence in that debian is a more-often utilized base OS for the ubuntus and mints of the world than RH is, considering that most if not all serious RH knock offs specifically remind us on their websites that they are really just RH less artwork (centos/scientific) or with some very minor changes (oracle), so in effect this isnt a commentary on how GOOD apt or yum is, but how the ecosystem has utilized them.

SO, if you are running Fedora13 and you want Fedora13 packages, yum is going to be essentially flawless. same goes for rhel[4-6], debian, ubuntu etc etc. If you are running in a server environment you would likely be considering stock packages for everything anyway, especially if you are wanting paid support.

I would add that I have ubuntu, debian, freebsd, and centos servers that run in a production environment. I rarely have issues with stock packages on anything, but when Im messing around i bust out a debian vm because it plays nicer with 3rd party/out of repo packages for me.


Scott Dowdle's picture

Good, insightful commentary

I agree with pretty much everything cttinsley and syadnom have added. Thanks for the insightful commentary. I do have some additional comments though.

1) While there is an upstream rpm (which these days stands for RPM Package Manager rather than Redhat Package Manager) the fellow who has been doing primary development on it (a former Red Hat employee) has had many public spats with everyone and I don't know that anyone is using it unmodified. I also believe that both Mandriva and SUSE have their own distinctive rpm forks as does Red Hat for RHEL and Fedora for Fedora. No one uses the same version of rpm and there are several higher level apps besides yum... which include zypper for OpenSUES and... oh I forgot the name of the Mandriva one... urpmi?

Anyway, my point is that there is more diversity in .rpm than there is in .deb and that would explain rpm having more incompatibilities when attempting to using cross-distro and cross-release packages.

There have continue to be efforts for all of the rpm stake holders to work together and share their improvements but I don't foresee a merging of all of the flavors back into one.

2) Within a flavor of rpm there can be quite a bit of cross-release package reuse. For example in Fedora, they do not necessarily rebuild all packages for each release. In fact, most packages that did not require updating are carried over each release and retain their previous release markings. For example, in Fedora 11 there were 3 packages with fc8 in the name, 12 packages with fc9, and 47 with fc10. For earlier releases I think numbers were much bigger.

I believe in Fedora 12 they had a major rpm change where the signing key type changed (moving to sha256?) so ALL packages had to be rebuilt.

3) Another factor to consider is that Debian is king at granularity. The made-up example that I usually give is that while distro X might have a package named "alphabet", distro Y might have several... named something like "uppercase", "lowercase", and "numbers-and-symbols". Whereas Debian would probably have, "A", "a", "B", "b", "1", "2", etc. I think this granularity may also enhance cross-distro and cross-release compatibility.

4) The last thing to consider is the vast difference in the goals and directions between Debian and Fedora. Debian is better known for their goals of stability and multi-arch support whereas Fedora is better known for their goals of rapid adoption of new technologies and whipping them into shape so that they can be adopted by others sooner. While Debian supports the most architectures and that really is a major reason each release takes so long (along with their large package library)... Fedora isn't afraid to tear up the field and plant a new crop. Fedora was first by a long span of time with SELinux, Xen, and KVM... just to name a few.

The vast difference in goals and directions has a big impact on cross-distro and cross-release package compatibility. I know syadnom was speaking more generally so I added some specifics.

In conclusion... having said all of that, it just highlights the great diversity of Linux distros... the diversity of Linux users... and what a fantastic enabler that FOSS is... and how good this diversity and competition is for everyone... even if it can be painful at times.


Pacman does it right

Andrew Niemantsverdriet and I have been using Arch Linux a lot lately, and I am completely convinced that Pacman does it right. By perfectly streamlining the build process, doing rolling-updates doesn't hurt a single thing. Rather than spend time worrying about which distro release we're running, it only looks at individual package dependencies. Much more flexible, and it's super easy to roll a new installer for a package, even if it doesn't conform to configure/make/make install.


Scott Dowdle's picture

Ultimate flexibility

I agree that pacman is a good design... coming from Arch Linux which is sort of a hybrid between source-based and binary-based packaging methods.

I'm sure Gentoo users would also like to pipe up about the flexibility of their source-based packaging system (if it can be called that) and the strengths of a rolling release.

The only problems with those are that, currently at least, they seem tied to the specific distros from which they came and not easily adapted for use outside. I could be wrong about that last point but so far, it appears to be true.

One other package manager that is really advancing things is conary... but with its advances is an increase in complexity perhaps prohibiting it from being more widely adopted.


lannocc's picture

Distributions and package management: Gentoo

Gentoo bills itself as a "meta-distribution", though the distinction here is kind of meaningless anyways. What is a distribution but more than a package manager and a selection of packages, provided in one or more default profiles? I'm building a distribution around Portage, leveraging the work the Gentoo folks have put into standardizing the packages. My approach is to find a package management system backed by a team who shares my philosophy for its use. If you need a package manager separate from the rest of the distribution but they don't make it easy then that would be a likely deal-breaker.


other package manager

I have always liked the way OSX handles apps, and I think conceptually it has a lot going for it.

To revise it, i would go with the folder-as-app system with the following cases.

when the app folder is put in a specific directory(s), the OS will recognize this and read a .provides file. The gives the ability to install libraries and different versions. the .provides for libXYZ1.6.2 would say that it provides libXYZ version 1.6.2, and you could have another app folder libXYZ 1.7.3.

when a .provides file is parsed, it reads a .requires file as well. be sure to provide full case scenarios in this process. case ver = null do this, case ver = 1.6 do that, case ver 1.7 do something else, end case.

when an app folder is executed, i.e. /apps/Firefox.appfolder the shell knows to run at /apps/Firefox.appfolder/.execute, but if you cd /apps/Firefox.appfolder/ you will get into the folder like any other.

whenever an app folder has a .provides, do something fancy to get those into /var/lib/, whether you link, or use unionfs.

now downloading an app and installing it is just dropping it into a directory.

for dependancies, im sure you would want to use the .requires file to invoke apt, yum, pacman, portage, etc.

anyway, i may just be crazy so please ignore :)


rolling updates

rolling updates makes it pretty much impossible for a 3rd party developer to support the distro. dependancies based on version numbers works fine for installing software but when you have 20 deps and all of them have been updated and at different times new bugs or interoperability issues can come into play between your software and those deps, or even between those deps.

this may be fine much of the time but if you have to run oracleDB then arch is out. you wouldnt pay that support contract if you couldnt use it, and you much have the contract.

If arch could put in freeze dates, when the major package versions would be frozen so you could actually target that package set it would become a supportable distro for servers.

As far as desktops, arch needs to get into the upstart/faster bootup groove. the simplistic init system is just too slow for laptops. ubuntu and fedora are making good progress and I dont think that they are complicating init beyond reason.


Upstart for Arch

syadnom, a quick search for Arch Upstart pretty much confirms that- there is an AUR (universe) package, but it looks like it's not very well maintained. I'll try to see if I can get it working sometime (right, because a full time student with a full time development job has plenty of free time to play with Arch.)

As for Oracle, people still use that? ;)

I completely agree that businesses should be much more careful when installing large production environments, and should not use cutting edge and experimental work. That's when you get a contract with Oracle and RHEL, and stay with RHEL 4.3 for decades. Desktop users clearly don't need that level of support.

I like Gentoo, Arch, etc, because they force the user to learn exactly what is happening and is required for their distro, and encourages the user to care about each piece of the system. This is clearly not for everyone, but when you're ready to take that step and take full care and responsibility of the computer, Pacman makes it easy. (Disclaimer: I've never used Gentoo, but I know my buddies who do are constantly complaining about it.)


genoo/arch etc

I think that Linux has a unique opportunity to offer up what Mac offers to Mac users, on the scale of windows.

IMHO Apple would be the major player had they not been so 'apple' if you know what I mean. OSX is miles ahead of Windows7 in usability. It has a 'just works' feel to it that is lacking in Windows(95-7).

To do this, linux distros need to have package management be a secret thing, that only the elitists care about. Installing a package, handling dependancies, everything needs to be a progress bar. When a package breaks, it needs to be handled better. The system needs to be able to hit the bug tracker and find the fix. It needs to tell the user that the issue will be resolved on x date which is the next update cycle or whatever, or it needs to say that there is not a know fix at the moment, would you like to be informed when there is?

Users shouldnt see the BIOS for even a second. They should see a linux logo and then their desktop.

I think that the vast number of distros is a good thing but only for now. I think it allows the 'elite' users to find all the possible ways to do things, to hash them out, and to let the strongest survive. At some point, there will need to be a convergence if linux is ever to challenge Microsoft. Someone has to win, either by total defeat, or merging together. To be a really viable platform, users need it to work and they need it to be consistent between computers at home, at work, at a friends house.

Microsoft does just that. There is just 1 windows, even though there are different versions. Applications that people run pretty much run on all of them, and they are sequential versions so its a simple thing to understand that your copy is too old to run program X. Linux can do this one better by more frequent major releases, or much more frequent minor releases.

Gentoo and other source based distros are just too slow. what I mean is that building packages from source just doesnt make any sense for regular desktop users. Building from source is something like mailorder. Want it now, get it much later. ubuntu can have pretty much any package in seconds. same with fedora. Arch does both, but desktop users dont really care about source packages.

really, computers arent these things that most people really want to know how works. they are like cars. I want a red one, that goes fast but I have $$$ so I need to be a bit thrifty, I cant spend $$$$$$$. The training you got when you were 15 handles every car on the road. The radio knobs, the shifter, the pedals, all in roughly the same place, operated the same way. most people want a computer as a tool not as a toy. they want to walk up and DO something like browse the web. none of the guts matter. They want to play a game, they want to listen to music. The game is the toy, the music is the toy, the computer isnt fun to use, the game is, the music is.

To get the game or get the music, web, chat, etc they need a search to find it and a button to get it. If it came on a disk they need to put the disk in it runs. They shouldnt have to get directx11 runtime, or opengl, or .net, or libXYZ, they shouldnt even know what most of those things are.


Scott Dowdle's picture

Try the "Death of the Desktop" posts

I like your comment and I have quite a few things to say in rebuttal but this thread has gotten really off topic.

There were two fairly recent posts about the "Death of the Desktop" where I think this discussion should be continued.

See: Video: Death of the Desktop and Death of the Desktop Take II.


Kindred spirit

I enjoyed reading putz3000's commentary on Ubuntu; I might have to try it. My relationship to Linux is very similar. Nice bit of writing... seems like investigations into earlier Linux versions were a series of 'hells'... much like Windows Vista or Microsoft Office 2007 - descents into bad places following really significant improvements (Windows XP Pro, Office 2003). Thanks for making me smile.


Linux Mint

You might also take a look at Linux Mint. It is also based on Ubuntu, but is a little different. Looks pretty nice, seems to have the same advantages as Ubuntu in regards to hardware. There are two different downloads, if you go with the live DVD ISO the website says you will get the following:

Contains additional software such as Java, VLC, F-Spot, OpenOffice.org-base, Samba, additional wallpapers and ttf-dejaVu fonts

Anyhow, I installed it on a spare drive and briefly played with it. Seems to be not bad although I have not sat down and played with it for a prolonged period of time.


Re: Why I Use Linux

At home and on my Netbook, I only use Ubuntu Lucid 10.04 and it is used consistently on a daily basis for everything I do. I also have Linux Mint installed on my 24X7 laptop that I use for a number of tasks on my home network and it just sits there and runs and runs.

At work, where I am Linux Support Engineer, I use Fedora 13 on both my desktop and my "work" laptop as we support CentOS (RHES 5) servers in the enterprise. In my previous positions, I have used Ubuntu servers as well as RHES and VMware ESX which is base on an older version of RHES.

I like both and I am "at home" in both. For personal use, I like Ubuntu. I have been using RPM forever so I am as comfortable with that as I am with apt-get and aptitude. However, I would not recommend Fedora 13 to a Windows or Noobie user because it is just to technical to configure and use as a viable replacement for the Windows Desktop where Ubuntu is easily configured or if you go with Linux Mint 9, it is ready to go as a viable replacement for the Windows Desktop including playing your favorite DVD's, no configuration required, runs on just about everything as you have noted, printer detection is pretty much automatic for most printers including networked printers. Can't get much easier to use that Linux Mint 9.

A comment on openSuSE, I am very disappointed in Novell, they know how to build and support a desktop with SuSE Linux Enterprise Desktop, as a commercial product. They would never put their business desktop users through the grief of trying to configure and use openSuSE, especially in its latest incantation. I did load openSuSE 11.3 on my "work" laptop when it released and found that applications that I needed to use didn't work or crashed but were stable and most important, usable, on the Fedora 13 desktop. Oh, those same applications work fine on Ubuntu as well. I doubt that I will ever use openSuSE again, too many issues, plus it is perceptibly "slower" than Ubuntu or Fedora to boot up.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.