I had a long conversation today with someone that was ready to buy a new computer based on the complaint that it was old and therefore slow. I disagree with this anytime it comes up. At some point, yes, computers will need to be replaced but they have the ability to long outlive the lifespan most people give them.
First, computer parts, outside of moving parts (fans, hard drives, etc), do not “wear out” and lose performance. I can understand a moving part losing efficiency, power, etc. I’ve had many fans wear out but the hardware behind it such as my cpu, motherboard, ram, etc functions as it did the day I bought it.
- I believe that, given the right software, your machine should run just as efficiently as it did the day you bought it.
- I believe that most people who complain about slow computers are simply owners of an unmaintained Windows based machine.
None of the machines in my home are under 3yrs old. I have 2 laptops, 2 desktops and 3 server machines. All but two of these were given to me as trade for IT work. Most were ready to be thrown away. After installing Ubuntu Linux on these machines they run perfectly. One machine would no longer recognize its network hardware, cd-tray or boot in less than ten minutes! It has been running Ubuntu for six months with not a single hardware issue! My desktops seem to nearly outperform my much newer office Windows XP machine. I feel the reason for this is the more efficient operating system. Remember, software was built to work with existing hardware, not to demand more powerful hardware.
Part of the reason this topic came up today was the imminent release of Windows Vista. This is a textbook case of software demanding hardware. I recently ran a hardware test to see if any of my machines would qualify to run Windows Vista. The result? An astounding “NO chance”. I lacked the hard drive space, I needed twice my current RAM and double my video card. This was just for the minimum requirements! This is ridiculous.
On this four-year-old desktop I can maintain a productive desktop including office software, graphic editing, 3d acceleration, email, web, chat, games, etc. I have visual effects that rival that of Mac OSX and I can play the latest games. This machine is four years old! It is a dual-cpu AMD 1800mhz, 512M, 7G HDD with a 128M nVidia geForce4 MX 440 video card. Nothing about this machine is extraordinary or even released within the last four years. By many it would be considered old or even obsolete. Why? When did the software start demanding more of the hardware? When did we start needing 10G simply for the Operating System? My entire OS plus many extras does not require 2.5G of my 7G drive.
To those of you that are considering a new computer based on the appearance of performance loss I bet the reason is Windows XP. An unmaintained machine (and even the best maintained machine) will slow down. The filesystem (NTFS) is inefficient. The registry is even worse. The potential for virus and related infection is huge. You most likely have something on your machine now that should not belong there taking up your resources. Switch to Linux. Your computer will seem brand new. No reason it shouldn’t. Its the same hardware you bought last year or the year before. It was fast then, why shouldn’t it be just as fast now? It doesn’t grow old and wear out like people do. It’s simply a matter of using the right software to deal with the hardware. Use one that doesn’t ask for the impossible. Try Ubuntu.
I always tell people who are throwing away old hardware, to make better use of it, or donate it to a charity or school.
The great thing about Linux is the fact that it does take better advantage of older hardware. I won’t say that Linux *always* works better on old hardware, but just that it can take better advantage of it than Windows.
But I disagree with you on one thing, although I see your point. I believe that software should push hardware development. Look at gaming, then look at video cards. Video cards are insanely powerful, and it’s because of the games behind them that drives that innovation. If regular software devs designed like gaming devs, I think computing would be hundreds of times more powerful than it currently is.
Just my $.02.
I do about the same thing you can do on your Ubuntu, but with a 733 mhz PIII and a 64 mb GeForce MX440. I had a windows xp running here, but it wouldn’t do shit.