One city government around here threw away over 100 HP thin clients. They run on about 10-20W. I plan on making a nice cluster out of them.
The point I was trying to make, which doesn't seem to come across, is that spending $ on new equipment is a shortcut. A quick fix. Spending just a bit of time seeking out the sources of e-waste in your community, in the right technical hands, can pay off big.
You don't have to feed the beast. It doesn't take much vision to see that the action of paying $ into a system that happily externalizes all damage is part of the problem and directly contributes to the ecological and socialogical degradation we are all hopefully observing.
You can get all you need for hardware for free if you spend the time to seek it out and at the same time, you are helping clean up the mess.
We developers, many of us have been around from the time that 16K was a lot of memory, a 40MB hard drive could hold all you had, and 1 MHz was fast enough to play with almost any of the ideas in computer science. This is still true for many many tasks. I would argue that learning algorithmic complexity (big O) can be easier when you don't have a massive amount of compute that blasts through O(2^n) in about the same time as O(n) for many data sets. Try that on an apple II. A ten year old computer can have 16G of ram and run dual cores over 2GHz. You can learn a heck of alot of computer science with one of those machines. No need to feed the beast.
The point I was trying to make, which doesn't seem to come across, is that spending $ on new equipment is a shortcut. A quick fix. Spending just a bit of time seeking out the sources of e-waste in your community, in the right technical hands, can pay off big.
You don't have to feed the beast. It doesn't take much vision to see that the action of paying $ into a system that happily externalizes all damage is part of the problem and directly contributes to the ecological and socialogical degradation we are all hopefully observing.
You can get all you need for hardware for free if you spend the time to seek it out and at the same time, you are helping clean up the mess.
We developers, many of us have been around from the time that 16K was a lot of memory, a 40MB hard drive could hold all you had, and 1 MHz was fast enough to play with almost any of the ideas in computer science. This is still true for many many tasks. I would argue that learning algorithmic complexity (big O) can be easier when you don't have a massive amount of compute that blasts through O(2^n) in about the same time as O(n) for many data sets. Try that on an apple II. A ten year old computer can have 16G of ram and run dual cores over 2GHz. You can learn a heck of alot of computer science with one of those machines. No need to feed the beast.