- #1
Algr
- 859
- 391
Of course all computers are getting faster over time. But are computers at different cost levels changing at the same rate? I've read some things indicating that high end supercomputers aren't advancing as quickly as PCs on the consumer level. Could this be measured objectively over time?
I find Google struggles with this kind of question, as it isn't clear what to call this. It's hard to search for something you don't have a name for.
What I'm hoping someone has done is something like this:
For each year:
- How powerful is a $1000 computer?
- How powerful is a $10 million computer?
- Calculate the ratio, and plot on a graph.
So in 1985, adjusted for inflation, you would be comparing a Cray 2 supercomputer (1.9 GFLOPS) to perhaps a Commodore 64.
Today you would compare a Typical PC to something from Frontier, perhaps?
Has a study like this been done?
I find Google struggles with this kind of question, as it isn't clear what to call this. It's hard to search for something you don't have a name for.
What I'm hoping someone has done is something like this:
For each year:
- How powerful is a $1000 computer?
- How powerful is a $10 million computer?
- Calculate the ratio, and plot on a graph.
So in 1985, adjusted for inflation, you would be comparing a Cray 2 supercomputer (1.9 GFLOPS) to perhaps a Commodore 64.
Today you would compare a Typical PC to something from Frontier, perhaps?
Has a study like this been done?