Scroll down to the bottom, I've included a few advertisements to show the state of the art of hardware at the time
OMG that's great. "4 MB of GRAPHICS MEMORY!!!"

If solid state drives can come close to competing with that kind of pricing
for me, it's not a straight cost/GB issue. i like standard HDDs for exactly the reason you mention, they're really damn cheap. i would have bought a couple of those 500GB for a RAID1 array by now, but for now i can't do RAID. really, i can't kick myself enough for buying XP home instead of pro at the last second. c'est la vie, live and learn. april, or possibly may, and i'll probably buy the first drive next month alongside my next graphics card (split up the cost, and hopefully get a unit from a different batch, though i think newegg does that anyway when they ship multiple units).
beyond that, i'll probably just keep my 150 for windows and programs. i like learning about these various technologies, but i don't want more than 3 drives. maybe, maybe i'd put a WD raptor in there, if i got like a xmas present or something, but i don't think it's a big enough priority for me to shell out money for it when i could spend my extra cash on, well, something else. i want to stick to 3 HDDs to help keep heat and wire clutter down, and i can dedicate the top six bays to direct airflow (if and when i move my optical drive to a slimline deal at the bottom of the case).
I think I read somewhere that a current high end system can render about a third the speed required
so then give it about two years. that's moore's law, isn't it? that computing power doubles every 1.5 years. so, 1.5 years to get 2/3 there, and another .75 years to be all the way there. two years and 3 months for high-end systems.
i don't think it's quite that simple, but i do think there's progress being made. i believe i read or heard that nVidia recently aquired Ageia (sp?), makers of that physics processing unit. i'm hoping we'll see them integrate the technology into their VGA products without requiring a separate expansion slot (which would make some sense for nVidia, since they're trying to push triple sli systems on high-end users that only leave a single slot more, if at all, for other expansion cards).
i also read a year or two ago that some company was working on a chip to help process AI behavior in games for things like pathfinding and (one hopes) tactical movement.
we're not only getting closer to movie quality in terms of the way things looks, but also the way they act and react.