The Race Of Speed

Struggling to keep up with the pace of innovation a technological arms race has become dominant in various industry sectors.
Take high-frequency stock trading where algorithms zip in and out of markets often changing orders within seconds. Part of all stock trading takes place in a warehouse, filled with long avenues of computer servers, close to New Jersey Turnpike.  But high-frequency traders do not only use algorithms, they also speed-read news to change strategies within seconds. Like .com companies were the reason for bandwidth, high-frequency traders are the reason for massive computing infrastructure.
At the other hand new processors are under way based on hybrid CPU-GPU architectures,  like Fermi from NVIDIA. Those architectures offer supercomputer power under the desk and later on tablet computers.
If we remove our irrational fear of new architectures and use such new computing muscles what does that mean to scientific and technical computing?
First, we can do things, we could not do before - like real-time analytics by frequent model calibration  and re-calibration.
Second, we might save time and use more brute-force approaches instead of developing clever algorithms  - like Montecarlo techniques to solve all type of PDEs, PIDEs, ... or work in high-dimensional parameter spaces instead of the real world.
Third, to exploiting a deluge of data we might use other feature extraction techniques than mathematical modeling.
But it is my strong believe that as massive computing muscles emerge they will also open doors for new human-machine intelligence business.
But we also need to remove our irrational fear of new hybrid software platforms and programming techniques.
Sometimes we might say: mathematical code is dead - long live mathematical thinking and free-form programming?