About 3 yeas ago I wrote One size Does Not Fit All. And I still find it is true in many fields in computer science, especially in machine learning, where you always deal with kind of ill-posed problems.
But I think of the algorithms that are widely used, like the Monte Carlo Method, Fast Fourier Transform and the Simplex Algorithm and the new computing muscles, hybrid CPU/GPU architectures that are so strong if you do the same thing with massive data on and on ....
In quantitative finance the new computing muscles might enable us to solve most of models with Monte Carlo simulation (improved by Longstaff Schwartz techniques - Least square Monte Carlo .. - required, if you have instruments with early exercise ...).
Some experts find the most widely used algorithm is the Simplex algorithm (that emerged in the late 40ies from the work of the US mathematician G. Dantzig). The Simplex algorithm has steadily conquered the world embedded in commercial optimization packages (for linear programming). Its services might be called many thousands of times a second. The algorithm that runs the world?
There are many applications of the Simplex algorithm.
It can be even applied to solve the collision detection problem of objects. Wrap your objects by constraints (a polytope) and search for the common objective function ... the collision event.
What will happen to the special functions, symbols we know by name, meet quite often, know their form, shape and behavior? Our good friends, we can artistically derive, integrate ....