Because of our ability to collect and analyze vast variety of data, scientists now have the potential to solve some of the world's biggest problems. But until very recently, we didn't know how to combine the right data sets and see crucial patterns, the article describes the idea in brief.
It was probably the first time, since I have subscribed HBR, that I read "Machine Learning" in a paragraph header.
Machine learning is an "old" methodology, but its application to real live problems took a while, because they need massive data and enormous computational power.
Why? Because most of the systems that can attack problems in life sciences, energy management, climate change, process industry, .... need multi-strategy and multi-method approaches.
I share the enthusiasm about the possibilities in the petabyte- and petaflop age. But there are some barriers and limitations. In short, we want to extract knowledge and computational models from data. For
- Analysis
- Prediction
- Control
In analysis, we want to get insight. Usually this is not so difficult.
In prediction, we want to create models that are able to predict future behavior. In machine learning generalization is difficult.
To do predictive modeling adequately, we usually need to introduce meta-learning algorithms and combine models. Mathematica's programming environment is ideal for this. It also provides built-in parallelism..
We exploit this technologies in our mlf .
In control we want to manage parameters that lead to desired goals (like quality features). To do this we need to solve inverse problems that are ill-posed by nature. The machine learning part is often focussed on the support of parameter identification in model calibration. Machine learning interact with theoretical models. To do this, you might need symbolic and numerical computation and machine learning in the framework of Mathematica.
The metal forming example
Take free bending. You need to understand the framework of the elastoplasticity theory and the complexity and limits of its mathematics derived from the mathematics of continuum mechanics. Deformation is decomposed into elastic and plastic parts and for simplicity decompositions shall determine stress and kinematical quantities.
However, resulting PDEs can be solved by finite element schemes. They "only" need to be calibrated related to material properties.
However, resulting PDEs can be solved by finite element schemes. They "only" need to be calibrated related to material properties.
If you buy a metal sheet you order it by standardized "names". And this is where the headache begins.
Standardized mass-steel qualities allow ranges of physical properties of up to 15% in one product. But each single sheet has its own metallurgical fingerprints. Measuring the properties before bending would destroy part of the sheet.
So, you need to recalibrate your models during the forming process ("continuously"). The closer you come to the final shape the more your system knows about the concrete material and the better it can control ....
Continuous measurements of the shape? The material does not stay in the form you brought it by the machine, it uses its elasticity memory and reshapes.
We have used mlf on Mathematica to solve such problems for concrete machine controls.