Will there be a limit to the increasing complexity of the model?

Will more efficient IT resources allow us to push the limits of the digital model indefinitely? Aren’t we in danger of losing control?

I recently took part in a presentation of activities conducted by young PhD students in a scientific laboratory of which I am a member of my community, and at each of these events I was distracted (and admired) by the power of these students’ work. I myself have been an art teacher of PhD students for over 20 years and I have noticed over the last two decades how physical and instrumental models have become so complex that one can only question oneself. Whether you will master these models indefinitely: At least, I ask myself this question often.

Let me explain:
Each new thesis is supposed to “capitalize” on previous achievements: completely new subjects are actually quite rare, and every doctoral student will, therefore, have to swallow the results of the work of his or her predecessors before advancing on the prescribed subject. : Over the years, we measure the extent of this work of assimilation, obviously necessary.
This achievement, already complex, will be enriched by the results of the new thesis: the complexity will therefore increase.
So it is conceivable that only powerful digital tools capable of handling increasingly large amounts of parameters and data can process a more complex modeling of this reality: because, in the end, it is actually the reality that we want to model it to better understand it. And predict it, in other words control it.
But, conversely, by pushing the complexity of these models, are we not afraid to lose the thread with the reality that we are trying to communicate as closely as possible? Not to mention the risks of computer “bugs” that can only increase when their detection capacity decreases as these models become more complex.

I have taken the example of the thesis, but the question also arises, in a slightly different way, for industrial performance models, which are always more complex for the same reason, such as the reality test:
Assimilation of previous results is replaced by learning performance management software as it is sophisticated, often interface with each other.
Demonstrating the adequacy of performance according to the customer’s requirements then becomes a computer practice of combining all the models in the appropriate software: only a distorted result can reveal the error of a model (or in a software, no software is completely accurate) [1])
But, at a time when “engineering feeling” has been banned for the reasons mentioned several times in this blog, can we still “gauge” the relevance in terms of ordering the number of “spit” numbers by a machine? ? Suspicion allowed.

Added to these questions are the more powerful ones, and therefore the carbon footprint of more energy-intensive models. The Computing and Communication Climate Seminar held at MIT last March announced that by 2040 we need to improve the energy efficiency of computing methods, due to the growing demand for artificial intelligence, blockchain and games (but yes!), By a factor of more than… 1 million!

I will explain my concern with the SKA (Square Kilometer Array) project, a huge radio telescope that will include 200 antennas in South Africa and 130,000 antennas in Australia. This will create a data rate of a few tens of terabytes per second which will require this data archive representing hundreds of petabytes per year. [2]. However, today, we do not know how we will be able to store all this data: information technology and renewable energy will require radical innovation, but innovation cannot be decreed and nothing allows to this day to say that its ambitious project should not be underestimated. At least in the beginning.

It seems that we often forget that the human brain cannot follow the evolution of tools at the same speed: we imagine compensation for this delay by the widespread use of artificial intelligence and robotization, the consequences of which are not merely role-playing. The slave of the instrument is this brain with which nature has given us. In this frantic race, I see two possible endings:
Either we will come up against the limits that we have to live in, which will probably give the people a chance to regain their rights;
Either we go on indefinitely, and there.
I guess the last thing readers of this blog want is what we (we) want.

[1] Proven by constant updates of all classic office software
[2] Tera = 1012 and Peta = 1015 or one million billion …

Leave a Comment