New Formula Will Make It Possible to Check the Effectiveness of Parallel Algorithms without a Supercomputer

Doctor of Sciences (Physics and Mathematics), Professor Leonid Sokolinsky has deduced a formula, which will help mathematicians and programmers evaluate the effectiveness of parallel algorithms. For four years the scientist has been working on deducing this formula at the SUSU Supercomputer Simulation Laboratory. The research results have been published in a highly-rated Journal of Parallel and Distributed Computing (Q1).
 

Parallel algorithms for a supercomputer

Programs based on complex numerical algorithms are used in many spheres of life. In particular, to track changes in quotations of stocks of companies on the major stock exchanges, to solve complex logistics tasks when building routes of cargo shipment and storage, or to compile optimal flight timetables. A standard computer, and even a very productive one, cannot cope with such tasks, that is why to solve them, supercomputers are used and parallel algorithms are created, which can simultaneously engage multiple processors. The more processor cores the algorithm effectively uses, the higher scalability it is capable of.
 

"Modern science, economics, and industry require the solving of optimization problems of huge computational complexity. For instance, a task of optimizing the operation of all the traffic lights in a city in such a way that would maximize the avoiding of the transportation congestion and jams. To solve this, a mathematician and a programmer will have to write a complex algorithm and split it into parallel parts, which will be performed by different processors. And it's very difficult. The more processors in a supercomputer, the more difficult it is to come up with an algorithm that will effectively use these thousands, dozens of thousands, and millions of processors. However, is a parallel algorithm is designed not that well, it will prove to be inefficient even on a supercomputer," shares Professor Sokolinsky.
 

Highly scalable parallel algorithms are also needed in fundamental science, for example, in physics, for solving the problems related to Bell's theorem.
 

Saving time and money

To evaluate the scalability of a new algorithm, it is necessary to hold a large-scale series of experiments by launching the elaborated parallel program on a computer. But often, after so much effort, it turns out that the algorithm is not effective. There arises a necessity to amend it, or to develop a new algorithm, so that a program created based on it could perform the set tasks quickly, accurately and effectively. That is why the scientist has proposed a new model of parallel computations, which allows to test the new algorithm before programming begins. Such an approach significantly reduces the time needed for software development and allows to introduce all the required amendments before writing the program.
 

"My goal was to create such a model of parallel computing, which would allow to conclude whether an algorithm can effectively use a supercomputer or not, and to do it at the earliest stage of the algorithm development, without programming, without launching it on a supercomputer, and without spending time and money on it. Using the formula, which I have proposed, and just a pen, paper and a calculator, can help you understand whether this or that algorithm will turn out to be effective when performed on a supercomputer," explains the scientist. "There are lots of parallel computing models, several dozens of them, but neither of those offers a ready-to-be-used formula that would help accurately assess the predicted scalability of an algorithm. In the model that I have proposed, for the first time in the world, such a formula has been deduced; and moreover, its effectiveness has been proved to be practically relevant during solving a variety of tasks."
 

Next, based on the developed model, it is planned to create a "programming framework" – a template, using which other mathematicians and programmers will be able to quickly create parallel programs for solving optimization problems of high computational complexity.
 

South Ural State University is a university of digital transformations, where innovative research is conducted in most of the priority fields of development of science and technology. In accordance with the strategies of research and technological development of the Russian Federation, the university is focused on the development of big scientific interdisciplinary projects in the field of digital industry, materials science, and ecology.
 

Research studies in the field of new technologies are among the priorities of the World-class Ural Interregional Research and Education Centre for Advanced Industrial Technologies and Materials (UIREC), which was established by the joint efforts of UrFU, SUSU, KSU, Ural Branch of the Russian Academy of Sciences and industrial corporations of Chelyabinsk, Sverdlovsk, and Kurgan regions.

 


Elena Kiriakova, photo by: Oleg Igoshin
You are reporting a typo in the following text:
Simply click the "Send typo report" button to complete the report. You can also include a comment.