Исследование нейросетевого метода решения задач линейного программирования

Николай Александрович Ольховский

Аннотация


В статье исследован метод определения вектора движения по гиперплоскостям, ограничивающим допустимый многогранник многомерной задачи линейного программирования на основе визуальных образов, подаваемых на вход нейронной сети прямого распространения. Алгоритм визуализации строит в окрестности точки, расположенной на ограничивающей гиперплоскости, рецептивное поле. Для каждой точки рецептивного поля вычисляется скалярное смещение до поверхности гиперплоскости. На основании вычисленного смещения каждой точке рецептивного поля присваивается скалярная величина. Полученный визуальный образ подается на вход нейронной сети прямого распространения, которая вычисляет на ограничивающей гиперплоскости направление максимального увеличения целевой функции. В статье предложена усовершенствованная форма крестообразного рецептивного поля. Описано построение обучающего множества на основе случайно сгенерированных ограничивающих гиперплоскостей и целевых функций в многомерных пространствах. Разработана масштабируемая архитектура нейронной сети с изменяемым числом скрытых слоев. Произведен подбор гиперпараметров нейронной сети. В вычислительных экспериментах подтверждена высокая (более 98%) точность работы крестообразного рецептивного поля. Исследована зависимость точности результатов нейронной сети от числа скрытых слоев и продолжительности обучения.

Ключевые слова


линейное программирование; метод поверхностного движения; искусственная нейронная сеть; глубокое обучение

Полный текст:

PDF

Литература


Sokolinskaya I.M., Sokolinsky L.B. On the Solution of Linear Programming Problems in the Age of Big Data. Parallel Computational Technologies. Vol. 753 / ed. by L.B. Sokolinsky, M.L. Zymbler. Cham: Springer, 2017. P. 86–100. Communications in Computer and Information Science. DOI: 10.1007/978-3-319-67035-5_7.

Brogaard J., Hendershott T., Riordan R. High-Frequency Trading and Price Discovery. Review of Financial Studies. 2014. Vol. 27, no. 8. P. 2267–2306. DOI: 10.1093/rfs/hhu032.

Deng S., Huang X., Wang J., et al. A Decision Support System for Trading in Apple Futures Market Using Predictions Fusion. IEEE Access. 2021. Vol. 9. P. 1271–1285. DOI: 10.1109/ACCESS.2020.3047138.

Seregin G. Lecture Notes on Regularity Theory for the Navier-Stokes Equations. WORLD SCIENTIFIC, 2014. DOI: 10.1142/9314.

Demin D. Synthesis of optimal control of technological processes based on a multialternative parametric description of the final state. Eastern-European Journal of Enterprise Technologies. 2017. Vol. 3, no. 4 (87). P. 51–63. DOI: 10.15587/1729-4061.2017.105294.

Kazarinov L.S., Shnayder D.A., Kolesnikova O.V. Heat load control in steam boilers. 2017 International Conference on Industrial Engineering, Applications and Manufacturing (ICIEAM). IEEE, 2017. P. 1–4. DOI: 10.1109/ICIEAM.2017.8076177.

Zagoskina E., Barbasova T., Shnaider D. Intelligent Control System of Blast-furnace Melting Efficiency. 2019 International Multi-Conference on Engineering, Computer and Information Sciences (SIBIRCON). IEEE, 2019. P. 0710–0713. DOI: 10.1109/SIBIRCON48586.2019.8958221.

Fleming J., Yan X., Allison C., et al. Real-time predictive eco-driving assistance considering road geometry and long-range radar measurements. IET Intelligent Transport Systems. 2021. Vol. 15, no. 4. P. 573–583. DOI: 10.1049/itr2.12047.

Scholl M., Minnerup K., Reiter C., et al. optimization of a Thermal Management System for Battery Electric Vehicles. 2019 Fourteenth International Conference on Ecological Vehicles and Renewable Energies (EVER). IEEE, 2019. P. 1–10. DOI: 10.1109/EVER.2019.8813657.

Meisel S. Dynamic Vehicle Routing. Anticipatory Optimization for Dynamic Decision Making. Vol. 51. New York, NY: Springer, 2011. P. 77–96. Operations Research/Computer Science Interfaces Series. DOI: 10.1007/978-1-4614-0505-4_6.

Cheng A.M.K. Real-Time Scheduling and Schedulability Analysis. Real-Time Systems. Wiley, 2002. P. 41–85. DOI: 10.1002/0471224626.ch3.

Kopetz H. Real-Time Scheduling. Real-Time Systems. Boston, MA: Springer, 2011. P. 239–258. Real-Time Systems Series. DOI: 10.1007/978-1-4419-8237-7_10.

Dantzig G.B. Linear programming and extensions. Princeton university press, 1998. 656 p.

Hall J.A.J., McKinnon K.I.M. Hyper-Sparsity in the Revised Simplex Method and How to Exploit it. Computational Optimization and Applications. 2005. Vol. 32, no. 3. P. 259–283. DOI: 10.1007/s10589-005-4802-0.

Klee V., Minty G.J. How good is the simplex algorithm? Inequalities — III. Proceedings of the Third Symposium on Inequalities Held at the University of California, Los Angeles, Sept. 1-9, 1969 / ed. by O. Shisha. Academic Press, 1972. P. 159–175.

Jeroslow R. The simplex algorithm with the pivot rule of maximizing criterion improvement. Discrete Mathematics. 1973. Vol. 4, no. 4. P. 367–377. DOI: 10.1016/0012- 365X(73)90171-4.

Zadeh N. A bad network problem for the simplex method and other minimum cost flow algorithms. Mathematical Programming. 1973. Vol. 5, no. 1. P. 255–266. DOI: 10.1007/BF01580132.

Bartels R.H., Stoer J., Zenger C. A Realization of the Simplex Method Based on Triangular Decompositions. Handbook for Automatic Computation. Berlin, Heidelberg: Springer Berlin Heidelberg, 1971. P. 152–190. DOI: 10.1007/978-3-642-86940-2_11.

Tolla P. A Survey of Some Linear Programming Methods. Concepts of Combinatorial Optimization / ed. by V.T. Paschos. 2nd ed. Hoboken, NJ, USA: Wiley, 2014. P. 157–188. DOI: 10.1002/9781119005216.ch7.

Hall J.A.J. Towards a practical parallelisation of the simplex method. Computational Management Science. 2010. Vol. 7, no. 2. P. 139–170. DOI: 10.1007/s10287-008-0080-5.

Mamalis B., Pantziou G. Advances in the Parallelization of the Simplex Method. Algorithms, Probability, Networks, and Games. Vol. 9295 / ed. by C. Zaroliagis, G. Pantziou, S. Kontogiannis. Springer, 2015. P. 281–307. Lecture Notes in Computer Science. DOI: 10.1007/978-3-319-24024-4_17.

Lubin M., Hall J.A.J., Petra C.G., Anitescu M. Parallel distributed-memory simplex for large-scale stochastic LP problems. Computational Optimization and Applications. 2013. Vol. 55, no. 3. P. 571–596. DOI: 10.1007/s10589-013-9542-y.

Khachiyan L. Polynomial algorithms in linear programming. USSR Computational Mathematics and Mathematical Physics. 1980. Vol. 20, no. 1. P. 53–72. DOI: 10.1016/0041-5553(80)90061-0.

Shor N.Z. Cut-off method with space extension in convex programming problems. Cybernetics. 1977. Vol. 13, no. 1. P. 94–96. DOI: 10.1007/BF01071394.

Yudin D., Nemirovsky A. Information complexity and efficient methods for solving convex extremal problems. Economics and mathematical methods (Ekonomika i matematicheskie metody). 1976. No. 2. P. 357–369. (in Russian).

Karmarkar N. A new polynomial-time algorithm for linear programming. Combinatorica. 1984. Vol. 4, no. 4. P. 373–395. DOI: 10.1007/BF02579150.

Gondzio J. Interior point methods 25 years later. European Journal of Operational Research. 2012. Vol. 218, no. 3. P. 587–601. DOI: 10.1016/j.ejor.2011.09.017.

Fathi-Hafshejani S., Mansouri H., Peyghami M.R., Chen S. Primal–dual interior-point method for linear optimization based on a kernel function with trigonometric growth term. Optimization. 2018. Vol. 67, no. 10. P. 1605–1630. DOI: 10.1080/02331934.2018.1482297.

Asadi S., Mansouri H. A Mehrotra type predictor-corrector interior-point algorithm for linear programming. Numerical Algebra, Control & Optimization. 2019. Vol. 9, no. 2. P. 147–156. DOI: 10.3934/naco.2019011.

Yuan Y. Implementation tricks of interior-point methods for large-scale linear programs. International Conference on Graphic and Image Processing (ICGIP 2011). Vol. 8285. International Society for Optics, Photonics, 2011. P. 828502. DOI: 10.1117/12.913019.

Kheirfam B., Haghighi M. A full-Newton step infeasible interior-point method for linear optimization based on a trigonometric kernel function. Optimization. 2016. Vol. 65, no. 4. P. 841–857. DOI: 10.1080/02331934.2015.1080255.

Xu Y., Zhang L., Zhang J. A full-modified-Newton step infeasible interior-point algorithm for linear optimization. Journal of Industrial and Management Optimization. 2015. Vol. 12, no. 1. P. 103–116. DOI: 10.3934/jimo.2016.12.103.

Roos C., Terlaky T., Vial J.-P. Interior Point Methods for Linear Optimization. Springer-Verlag, 2005. 500 p. DOI: 10.1007/b100325.

Olkhovsky N.A., Sokolinsky L.B. A new method of linear programming. Numerical Methods and Programming (Vychislitel’nye Metody i Programmirovanie). 2023. Vol. 24, no. 4. P. 408–429. (in Russian) DOI: 10.26089/NUMMET.V24R428.

Sokolinskaya I.M., Sokolinsky L.B. On an iterative method for solving linear programming problems on cluster computing systems. Numerical Methods and Programming (Vychislitel’nye Metody i Programmirovanie). 2020. Vol. 21, no. 3. P. 329–340. DOI: 10.26089/NumMet.v21r328.

Olkhovsky N.A., Sokolinsky L.B. Visualizing Multidimensional Linear Programming Problems. Parallel Computational Technologies. Vol. 1618 / ed. by L.B. Sokolinsky, M.L. Zymbler. Cham: Springer, 2022. P. 172–196. Communications in Computer and Information Science. DOI: 10.1007/978-3-031-11623-0_13.

Sokolinsky L.B., Sokolinskaya I.M. Apex Method: A New Scalable Iterative Method for Linear Programming. Mathematics. 2023. Vol. 11, no. 7. P. 1654. DOI: 10.3390/math11071654.

Biewald L. Experiment Tracking with Weights & Biases. Software available from wandb.com. 2020. January.

Goodfellow I., Bengio Y., Courville A. Deep Learning (Adaptive Computation and Machine Learning). Vol. 8. 2016.

Bilenko R.V., Dolganina N.Y., Ivanova E.V., Rekachinsky A.I. High-performance Computing Resources of South Ural State University. Bulletin of the South Ural State University. Series: Computational Mathematics and Software Engineering. 2022. Vol. 11, no. 1. P. 15–30. DOI: 10.14529/cmse220102.




DOI: http://dx.doi.org/10.14529/cmse230402