The article considers the problems of using construction materials of improper quality in Russia. The concepts of sub-standard, fraudulent and counterfeit construction materials (SFC materials) are defined. The distinctive features of SFC products in the construction materials market and the consequences of non-compliance with general requirements for their quality are highlighted. Features of digital marking of construction materials in Russia are analyzed. Key problems of incoming inspection made by the contractor and their causes are highlighted. In conclusion, recommendations are given to improve the efficiency of the incoming inspection system for construction materials.
Keywords: construction materials, quality, fraudulent products, counterfeit products, labeling, safety of buildings and structures, construction control, incoming inspection
The article explores the application of the residue number system in text information processing. The residue number system, based on the principles of modular arithmetic, represents numbers as sets of residues relative to pairwise coprime moduli. This approach enables parallel computation, potential data compression, and increased noise immunity. The study addresses issues such as character encoding, parallel information processing, error detection and correction, computational advantages in implementing polynomial hash functions, as well as practical limitations of the residue number system.
Keywords: residue number system, modular arithmetic, text processing, parallel computing, data compression, noise immunity, Chinese remainder theorem, polynomial hashing, error correction, computational linguistics
Ontological modeling is a promising direction in the development of the scientific and methodological base for developing intelligent information systems in the power industry. The article proposes a new approach to using ontological models in creating artificial intelligence systems for forecasting time series in electrical engineering problems. Formal metrics are introduced: the ontological distance between a feature and a target variable, as well as the semantic relevance of a feature. Using examples of domain ontologies for wind energy and electricity consumption of an industrial enterprise, algorithms for calculating these metrics are demonstrated and it is shown how they allow ranking features, implementing an automated selection of the most significant features, and providing semantic regularization of training regression models of various types. Recommendations are given for choosing coefficients for calculating metrics, an analysis of the theoretical properties of metrics is carried out, and the applicability limits of the proposed approach are outlined. The results obtained form the basis for further integration of ontological information into mathematical and computer models for forecasting electricity generation and consumption in the development of industry intelligent systems.
Keywords: ontology, ontological distance, feature relevance, systems analysis, explainable artificial intelligence, power industry, generation forecasting, electricity consumption forecasting
In the modern world, when technology is developing at an incredible rate, computers have gained the ability to "see" and perceive the world around them like a human. This has led to a revolution in visual data analysis and processing. One of the key achievements was the use of computer vision to search for objects in photographs and videos. Thanks to these technologies, it is possible not only to find objects such as people, cars or animals, but also to accurately indicate their position using bounding boxes or masks for segmentation. This article discusses in detail modern models of deep neural networks used to detect humans in images and videos taken from a height and a long distance against a complex background. The architectures of the Faster Region-based Convolutional Neural Network (Faster R-CNN), Mask Region-based Convolutional Neural Network (Mask R-CNN), Single Shot Detector (SSD) and You Only Look Once (YOLO) are analyzed, their accuracy, speed and ability to effectively detect objects in conditions of a heterogeneous background are compared. Special attention is paid to studying the features of each model in specific practical situations, where both high-quality target object detection and image processing speed are important.
Keywords: machine learning, artificial intelligence, deep learning, convolutional neural networks, human detection, computer vision, object detection, image processing
The relevance of the study is due to the fact that: firstly, the proposed topic is in line with the Government-approved "Strategy for the development of the construction industry and Housing and Communal Services of the Russian Federation for the period up to 2030 with a forecast up to 2035" dated October 31, 2022, which assumes an increase in the share of industrial housing construction, including panel construction; secondlySecondly, the existing methods for calculating the joints of panel buildings have come down to us almost unchanged from the level of development of construction science and engineering in the 1980s. of the last century, which leads to increased material costs and rising real estate prices, therefore, the current trend in the development of the construction industry requires the improvement of these techniques. Modern approaches to nonlinear modeling and calculation of reinforced concrete will allow, in relation to panel joints, to reveal the reserves of their bearing capacity and obtain a more rational design, which will reduce the cost of constructing the buildings in question as a whole. PURPOSE. To carry out a comparative analysis of existing methods for calculating panel joints in two groups of limit states, to identify their advantages and disadvantages. Identify ways to improve them. METHODS used in the course of the research: theoretical methods – chronological, formalization, classification, analysis, synthesis, generalization, comparison. RESULTS. The analysis of existing methods of foreign and domestic design standards, as well as various author's methods, showed that: 1) foreign and domestic standards are based on the limit state method; 2) the nonlinear deformation model is not used in the calculation of joints of reinforced concrete panels; 3) to obtain a more accurate VAT of reinforced concrete panel joints, it is necessary to process a large amount of data using computer technology. CONCLUSION. The analysis shows that the use of computer software systems is the most promising method for calculating building structures, allowing fast and accurate calculations, reducing the cost of construction.
Keywords: large-panel construction, reinforced concrete, panel joint, joint calculation, joint classification, platform joint, deformation model, limiting forces, computer modeling, finite element
This article presents the development of a combined method for summarizing Russian-language texts, integrating extractive and abstractive approaches to overcome the limitations of existing methods. The proposed method is preceded by the following stages: text preprocessing, comprehensive linguistic analysis using RuBERT, and semantic similarity-based clustering. The method involves extractive summarization via the TextRank algorithm and abstractive refinement using the RuT5 neural network model. Experiments conducted on the Gazeta.Ru news corpus confirmed the method's superiority in terms of precision, recall, F-score, and ROUGE metrics. The results demonstrated the superiority of the combined approach over purely extractive methods (such as TF-IDF and statistical methods) and abstractive methods (such as RuT5 and mBART).
Keywords: combined method, summarization, Russian-language texts, TextRank, RuT5
The paper considers a stochastic model of the operation of an automatic information processing system, which is described by a system of differential equations of the Kolmogorov distribution of state probabilities, assuming that the flow of requests is Poisson, including the simplest one. A scheme for solving a system of differential equations of high dimensionality with slowly changing initial data is proposed, and the parameters of the presented model are compared with the parameters of the simulation model of the Apache HTTP Server. To compare the simulation and stochastic models, a test server was used to generate requests and simulate their processing using the Apache JMeter program, which was used to estimate the parameters of the incoming and processed request flows. The presented model does not contradict the simulation model and allows us to evaluate the system's states under different operating conditions and calculate the load on the web server when there is a large amount of data.
Keywords: stochastic modeling, simulation model, Kolmogorov equations, sweep method, queuing system, performance characteristics, test server, request flow, service channels, queue
The article considers the parameter identification issues of linear non-stationary dynamic systems adaptive models using the example of a linearized adjustable model of a DC motor with independent excitation. A new method for estimating the parameters of adjustable models from a small number of observations is developed based on projection identification and the apparatus of linear algebra and analytical geometry. To evaluate the developed identification method, a comparison of the transient processes of the adaptive model of a DC motor with independent excitation with the obtained parameter estimates with reference characteristics was carried out. The efficiency of the proposed identification method in problems of DC electric drive control is shown.
Keywords: DC motor, projection identification, dynamic system parameter estimation, adaptive model of non-stationary dynamic system
This article presents the development of an automated system for monitoring the quality of dragees using machine vision and digital simulation technologies. The goal of this work is to create an intelligent system that can detect surface defects of confectionery products, such as chips and cracks, in real time. The proposed approach combines the training of a neural network based on the YOLO architecture using both real and synthetic images, as well as the implementation of digital simulation of the production line for pre-debugging and optimizing the system parameters. The use of a digital model allows testing in conditions close to real, which contributes to an increase in the accuracy of defect classification and a decrease in the costs of equipment setup. The tests carried out confirmed the high efficiency of the proposed system and the expediency of its implementation in the food industry.
Keywords: machine vision, quality control, dragees, YOLO, digital simulation, surface defects
Modern computer systems for controlling chemical-technological processes make it possible to programmatically implement complex control algorithms, including using machine learning methods and elements of artificial intelligence. Such algorithms can be applied, among other things, to complex non-stationary multi-product and flexible discrete productions, which include such low-tonnage chemical processes as the production of polymeric materials. The article discusses the production of fluoroplastic in batch reactors. This process occurs under constantly changing parameters such as pressure and temperature. One of the important tasks of the control system is to stabilize the quality of the produced polymer, and for these purposes it is important to predict this quality during the production process before the release of fluoroplastic. The quality of the product, in turn, strongly depends on both the quality of the initial reagents and the actions of the operator. In non-stationary process conditions, typical virtual quality analyzers based on regression dependencies show poor results and are not applicable. The article proposes the architecture of a virtual quality analyzer based on mathematical forecasting methods using such algorithms as: random forest method, gradient boosting, etc.
Keywords: polymerization, multi-product manufacturing, low-tonnage chemistry, quality forecasting, machine learning
The article focuses on the development of a web portal for monitoring and forecasting atmospheric air quality in the Khabarovsk Territory. The study analyzes existing solutions in the field of environmental monitoring, identifying their key shortcomings, such as the lack of real-time data, limited functionality, and outdated interfaces. The authors propose a modern solution based on the Python/Django and PostgreSQL technology stack, which enables the collection, processing, and visualization of air quality sensor data. Special attention is given to the implementation of harmful gas concentration forecasting using a recurrent neural network, as well as the creation of an intuitive user interface with an interactive map based on OpenStreetMap. The article provides a detailed description of the system architecture, including the backend, database, and frontend implementation, along with the methods used to ensure performance and security. The result of this work is a functional web portal that provides up-to-date information on atmospheric air conditions, forecast data, and user-friendly visualization tools. The developed solution demonstrates high efficiency and can be scaled for use in other regions.
Keywords: environmental monitoring, air quality, web portal, forecasting, Django, Python, PostgreSQL, neural networks, OpenStreetMap
Analysis of a digital data transmission system through a noisy communication channel based on the Huffman compression method and encoding using cyclic Bose-Chowdhury-Hockingham codes This article examines the effectiveness of a digital data transmission system through a noisy communication channel using the Huffman compression method and cyclic BCH encoding (Bose-Chowdhury-Hockingham). Huffman compression reduces data redundancy, which increases the effective transmission rate, while BCH codes detect and correct errors caused by channel noise. The analysis likely includes evaluating parameters such as compression ratio, data transmission rate, error probability after decoding, and computational complexity of the algorithms. The results demonstrate the effectiveness of this combination of techniques in improving data transmission reliability in noisy environments.
Keywords: " digital transmission system, cyclic coding, compression ratio, decoding, encoding"
The article focuses on developing and testing a methodology for quantitatively assessing the robustness of time series models (SARIMA, Prophet, LSTM) used in inventory management under unstable external conditions. The proposed approach involves modeling input data distortion scenarios, including holiday demand surges, logistical disruptions, inflationary trends, and structural shifts. Robustness metrics—Robustness Index (RI) and Degradation Ratio (D)—are introduced to evaluate forecast degradation. Experiments on synthetic data show that high accuracy on clean data does not guarantee robustness: SARIMA is sensitive to inflationary trends, Prophet is robust to seasonality, and LSTM is vulnerable to structural shifts. The findings are applicable in logistics and retail for optimizing supply planning.
Keywords: inventory management, time series, model robustness, demand forecasting, SARIMA, Prophet, LSTM, Robustness Index, Degradation Ratio, logistical disruptions, seasonal fluctuations
The paper presents a methodology that includes stages of task performance control, data collection and analysis, determination of reliability and efficiency criteria, reasonable selection, communication, implementation and control of the results of management decisions. A cyclic algorithm for comprehensive verification of compliance with the reliability and efficiency criteria of the system has been developed, allowing for prompt response to changes, increased system stability and adaptation to adverse environmental impacts. Improved mathematical formulas for assessing the state of organizational systems are proposed, including calculation of the readiness factor, level of planned task performance and compliance with established requirements. The application of the methodology is aimed at increasing the validity of decisions made while reducing the time for decision-making, as well as ensuring the relevance, completeness and reliability of information in information resources in the interests of sustainable development of organizational systems.
Keywords: algorithms, time, control, reliability and efficiency criteria, indicators, resources, management decisions, cyclicity
The article discusses the possibility of controlling engines in a multi-engine system using a fuzzy logic algorithm. An example of commands for controlling a system of several engines is given.
Keywords: fuzzy logic, automation, process modeling, PID control, motor control
This article discusses the implementation features of named entity recognition models. In the course of the work, a number of experiments were conducted with both traditional models and well-known neural network architectures, a hybrid model, the features of the results, their comparison and possible explanations are considered. In particular, it is shown that a hybrid model with the addition of a bidirectional long short-term memory can give better results than the basic bidirectional representation model based on transformers. It is also shown that, improved by adding a thinning layer for regularization, a weighted loss function and a linear classifier on top of the outputs, a bidirectional representation model based on transformers can give high metric values. For clarity, the work provides graphs of model training and tables with metrics for comparison. In the process of work, conclusions and recommendations were formed.
Keywords: text analysis, artificial intelligence, named entity recognition, neural networks, deep learning, machine learning
The paper considers the effect of particle size on the dynamics of suspended sediments in a riverbed. The EcoGIS-Simulation computing complex is used to simulate the joint dynamics of surface waters and sediments in the Volga River model below the Volga hydroelectric dam. The most important factor in the variability of the riverbed is the spring releases of water from the Volgograd reservoir, when water consumption increases fivefold. Some integral and local characteristics of the riverbed are calculated depending on the particle size coefficient.
Keywords: suspended sediment, soil particle size, sediment dynamics, diffusion, bottom sediments, channel morphology, relief, particle gravitational settling velocity, EcoGIS-Simulation software and hardware complex, Wexler formula, water flow
To ensure the stable and reliable operation of isolated power systems, models based on rapid processing and analysis of non-Gaussian data are needed, which contributes to increased energy efficiency and improved energy management. Within the framework of the theory of optimal control of power consumption, based on a comprehensive ranking analysis procedure, a model of regime rationing was developed, which differs from the known ones in that for the first time an R-distribution device based on rank analysis was used, as well as a device and method of regime rationing, which automatically ensures the necessary stable power consumption of the regional electrical complex in conditions of resource constraints.
Keywords: Regime normalization, rank analysis, OLAP data cube, half division method, entropy, topological data, rank topological measure, resource constraints plan, approximation, regional electric complex, power consumption.
The article examines the influence of the data processing direction on the results of the discrete cosine transform (DCT). Based on the theory of groups, the symmetries of the basic functions of the DCT are considered, and the changes that occur when the direction of signal processing is changed are analyzed. It is shown that the antisymmetric components of the basis change sign in the reverse order of counts, while the symmetric ones remain unchanged. Modified expressions for block PREP are proposed, taking into account the change in the processing direction. The invariance of the frequency composition of the transform to the data processing direction has been experimentally confirmed. The results demonstrate the possibility of applying the proposed approach to the analysis of arbitrary signals, including image processing and data compression.
Keywords: discrete transforms, basic functions, invariance, symmetry, processing direction, matrix representation, correlation
Modern engineering equipment operation necessitates solving optimal control problems based on measurement data from numerous physical and technological process parameters. The analysis of multidimensional data arrays for their approximation with analytical dependencies represents both current and practically significant challenges. Existing software solutions demonstrate limitations when working with multidimensional data or provide only fixed sets of basis functions. Objectives. The aim of this study is to develop software for multidimensional regression based on the least squares method and a library of constructible basis functions, enabling users to create and utilize diverse basis functions for approximating multidimensional data. Methods. The development employs a generalized least squares method model with loss function minimization in the form of a multidimensional elliptical paraboloid. LASSO (L1), ridge regression (L2), and Elastic Net regularization mechanisms enhance model generalization and numerical stability. A precomputation strategy reduces asymptotic complexity from O(b²·N·f·log₂(p)) to O(b·N·(b+f·log₂(p))). The software architecture includes recursive algorithms for basis function generation, WebAssembly for computationally intensive operations, and modern web technologies including Vue3, TypeScript, and visualization libraries. Results. The developed web application provides efficient approximation of multidimensional data with 2D and 3D visualization capabilities. Quality assessment employs MSE, R², and AIC metrics. The software supports XLSX data loading and intuitive basis function construction through a user-friendly interface. Conclusion. The practical value lies in creating a publicly accessible tool at https://datapprox.com for analyzing and modeling complex multidimensional dependencies without requiring additional software installation.
Keywords: approximation, least squares method, basic functions, multidimensional regression, L1/L2 regularization, web-based
The study addresses the problem of short-term forecasting of ice temperature in engineering systems with high sensitivity to thermal loads. A transformer-based architecture is proposed, enhanced with a physics-informed loss function derived from the heat balance equation. This approach accounts for the inertial properties of the system and aligns the predicted temperature dynamics with the supplied power and external conditions. The model is tested on data from an ice rink, sampled at one-minute intervals. A comparative analysis is conducted against baseline architectures including LSTM, GRU, and Transformer using MSE, MAE, and MAPE metrics. The results demonstrate a significant improvement in accuracy during transitional regimes, as well as robustness to sharp temperature fluctuations—particularly following ice resurfacing. The proposed method can be integrated into intelligent control loops for engineering systems, providing not only high predictive accuracy but also physical interpretability. The study confirms the effectiveness of incorporating physical knowledge into neural forecasting models.
Keywords: short-term forecasting, time series analysis, transformer architecture, machine learning, physics-informed modeling, predictive control
This paper is devoted to the construction of a visual-inertial odometry system for an unmanned vehicle using both binocular cameras and inertial sensors as an information source, which would be able to simultaneously determine the vehicle's own position and the relative position of other road users. To ensure accurate and continuous localization, it is proposed to use an inertial navigation system and two types of image keypoints. Deep learning models are used to accurately and reliably track keypoints. To achieve efficient and reliable matching of objects between two frames, a multi-level data association mechanism is proposed that takes into account possible errors of various system components. The experimental results demonstrate the feasibility and application potential of the proposed system.
Keywords: multi-object visual-inertial odometry, localization, data association, tracking of 3D dynamic objects
This paper is devoted to the theoretical analysis of the methods used in verifying the dynamics of a signature obtained from a graphic tablet. A classification of three fundamental approaches to solving this problem is carried out: matching with a standard; stochastic modeling and discriminative classification. Each approach in this paper is considered using a specific method as an example: dynamic transformation of the time scale; hidden Markov models; support vector machine. For each method, the theoretical foundations are disclosed, the mathematical apparatus is presented, the main advantages and disadvantages are identified. The results of the comparative analysis can be used as the necessary theoretical basis for developing modern signature dynamics verification systems.
Keywords: verification, biometric authentication, signature dynamics, graphic tablet, classification of methods, matching with a standard, stochastic modeling, discriminative classification, hidden Markov models, dynamic transformation of the time scale