×

You are using an outdated browser Internet Explorer. It does not support some functions of the site.

Recommend that you install one of the following browsers: Firefox, Opera or Chrome.

Contacts:

+7 961 270-60-01
ivdon3@bk.ru

  • Application of Python for intelligent data analysis in an oil refinery

    The article reflects the basic principles of the application of artificial intelligence (AI) and machine learning (ML) technologies at oil refineries, with a particular focus on Russian industrial enterprises. Modern oil refineries are equipped with numerous sensors embedded in technological units, generating vast volumes of heterogeneous data in real time. Effective processing of this data is essential not only for maintaining the stable operation of equipment but also for optimizing energy consumption, which is especially relevant under the increasing global demand for energy resources. The study highlights how AI and ML methods are transforming data management in the oil industry by enabling predictive analytics and real-time decision-making. Python programming language plays a central role in this process due to its open-source ecosystem, flexibility, and extensive set of specialized libraries. Key libraries are categorized and discussed: for data preprocessing and manipulation (NumPy, SciPy, Pandas, Dask), for visualization (Matplotlib, Seaborn, Plotly), and for building predictive models (Scikit-learn, PyTorch, TensorFlow, Keras, Statsmodels). In addition, the article discusses the importance of model validation, hyperparameter tuning, and the automation of ML workflows using pipelines to improve the accuracy and adaptability of predictions under variable operating conditions. Through practical examples based on real industrial datasets, the authors demonstrate the capabilities of Python tools in creating interpretable and robust AI solutions that help improve energy efficiency and support digital transformation in the oil refining sector.

    Keywords: machine learning (ML), artificial intelligence (AI), intelligent data analysis, Python, Scikit-learn, forecasting, energy consumption, oil refining, oil and gas industry, oil refinery

  • Application of homogeneous nested piecewise linear regression with clustering of variables to model staffing levels of information protection units

    Mathematical modeling of complex systems often requires the use of variable grouping methods to build effective models. This paper considers the problem of constructing a homogeneous nested piecewise linear regression with variable grouping for modeling the staffing of information protection units. A corresponding model for the Social Fund of Russia is constructed using spatial data for the year 2022. The data on the number of employees of the organization, electronic signatures, protected nodes, protected resources, the total number of structural units, individual buildings and IT service specialists are used as independent variables.

    Keywords: information protection, regression model, homogeneous nested piecewise linear regression, parameter estimation, least modulus method, linear-Boolean programming problem, index set, set power, social fund

  • Controlling a plane-parallel robot using sliding mode

    Differential-algebraic equations for describing the motion of a plane-parallel robot-manipulator are investigated. The dynamic model is constructed using the Lagrange equation and the substructure method. The design of a control system regulator using the sliding mode method is considered. The control accuracy is tested on a model of a 3-RRR plane-parallel robot . It consists of three kinematic chains, each of which has two links with three rotational joints. To study the efficiency of the controller, a circular trajectory is used as the target motion for the multibody system. The considered control system for a plane-parallel robot is capable of solving problems of movement and ensuring high positioning accuracy.

    Keywords: control, plane-parallel robot, kinematic characteristics, dynamic model, differential-algebraic equations, constraint equation, controller, sliding mode, Lyapunov function, program trajectory

  • Deploying and Integrating Grafana, Loki, and Alloy in a Kubernetes Environment

    This article presents a structured approach to deploying and integrating Grafana, Loki, and Alloy in Kubernetes environments. The work was performed using a cluster managed via Kubespray. The architecture is focused on ensuring external availability, high fault tolerance, and universality of use.

    Keywords: monitoring, ocestration, containerization, Grafana, Loki, Kubernetes, Alloy

  • Simulation of incremental encoder based speed sensor in controlled electro drive

    The paper is about special questions in simulation of controlled electro drive with speed feedback. The incremental encoder, that is an angle sensor in fact, is widely used as a speed feedback sensor in such a drives. It has same special features as speed sensor because of discrete operation and this features are to be taken in account in control system development and simulation. The simulation model of incremental encoder and speed signal decoder is present. Model is realized in SimInTech simulation system using visual modeling and programming language based description approach.

    Keywords: Incremental encoder, speed sensor, quadrature decoder, electro drive simulation, incremental encoder simulation, SimInTech

  • Algorithm for forming a strategy for automatic updating of artificial intelligence models in forecasting tasks in the electric power industry

    Changes in external conditions, parameters of object functioning, relationships between system elements and system connections with the supersystem lead to a decrease in the accuracy of the artificial intelligence models results, which is called model degradation. Reducing the risk of model degradation is relevant for electric power engineering tasks, the peculiarity of which is multifactor dependencies in complex technical systems and the influence of meteorological parameters. Therefore, automatic updating of models over time is a necessary condition for building user confidence in forecasting systems in power engineering tasks and industry implementations of such systems. There are various methods used to prevent degradation, including an algorithm for detecting data drift, an algorithm for updating models, their retraining, additional training, and fine-tuning. This article presents the results of a study of drift types, their systematization and classification by various features. The solution options that developers need to make when creating intelligent forecasting systems to determine a strategy for updating forecast models are formalized, including update trigger criteria, model selection, hyperparameter optimization, and the choice of an update method and data set formation. An algorithm for forming a strategy for automatic updating of artificial intelligence models is proposed and practical recommendations are given for developers of models in problems of forecasting time series in the power industry, such as forecasting electricity consumption, forecasting the output of solar, wind and hydroelectric power plants.

    Keywords: time series forecasting, artificial intelligence, machine learning, trusted AI system, model degradation, data drift, concept drift

  • Intelligent Vision-Based System for Identifying Predators in Uganda: A Deep Learning Approach for Camera Trap Image Analysis

    This study presents an effective vision -based method to accurately identify predator species from camera trap images in protected Uganda areas. To address the challenges of object detection in natural environments, we propose a new multiphase deep learning architecture that combines extraction of various features with concentrated edge detection. Compared to previous approaches, our method offers 90.9% classification accuracy, significantly requiring fewer manual advertising training samples. Background pixels were systematically filtered to improve model performance under various environmental conditions. This work advances in both biology and computational vision, demonstrating an effective and data-oriented approach to automated wildlife monitoring that supports science -based conservation measures.

    Keywords: deep learning, camera trap, convolutional neural network, dataset, predator, kidepo national park, wildlife

  • Instrumental and organizational aspects of IntraService implementation in corporate IT environment

    The paper examines the case of IntraService incident management system implementation in an organization operating in the digital infrastructure segment. The study focuses on the assessment of changes that occurred in the functioning of the support service based on quantitative and qualitative indicators. The method of comparative analysis of operational parameters before and after the launch of the system is used, accompanied by expert interpretation of internal processes.

    Keywords: implementation, system, incident, support, automation, platform, organization, infrastructure, process, integration

  • Justification of the efficiency of using waste recycling and disposal technologies based on the WARM model

    The article provides an overview of modern approaches to the study of digital twins and assesses the state of their implementation in transport logistics. The authors show features of the digitalization formation and identify barriers and prospects for the development of digital twins in the transport and logistics sector. The analysis and systematization of methods used to define the concept of a digital twin, the structure and typology of digital twins in logistics are carried out. Certain promising areas and links in product supply chains, in which digital twins are being implemented especially actively, are highlighted. The paper concludes that the implementation of digital technologies and digital twins in transport logistics can become an effective tool for its transformation in modern conditions if the development and implementation of digital twins is carried out within the framework of product supply chains based on cooperation between industrial companies and related companies, with the active support of the state.

    Keywords: digital twins, transport and logistics systems, supply chains, intralogistics, digital chain

  • Analysis of the structure and quality of solar radiation data from ERA5 reanalysis for short-term forecasting in the Far North

    The article considers the assessment of the suitability of solar radiation data from ERA5 atmospheric reanalysis for forecasting problems in the northern territories. The experimental site of the Mukhrino station (Khanty-Mansiysk Autonomous Okrug), equipped with an autonomous power supply system, was chosen as the object of analysis. A statistical analysis of the annual array of global horizontal insolation data obtained using the PVGIS platform has been carried out. Seasonal and diurnal features of changes in insolation are considered, distribution profiles are constructed, and emissions are estimated using the interquartile range method. It is established that the data are characterized by high variability and the presence of a large number of zero values due to polar nights and weather conditions. The identified features must be taken into account when building short-term forecasting models. The conclusion is made about the acceptable quality of ERA5 data for use in forecasting energy generation and consumption in heating systems.

    Keywords: ERA5, solar radiation, horizontal insolation, the Far North, statistical analysis, forecasting, emissions analysis, renewable energy sources, energy supply to remote areas, time series, intelligent generation management

  • Increase of accuracy of results неравноточных measurements on the basis of data transmission by the rests

    Processing of results the unequal measurements presented by a binary code and the rests is considered. The technique of increase of accuracy of results of telemeasurements is resulted at data transmission by a series from measurement by the rests together with a binary code. The rests are duplicated in half-words in a word of data. Results of application of a technique are shown at single distortions of bats of data for a series from three measurements: measurement by the rests, then measurement in a binary code and one more measurement by the rests. At processing a series from three measurements which is received with step on a scale, equal from unit up to half of module of comparison, accuracy of measurements raises at a single mistake in a bat in a word with a binary code and a word with the rests in comparison with transfer by a binary code.

    Keywords: telemeasurements, unequal the measurements, the rests of data, a dispersion of an error, accuracy of measurements

  • Bidirectional Long Short-Term Memory Networks for Automated Source Code Generation

    This paper examines the application of Bidirectional Long Short-Term Memory (Bi-LSTM) networks in neural source code generation. The research analyses how Bi-LSTMs process sequential data bidirectionally, capturing contextual information from both past and future tokens to generate syntactically correct and semantically coherent code. A comprehensive analysis of model architectures is presented, including embedding mechanisms, network configurations, and output layers. The study details data preparation processes, focusing on tokenization techniques that balance vocabulary size with domain-specific terminology handling. Training methodologies, optimization algorithms, and evaluation metrics are discussed with comparative results across multiple programming languages. Despite promising outcomes, challenges remain in functional correctness and complex code structure generation. Future research directions include attention mechanisms, innovative architectures, and advanced training procedures.

    Keywords: code generation, deep learning, recurrent neural networks, transformers, tokenisation

  • Data Clustering Using Asymmetric Similarity Measures

    The article focuses on developing data clustering algorithms using asymmetric similarity measures, which are relevant in tasks involving directed interactions. Two algorithms are proposed: stepwise cluster formation and a modified version with iterative center refinement. Experiments were conducted, including a comparison with the k-medoids method. The results showed that the fixed-center algorithm is efficient for small datasets, while the center-recalculation algorithm provides more accurate clustering. The choice of algorithm depends on the requirements for speed and quality.

    Keywords: clustering, asymmetric similarity measures, clustering algorithms, iterative refinement, k-medoids, directed interactions, adaptive methods

  • Models for Constructing Optimal Container Freight Plans for Complex Logistics Systems

    The article addresses the challenges and proposes mathematical models for optimizing container freight transportation within complex logistics systems, emphasizing the growing importance of digital technologies and artificial intelligence in logistics by 2025. It highlights key industry issues such as decentralized global supply chains, environmental risks, infrastructure deficiencies, safety concerns, and notably, the costly problem of transporting empty containers, which accounts for a significant portion of operational expenses worldwide and in Russia. The core contribution is a modified three-dimensional transport optimization model that incorporates container types, cargo volumes, and transportation costs, including the cost variations due to partially filled or empty containers. The model extends classical transportation problem formulations by introducing a potentials method that accounts for the contributions of suppliers, recipients, and container costs to determine an optimal transport plan minimizing total costs. Constraints ensure that supply and demand conditions, container capacities, and route feasibility are respected. The model uniquely integrates the degree of container filling into cost calculations using a coefficient to adjust transportation costs accordingly. This approach enables more accurate and cost-effective freight planning. Additionally, the article discusses the development of a simulation model and a client-server application to automate the search for optimal transport plans, facilitating practical implementation. The proposed framework can be expanded to include various container types, cargo characteristics, and transport modes, offering a comprehensive tool for improving logistics efficiency in container freight transportation.

    Keywords: diversification of management, production diversification, financial and economic purposes of a diversification, technological purposes of ensuring flexibility of production

  • Results of the experimental testing of the secure information and monitoring net-work of test benches

    The article presents the process of verifying the functioning of a secure data transmission network based on broadband wireless access equipment with a sev-en-element antenna array (ABSD 7) and the same with one antenna device (ABSD 1). The conditions of the experiment, the composition and completeness of the equipment are described. The results of the checks in various modes of op-eration are presented. It is concluded that it is possible to use standard on-board communication equipment as a repeater when installing the appropriate program mode.

    Keywords: data transmission, secure network, data transmission channel, repeater, basic station, on-board equipment.

  • Prospects of using wan optimizers in designing a corporate computer network

    The article examines the problem of global network optimization, as well as currently existing software and hardware solutions. The purpose of the study is to determine the technological basis for developing a prototype of a domestic WAN optimizer. When studying the subject area, it turned out that there are no domestic solutions in this area that are freely available. The resulting solution can be adapted to the specific requirements of the customer company by adding the necessary modifications to the prototype.

    Keywords: global network, data deduplication, WAN optimizer, bandwidth

  • Designing a component for classifying objects and interpreting their actions using computer vision and machine learning methods

    The article presents aspects of designing an artificial intelligence module for analyzing video streams from surveillance cameras in order to classify objects and interpret their actions as part of the task of collecting statistical information and recording information about abnormal activity of surveillance objects. A diagram of the sequence of the user's process with active monitoring using a Telegram bot and a conceptual diagram of the interaction of the information and analytical system of a pedigree dog kennel on the platform "1С:Enterprise" with external services.

    Keywords: computer vision, machine learning, neural networks, artificial intelligence, action recognition, object classification, YOLO, LSTM model, behavioral patterns, keyword search, 1C:Enterprise, Telegram bot

  • Advanced convolutional neural network frameworks for robust multi-angle facial authentication: implementation and comparative evaluation

    This article presents the technical implementation of a convolutional nueral network-based face recognition system that is able to work under variable scenarios like occlusion, angle changes, and camera rotation. various face identification algorithms were analysed with the purpose of developing a model that could identify faces at different angles. The system was experimentally verified with various datasets and compared to its accuracy, processing speed, and robustness towards environmental disturbance. Results indicate that our convolutioan neural network structure optimized achieves 90%+ accuracy under pristine conditions and maintains decent performance upon partial occlusion.

    Keywords: face detection, convolutional nueral networks, model, feature extraction, deep learning, face recognition, image

  • Analysis of Machine Learning Algorithm for Processing Text Documents

    The use of machine learning when working with text documents significantly increases the efficiency of work and expands the range of tasks to be solved. The paper provides an analysis of the main methods of presenting data in a digital format and machine learning algorithms, and a conclusion is made about the optimal solution for generative and discriminative tasks.

    Keywords: machine learning, natural language processing, transformer architecture models, gradient boosting, large language models

  • Unveiling hidden Patterns in Classifying Wildlife Images using Convolutional neural networks for Species Identification in Conservation Initiatives

    This study is a testament to the potential of convolutional neural networks in softmax activation to classify mantis, honey badger, and weasel samples. The model was able to predict highly with low misclassification and had the potential to reduce environmental variance by minimizing it using data augmentation. The research shows how deep learning networks would be used in the automation of taxonomic classification, which in turn would help species identification through images and large-scale conservation monitoring.

    Keywords: deep learning, machine learning, convolutional neural networks, dataset, softmax function, image classification, wildlife, data augmentations

  • Demand forecasting and inventory management using machine learning

    This article is devoted to the study of the possibilities of machine learning technology for forecasting the demand for goods. The study analyzes various models and the possibilities of their application as part of the task of predicting future sales. The greatest attention is focused on modern methods of time series analysis, in particular neural network and statistical approaches. The results obtained during the study clearly demonstrate the advantages and disadvantages of different models, the degree of influence of their parameters on the accuracy of the forecast within the framework of the demand forecasting task. The practical significance of the findings is determined by the possibility of using the results obtained in the analysis of a similar data set. The relevance of the study is due to the need for accurate forecasting of demand for goods to optimize inventory and reduce costs. The use of modern machine learning methods makes it possible to increase the accuracy of predictions, which is especially important in an unstable market and changing consumer demand.

    Keywords: machine learning algorithms, demand estimation, forecasting accuracy, time sequence analysis, sales volume prediction, Python, autoregressive integrated moving average, random forest, gradient boosting, neural networks, long-term short-term memory

  • Using machine learning methods to convert a scan into elements of a digital information model

    The article discusses machine learning methods, their application areas, limitations and application possibilities. Additionally highlighted achievements in deep learning, which allow obtaining accurate results with optimal time and effort. The promising architecture of neural networks of transformers is also described in detail. As an alternative approach, it is proposed to use a generative adversarial network in the process of converting a scan into elements of a digital information model.

    Keywords: scanning, point cloud, information model, construction, objects, representation, neural network, machine learning

  • Frontend Development Efficiency Based on Builder Analysis

    Modern web applications are becoming more complex and feature-rich, which creates the need for effective tools for dependency management, optimization, and project assembly. Buider allow you to optimize your code, which directly affects the download and execution speed of applications. The purpose of the work is to conduct a comparative analysis of JavaScript builders: Webpack, Parcel, and Rollup in order to identify their advantages and disadvantages from the point of view of frontend development ergonomics. This includes evaluating the convenience of configuration, resource efficiency, build speed, and other factors that affect developer productivity and the final quality of web applications. Practical testing of the builders was carried out using the example of a standard web project. The ergonomics of working with tools is evaluated: criteria are identified and a comparison is made based on the data obtained. Recommendations have been developed for choosing the optimal tool for various types of projects in front-end development. The research results can be used as a basis for training new specialists, as well as for improving existing practices in developing web applications when making informed decisions on the choice of technologies for long-term projects.

    Keywords: web development, development efficiency, ergonomics, frontend development, testing, builder

  • Content-based approach in recommender systems: principles, methods and performance metrics

    This paper explores the content-based filtering approach in modern recommender systems, focusing on its key principles, implementation methods, and evaluation metrics. The study highlights the advantages of content-based systems in scenarios that require deep object analysis and user preference modeling, especially when there is a lack of data for collaborative filtering.

    Keywords: сontent - oriented filtering, recommendation systems, feature extraction, similarity metrics, personalization

  • Comprehensive Analysis of Russian-Language Texts Based on Transformer-Type Neural Network Models

    This article presents a comprehensive analysis of Russian-language texts utilizing neural network models based on the Bidirectional Encoder Representations from Transformers (BERT) architecture. The study employs specialized models for the Russian language: RuBERT-tiny, RuBERT-tiny2, and RuBERT-base-cased. The proposed methodology encompasses morphological, syntactic, and semantic levels of analysis, integrating lemmatization, part-of-speech tagging, morphological feature identification, syntactic dependency parsing, semantic role labeling, and relation extraction. The application of BERT-family models achieves accuracy rates exceeding 98% for lemmatization, 97% for part-of-speech tagging and morphological feature identification, 96% for syntactic parsing, and 94% for semantic analysis. The method is suitable for tasks requiring deep text comprehension and can be optimized for processing large corpora.

    Keywords: BERT, Russian-language texts, morphological analysis, syntactic analysis, semantic analysis, lemmatization, RuBERT, natural language processing, NLP