Computational nanotechnology
Quarterly peer-review journal.
About
“Computational nanotechnology” journal publishes peer-reviewed scientific research works on mathematical modeling of processes while creating nanostructured materials and devices. The development of nanoelectronics devices, nanoprocesses needs to involve quantum computing allowing prediction of the structure of matter.Work on nanoprocesses requires the development of quantum computers with a fundamentally new architecture.
The journal publishes peer-reviewed scientific articles on the following scientific specialties:
- Computer Science
- Artificial intelligence and machine learning
- Mathematical modeling, numerical methods and complex programs
- Theoretical informatics, cybernetics
- Cybersecurity
- Information Technology and Telecommunication
- System analysis, management and information processing
- Elements of Computing Systems
- Automation of manufacturing and technological processes
- Management in organizational systems
- Mathematical and software of computеrs, complexes and computer networks
- Information security
- Computer modeling and design automation systems
- Informatics and Information Processing
- Nanotechnology and nanomaterials
Indexing
- Russian Science Citation Index (RSCI)
- East View Information Services
- Ulrichsweb Global Periodicals Directory
- Google Scholar
- Dimensions
- CrossRef
- MathNet
VAK of Russia
In accordance with the decision of the Presidium of the Higher Attestation Commission of the Ministry of Education and Science of Russia dated 29.05.2017, the journal «Computational Nanotechnology» is included in the List of leading peer‐reviewed scientific journals and publications in which the main scientific results of dissertations for the degree of candidate and doctor of sciences should be published.
Subject heading list
- Atomistic Simulations - Algorithms and Methods
- Quantum and Molecular Computing, and Quantum Simulations
- Bioinformatics, nanomedicine and the creation of new drugs and their delivery to the necessary areas of neurons
- Development of the architecture of quantum computers based on new principles, creating new quantum programming
- Development of new energy units based on renewable kinds of energy
- Problems of synthesis of nanostructured materials to create new ultra-compact schemes for supercomputers
- Peculiarities of the development of devices based on nanostructured materials
- Development of functional nanomaterials based on nanoparticles and polymer nanostructures
- Multiscale modeling for information control and processing
- Information systems of development of functional nanomaterials
Current Issue



Vol 11, No 4 (2024)
ИСКУССТВЕННЫЙ ИНТЕЛЛЕКТ И МАШИННОЕ ОБУЧЕНИЕ
Application of numerical methods for optimizing visual elements in e-commerce
Abstract
The article discusses the use of numerical methods to optimize the design elements of product cards. The discount block, one of the key elements significantly influencing sales, is selected as the object of study. The aim of the research is to improve the click-through rate (CTR) of product cards by analyzing and optimizing visual parameters such as color, font size, block placement, discount format, and device type. To achieve this goal, a regression model was developed to predict CTR for new parameter combinations without the need for full-cycle testing and to evaluate the significance of the analyzed parameters. The results show that the most impactful factors on CTR are background color, font size, and the placement of the discount block. The proposed approach reduces the number of required tests, accelerates the optimization process, and can be adapted to other design elements, such as call-to-action buttons or stock availability indicators.



Analysis and evaluation of algorithms for personalization of interaction with the user for the development of a social network
Abstract
This article analyzes personalization algorithms for social networks, with key objectives being the enhancement of user interaction and the improvement of recommendation relevance. The goal of this work is to evaluate various personalization approaches, such as recommendation systems and machine learning algorithms, as well as to assess the accuracy of these algorithms. Personalization approaches based on recommendation systems and machine learning methods are discussed, along with the application of artificial intelligence to improve recommendation accuracy. Three primary recommendation system algorithms are presented: collaborative filtering, content-based filtering, and hybrid models. Collaborative filtering was selected as the main personalization method, using the Python Library Surprise, which includes algorithms such as Singular Value Decomposition, Slope One, and K-Nearest Neighbors. A comparative analysis of Root Mean Squared Error and Mean Absolute Error metrics revealed that the K-Nearest Neighbors algorithm showed the best results, making it the preferred choice for further implementation. The final model, trained on the full dataset, demonstrated strong accuracy and potential for practical use in real products. The results presented could be valuable for social network developers in choosing optimal algorithms to enhance user experience, as well as for future research in personalization and recommendation systems.



MATHEMATICAL MODELING, NUMERICAL METHODS AND COMPLEX PROGRAMS
Using graphs to identify asset security compromises
Abstract
Due to the ever-expanding threat landscape, the problem of timely identification of information security risks, their assessment, and, as a result, management of these risks remains urgent. The main components of all quantitative risk assessments are the frequency, or probability, of the realization of a risky event, and the amount of losses from the realization of the threat. The purpose of the work is to increase the accuracy in quantifying information security risks, develop a theoretical model that takes into account all the relationships between assets in the company’s information environment, and compile an effective set of risk management measures. To formalize the company’s information security risk assessment model, a set of security breach conditions for the company’s information environment was identified, consisting of elements characterizing the possible results of threat implementation for each asset. As a result of the development of the model, the relationship of assets and the versatility of threat scenarios are shown.



SYSTEM ANALYSIS, INFORMATION MANAGEMENT AND PROCESSING, STATISTICS
Identifying focus points with intelligent self-writing eye tracker
Abstract
When perceiving complex, saturated images, the task arises not only to track where the user’s gaze is directed, but also to understand how he processes information on the screen, how he switches between objects and where his attention lingers longer. The effectiveness of perception is predetermined by the effectiveness of the interface and the information model built by the developer, the effectiveness of whose work is determined by the identified accents of the human operator’s attention. An eye tracking toolkit has been developed in Python version 3.10 with the connection of the libraries mediapipe, openCV and matplotlib with extended functionality aimed at improving the accuracy of interpretation of gaze behavior and improving the methods of presenting the collected data. The use of the developed toolkit allows us to determine the areas of interest of users, identify accents of attention, which can serve as a basis for constructing attention maps, which can subsequently help in creating an effective user interface.



Solar water heating system for a country house
Abstract
The paper considers a heating system for a country house, including methods for automation and forecasting of heat transfer. The study is based on the integration of solar panels and hardware to form an automated control system (ACS) that adapts to climatic conditions, time of day and position of solar panels. The system takes into account temperature changes, weather factors and the position of the sun, which allows to minimize heat loss and increase energy efficiency. The use of this system allows to reduce heating costs and ensures environmental friendliness due to the use of renewable energy sources. The automated control and dispatching system for the proposed model of a solar water heating system for a country house is designed to monitor the condition of equipment at individual heating points and allows: to provide automatic control services with up-to-date and accurate information on the operation of the equipment; to carry out operational control over the condition of solar systems and process equipment; to track the exit beyond the permissible limits of instrumental and process parameters of heat transfer of the system; to implement modules for changing the operating parameters of the system, ensuring integration into a single system of access to process data and the current state of the equipment. The control of the position of the heliopanels and the use of temperature, pressure and thermal energy sensors allows you to maintain an optimal microclimate inside the building. The operation of pumps and storage tanks is regulated by the automated control system, preventing overloads and minimizing energy consumption. Such automation capabilities make the water supply system sustainable and energy efficient, especially in conditions of low temperatures and high solar activity.



ELEMENTS OF COMPUTING SYSTEMS
Optimization of quantum computations: the impact of the Doppler effect on cubit coherence
Abstract
The Doppler effect, arising from the relative motion between the source and the observer, plays a significant role in quantum computations, particularly in the context of decoherence and the state of qubits. In quantum systems where information is encoded in the states of qubits, changes in the frequency of photons caused by the Doppler effect can lead to coherence violations and a decrease in computational accuracy. These changes can cause state mixing and complicate the management of quantum interactions, increasing the probability of errors. Understanding and accounting for the Doppler effect is critically important for designing robust quantum systems, as it can manifest in various types of qubits, including photonic, atomic, and ion qubits. To minimize the impact of the Doppler effect, it is necessary to develop error correction methods and utilize technologies such as polarizing filters or feedback systems. Thus, studying the Doppler effect deepens our understanding of the mechanisms of decoherence and contributes to the creation of more stable and efficient quantum computing systems.



MATHEMATICAL AND SOFTWARE OF COMPUTЕRS, COMPLEXES AND COMPUTER NETWORKS
Applying GPU parallel programming for image processing and clustering
Abstract
This paper presents state-of-the-art image processing and structural analysis software tools that use GPU parallel programming to achieve substantial performance gains. The software suite combines advanced preprocessing techniques, object identification methods, clustering algorithms, and analysis tools to facilitate efficient and precise analysis of complex imaging datasets. The case studies illustrate the software’s versatility and effectiveness across diverse scientific domains, including materials science, biological research, and astronomy. By exploiting GPU parallel programming, the tools deliver performance improvements of 5–20x compared to traditional sequential programming, enabling real-time visualization and expedited data processing. The intuitive user interface empowers researchers to fine-tune parameters, visualize results, and interpret data with ease, streamlining the research workflow. The broader impacts of these tools include accelerating scientific discovery, enhancing data analysis accuracy, and driving innovation across diverse scientific fields. A notable example of their effectiveness is the processing and analysis of electron microscopy images of amorphous alloys. The developed algorithms and software tools demonstrate promising results in this area, facilitating detailed studies of atomic structure and degree of orderliness.



Algorithm for detection of head tremor according to data of a smartphone video camera of a biomedical monitoring system
Abstract
Modern conditions demand active digitization from humanity across various spheres of activity and daily life, facilitating faster task completion and simplifying processes. Self-diagnosis allows individuals to identify symptoms, which can serve as a basis for consulting medical professionals, especially crucial in critical situations where lives are at stake. Thus, it is clear that the development of such systems is a relevant challenge. In this context, head tremor plays a significant role as it may indicate the presence of Parkinson’s disease or multiple sclerosis. The aim of this work is to develop a head tremor detection module suitable for integration into smartphone applications. The study employs a method based on analyzing data from the optical sensor, namely the front camera of the smartphone. This method utilizes an open machine learning model, ML Kit, for facial recognition, along with a specially designed algorithm for processing results. Testing demonstrated an accuracy of 0.92 according to the accuracy metric. This approach offers a novel method for detecting head tremors and highlights the effectiveness of using ML Kit’s standard model for similar tasks on smartphones, which can also be applied within a larger biomedical diagnostic system.



Monitoring fault tolerance in distributed systems
Abstract
The goal of this study is to develop and verify a monitoring model for reliability and availability in distributed systems, built on probabilistic component characteristics and accounting for dependent failures. Modern distributed systems require accurate failure prediction methods that can account for complex dependencies between nodes and support reliable performance under high loads. Traditional approaches based on empirical data analysis often fall short in predicting system states under changing loads, which limits their applicability. In this research, the developed probabilistic model underwent verification using numerical simulation and accuracy assessment through Kullback–Leibler divergence and mean squared error (MSE), confirming its accuracy and practical value. The model’s versatility was proven experimentally, demonstrating its ability to adapt to various types of distributed systems while providing precise real-time predictions of availability and resilience. Numerical experiments showed that the proposed model can be a reliable tool for managing fault tolerance and load balancing. Thus, the developed model is an effective solution for enhancing the reliability of distributed systems, exhibiting a high degree of versatility and making it valuable for a wide range of applications.



Model and methods of analysis of software systems that provide recommendations for reducing the time of research
Abstract
A generalized model is proposed, consisting of processes, programs, and computational systems for computational and experimental studies of flutter on a dynamically similar model and the actual structure of an aircraft. The analysis conducted based on this model allowed for the identification of the most time-consuming processes. In the computational studies of flutter, the process of executing the program for calculating aerodynamic forces was highlighted as the most time-consuming component of the complete package for calculating the critical flutter speed. In experimental studies, the process of conducting frequency tests on the actual structure using the traditional step-by-step excitation method with harmonic forces applied to its structure was identified as the most time-consuming. During the experimental studies, the process of conducting frequency tests of full-scale aircraft using a measuring and computing system providing a traditional method of step-by-step excitation of oscillations by harmonic forces with the selection of their amplitudes was identified as the most expensive. When testing dynamically similar models in wind tunnels, in turn, the process of secondary processing of data recorded over communication wires with interference is indicated as the most time-consuming. Significant time expenditures are also noted in the process of exchanging computational and experimental data. Recommendations are given on ways to reduce these time costs, examples of implementations and estimates of their effectiveness are given.



METHODS AND SYSTEMS OF INFORMATION PROTECTION, INFORMATION SECURITY
Improving network security through deep learning RNN approach
Abstract
Subject of the Study. This article explores the use of Recurrent Neural Networks (RNN), Long Short-Term Memory (LSTM) networks, to improve the effectiveness of Intrusion Detection Systems (IDS). The study emphasizes the work of optimizers to enhance the accuracy of detecting network attacks and provides a comparative analysis of various optimization algorithms using the NSL-KDD dataset. Method. The article proposes an RNN-LSTM-based approach for detecting intrusions in network traffic. Seven different optimization algorithms were evaluated, including Adamax, SGD, Adagrad, Adam, RMSprop, Nadam, and Adadelta. The method involves a comparative analysis of their performance with varying hidden layer sizes. Main Results. The experiment methodology included training the RNN-LSTM model with hidden layer sizes ranging from 50 to 100 over 500 epochs. The Adamax optimizer achieved the highest accuracy of 99.79%, while Adadelta had the lowest accuracy at 97.29%. Additionally, SGD demonstrated the best True Positive Rate (TPR), while Adamax showed the lowest False Alarm Rate (FAR). The study evaluated metrics such as accuracy, TPR, FAR, precision, and F1-score, with Adamax standing out for its overall performance. Practical Significance. The article is relevant to professionals in the fields of cybersecurity, network security, and IDS development. This article provides valuable insights for enhancing IDS configurations to improve the detection and mitigation of network intrusions.



COMPUTER MODELING AND DESIGN AUTOMATION SYSTEMS
Detailing fuzzy cognitive map models through clustering and nesting for complex concepts
Abstract
Analyzing complex information models, constructed from any collected data, is a challenging task. In recent years, new methodologies have emerged and been proposed to address various problems in this field. However, there is still a need for new, efficient, and user-friendly methods for data presentation and information modeling. This paper proposes a method for creating a nested structure based on fuzzy cognitive maps. In this method, each concept can be represented as another fuzzy cognitive map through clustering, which provides a more detailed and accurate representation of complex data and increases the convenience and efficiency of analyzing such information models. This nested structure is then optimized by applying evolutionary learning algorithms. Through a dynamic optimization process, the entire nested structure based on fuzzy cognitive maps is restructured to obtain important relationships between map elements at each level of nesting, as well as to determine the weight coefficients of these relationships based on available time series. This process allows for the discovery of hidden relationships between important map elements. The article proposes the application of this nested approach using the example of a fuzzy cognitive map of the influence of various social factors on becoming homeless.



Dynamic rod pump model
Abstract
On the basis of full kinematic analysis in terms of position functions and velocity analogues, a dynamic model of the drive of rod deep well pump PSHGN8-3-5500 is proposed, containing the main geometrical kinematic, inertial and power parameters. Integration of differential equations of the dynamic model is carried out under the condition that the pump is driven by the asynchronous motor 4AR180M3 U3. The influence of the number of swings per minute on the main parameters of the rod pump operation is investigated. The laws of change of angular velocity and angular acceleration of the driving crank in starting and steady-state modes are determined.



Analysis of the implementation of the agent-based approach and features of the simulation of a seaport in the AnyLogic environment
Abstract
The relevance of this article is due to the interest in simulation modeling in the context of doing business, as well as in order to improve it. Modeling gives you an in-depth understanding of how a business works, allowing you to experiment with it in a secure digital environment and identify bottlenecks that require optimization. At the same time, one of the most important steps in building a model is to adjust the source data. If you do not take into account the operations typical for the company, the result of the experiments will not be plausible in the model and, therefore, inapplicable in practice. It is also important to define an approach to simulation modeling, since the result of modeling after an incorrect choice can take a long time and turn out to be unprofitable. A modern workforce that supports any logic is pre-configured to create simulation models that regulate the real tactics and operation of seaport container terminals: ship sections, cargo transportation, transportation by various modes of transport and personnel composition. As part of the study, the reconstruction of the antenna path was considered, and a description of its special reconstruction in the Anylogic center was considered using the example of the container terminal of the Vladivostok commercial port.



INFORMATICS AND INFORMATION PROCESSING
Aspects of modernization of ITSM class information system in software development companies: microservice architecture of the search module of the system
Abstract
The article considers the problem of practical modernization of IT service management system in software development companies. This problem has recently been widely considered in the works of world scientists [1–5; 13–15]. To study the needs of modernization, typical business process diagrams of the developer company (BPMN 2.0) are given. Based on the analysis of the business processes and the shortcomings identified in them, the components of an architectural microservice solution for modernization in the area of incident handling are developed and given in schematic form. As a result, the microservice architecture of the search module of the ITSM system is obtained, which is given in the form of a model with explanations in the article. As a development of the research, it is planned to carry out pilot tests of the developed architecture in solving a number of applied tasks with subsequent evaluation of the obtained results. The proposed modernization variant is characterized by universality and can be considered as a solution in each case of ITSM use in a production enterprise. The article will be useful for IT-specialists implementing and maintaining ITSM systems.



NANOTECHNOLOGY AND NANOMATERIALS
Energy balance: from coal to quantum batteries
Abstract
Electronegativity and chemical hardness are important concepts in chemistry that influence the structure, properties, and reactivity of substances. Electronegativity defines an atom’s ability to attract electrons in a chemical bond, which affects the polarity and stability of molecules. Chemical hardness, on the other hand, characterizes a substance’s resistance to changes in its electronic structure and its response to external influences. These two parameters are interconnected and play a key role in understanding the behavior of chemical compounds, particularly in the context of superconductors, catalysts, and materials with unique properties. Understanding the relationship between electronegativity and chemical hardness may contribute to the development of new materials with tailored characteristics and enhanced properties.



The observer effect in the double-slit experiment: the role of experimental parameters in forming the interference pattern
Abstract
Changes in experimental conditions significantly influence the interference pattern in the double-slit experiment, which is determined by various factors, including the distance and width of the slits, the wavelength, the position of the detector, and the spectral properties of the detector itself. The observer effect, manifested in the alteration of quantum objects’ behavior depending on the measurement conditions, underscores the critical importance of experimental conditions in quantum mechanics and their direct impact on the observed results. Understanding these factors deepens our knowledge of quantum interactions and contributes to the development of more reliable and effective quantum systems, such as quantum computers and quantum communication networks. This knowledge opens new horizons in the study of the nature of light and matter, as well as fostering a deeper understanding of the “observer effect” and the application of quantum technologies to practical problems.



Fractals and the structure of the universe
Abstract
This article examines the phenomenon of fractals and their role in understanding the structure of the universe. Fractals are complex geometric structures characterized by self-similarity, finding applications in various fields of science, from mathematics to biology. Examples of fractals in nature are provided, including galaxies, clouds, the nervous system, and natural landscapes. The discussion highlights how fractals assist in modeling complex systems, analyzing data, and understanding the evolution of different structures. The article emphasizes the importance of fractals as a tool for studying natural processes and their significance for further research in quantum physics and chaos theory.


