• Nenhum resultado encontrado

Computing and Grid Technologies in Science and Education" (GRID'2021)

N/A
N/A
Protected

Academic year: 2024

Share "Computing and Grid Technologies in Science and Education" (GRID'2021)"

Copied!
56
0
0

Texto

In this work, the current status of the computing capability, network and engineering infrastructure is shown as. It generates various queries (event search, trigger searches, etc.) on sets of the EventIndex data and measures the response times. The project of the Super Charm-Tau (SCT) factory --- a high-brightness electron-positron collider for the study of charmed hadrons and taulepton.

One of the most significant improvements was the migration of the repository for aggregated data from. This work describes the design of a digital model of an HPC system for processing data from the "megascience" class electron-positron collider of the Super Charm-Tau factory. The model includes intelligent agents that mimic the behavior of the supercomputer's main subsystems, such as a task scheduler, computer.

In the case of the TAIGA experiment, we have previously shown that both the quality of the selection of gamma-ray events and its accuracy. We provide an overview of the CMS experiment activities to apply Machine Learning (ML) techniques to Data Quality Monitoring (DQM). Evolution of the WLCG Computing Infrastructure for the High Luminosity Challenge of the LHC at CERN.

The background of the model is a new quantum inference model based on the quantum genetic algorithm.

EFFECTIVE ALGORITHM OF CALCULATING THE WIGNER FUNCTION FOR A QUANTUM SYSTEM WITH A POLYNOMIAL POTENTIAL

Studies of the geometrical aspects of quantum information are becoming very relevant for practical purposes. Due to the demand coming from quantum technology, the formulation of quantum estimation theory has become the frontier of modern research. Currently, data from most satellites are in the public domain and are generally multispectral images.

Modern applications of quantum mechanics renewed interest in the properties of the set of finite size density matrices. The search for short-lived particles is an important part of physics research in experiments with relativistic heavy ions. The report provides an overview of the current state and the most important directions for the advanced development of National.

This method allows you to obtain the system's solutions in both symbolic and numerical form. Improvements to the LOOT model for primary vertex discovery based on analysis of development results. These predictions are important when comparing the underlying QRS complex of the ECG wave with the slowly decaying waves (or arrhythmia) in cardiac patients.

An ontology-based approach in exploratory analysis of textual data can significantly improve the quality of the results obtained. A PIK Data Center was put into use in 2017 as part of the PIK nuclear reactor reconstruction project. The SPD (Spin Physics Detector) is a planned spin physics experiment in the second interaction point of the NICA collider underneath.

The main purpose of the experiment is the test of the fundamentals of QCD via the study of the polarized structure of the nucleon and spin-related phenomena in the collision of. The given architecture of the supercomputer allows users to choose optimal computing facilities to solve their tasks. National data lake research and development, as part of the DOMA project, should deal with the investigation of possible technological solutions for

Some Aspects of Workflow Scheduling in Computerized Continuum Systems Mr. Vladislav Kashansky (University of Klagenfurt and South Ural State University). A model based on the fractal method for describing load dynamics is discussed.

APPROACH TO REMOTE PARTICIPATION IN THE ITER EXPERIMENTAL PROGRAM

To monitor hardware resources and services in growing DICE infrastructure, a system based on Prometheus and Thanos was designed and implemented.

EXPERIENCE FROM MODEL OF RUSSIAN REMOTE PARTICIPATION CENTER Mr Oleg Semenov (Project Center ITER)

THE ALGORITHM FOR SOLVING THE PROBLEM OF SYNTHESIS OF THE OPTIMAL LOGICAL STRUCTURE OF DISTRIBUTED DATA IN ARCHITECTURE OF GRID SERVICE

In particular, such tasks arise in the organization of systems for processing huge amounts of information from the Large Hadron Collider  the.

DEVELOPMENT OF EFFECTIVE ACCESS TO THE DISTRIBUTED SCIENTIFIC AND EDUCATIONAL E-INFRASTRUCTURE

Performance analysis of different parallel data processing methods implemented in the ROOT package. This configuration reduces the vulnerability of the entire network, as the failure of a single control element immediately interrupts its operation. In this work, the LinkedIssuesHasStatus plugin for the JIRA service of the ALICE experiment is developed and implemented.

The results of the analysis of the models can be used in the reorganization of the existing ones. Development of information systems for theoretical and applied tasks based on the HybriLIT platform. The report provides an overview of two information systems (IS) that are being developed based on the HybriLIT platform.

The ATLAS experiment uses various tools to monitor and analyze the metadata of major distributed computing applications. One of the tools is fully based on the unified monitoring infrastructure (UMA) provided by the CERN-IT Monit group. In the wake of the successful integration of the Titan supercomputer into the ATLAS computer.

This is especially true when studying diseases associated with changes and disorders in the functioning of the brain. The report will present the results of the development of the algorithm block of the Information System (IS) for radiobiological studies, which was created within the framework of the joint project of MLIT and LRB JINR, from the point of view of resolution. However, sequential algorithms become unable to handle a given problem as the amount of data representing graph instances increases.

In this paper, we discuss the main principles and architecture of the digital analytics platform that aims to support socio-economic applications. Heterogeneous platform 34;HybriLIT is part of the Multifunctional Information and Computer Complex (MICC) of the Laboratory of Information Technologies named after MG Meshcheryakov from JINR, Dubna. The article discusses the main provisions (methods, risk models, calculation algorithms, etc. .) of the issue of organizing personal data protection (PD), based on the application of the anonymization procedure.

The authors reveal the relevance of the studied problem based on the trend of the general growth of informatization and the further development of Big Data technology. The most promising is the Blockchain technology, which has the capabilities of the most effective coordination of the.

Referências

Documentos relacionados