63 research outputs found

    mABC: multi-Agent Blockchain-Inspired Collaboration for root cause analysis in micro-services architecture

    Full text link
    The escalating complexity of micro-services architecture in cloud-native technologies poses significant challenges for maintaining system stability and efficiency. To conduct root cause analysis (RCA) and resolution of alert events, we propose a pioneering framework, multi-Agent Blockchain-inspired Collaboration for root cause analysis in micro-services architecture (mABC), to revolutionize the AI for IT operations (AIOps) domain, where multiple agents based on the powerful large language models (LLMs) perform blockchain-inspired voting to reach a final agreement following a standardized process for processing tasks and queries provided by Agent Workflow. Specifically, seven specialized agents derived from Agent Workflow each provide valuable insights towards root cause analysis based on their expertise and the intrinsic software knowledge of LLMs collaborating within a decentralized chain. To avoid potential instability issues in LLMs and fully leverage the transparent and egalitarian advantages inherent in a decentralized structure, mABC adopts a decision-making process inspired by blockchain governance principles while considering the contribution index and expertise index of each agent. Experimental results on the public benchmark AIOps challenge dataset and our created train-ticket dataset demonstrate superior performance in accurately identifying root causes and formulating effective solutions, compared to previous strong baselines. The ablation study further highlights the significance of each component within mABC, with Agent Workflow, multi-agent, and blockchain-inspired voting being crucial for achieving optimal performance. mABC offers a comprehensive automated root cause analysis and resolution in micro-services architecture and achieves a significant improvement in the AIOps domain compared to existing baseline

    A Study on Whether Microblogging Has a Positive Effect on Reduction of Reverse Culture Shock: A Weibo Case

    Get PDF
    When sojourners come back to their home country, they have to suffer a certain degree of symptoms such as feelings of alienation, discomfort and abnormality. this phenomenon is called "reverse culture shock". However in the last decade the industry of social media has been growing rapidly, including microblog such as China´s Weibo. microblogging has a positive effect on building common ground and mutual understanding, enhancing feelings of intimacy and connectedness with others. Thanks to social media sojourners have a bridge connecting themselves and their home countries and close people there. So this study aims to figure out whether these benefits of social media play a role in reducing sojourners`sense of reverse culture shock

    Pathogenesis of psoriasis complicated with atherosclerosis: a bioinformatics analysis based on transcriptomic data

    No full text
    Objective To identify comorbid hub genes in psoriasis and atherosclerosis. Methods Transcriptomic datasets of three psoriatic samples and three atherosclerosis samples were downloaded from the GEO database. A deep learning algorithm (Batch Normalization) was utilized to merge and batch-correct the datasets of the two diseases. The limma package was employed to intersect the differentially expressed genes in the lesions and normal tissues of both diseases. The protein-protein interaction network was constructed using the STRING database and CytoHubba plugin to identify the hub genes. Results Intersection analysis revealed 132 up-regulated genes and 114 down-regulated genes in the lesions of these two diseases. Construction of the interaction network identified 10 hub genes: MX1, OAS1, OAS2, OASL, IFIT1, RSAD2, CXCL10, IFIT3, XAF1 and IL1B, among which the first six were enriched in the type Ⅰ interferon signaling pathway. Two external validation sets independently verified the expression of CXCL10. Conclusions CXCL10 is a key comorbidity gene for psoriasis and atherosclerosis. The activation pattern of hub gene is similar to that of innate immune response to viral invasion

    Digital Transformation and Rule of Law Based on Peak CO<sub>2</sub> Emissions and Carbon Neutrality

    No full text
    The promotion and implementation of carbon neutrality against peaking carbon dioxide emissions urgently need the support of science and technology and the backing provided by a guarantee of rule of law. The proposition, logic, and progression of digital responses to peaking carbon dioxide levels in the search for carbon neutrality are clearly reflected in the current era, employing big data to address the problems of inadequate central–local coordination and interaction, the inadequate application of the rule of law, campaign-style “carbon reduction” promotion, and scientific and technological support in the promotion and implementation of peak carbon dioxide emissions and carbon neutrality. We need to pay attention to the coordination of digital technology, the rule of law response, and the protection of people’s rights. First, in the process of digital carbon dioxide peaking and carbon neutrality, it is necessary to improve the credibility of carbon dioxide peaking and carbon neutrality through the “whole-process trace” and storage mechanism techniques that are made possible by blockchain technology. Second, it is necessary to refine the management of peak carbon dioxide emissions and carbon neutrality through “decentralization” and consensus mechanisms. Third, it is necessary to improve the effectiveness of governance in the management of peak carbon dioxide emissions and carbon neutrality through “non-falsifiability” and collaboration mechanisms. Fourth and finally, the conclusions of this paper are offered. First, from the aspect of smart city construction, it is necessary to promote the coordinated construction of a low-carbon city and smart city and explore the legal ramifications of low-carbon development in urban governance. Second, in corporate governance, we need to build a low-carbon-development digital platform to promote the integration of digital technology and corporate compliance. Third, in terms of global governance, we need to promote the rule of law in cyberspace to address global climate change, the low-carbon development of digital technology, and the low-carbon construction of a cyber society. Fourth, we need to emphasize the rights and obligations of different parties in the implementation mechanism of the rule of law on digital carbon peaks and carbon neutrality

    Text simplification based on deep learning

    No full text
    Trabajo de Fin de Doble Grado en Administración y Dirección de Empresas e Ingeniería Informática, Facultad de Informática UCM, Departamento de Ingeniería del Software e Inteligencia Artificial, Curso 2023/2024. La aplicación final está disponible en https://simplificacion.pythonanywhere.com/ y el código desarrollado se puede consultar en https://github.com/XinxiangZ/tfg.Los textos cotidianos pueden ser difíciles de entender para algunos grupos sociales debido a diversas razones, como un nivel educativo bajo, el envejecimiento, la discapacidad intelectual o los trastornos de aprendizaje. Para facilitar a estos grupos el acceso a la información surge la simplificación de textos. La simplificación de textos se entiende como el proceso de transformar un texto en uno equivalente pero más sencillo de comprender. Este proceso incluye varias tareas como la división de oraciones complejas en otras más simples y la sustitución del vocabulario complejo por un vocabulario más simple y cotidiano. Tradicionalmente, la simplificación de textos se realizaba de forma manual por editores con conocimiento acerca de las pautas de simplificación, pero en términos de tiempo y esfuerzo, la simplificación manual de textos es una tarea costosa, sobre todo ahora cuando la información que se genera es constante. Con el objetivo de agilizar la tarea de simplificación, surge la idea de automatizar parte del trabajo, dando lugar a la simplificación automática de textos. Los modelos de lenguaje juegan un papel importante en esta tarea pues actualmente son la base de las técnicas del Procesamiento del Lenguaje Natural. En este TFG analizamos los corpora y los Grandes Modelos de Lenguaje que existen actualmente para la simplificación de textos en castellano. Tras este análisis concluimos que la elección de un corpus específico influye en la tarea de simplificación que se esté estudiando en cada caso, pues cada corpus está creado para una tarea de simplificación concreta. Para seleccionar el mejor modelo para cada tarea de simplificación, realizamos un estudio experimental en el que, mediante métricas de evaluación, evaluamos el rendimiento de cada modelo sobre cada corpus. Finalmente, y para poner en práctica los modelos estudiados, creamos una aplicación web que simplifica textos en español teniendo en cuenta los distintos tipos de simplificaciones y las conclusiones obtenidas durante el estudio de los corpora y los modelos.Everyday texts can be difficult to understand for some social groups due to various reasons, such as low educational level, aging, intellectual disability, or learning disorders. To facilitate access to information for these groups, text simplification arises. Text simplification is understood as the process of transforming a text into an equivalent but simpler one to understand. This process includes various tasks such as dividing complex sentences into simpler ones and replacing complex vocabulary with simpler and more everyday vocabulary. Traditionally, text simplification was carried out manually by editors with knowledge of simplification guidelines, but in terms of time and effort, manual text simplification is an expensive task, especially now when information is constantly being generated. With the aim of streamlining the simplification task, the idea of automating part of the work arises, giving rise to automatic text simplification. Language models play an important role in this task as they are currently the basis of Natural Language Processing techniques. In this work we analyze the corpora and Large Language Models that currently exist for text simplification in Spanish. After this analysis, we conclude that the choice of a specific corpus influences the simplification task that is being studied in each case, since each corpus is created for a specific simplification task. To select the best model for each simplification task, we carry out an experimental study in which, using evaluation metrics, we evaluate the performance of each model on each corpus. Finally, and to put into practice the studied models, we create a web application that simplifies texts in Spanish taking into account the different types of simplifications and the conclusions obtained during the study of the corpora and models.Depto. de Ingeniería de Software e Inteligencia Artificial (ISIA)Fac. de InformáticaTRUEunpu

    GA&minus;Reinforced Deep Neural Network for Net Electric Load Forecasting in Microgrids with Renewable Energy Resources for Scheduling Battery Energy Storage Systems

    No full text
    The large&minus;scale integration of wind power and PV cells into electric grids alleviates the problem of an energy crisis. However, this is also responsible for technical and management problems in the power grid, such as power fluctuation, scheduling difficulties, and reliability reduction. The microgrid concept has been proposed to locally control and manage a cluster of local distributed energy resources (DERs) and loads. If the net load power can be accurately predicted, it is possible to schedule/optimize the operation of battery energy storage systems (BESSs) through economic dispatch to cover intermittent renewables. However, the load curve of the microgrid is highly affected by various external factors, resulting in large fluctuations, which makes the prediction problematic. This paper predicts the net electric load of the microgrid using a deep neural network to realize a reliable power supply as well as reduce the cost of power generation. Considering that the backpropagation (BP) neural network has a good approximation effect as well as a strong adaptation ability, the load prediction model of the BP deep neural network is established. However, there are some defects in the BP neural network, such as the prediction effect, which is not precise enough and easily falls into a locally optimal solution. Hence, a genetic algorithm (GA)&minus;reinforced deep neural network is introduced. By optimizing the weight and threshold of the BP network, the deficiency of the BP neural network algorithm is improved so that the prediction effect is realized and optimized. The results reveal that the error reduction in the mean square error (MSE) of the GA&ndash;BP neural network prediction is 2.0221, which is significantly smaller than the 30.3493 of the BP neural network prediction. Additionally, the error reduction is 93.3%. The error reductions of the root mean square error (RMSE) and mean absolute error (MAE) are 74.18% and 51.2%, respectively

    Handling Computation Hardness and Time Complexity Issue of Battery Energy Storage Scheduling in Microgrids by Deep Reinforcement Learning

    No full text
    With the development of microgrids (MGs), an energy management system (EMS) is required to ensure the stable and economically efficient operation of the MG system. In this paper, an intelligent EMS is proposed by exploiting the deep reinforcement learning (DRL) technique. DRL is employed as the effective method for handling the computation hardness of optimal scheduling of the charge/discharge of battery energy storage in the MG EMS. Since the optimal decision for charge/discharge of the battery depends on its state of charge given from the consecutive time steps, it demands a full-time horizon scheduling to obtain the optimum solution. This, however, increases the time complexity of the EMS and turns it into an NP-hard problem. By considering the energy storage system’s charging/discharging power as the control variable, the DRL agent is trained to investigate the best energy storage control method for both deterministic and stochastic weather scenarios. The efficiency of the strategy suggested in this study in minimizing the cost of purchasing energy is also shown from a quantitative perspective through programming verification and comparison with the results of mixed integer programming and the heuristic genetic algorithm (GA)

    An execution time prediction model for crew information processing in new special vehicles

    No full text
    Task execution time prediction modeling is of great significance to the safety of special vehicle crew and maintenance of overall combat effectiveness. In our study, a task execution time prediction model for crew information processing task is proposed considering the characteristics of information processing task, as well as the interaction mode of man-machine system of special vehicles. In addition, a model validation experiment was conducted adopting 20 subjects facing to two kinds of representative tasks, and both of them have two levels of complexities. The result shows that there is a highly positive correlation (r = 0.999, p = 0.001) between the model prediction and the experimental results of four tasks, which indicates that the model has a good applicability. Based on the rationality of model validation, the application research of the model was conducted. The result shows that for the sub-tasks with more objects and visual searching difficulty, the speech interaction mode can reduce the operation duration to a great extent; for simple subtasks with less operation steps and strong coherence, the touch interaction mode has certain advantages
    corecore