39 research outputs found

    An Intelligent Decision Support System for Business IT Security Strategy

    Get PDF
    Cyber threat intelligence (CTI) is an emerging approach to improve cyber security of business IT environment. It has information of an a ected business IT context. CTI sharing tools are available for subscribers, and CTI feeds are increasingly available. If another business IT context is similar to a CTI feed context, the threat described in the CTI feed might also take place in the business IT context. Businesses can take proactive defensive actions if relevant CTI is identi ed. However, a challenge is how to develop an e ective connection strategy for CTI onto business IT contexts. Businesses are still insu ciently using CTI because not all of them have su cient knowledge from domain experts. Moreover, business IT contexts vary over time. When the business IT contextual states have changed, the relevant CTI might be no longer appropriate and applicable. Another challenge is how a connection strategy has the ability to adapt to the business IT contextual changes. To ll the gap, in this Ph.D project, a dynamic connection strategy for CTI onto business IT contexts is proposed and the strategy is instantiated to be a dynamic connection rule assembly system. The system can identify relevant CTI for a business IT context and can modify its internal con gurations and structures to adapt to the business IT contextual changes. This thesis introduces the system development phases from design to delivery, and the contributions to knowledge are explained as follows. A hybrid representation of the dynamic connection strategy is proposed to generalise and interpret the problem domain and the system development. The representation uses selected computational intelligence models and software development models. In terms of the computational intelligence models, a CTI feed context and a business IT context are generalised to be the same type, i.e., context object. Grey number model is selected to represent the attribute values of context objects. Fuzzy sets are used to represent the context objects, and linguistic densities of the attribute values of context objects are reasoned. To assemble applicable connection knowledge, the system constructs a set of connection objects based on the context objects and uses rough set operations to extract applicable connection objects that contain the connection knowledge. Furthermore, to adapt to contextual changes, a rough set based incremental updating approach with multiple operations is developed to incrementally update the approximations. A set of propositions are proposed to describe how the system changes based on the previous states and internal structures of the system, and their complexities and e ciencies are analysed. In terms of the software development models, some uni ed modelling language (UML) models are selected to represent the system in design phase. Activity diagram is used to represent the business process of the system. Use case diagram is used to represent the human interactions with the system. Class diagram is used to represent the internal components and relationships between them. Using the representation, developers can develop a prototype of the system rapidly. Using the representation, an application of the system is developed using mainstream software development techniques. RESTful software architecture is used for the communication of the business IT contextual information and the analysis results using CTI between the server and the clients. A script based method is deployed in the clients to collect the contextual information. Observer pattern and a timer are used for the design and development of the monitor-trigger mechanism. In summary, the representation generalises real-world cases in the problem domain and interprets the system data. A speci c business can initialise an instance of the representation to be a speci c system based on its IT context and CTI feeds, and the knowledge assembled by the system can be used to identify relevant CTI feeds. From the relevant CTI data, the system locates and retrieves the useful information that can inform security decisions and then sends it to the client users. When the system needs to modify itself to adapt to the business IT contextual changes, the system can invoke the corresponding incremental updating functions and avoid a time-consuming re-computation. With this updating strategy, the application can provide its users in the client side with timely support and useful information that can inform security decisions using CTI

    S-adenosyl-L-methionine improves ventricular remodeling after myocardial infarction by regulating angiogenesis and fibrosis

    Get PDF
    Purpose: To investigate the effect of S-adenosyl-L-methionine (SAM) on angiogenesis and fibrosis in the heart of rats with myocardial infarction (MI), and to determine the mechanism of action.Methods: Sprague Dawley rats with MI received SAM treatment (15 mg/kg) intraperitoneally. The cumulative survival (%) of rats was recorded to determine their rate of survival. Hematoxylin-eosin staining, echocardiography, and hemodynamics were also performed. In addition, the effects of SAM vascular regeneration in the rats were analyzed by determining the expression of vascular endothelial growth factor (VEGF), basic fibroblast growth factor (bFGF) and hypoxia-inducible factor 1-α (HIF1-α) in rats.Results: The 8-week survival rate of the MI group was significantly lower than that of the sham group, while SAM significantly improved the survival rate of the rats. In addition, SAM improved the contractile and diastolic heart function in the rats and also increased the ventricular pressure change. Furthermore, SAM elevated the expressions of VEGF, bFGF and HIF1-α in rat myocardium and serum. In myocardial tissues of SAM-treated rats, the expressions of collagen I, collagen III and α-sma were reduced, indicating that SAM inhibited myocardial fibrosis. In addition, SAM promoted cardiac angiogenesis by activating Jagged1/Notch1 signaling pathway.Conclusion: SAM promotes angiogenesis of the myocardium by activating Jagged1/Notch1 signaling pathway and inhibiting fibrosis in rat myocardium. Therefore, SAM effectively inhibits ventricular remodeling in rats after MI, thereby improving the rats’ heart structure and function. The results may provide new targets for the treatment of myocardial infarction

    CPDG: A Contrastive Pre-Training Method for Dynamic Graph Neural Networks

    Full text link
    Dynamic graph data mining has gained popularity in recent years due to the rich information contained in dynamic graphs and their widespread use in the real world. Despite the advances in dynamic graph neural networks (DGNNs), the rich information and diverse downstream tasks have posed significant difficulties for the practical application of DGNNs in industrial scenarios. To this end, in this paper, we propose to address them by pre-training and present the Contrastive Pre-Training Method for Dynamic Graph Neural Networks (CPDG). CPDG tackles the challenges of pre-training for DGNNs, including generalization and long-short term modeling capability, through a flexible structural-temporal subgraph sampler along with structural-temporal contrastive pre-training schemes. Extensive experiments conducted on both large-scale research and industrial dynamic graph datasets show that CPDG outperforms existing methods in dynamic graph pre-training for various downstream tasks under three transfer settings.Comment: 12 pages, 6 figure

    Multicluster-Coordination Industrial Internet of Things: The Era of Nonorthogonal Transmission

    Get PDF
    The imminent industrial Internet of Things (IIoT) aims to provide massive device connectivity and support ever-increasing data demands, putting today's production environment on the edge of a new era of innovations and changes. In a multicluster IIoT, devices may suffer severe intercluster interference due to the intensive frequency reuse among adjacent access points (APs), thus deteriorating their quality of service (QoS). To address this issue, conventional multicluster coordination in the IIoT provides orthogonal code-, frequency-, time- or spatial-domain multiple access for interference management, but this results in a waste of resources, especially in the context of the explosively increased number of devices

    Modeling Spatiotemporal Periodicity and Collaborative Signal for Local-Life Service Recommendation

    Full text link
    Online local-life service platforms provide services like nearby daily essentials and food delivery for hundreds of millions of users. Different from other types of recommender systems, local-life service recommendation has the following characteristics: (1) spatiotemporal periodicity, which means a user's preferences for items vary from different locations at different times. (2) spatiotemporal collaborative signal, which indicates similar users have similar preferences at specific locations and times. However, most existing methods either focus on merely the spatiotemporal contexts in sequences, or model the user-item interactions without spatiotemporal contexts in graphs. To address this issue, we design a new method named SPCS in this paper. Specifically, we propose a novel spatiotemporal graph transformer (SGT) layer, which explicitly encodes relative spatiotemporal contexts, and aggregates the information from multi-hop neighbors to unify spatiotemporal periodicity and collaborative signal. With extensive experiments on both public and industrial datasets, this paper validates the state-of-the-art performance of SPCS.Comment: KDAH CIKM'23 Worksho

    GliDe with a CaPE: A Low-Hassle Method to Accelerate Speculative Decoding

    Full text link
    Speculative decoding is a relatively new decoding framework that leverages small and efficient draft models to reduce the latency of LLMs. In this study, we introduce GliDe and CaPE, two low-hassle modifications to vanilla speculative decoding to further improve the decoding speed of a frozen LLM. Specifically, GliDe is a modified draft model architecture that reuses the cached keys and values from the target LLM, while CaPE is a proposal expansion method that uses the draft model's confidence scores to help select additional candidate tokens for verification. Extensive experiments on different benchmarks demonstrate that our proposed GliDe draft model significantly reduces the expected decoding latency. Additional evaluation using walltime reveals that GliDe can accelerate Vicuna models up to 2.17x and further extend the improvement to 2.61x with CaPE. We will release our code, data, and the trained draft models

    Macro Graph Neural Networks for Online Billion-Scale Recommender Systems

    Full text link
    Predicting Click-Through Rate (CTR) in billion-scale recommender systems poses a long-standing challenge for Graph Neural Networks (GNNs) due to the overwhelming computational complexity involved in aggregating billions of neighbors. To tackle this, GNN-based CTR models usually sample hundreds of neighbors out of the billions to facilitate efficient online recommendations. However, sampling only a small portion of neighbors results in a severe sampling bias and the failure to encompass the full spectrum of user or item behavioral patterns. To address this challenge, we name the conventional user-item recommendation graph as "micro recommendation graph" and introduce a more suitable MAcro Recommendation Graph (MAG) for billion-scale recommendations. MAG resolves the computational complexity problems in the infrastructure by reducing the node count from billions to hundreds. Specifically, MAG groups micro nodes (users and items) with similar behavior patterns to form macro nodes. Subsequently, we introduce tailored Macro Graph Neural Networks (MacGNN) to aggregate information on a macro level and revise the embeddings of macro nodes. MacGNN has already served Taobao's homepage feed for two months, providing recommendations for over one billion users. Extensive offline experiments on three public benchmark datasets and an industrial dataset present that MacGNN significantly outperforms twelve CTR baselines while remaining computationally efficient. Besides, online A/B tests confirm MacGNN's superiority in billion-scale recommender systems.Comment: 11 pages, 7 figures, accepted by The Web Conference 202

    Compressive Sensing based User Activity Detection and Channel Estimation in Uplink NOMA Systems

    Get PDF
    Conventional request-grant based non-orthogonal multiple access (NOMA) incurs tremendous overhead and high latency. To enable grant-free access in NOMA systems, user activity detection (UAD) is essential. In this paper, we investigate compressive sensing (CS) aided UAD, by utilizing the property of quasi-time-invariant channel tap delays as the prior information. This does not require any prior knowledge of the number of active users like the previous approaches, and therefore is more practical. Two UAD algorithms are proposed, which are referred to as gradient based and time-invariant channel tap delays assisted CS (g-TIDCS) and mean value based and TIDCS (m-TIDCS), respectively. They achieve much higher UAD accuracy than the previous work at low signal-to-noise ratio (SNR). Based on the UAD results, we also propose a low-complexity CS based channel estimation scheme, which achieves higher accuracy than the previous channel estimation approaches
    corecore