262 research outputs found

    Inferring Regulatory Networks by Combining Perturbation Screens and Steady State Gene Expression Profiles

    Full text link
    Reconstructing transcriptional regulatory networks is an important task in functional genomics. Data obtained from experiments that perturb genes by knockouts or RNA interference contain useful information for addressing this reconstruction problem. However, such data can be limited in size and/or are expensive to acquire. On the other hand, observational data of the organism in steady state (e.g. wild-type) are more readily available, but their informational content is inadequate for the task at hand. We develop a computational approach to appropriately utilize both data sources for estimating a regulatory network. The proposed approach is based on a three-step algorithm to estimate the underlying directed but cyclic network, that uses as input both perturbation screens and steady state gene expression data. In the first step, the algorithm determines causal orderings of the genes that are consistent with the perturbation data, by combining an exhaustive search method with a fast heuristic that in turn couples a Monte Carlo technique with a fast search algorithm. In the second step, for each obtained causal ordering, a regulatory network is estimated using a penalized likelihood based method, while in the third step a consensus network is constructed from the highest scored ones. Extensive computational experiments show that the algorithm performs well in reconstructing the underlying network and clearly outperforms competing approaches that rely only on a single data source. Further, it is established that the algorithm produces a consistent estimate of the regulatory network.Comment: 24 pages, 4 figures, 6 table

    Life cycle assessment of recycling options for automotive Li-ion battery packs

    Get PDF
    Ramping up automotive lithium-ion battery (LIB) production volumes creates an imperative need for the establishment of end-of-life treatment chains for spent automotive traction battery packs. Life Cycle Assessment (LCA) is an essential tool in evaluating the environmental performance of such chains and options. This work synthesises publicly-available data to expand upon previously reported LCA studies for LIB recycling and holistically model end-of-life treatment chains for spent automotive traction battery packs with lithium nickel cobalt manganese oxide positive electrodes. The study provides an in-depth analysis of unit process contributions to the environmental benefits and burdens of battery recycling options and integrates these with the battery production impacts to estimate the net environmental benefit achieved by the introduction of recycling in the value chain. The attributional LCA model accounts for the whole recycling chain, from the point of end-of-life LIB collection to the provision of secondary materials for battery manufacturing. Pyrometallurgical processing of spent automotive traction battery cells is predicted to have a larger Global Warming Potential (GWP), due to its higher energy intensity, while hydrometallurgical processing is shown to be more environmentally beneficial, due to the additional recovery of lithium as hydroxide. The majority of the environmental benefits arise from the recovery of aluminium and copper fractions of battery packs, with important contributions also arising from the recovery of nickel and cobalt from the battery cells. Overall, the LCA model presented estimates a net benefit in 11 out of 13 environmental impact categories based on the ReCiPe characterisation method, as compared to battery production without recycling. An investigation of the effect of geographic specificity on the combined production and recycling indicates that it is as a key source of GWP impact variability and that the more climate burdening chains offer a significantly higher potential for GWP reductions through battery recycling. The sensitivity analysis carried out shows that impacts related to air quality are higher when recovering lower grade materials. This study provides a quantitative and replicable inventory model which highlights the significance of the environmental benefits achieved through the establishment of circular automotive battery value chains

    Cloud Watching: Understanding Attacks Against Cloud-Hosted Services

    Full text link
    Cloud computing has dramatically changed service deployment patterns. In this work, we analyze how attackers identify and target cloud services in contrast to traditional enterprise networks and network telescopes. Using a diverse set of cloud honeypots in 5~providers and 23~countries as well as 2~educational networks and 1~network telescope, we analyze how IP address assignment, geography, network, and service-port selection, influence what services are targeted in the cloud. We find that scanners that target cloud compute are selective: they avoid scanning networks without legitimate services and they discriminate between geographic regions. Further, attackers mine Internet-service search engines to find exploitable services and, in some cases, they avoid targeting IANA-assigned protocols, causing researchers to misclassify at least 15\% of traffic on select ports. Based on our results, we derive recommendations for researchers and operators.Comment: Proceedings of the 2023 ACM Internet Measurement Conference (IMC '23), October 24--26, 2023, Montreal, QC, Canad

    A critical analysis of the use of AI in HRM in luxury hotels

    Get PDF
    Purpose: The aim of this study is to analyze the potential and effectiveness of AI-powered technology on recruitment practices in luxury hotels focusing on exploring the impact of AI on the efficiency and effectiveness on recruitment practices. Research Methods: In-depth semi-structured interviews with fifteen hotel general and HR managers were done through skype, e-mail and telephone in luxury hotels in Greece, the UK and Belgium. Convenience sampling was used, as very few luxury hotels use AI in recruitment and few people were available to participate in this study due to COVID-19. Results and Discussion: The findings suggest that the use of AI must be measured and evaluated in advance by hotels. AI has changed administrative duties in HRM. Launching AI technology in the recruitment process luxury hotels identified differences on the way they communicate with candidates and better results in recruitment and finding talent. Many organizations are reluctant to invest in AI due to readiness at implementing AI and the training required to use AI. Implications: This paper contributes to research gap on the use of AI in HRM as most studies focus on customer service. HR managers use AI in the pre-selection recruitment process as it makes the process faster and smoother. It can also provide better results in identifying a larger pool of talent. Asynchronous video interviews, games with AI may be used to help future candidates to understand the job requirements and assess if they have the skills or personality to meet the person specifications. Training on the use of AI should be provided to HR managers to acquire the necessary technical skills. Future studies may use quantitative techniques to assess the influence of AI recruitment in business efficiency

    Detection of Sparse Anomalies in High-Dimensional Network Telescope Signals

    Full text link
    Network operators and system administrators are increasingly overwhelmed with incessant cyber-security threats ranging from malicious network reconnaissance to attacks such as distributed denial of service and data breaches. A large number of these attacks could be prevented if the network operators were better equipped with threat intelligence information that would allow them to block or throttle nefarious scanning activities. Network telescopes or "darknets" offer a unique window into observing Internet-wide scanners and other malicious entities, and they could offer early warning signals to operators that would be critical for infrastructure protection and/or attack mitigation. A network telescope consists of unused or "dark" IP spaces that serve no users, and solely passively observes any Internet traffic destined to the "telescope sensor" in an attempt to record ubiquitous network scanners, malware that forage for vulnerable devices, and other dubious activities. Hence, monitoring network telescopes for timely detection of coordinated and heavy scanning activities is an important, albeit challenging, task. The challenges mainly arise due to the non-stationarity and the dynamic nature of Internet traffic and, more importantly, the fact that one needs to monitor high-dimensional signals (e.g., all TCP/UDP ports) to search for "sparse" anomalies. We propose statistical methods to address both challenges in an efficient and "online" manner; our work is validated both with synthetic data as well as real-world data from a large network telescope
    corecore