26 research outputs found

    Association between methionine sulfoxide and risk of moyamoya disease

    Get PDF
    ObjectiveMethionine sulfoxide (MetO) has been identified as a risk factor for vascular diseases and was considered as an important indicator of oxidative stress. However, the effects of MetO and its association with moyamoya disease (MMD) remained unclear. Therefore, we performed this study to evaluate the association between serum MetO levels and the risk of MMD and its subtypes.MethodsWe eventually included consecutive 353 MMD patients and 88 healthy controls (HCs) with complete data from September 2020 to December 2021 in our analyzes. Serum levels of MetO were quantified using liquid chromatography-mass spectrometry (LC–MS) analysis. We evaluated the role of MetO in MMD using logistic regression models and confirmed by receiver-operating characteristic (ROC) curves and area under curve (AUC) values.ResultsWe found that the levels of MetO were significantly higher in MMD and its subtypes than in HCs (p < 0.001 for all). After adjusting for traditional risk factors, serum MetO levels were significantly associated with the risk of MMD and its subtypes (p < 0.001 for all). We further divided the MetO levels into low and high groups, and the high MetO level was significantly associated with the risk of MMD and its subtypes (p < 0.05 for all). When MetO levels were assessed as quartiles, we found that the third (Q3) and fourth (Q4) MetO quartiles had a significantly increased risk of MMD compared with the lowest quartile (Q3, OR: 2.323, 95%CI: 1.088–4.959, p = 0.029; Q4, OR: 5.559, 95%CI: 2.088–14.805, p = 0.001).ConclusionIn this study, we found that a high level of serum MetO was associated with an increased risk of MMD and its subtypes. Our study raised a novel perspective on the pathogenesis of MMD and suggested potential therapeutic targets

    Single-cell atlas reveals different immune environments between stable and vulnerable atherosclerotic plaques

    Get PDF
    IntroductionRegardless of the degree of stenosis, vulnerable plaque is an important cause of ischemic stroke and thrombotic complications. The changes of the immune microenvironment within plaques seem to be an important factor affecting the characteristics of the plaque. However, the differences of immune microenvironment between stable and vulnerable plaques were remained unknown.MethodsIn this study, RNA-sequencing was performed on superficial temporal arteries from 5 traumatic patients and plaques from 3 atherosclerotic patients to preliminary identify the key immune response processes in plaques. Mass cytometry (CyTOF) technology was used to explore differences in immune composition between 9 vulnerable plaques and 12 stable plaques. Finally, immunofluorescence technique was used to validate our findings in the previous analysis.ResultsOur results showed that more CD86+CD68+ M1 pro-inflammatory macrophages were found in vulnerable plaques, while CD4+T memory cells were mainly found in stable plaques. In addition, a CD11c+ subset of CD4+T cells with higher IFN-r secretion was found within the vulnerable plaque. In two subsets of B cells, CD19+CD20-B cells in vulnerable plaques secreted more TNF-a and IL-6, while CD19-CD20+B cells expressed more PD-1 molecules.ConclusionIn conclusion, our study suggested that M1-like macrophages are the major cell subset affecting plaque stability, while functional B cells may also contribute to plaque stability

    Prediction of protein assemblies, the next frontier: The CASP14-CAPRI experiment

    Get PDF
    We present the results for CAPRI Round 50, the fourth joint CASP-CAPRI protein assembly prediction challenge. The Round comprised a total of twelve targets, including six dimers, three trimers, and three higher-order oligomers. Four of these were easy targets, for which good structural templates were available either for the full assembly, or for the main interfaces (of the higher-order oligomers). Eight were difficult targets for which only distantly related templates were found for the individual subunits. Twenty-five CAPRI groups including eight automatic servers submitted ~1250 models per target. Twenty groups including six servers participated in the CAPRI scoring challenge submitted ~190 models per target. The accuracy of the predicted models was evaluated using the classical CAPRI criteria. The prediction performance was measured by a weighted scoring scheme that takes into account the number of models of acceptable quality or higher submitted by each group as part of their five top-ranking models. Compared to the previous CASP-CAPRI challenge, top performing groups submitted such models for a larger fraction (70–75%) of the targets in this Round, but fewer of these models were of high accuracy. Scorer groups achieved stronger performance with more groups submitting correct models for 70–80% of the targets or achieving high accuracy predictions. Servers performed less well in general, except for the MDOCKPP and LZERD servers, who performed on par with human groups. In addition to these results, major advances in methodology are discussed, providing an informative overview of where the prediction of protein assemblies currently stands.Cancer Research UK, Grant/Award Number: FC001003; Changzhou Science and Technology Bureau, Grant/Award Number: CE20200503; Department of Energy and Climate Change, Grant/Award Numbers: DE-AR001213, DE-SC0020400, DE-SC0021303; H2020 European Institute of Innovation and Technology, Grant/Award Numbers: 675728, 777536, 823830; Institut national de recherche en informatique et en automatique (INRIA), Grant/Award Number: Cordi-S; Lietuvos Mokslo Taryba, Grant/Award Numbers: S-MIP-17-60, S-MIP-21-35; Medical Research Council, Grant/Award Number: FC001003; Japan Society for the Promotion of Science KAKENHI, Grant/Award Number: JP19J00950; Ministerio de Ciencia e Innovación, Grant/Award Number: PID2019-110167RB-I00; Narodowe Centrum Nauki, Grant/Award Numbers: UMO-2017/25/B/ST4/01026, UMO-2017/26/M/ST4/00044, UMO-2017/27/B/ST4/00926; National Institute of General Medical Sciences, Grant/Award Numbers: R21GM127952, R35GM118078, RM1135136, T32GM132024; National Institutes of Health, Grant/Award Numbers: R01GM074255, R01GM078221, R01GM093123, R01GM109980, R01GM133840, R01GN123055, R01HL142301, R35GM124952, R35GM136409; National Natural Science Foundation of China, Grant/Award Number: 81603152; National Science Foundation, Grant/Award Numbers: AF1645512, CCF1943008, CMMI1825941, DBI1759277, DBI1759934, DBI1917263, DBI20036350, IIS1763246, MCB1925643; NWO, Grant/Award Number: TOP-PUNT 718.015.001; Wellcome Trust, Grant/Award Number: FC00100

    Strategies for Datacenters Participating in Demand Response by Two-Stage Decisions

    No full text
    Modern smart grids have proposed a series of demand response (DR) programs and encourage users to participate in them with the purpose of maintaining reliability and efficiency so as to respond to the sustainable development of demand-side management. As a large load of the smart grid, a datacenter could be regarded as a potential demand response participant. Encouraging datacenters to participate in demand response programs can help the grid to achieve better load balancing effect, while the datacenter can also reduce its own power consumption so as to save electricity costs. In this paper, we designed a demand response participation strategy based on two-stage decisions to reduce the total cost of the datacenter while considering the DR requirements of the grid. The first stage determines whether to participate in demand response by predicting real-time electricity prices of the power grid and incentive information will be sent to encourage users to participate in the program to help shave the peak load. In the second stage, the datacenter interacts with its users by allowing users to submit bid information by reverse auction. Then, the datacenter selects the tasks of the winning users to postpone processing them with awards. Experimental results show that the proposed strategy could help the datacenter to reduce its cost and effectively meet the demand response requirements of the smart grid at the same time

    Thermal-Aware Hybrid Workload Management in a Green Datacenter towards Renewable Energy Utilization

    No full text
    The increase in massive data processing and computing in datacenters in recent years has resulted in the problem of severe energy consumption, which also leads to a significant carbon footprint and a negative impact on the environment. A growing number of IT companies with operating datacenters are adopting renewable energy as part of their energy supply to offset the consumption of brown energy. In this paper, we focused on a green datacenter using hybrid energy supply, leveraged the time flexibility of workloads in the datacenter, and proposed a thermal-aware workload management method to maximize the utilization of renewable energy sources, considering the power consumption of both computing devices and cooling devices at the same time. The critical knob of our approach was workload shifting, which scheduled more delay-tolerant workloads and allocated resources in the datacenter according to the availability of renewable energy supply and the variation of cooling temperature. In order to evaluate the performance of the proposed method, we conducted simulation experiments using the Cloudsim-plus tool. The results demonstrated that the proposed method could effectively reduce the consumption of brown energy while maximizing the utilization of green energy

    Smart-Grid-Aware Load Regulation of Multiple Datacenters towards the Variable Generation of Renewable Energy

    No full text
    Recently, as renewable and distributed power sources boost, many such resources are integrated into the smart grid as a clean energy input. However, since the generation of renewable energy is intermittent and unstable, the smart grid needs to regulate the load to maintain stability after integrating the renewable energy source. At the same time, with the development of cloud computing, large-scale datacenters are becoming potentially controllable loads for the smart grid due to their high energy consumption. In this paper, we propose an appropriate approach to dynamically adjust the datacenter load to balance the unstable renewable energy input into the grid. This could meet the demand response requirements by taking advantage of the variable power consumption of datacenters. We have examined the scenarios of one or more datacenters being integrated into the grid and adopted a stochastic algorithm to solve the problem we established. The experimental results illustrated that the dynamic load management of multiple datacenters could help the smart grid to reduce losses and thus save operational costs. Besides, we also analyzed the impact of the flexibility and the delay of datacenter actions, which could be applied to more general scenarios in realistic environments. Furthermore, considering the impact of the action delay, we employed a forecasting method to predict renewable energy generation in advance to eliminate the extra losses brought by the delay as much as possible. By predicting solar power generation, the improved results showed that the proposed method was effective and feasible under both sunny and cloudy/rainy/snowy weather conditions

    Automated Spectrophotometric Determination of Carbonate Ion Concentration in Seawater Using a Portable Syringe Pump Based Analyzer

    No full text
    Observations of seawater carbonate ion concentrations are critical to assess the ecological effects of ocean acidification. Nevertheless, currently available methods are labor intensive or too complex for field applications. Here, we report the design and performance of the first fully automated portable carbonate ion analyzer. Measurements are based on reaction of carbonate and chloride ions with Pb(II) followed by quantitative UV spectrophotometric detection of the PbCO30 complex. The core hardware is a syringe pump equipped with a multi-position valve that is controlled by software written in LabVIEW. Measurement precision is 1.1% (n = 13) with a measurement frequency of 12 h−1. The analyzer was used to continuously monitor carbonate ion concentration variations in a 2500 L coral reef tank for five days (test 1), and used for shipboard underway and vertical profile analysis during a 13-day cruise (test 2). The analyzer attained a combined standard uncertainty of 3.0%, which meets the Global Ocean Acidification Observing Network\u27s “weather level” goal. Through use of a syringe pump mechanism for mixing seawater and reagent solution, the analyzer is robust, functionally flexible, and quite suitable for continuous environmental monitoring under harsh conditions

    Biomimetic scaffolds with programmable pore structures for minimum invasive bone repair

    No full text
    Due to the complexity of surgery for large-area bone injuries, implanting a large volume of materials into the injury site remains a big challenge in orthopedics. To solve this difficulty, in this study, a series of biomimetic hydroxyapatite/shape-memory composite scaffolds were designed and synthesized with programmable pore structures, based on poly(Ï”-caprolactone) (PCL), polytetrahydrofuran (PTMG) and the osteoconductive hydroxyapatite (HA). The obtained scaffolds presented various pore structures, high connectivity, tunable mechanical properties, and excellent shape memory performance. Moreover, the mineralization activity of the developed scaffolds could enhance the formation of hydroxyapatite and they showed good biocompatibility in vitro. The in vivo experiments show that scaffolds could promote the formation of new bone in critical size cranial defects. The programmable porous scaffold biomaterials exhibited potential application promise in bone regeneration
    corecore