508 research outputs found

    Analytical coupled-wave model for photonic crystal quantum cascade lasers

    Full text link
    A coupled-wave model is developed for photonic-crystal quantum cascade lasers. The analytical model provides an efficient analysis of full three-dimensional large-area device structure, and the validity is confirmed via simulations and previous experimental results.Comment: 21 pages and 8 figure

    Dimeth­yl(2-oxo-2-phenyl­eth­yl)sulfanium bromide

    Get PDF
    Single crystals of the title compound, C10H13OS+·Br−, were obtained from ethyl acetate/ethyl ether after reaction of acetophenone with hydro­bromic acid and dimethyl­sulfoxide. The carbonyl group is almost coplanar with the neighbouring phenyl ring [O—C—C—C = 178.9 (2)°]. The sulfanium group shows a trigonal–pyramidal geometry at the S atom. The crystal structure is stabil­ized by C—H⋯Br hydrogen-bonding inter­actions. Weak π–π inter­actions link adjacent phenyl rings [centroid–centroid distance = 3.946 (2) Å]

    Function modification of SR-PSOX by point mutations of basic amino acids

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Atherosclerosis (AS) is a common cardiovascular disease. Transformation of macrophages to form foam cells by internalizing modified low density-lipoprotein (LDL) via scavenger receptor (SR) is a key pathogenic process in the onset of AS. It has been demonstrated that SR-PSOX functions as either a scavenger receptor for uptake of atherogenic lipoproteins and bacteria or a membrane-anchored chemokine for adhesion of macrophages and T-cells to the endothelium. Therefore, SR-PSOX plays an important role in the development of AS. In this study the key basic amino acids in the chemokine domain of SR-PSOX have been identified for its functions.</p> <p>Results</p> <p>A cell model to study the functions of SR-PSOX was successfully established. Based on the cell model, a series of mutants of human SR-PSOX were constructed by replacing the single basic amino acid residue in the non-conservative region of the chemokine domain (arginine 62, arginine 78, histidine 80, arginine 82, histidine 85, lysine 105, lysine 119, histidine 123) with alanine (designated as R62A, R78A, H80A, R82A, H85A, K105A, K119A and H123A, respectively). Functional studies showed that the mutants with H80A, H85A, and K105A significantly increased the activities of oxLDL uptake and bacterial phagocytosis compared with the wild-type SR-PSOX. In addition, we have also found that mutagenesis of either of those amino acids strongly reduced the adhesive activity of SR-PSOX by using a highly non-overlapping set of basic amino acid residues.</p> <p>Conclusion</p> <p>Our study demonstrates that basic amino acid residues in the non-conservative region of the chemokine domain of SR-PSOX are critical for its functions. Mutation of H80, H85, and K105 is responsible for increasing SR-PSOX binding with oxLDL and bacteria. All the basic amino acids in this region are important in the cells adhesion via SR-PSOX. These findings suggest that mutagenesis of the basic amino acids in the chemokine domain of SR-PSOX may contribute to atherogenesis.</p

    Duo Recital

    Full text link
    Program listing performers and works performe

    GPUMemSort: A High Performance Graphics Co-processors Sorting Algorithm for Large Scale In-Memory Data

    Get PDF
    In this paper, we present a GPU-based sorting algorithm,GPUMemSort, which achieves high performance insorting large-scale in-memory data by take advantage ofGPU processors. It consists of two algorithms: an in-corealgorithm, which is responsible for sorting data in GPUglobal memory efficiently, and an out-of-core algorithm,which is responsible for dividing large-scale data intomultiple chunks that fit GPU global memory.GPUMemSort is implemented based on NVIDIA’s CUDAframework and some critical and detailed optimizationmethods are also presented. The tests of differentalgorithms have been run on multiple data sets. Theexperimental results show that our in-core sorting canoutperform other comparison-based algorithms andGPUMemSort is highly effective in sorting large-scale inmemorydata

    Is ChatGPT Good at Search? Investigating Large Language Models as Re-Ranking Agent

    Full text link
    Large Language Models (LLMs) have demonstrated a remarkable ability to generalize zero-shot to various language-related tasks. This paper focuses on the study of exploring generative LLMs such as ChatGPT and GPT-4 for relevance ranking in Information Retrieval (IR). Surprisingly, our experiments reveal that properly instructed ChatGPT and GPT-4 can deliver competitive, even superior results than supervised methods on popular IR benchmarks. Notably, GPT-4 outperforms the fully fine-tuned monoT5-3B on MS MARCO by an average of 2.7 nDCG on TREC datasets, an average of 2.3 nDCG on eight BEIR datasets, and an average of 2.7 nDCG on ten low-resource languages Mr.TyDi. Subsequently, we delve into the potential for distilling the ranking capabilities of ChatGPT into a specialized model. Our small specialized model that trained on 10K ChatGPT generated data outperforms monoT5 trained on 400K annotated MS MARCO data on BEIR. The code to reproduce our results is available at www.github.com/sunnweiwei/RankGP
    corecore