727 research outputs found

    Reproductive Risk Factors for Breast Cancer: A Case Control Study

    Get PDF
    Background: Breast cancer is second most important cancer among Indian women. Although risk factors are not much prevalent as in western countries, incidence rate is increasing in India. The study was undertaken to study various risk factors associated with breast cancer. Methods: A hospital based group matched case control study was undertaken to identify risk factors. The study consisted of 105 hospitalized cases confirmed on histopathology and 210 group matched controls selected from urban field practice area, Sadar, without any malignancy. Bivariate analyses included odds ratio (OR), 95% confidence interval (CI) for odds ratio. Results: Earlier age at menarche ≤ 12 years of age, late age at first full term delivery, nulliparity, Lack of breast-feeding were found to be significantly associated with the risk of breast cancer in both pre menopausal & post menopausal women while age at menopause at or after 50 years was significantly associated with the risk in post menopausal women. Conclusions: Study suggests that the changes in menstrual and reproductive patterns among women i.e. early age at menarche and late age at first childbirth and some environmental factors in Central India may have contributed to the increase in breast cancer risk, particularly among younger women

    A methodology pruning the search space of six compiler transformations by addressing them together as one problem and by exploiting the hardware architecture details

    Get PDF
    Today’s compilers have a plethora of optimizations-transformations to choose from, and the correct choice, order as well parameters of transformations have a significant/large impact on performance; choosing the correct order and parameters of optimizations has been a long standing problem in compilation research, which until now remains unsolved; the separate sub-problems optimization gives a different schedule/binary for each sub-problem and these schedules cannot coexist, as by refining one degrades the other. Researchers try to solve this problem by using iterative compilation techniques but the search space is so big that it cannot be searched even by using modern supercomputers. Moreover, compiler transformations do not take into account the hardware architecture details and data reuse in an efficient way. In this paper, a new iterative compilation methodology is presented which reduces the search space of six compiler transformations by addressing the above problems; the search space is reduced by many orders of magnitude and thus an efficient solution is now capable to be found. The transformations are the following: loop tiling (including the number of the levels of tiling), loop unroll, register allocation, scalar replacement, loop interchange and data array layouts. The search space is reduced (a) by addressing the aforementioned transformations together as one problem and not separately, (b) by taking into account the custom hardware architecture details (e.g., cache size and associativity) and algorithm characteristics (e.g., data reuse). The proposed methodology has been evaluated over iterative compilation and gcc/icc compilers, on both embedded and general purpose processors; it achieves significant performance gains at many orders of magnitude lower compilation time

    Augmenting the 6-3-5 method with design information

    Get PDF
    This paper describes a comparative study between the 6-3-5 Method and the ICR Grid. The ICR Grid is an evolved variant of 6-3-5 intended to better integrate information into the concept generation process. Unlike a conventional 6-3-5 process where participants continually sketch concepts, using the ICR Grid (the name derived from its Inform, Create, Reflect activities and structured, grid-like output) participants are additionally required to undertake information search tasks, use specific information items for concept development, and reflect on the merit of concepts as the session progresses. The results indicate that although the quantity of concepts was lower, the use of information had a positive effect in a number of areas, principally the quality and variety of output. Although grounded in the area of product development, this research is applicable to any organisation undertaking idea generation and problem solving. As well as providing insights on the transference of information to concepts, it holds additional interest for studies on the composition and use of digital libraries

    Experimental Investigation on the Operation Performance of a Liquid Desiccant Air-conditioning System

    Get PDF
    A large share of energy consumption is taken by an air-conditioning system. It worsens the electricity load of the power network. Therefore, more and more scholars are paying attention to research on new types of air-conditioning systems that are energy- saving and environment-friendly. A liquid desiccant air conditioning system is among them, as it has a tremendous ability for power storage and low requirements for heat resources. Heat with low temperatures, such as excess heat, waste heat, and solar power, is suitable for the liquid desiccant air-conditioning system. The feasibility and economical efficiency of the system are studied in this experimental research. The result shows that when the temperature of the regeneration is about 80?, the thermodynamic coefficient of the system is about 0.6, and the supply air temperature of the air-conditioning system remains stable at 21?, the air-conditioning system can meet human comfort levels

    A comparison of transgenic rodent mutation and in vivo comet assay responses for 91 chemicals.

    Get PDF
    A database of 91 chemicals with published data from both transgenic rodent mutation (TGR) and rodent comet assays has been compiled. The objective was to compare the sensitivity of the two assays for detecting genotoxicity. Critical aspects of study design and results were tabulated for each dataset. There were fewer datasets from rats than mice, particularly for the TGR assay, and therefore, results from both species were combined for further analysis. TGR and comet responses were compared in liver and bone marrow (the most commonly studied tissues), and in stomach and colon evaluated either separately or in combination with other GI tract segments. Overall positive, negative, or equivocal test results were assessed for each chemical across the tissues examined in the TGR and comet assays using two approaches: 1) overall calls based on weight of evidence (WoE) and expert judgement, and 2) curation of the data based on a priori acceptability criteria prior to deriving final tissue specific calls. Since the database contains a high prevalence of positive results, overall agreement between the assays was determined using statistics adjusted for prevalence (using AC1 and PABAK). These coefficients showed fair or moderate to good agreement for liver and the GI tract (predominantly stomach and colon data) using WoE, reduced agreement for stomach and colon evaluated separately using data curation, and poor or no agreement for bone marrow using both the WoE and data curation approaches. Confidence in these results is higher for liver than for the other tissues, for which there were less data. Our analysis finds that comet and TGR generally identify the same compounds (mainly potent mutagens) as genotoxic in liver, stomach and colon, but not in bone marrow. However, the current database content precluded drawing assay concordance conclusions for weak mutagens and non-DNA reactive chemicals

    A high-performance matrix-matrix multiplication methodology for CPU and GPU architectures

    Get PDF
    Current compilers cannot generate code that can compete with hand-tuned code in efficiency, even for a simple kernel like matrix–matrix multiplication (MMM). A key step in program optimization is the estimation of optimal values for parameters such as tile sizes and number of levels of tiling. The scheduling parameter values selection is a very difficult and time-consuming task, since parameter values depend on each other; this is why they are found by using searching methods and empirical techniques. To overcome this problem, the scheduling sub-problems must be optimized together, as one problem and not separately. In this paper, an MMM methodology is presented where the optimum scheduling parameters are found by decreasing the search space theoretically, while the major scheduling sub-problems are addressed together as one problem and not separately according to the hardware architecture parameters and input size; for different hardware architecture parameters and/or input sizes, a different implementation is produced. This is achieved by fully exploiting the software characteristics (e.g., data reuse) and hardware architecture parameters (e.g., data caches sizes and associativities), giving high-quality solutions and a smaller search space. This methodology refers to a wide range of CPU and GPU architectures

    Supernova 2007bi as a pair-instability explosion

    Get PDF
    Stars with initial masses 10 M_{solar} < M_{initial} < 100 M_{solar} fuse progressively heavier elements in their centres, up to inert iron. The core then gravitationally collapses to a neutron star or a black hole, leading to an explosion -- an iron-core-collapse supernova (SN). In contrast, extremely massive stars (M_{initial} > 140 M_{solar}), if such exist, have oxygen cores which exceed M_{core} = 50 M_{solar}. There, high temperatures are reached at relatively low densities. Conversion of energetic, pressure-supporting photons into electron-positron pairs occurs prior to oxygen ignition, and leads to a violent contraction that triggers a catastrophic nuclear explosion. Tremendous energies (>~ 10^{52} erg) are released, completely unbinding the star in a pair-instability SN (PISN), with no compact remnant. Transitional objects with 100 M_{solar} < M_{initial} < 140 M_{solar}, which end up as iron-core-collapse supernovae following violent mass ejections, perhaps due to short instances of the pair instability, may have been identified. However, genuine PISNe, perhaps common in the early Universe, have not been observed to date. Here, we present our discovery of SN 2007bi, a luminous, slowly evolving supernova located within a dwarf galaxy (~1% the size of the Milky Way). We measure the exploding core mass to be likely ~100 M_{solar}, in which case theory unambiguously predicts a PISN outcome. We show that >3 M_{solar} of radioactive 56Ni were synthesized, and that our observations are well fit by PISN models. A PISN explosion in the local Universe indicates that nearby dwarf galaxies probably host extremely massive stars, above the apparent Galactic limit, perhaps resulting from star formation processes similar to those that created the first stars in the Universe.Comment: Accepted version of the paper appearing in Nature, 462, 624 (2009), including all supplementary informatio

    Histone deacetylase adaptation in single ventricle heart disease and a young animal model of right ventricular hypertrophy.

    Get PDF
    BackgroundHistone deacetylase (HDAC) inhibitors are promising therapeutics for various forms of cardiac diseases. The purpose of this study was to assess cardiac HDAC catalytic activity and expression in children with single ventricle (SV) heart disease of right ventricular morphology, as well as in a rodent model of right ventricular hypertrophy (RVH).MethodsHomogenates of right ventricle (RV) explants from non-failing controls and children born with a SV were assayed for HDAC catalytic activity and HDAC isoform expression. Postnatal 1-day-old rat pups were placed in hypoxic conditions, and echocardiographic analysis, gene expression, HDAC catalytic activity, and isoform expression studies of the RV were performed.ResultsClass I, IIa, and IIb HDAC catalytic activity and protein expression were elevated in the hearts of children born with a SV. Hypoxic neonatal rats demonstrated RVH, abnormal gene expression, elevated class I and class IIb HDAC catalytic activity, and protein expression in the RV compared with those in the control.ConclusionsThese data suggest that myocardial HDAC adaptations occur in the SV heart and could represent a novel therapeutic target. Although further characterization of the hypoxic neonatal rat is needed, this animal model may be suitable for preclinical investigations of pediatric RV disease and could serve as a useful model for future mechanistic studies

    PTF11rka: an interacting supernova at the crossroads of stripped-envelope and H-poor superluminous stellar core collapses

    Get PDF
    The hydrogen-poor supernova PTF11rka (z = 0.0744), reported by the Palomar Transient Factory, was observed with various telescopes starting a few days after the estimated explosion time of 2011 Dec. 5 UT and up to 432 rest-frame days thereafter. The rising part of the light curve was monitored only in the R_PTF filter band, and maximum in this band was reached ~30 rest-frame days after the estimated explosion time. The light curve and spectra of PTF11rka are consistent with the core-collapse explosion of a ~10 Msun carbon-oxygen core evolved from a progenitor of main-sequence mass 25--40 Msun, that liberated a kinetic energy (KE) ~ 4 x 10^{51} erg, expelled ~8 Msun of ejecta (Mej), and synthesised ~0.5 Msun of 56Nichel. The photospheric spectra of PTF11rka are characterised by narrow absorption lines that point to suppression of the highest ejecta velocities ~>15,000 km/s. This would be expected if the ejecta impacted a dense, clumpy circumstellar medium. This in turn caused them to lose a fraction of their energy (~5 x 10^50 erg), less than 2% of which was converted into radiation that sustained the light curve before maximum brightness. This is reminiscent of the superluminous SN 2007bi, the light-curve shape and spectra of which are very similar to those of PTF11rka, although the latter is a factor of 10 less luminous and evolves faster in time. PTF11rka is in fact more similar to gamma-ray burst supernovae (GRB-SNe) in luminosity, although it has a lower energy and a lower KE/Mej ratio

    Lack of the Long Pentraxin PTX3 Promotes Autoimmune Lung Disease but not Glomerulonephritis in Murine Systemic Lupus Erythematosus

    Get PDF
    The long pentraxin PTX3 has multiple roles in innate immunity. For example, PTX3 regulates C1q binding to pathogens and dead cells and regulates their uptake by phagocytes. It also inhibits P-selectin-mediated recruitment of leukocytes. Both of these mechanisms are known to be involved in autoimmunity and autoimmune tissue injury, e.g. in systemic lupus erythematosus, but a contribution of PTX3 is hypothetical. To evaluate a potential immunoregulatory role of PTX3 in autoimmunity we crossed Ptx3-deficient mice with Fas-deficient (lpr) C57BL/6 (B6) mice with mild lupus-like autoimmunity. PTX3 was found to be increasingly expressed in kidneys and lungs of B6lpr along disease progression. Lack of PTX3 impaired the phagocytic uptake of apoptotic T cells into peritoneal macrophages and selectively expanded CD4/CD8 double negative T cells while other immune cell subsets and lupus autoantibody production remained unaffected. Lack of PTX3 also aggravated autoimmune lung disease, i.e. peribronchial and perivascular CD3+ T cell and macrophage infiltrates of B6lpr mice. In contrast, histomorphological and functional parameters of lupus nephritis remained unaffected by the Ptx3 genotype. Together, PTX3 specifically suppresses autoimmune lung disease that is associated with systemic lupus erythematosus. Vice versa, loss-of-function mutations in the Ptx3 gene might represent a genetic risk factor for pulmonary (but not renal) manifestations of systemic lupus or other autoimmune diseases
    • …
    corecore