661 research outputs found

    Risk Stratification Before and During Treatment in Newly Diagnosed Multiple Myeloma: From Clinical Trials to the Real-World Setting

    Get PDF
    Multiple Myeloma (MM) is a hematologic malignancy characterized by a wide clinical and biological heterogeneity leading to different patient outcomes. Various prognostic tools to stratify newly diagnosed (ND)MM patients into different risk groups have been proposed. At baseline, the standard-of-care prognostic score is the Revised International Staging System (R-ISS), which stratifies patients according to widely available serum markers (i.e., albumin, β 2-microglobulin, lactate dehydrogenase) and high-risk cytogenetic abnormalities detected by fluorescence in situ hybridization. Though this score clearly identifies a low-risk and a high-risk population, the majority of patients are categorized as at “intermediate risk”. Although new prognostic factors identified through molecular assays (e.g., gene expression profiling, next-generation sequencing) are now available and may improve risk stratification, the majority of them need specialized centers and bioinformatic expertise that may preclude their broad application in the real-world setting. In the last years, new tools to monitor response and measurable residual disease (MRD) with very high sensitivity after the start of treatment have been developed. MRD analyses both inside and outside the bone marrow have a strong prognostic impact, and the achievement of MRD negativity may counterbalance the high-risk behavior identified at baseline. All these techniques have been developed in clinical trials. However, their efficient application in real-world clinical practice and their potential role to guide treatment-decision making are still open issues. This mini review will cover currently known prognostic factors identified before and during first-line treatment, with a particular focus on their potential applications in real-world clinical practice

    An investigation into the use of CALNN capped gold nanoparticles for improving microwave heating

    Get PDF
    The use of microwaves for both therapeutic and diagnostic applications has become an accepted alternative in the clinics. For diagnostics, gold (Au) nanoparticles have been used for imaging tumour vasculature and also served as potential diagnostic markers for cancer. [1] In high-frequency therapeutic applications, two different treatments exist; hyperthermia and ablation. [2] In hyperthermia, the tumour tissue is heated to supra-physiological temperatures, making it more susceptible to traditional treatment methods such as chemotherapy and radiotherapy. This type of treatment could be administered externally. However, there still remains a challenge to focus the heating to particular areas which need to be treated, while avoiding unwanted hotspots. To date, numerous methods have been used to focus the heat from different antennas. A novel technique which is being investigated is the use of nanoparticles to improve focusing and thus achieve better localised heating effect. A previous study by Cardinal et al. [3] showed that at RF-frequencies, remarkable improvements resulted from using Au nanoparticles. In this work, the use of CALNN peptide capped Au nanoparticles for the focusing of microwaves at 2.45 GHz is investigated. The CALNN capped Au nanoparticles were prepared as described elsewhere. [4] Au nanoparticles were added to tissue mimicking solutions (such as muscle, liver or fat) to compare their dielectric properties with the those of the control (without Au nanoparticles). The frequency range investigated was from 400 MHz to 20 GHz. During this study, various concentrations, particle sizes and shapes of Au nanoparticles were considered. The study also investigated the heating rates of the pseudo-biological tissue samples and how these varied with the addition of the nanoparticles. The outcome of this study will determine the viability of using CALNN capped Au nanoparticles to assist in the focusing of microwave radiation during microwave hyperthermia

    Mental and behavioral disorders due to substance abuse and perinatal outcomes: A study based on linked population data in New South Wales, Australia

    Full text link
    Background: The effects of mental and behavioral disorders (MBD) due to substance use during peri-conception and pregnancy on perinatal outcomes are unclear. The adverse perinatal outcomes of primiparous mothers admitted to hospital with MBD due to substance use before and/or during pregnancy were investigated. Method: This study linked birth and hospital records in NSW, Australia. Subjects included primiparous mothers admitted to hospital for MBD due to use of alcohol, opioids or cannabinoids during peri-conception and pregnancy. Results: There were 304 primiparous mothers admitted to hospital for MBD due to alcohol use (MBDA), 306 for MBD due to opioids use (MBDO) and 497 for MBD due to cannabinoids (MBDC) between the 12 months peri-conception and the end of pregnancy. Primiparous mothers admitted to hospital for MBDA during pregnancy or during both peri-conception and pregnancy were significantly more likely to give birth to a baby of low birthweight (AOR = 4.03, 95%CI: 1.97-8.24 for pregnancy; AOR = 9.21, 95%CI: 3.76-22.57 both periods); preterm birth (AOR = 3.26, 95% CI: 1.52-6.97 for pregnancy; AOR = 4.06, 95%CI: 1.50-11.01 both periods) and admission to SCN or NICU (AOR = 2.42, 95%CI: 1.31-4.49 for pregnancy; AOR = 4.03, 95%CI: 1.72-9.44 both periods). Primiparous mothers admitted to hospital for MBDO, MBDC or a combined diagnosis were almost three times as likely to give birth to preterm babies compared to mothers without hospital admissions for psychiatric or substance use disorders. Babies whose mothers were admitted to hospital with MBDO before and/or during pregnancy were six times more likely to be admitted to SCN or NICU (AOR = 6.29, 95%CI: 4.62-8.57). Conclusion: Consumption of alcohol, opioids or cannabinoids during peri-conception or pregnancy significantly increased the risk of adverse perinatal outcomes. © 2014 by the authors; licensee MDPI, Basel, Switzerland

    Multilevel Structured Low-Density Parity-Check Codes

    Full text link
    Low-Density Parity-Check (LDPC) codes are typically characterized by a relatively high-complexity description, since a considerable amount of memory is required in order to store their code description, which can be represented either by the connections of the edges in their Tanner graph or by the non-zero entries in their parity-check matrix (PCM). This problem becomes more pronounced for pseudo-random LDPC codes, where literally each non-zero entry of their PCM has to be enumerated, and stored in a look-up table. Therefore, they become inadequate for employment in memoryconstrained transceivers. Motivated by this, we are proposing a novel family of structured LDPC codes, termed as Multilevel Structured (MLS) LDPC codes, which benefit from reduced storage requirements, hardware-friendly implementations as well as from low-complexity encoding and decoding. Our simulation results demonstrate that these advantages accrue without any compromise in their attainable Bit Error Ratio (BER) performance, when compared to their previously proposed more complex counterparts of the same code-length. In particular, we characterize a half-rate quasi-cyclic (QC) MLS LDPC code having a block length of 8064 that can be uniquely and unambiguously described by as few as 144 edges, despite exhibiting an identical BER performance over both Additive White Gaussian Noise (AWGN) and uncorrelated Rayleigh (UR) channels, when compared to a pseudorandom construction, which requires the enumeration of a significantly higher number of 24,192 edges

    Pilot symbol assisted coding

    Full text link

    Failure Under Stress: The Effect of the Exotic Herbivore \u3cem\u3eAdelges tsugae\u3c/em\u3e on Biomechanics of \u3cem\u3eTsuga canadensis\u3c/em\u3e

    Get PDF
    Background and Aims Exotic herbivores that lack a coevolutionary history with their host plants can benefit from poorly adapted host defences, potentially leading to rapid population growth of the herbivore and severe damage to its plant hosts. The hemlock woolly adelgid (Adelges tsugae) is an exotic hemipteran that feeds on the long-lived conifer eastern hemlock (Tsuga canadensis), causing rapid mortality of infested trees. While the mechanism of this mortality is unknown, evidence indicates that A. tsugae feeding causes a hypersensitive response and alters wood anatomy. This study investigated the effect of A. tsugae feeding on biomechanical properties at different spatial scales: needles, twigs and branches. Methods Uninfested and A. tsugae-infested samples were collected from a common garden experiment as well as from naturally infested urban and rural field sites. Tension and flexure mechanical tests were used to quantify biomechanical properties of the different tissues. In tissues that showed a significant effect of herbivory, the potential contributions of lignin and tissue density on the results were quantified. Key Results Adelges tsugae infestation decreased the abscission strength, but not flexibility, of needles. A. tsugae feeding also decreased mechanical strength and flexibility in currently attacked twigs, but this effect disappeared in older, previously attacked branches. Lignin and twig tissue density contributed to differences in mechanical strength but were not affected by insect treatment. Conclusions Decreased strength and flexibility in twigs, along with decreased needle strength, suggest that infested trees experience resource stress. Altered growth patterns and cell wall chemistry probably contribute to these mechanical effects. Consistent site effects emphasize the role of environmental variation in mechanical traits. The mechanical changes measured here may increase susceptibility to abiotic physical stressors in hemlocks colonized by A. tsugae. Thus, the interaction between herbivore and physical stresses is probably accelerating the decline of eastern hemlock, as HWA continues to expand its range

    Is platelet inhibition due to thienopyridines increased in elderly patients, in patients with previous stroke and patients with low body weight as a possible explanation of an increased bleeding risk?

    Get PDF
    Background The TRITON-TIMI 38 study has identified three subgroups of patients with a higher risk of bleeding during treatment with the thienopyridine prasugrel: patients with a history of stroke or transient ischaemic attack (TIA), patients ≥75 years and patients with a body weight <60 kg. However, the underlying pathobiology leading to this increased bleeding risk remains to be elucidated. The higher bleeding rate may be due to a stronger prasugrelinduced inhibition of platelet aggregation in these subgroups. The aim of the present study was to determine whether on-treatment platelet reactivity is lower in these risk subgroups as compared with other patients in a large cohort on the thienopyridine clopidogrel undergoing elective coronary stenting. Methods A total of 1069 consecutive patients were enrolled. On-clopidogrel platelet reactivity was measured in parallel by light transmittance aggregometry, the Verify- Now®P2Y12 assay and the PFA-100 collagen/ADP cartridge. Results Fourteen patients (1.5%) had a prior history of stroke or TIA, 138 patients (14.5%) were older than 75 years and 30 patients (3.2%) had a body weight <60 kg. Age ≥75 years and a history of stroke were independent predictors of a higher on-treatment platelet reactivity. In contrast, a body weight <60 kg was significantly associated with a lower on-treatment platelet reactivity. Conclusion In two high-risk subgroups for bleeding, patients ≥75 years and patients with previous stroke, onclopidogrel platelet reactivity is increased. In contrast, in patients with a low body weight, on-clopidogrel platelet reactivity is decreased, suggesting that a stronger response to a thienopyridine might only lead to more bleeds in patients with low body weight
    corecore