1,539 research outputs found

    A mechanistic study on the phototoxicity of atorvastatin: singlet oxygen generation by a phenanthrene-like photoproduct

    Get PDF
    Atorvastatin calcium (ATV) is one of the most frequently prescribed drugs worldwide. Among the adverse effects observed for this lipid-lowering agent, clinical cases of cutaneous adverse reactions have been reported and associated with photosensitivity disorders. Previous work dealing with ATV photochemistry has shown that exposure to natural sunlight in aqueous solution leads to photoproducts resulting from oxidation of the pyrrole ring and from cyclization to a phenanthrene derivative. Laser flash photolysis of ATV, at both 266 and 308 nm, led to a transient spectrum with two maxima at λ ) 360 and λ ) 580 nm (τ ) 41 μs), which was assigned to the primary intermediate of the stilbene-like photocyclization. On the basis of the absence of a triplet-triplet absorption, the role of the parent drug as singlet oxygen photosensitizer can be discarded. By contrast, a stable phenanthrene-like photoproduct would be a good candidate to play this role. Laser flash photolysis of this compound showed a triplet-triplet transient absorption at λmax ) 460 nm with a lifetime of 26 μs, which was efficiently quenched by oxygen (kq ) 3 ((0.2) × 109 M-1 s-1). Its potential to photosensitize formation of singlet oxygen was confirmed by spin trapping experiments, through conversion of TEMP to the stable free radical TEMPO. The photoreactivity of the phenanthrene-like photoproduct was investigated using Trp as a marker. The disappearance of the amino acid fluorescence (λmax ) 340 nm) after increasing irradiation times at 355 nm was taken as a measurement of photodynamic oxidation. To confirm the involvement of a type II mechanism, the same experiment was also performed in D2O; this resulted in a significant enhancement of the reaction rate. On the basis of the obtained photophysical and photochemical results, the phototoxicity of atorvastatin can be attributed to singlet oxygen formation with the phenanthrene-like photoproduct as a photosensitizer

    Exploring virtual reality object perception following sensory-motor interactions with different visuo-haptic collider properties.

    Get PDF
    Interacting with the environment often requires the integration of visual and haptic information. Notably, perceiving external objects depends on how our brain binds sensory inputs into a unitary experience. The feedback provided by objects when we interact (through our movements) with them might then influence our perception. In VR, the interaction with an object can be dissociated by the size of the object itself by means of 'colliders' (interactive spaces surrounding the objects). The present study investigates possible after-effects in size discrimination for virtual objects after exposure to a prolonged interaction characterized by visual and haptic incongruencies. A total of 96 participants participated in this virtual reality study. Participants were distributed into four groups, in which they were required to perform a size discrimination task between two cubes before and after 15 min of a visuomotor task involving the interaction with the same virtual cubes. Each group interacted with a different cube where the visual (normal vs. small collider) and the virtual cube's haptic (vibration vs. no vibration) features were manipulated. The quality of interaction (number of touches and trials performed) was used as a dependent variable to investigate the performance in the visuomotor task. To measure bias in size perception, we compared changes in point of subjective equality (PSE) before and after the task in the four groups. The results showed that a small visual collider decreased manipulation performance, regardless of the presence or not of the haptic signal. However, change in PSE was found only in the group exposed to the small visual collider with haptic feedback, leading to increased perception of the cube size. This after-effect was absent in the only visual incongruency condition, suggesting that haptic information and multisensory integration played a crucial role in inducing perceptual changes. The results are discussed considering the recent findings in visual-haptic integration during multisensory information processing in real and virtual environments

    Stack-CNN algorithm: A new approach for the detection of space objects

    Get PDF
    We present a new trigger algorithm combining a stacking procedure and a Convolutional Neural Network that could be applied to any space object moving linearly or with a known trajectory in the field of view of a telescope. This includes the detection of high velocity fragmentation debris in orbit. A possible implementation is as trigger system for an orbiting Space Debris remediation system. The algorithm was initially developed as offline system for the Multiwavelength Imaging New Instrument for the Extreme Universe Space Observatory (Mini-EUSO), on the International Space Station. We evaluated the performance of the algorithm on simulated data and compared it with those obtained by means of a more conventional trigger algorithm. Results indicate that this method would allow to recognise signals with 1% Signal over Background Ratio (SBR) on poissonian random fluctuations with a negligible fake trigger rate. Such promising results lead us to not only consider this technique as an online trigger system, but also as an offline method for searching moving signals and their characteristics (speed and direction). More generally, any kind of telescope (on the ground or in space) such as those used for space debris, meteors monitoring or cosmic ray science, could benefit from this automatized technique. The content of this paper is part of the recent Italian patent proposal submitted by the authors (patent application number: 102021000009845)

    What drives the valuation of entrepreneurial ventures? A map to navigate the literature and research directions

    Get PDF
    The drivers of the valuations of entrepreneurial ventures are an important issue in entrepreneurial finance, but related research is fragmented. The theoretical perspectives and the drivers highlighted by previous studies differ based on the financial milestones during a venture's lifecycle in which the valuation is performed (e.g., venture capital investments, initial public offerings, acquisitions). The introduction of new digital financing channels (e.g., crowdfunding, initial coin offerings) that allow retail investors to directly invest in entrepreneurial ventures challenge our understanding of the drivers of valuation. This change has also increased the diversity in the sequence of financial milestones that ventures go through, with important implications for valuation. We conduct a systematic literature review and develop a map highlighting how and why the drivers of venture valuations and their underlying theoretical lenses vary across the different milestones that ventures go through. The map allows us to outline new promising avenues for future research.Plain English Summary In this paper, we conduct a systematic literature review on entrepreneurial ventures' valuation drivers and their underlying theoretical lenses, highlighting how and why they vary along firms' life cycle. The valuation of entrepreneurial ventures is a challenging task for practitioners and a relevant issue that attracts the attention of scholars in entrepreneurship, finance, management, and economics. The literature on the topic is highly fragmented. Indeed, the context in which venture valuations are observed (e.g., in private deals or public offerings) differs across different financial milestones. The introduction of new digital financing channels (e.g., crowdfunding, initial coin offerings) and the increased diversity in the sequence of financial milestones that ventures go through further challenge our understanding of valuation drivers. This study is primarily aimed at scholars, offering them a map to create order in what we know about the drivers of entrepreneurial venture valuations and indicating promising avenues for future research

    Serratiopeptidase reduces the invasion of osteoblasts by Staphylococcus aureus

    Get PDF
    Finding new strategies to counteract periprosthetic infection and implant failure is a main target in orthopedics. Staphylococcus aureus, the leading etiologic agent of orthopedic implant infections, is able to enter and kill osteoblasts, to stimulate pro-inflammatory chemokine secretion, to recruit osteoclasts, and to cause inflammatory osteolysis. Moreover, by entering eukaryotic cells, staphylococci hide from the host immune defenses and shelter from the extracellular antibiotics. Thus, infection persists, inflammation thrives, and a highly destructive osteomyelitis occurs around the implant. The ability of serratiopeptidase (SPEP), a metalloprotease by Serratia marcescens, to control S. aureus invasion of osteoblastic MG-63 cells and pro-inflammatory chemokine MCP-1 secretion was evaluated. Human osteoblast cells were infected with staphylococcal strains in the presence and in the absence of SPEP. Cell proliferation and cell viability were also evaluated. The release of pro-inflammatory chemokine MCP-1 was evaluated after the exposure of the osteoblast cells to staphylococcal strains. The significance of the differences in the results of each test and the relative control values was determined with Student’s t-test. SPEP impairs their invasiveness into osteoblasts, without affecting the viability and proliferation of bone cells, and tones down their production of MCP-1. We recognize SPEP as a potential tool against S. aureus bone infection and destruction

    Alkali-activation of marble sludge: Influence of curing conditions and waste glass addition

    Get PDF
    The use of marble sludge as precursor for new alkali activated materials was assessed studying three different curing conditions (air, humid and water immersion, respectively), after an initial curing at 60 °C for 24 h, and two glass powder fractions additions (2.5 and 5.0 vol%). Microstructural, physical (drying shrinkage, Fourier transform-infrared (FT-IR) spectroscopy, X-ray spectroscopy (XPS)), thermal (differential thermal analysis – thermogravimetric analysis, DTA-TGA) and mechanical (flexural and compressive strength) properties were investigated. Air curing was the most favourable atmosphere for mechanical properties development because it promotes Si-O-Si polymerization and gel densification, as demonstrated by FT-IR and FE-SEM observations, respectively. Satisfactory mechanical properties were achieved (18 MPa and 45 MPa, for flexural and compressive strength, respectively) in particular for glass containing mixtures. Moreover, glass powder addition significantly reduced drying shrinkage of air-cured samples because it operated as a rigid aggregate in the matrix and strengthened the formed gel

    Two-sided estimates of minimum-error distinguishability of mixed quantum states via generalized Holevo-Curlander bounds

    Full text link
    We prove a concise factor-of-2 estimate for the failure rate of optimally distinguishing an arbitrary ensemble of mixed quantum states, generalizing work of Holevo [Theor. Probab. Appl. 23, 411 (1978)] and Curlander [Ph.D. Thesis, MIT, 1979]. A modification to the minimal principle of Cocha and Poor [Proceedings of the 6th International Conference on Quantum Communication, Measurement, and Computing (Rinton, Princeton, NJ, 2003)] is used to derive a suboptimal measurement which has an error rate within a factor of 2 of the optimal by construction. This measurement is quadratically weighted and has appeared as the first iterate of a sequence of measurements proposed by Jezek et al. [Phys. Rev. A 65, 060301 (2002)]. Unlike the so-called pretty good measurement, it coincides with Holevo's asymptotically optimal measurement in the case of nonequiprobable pure states. A quadratically weighted version of the measurement bound by Barnum and Knill [J. Math. Phys. 43, 2097 (2002)] is proven. Bounds on the distinguishability of syndromes in the sense of Schumacher and Westmoreland [Phys. Rev. A 56, 131 (1997)] appear as a corollary. An appendix relates our bounds to the trace-Jensen inequality.Comment: It was not realized at the time of publication that the lower bound of Theorem 10 has a simple generalization using matrix monotonicity (See [J. Math. Phys. 50, 062102]). Furthermore, this generalization is a trivial variation of a previously-obtained bound of Ogawa and Nagaoka [IEEE Trans. Inf. Theory 45, 2486-2489 (1999)], which had been overlooked by the autho

    Vulnerabilità del sistema bancario italiano. Diagnosi e rimedi

    Get PDF
    Il sovraccarico di crediti deteriorati (NPL) associato ad una bassa redditività sono i fattori che principalmente contribuiscono a determinare la vulnerabilità del sistema bancario italiano. L'attenzione finora si è concentrata principalmente sull'eccesso di NPL, mentre il problema della redditività è di solito considerato come una questione che deve essere affrontata e gestita dal management bancario e non come un'area meritevole di specifici interventi di regolamentazione e di vigilanza. Anche se il capitale costituisce il fulcro della regolamentazione prudenziale, i regolatori e i supervisori non dedicano di solito sufficiente attenzione al problema da dove venga questo capitale. Nello studio, condotto su un campione di 410 banche e gruppi bancari italiani, facciamo uno stress test sugli NPL integrato con un test di vitalità. I risultati principali sono i seguenti. I fattori di vulnerabilità del sistema bancario italiano sono estremamente diffusi e non limitati a poche "mele marce", come talora si sostiene. Interventi di ricapitalizzazione nell'ordine di circa 10 miliardi di euro sarebbero necessari per risolvere in misura accettabile l'eccesso degli NPL; tuttavia, molto più importante è che limitare gli interventi al problema degli NPL non è sufficiente a riportare gran parte delle banche italiane in condizioni di vitalità di lungo termine, a causa delle inefficienze dei loro attuali modelli di business. L'analisi sul sistema bancario italiano offre motivi a supporto di una critica radicale dell'attuale approccio regolamentare e di vigilanza che, trascurando il problema fondamentale della redditività delle banche, non è quindi in grado di evitare i rischi per la loro solvibilità originati dall'accumulazione degli NPL. Sosteniamo quindi che i cambiamenti strutturali necessari per riportare le banche italiane su un percorso di vitalità di lungo termine richiedono un approccio innovativo di regolamentazione e di supervisione.The major vulnerabilities of the Italian banking system are the overhang of NPLs and low profitability. Differently from the attention given to excesses of NPLs, the profitability problem is normally considered to be a matter to be left to bank management and not an area of explicit direct regulatory action. Although focusing on capital requirements, regulators and supervisors seldom pose the question of where capital comes from. Using a large sample of 410 Italian domestic banking groups and individual banks, we propose an NPL stress test and a viability test that show: that the system’s vulnerability is a widespread phenomenon; that a further recapitalisation of around ten billion euro is necessary; and that, more importantly, limiting interventions to the overhang problem does not put the majority of Italian banks into a viability path due to the inefficiencies coming from their current business models. The analysis of the Italian case strengthens the critique of current regulation and supervision because, not focusing on bank profitability, they do not avoid threats on solvency coming from the accumulation of NPLs. We thus argue that the structural changes necessary to put the Italian banking system into a viable path require new regulatory and supervisory approaches

    Dealing with the vulnerability of the Italian banking system

    Get PDF
    Il sovraccarico di crediti deteriorati (NPL) associato ad una bassa redditività sono i fattori che principalmente contribuiscono a determinare la vulnerabilità del sistema bancario italiano. L'attenzione finora si è concentrata principalmente sull'eccesso di NPL, mentre il problema della redditività è di solito considerato come una questione che deve essere affrontata e gestita dal management bancario e non come un'area meritevole di specifici interventi di regolamentazione e di vigilanza. Anche se il capitale costituisce il fulcro della regolamentazione prudenziale, i regolatori e i supervisori non dedicano di solito sufficiente attenzione al problema da dove venga questo capitale. Nello studio, condotto su un campione di 410 banche e gruppi bancari italiani, facciamo uno stress test sugli NPL integrato con un test di vitalità. I risultati principali sono i seguenti. I fattori di vulnerabilità del sistema bancario italiano sono estremamente diffusi e non limitati a poche "mele marce", come talora si sostiene. Interventi di ricapitalizzazione nell'ordine di circa 10 miliardi di euro sarebbero necessari per risolvere in misura accettabile l'eccesso degli NPL; tuttavia, molto più importante è che limitare gli interventi al problema degli NPL non è sufficiente a riportare gran parte delle banche italiane in condizioni di vitalità di lungo termine, a causa delle inefficienze dei loro attuali modelli di business. L'analisi sul sistema bancario italiano offre motivi a supporto di una critica radicale dell'attuale approccio regolamentare e di vigilanza che, trascurando il problema fondamentale della redditività delle banche, non è quindi in grado di evitare i rischi per la loro solvibilità originati dall'accumulazione degli NPL. Sosteniamo quindi che i cambiamenti strutturali necessari per riportare le banche italiane su un percorso di vitalità di lungo termine richiedono un approccio innovativo di regolamentazione e di supervisione.The major vulnerabilities of the Italian banking system are the overhang of NPLs and low profitability. Differently from the attention given to excesses of NPLs, the profitability problem is normally considered to be a matter to be left to bank management and not an area of explicit direct regulatory action. Although focusing on capital requirements, regulators and supervisors seldom pose the question of where capital comes from. Using a large sample of 410 Italian domestic banking groups and individual banks, we propose an NPL stress test and a viability test that show: that the system’s vulnerability is a widespread phenomenon; that a further recapitalisation of around ten billion euro is necessary; and that, more importantly, limiting interventions to the overhang problem does not put the majority of Italian banks into a viability path due to the inefficiencies coming from their current business models. The analysis of the Italian case strengthens the critique of current regulation and supervision because, not focusing on bank profitability, they do not avoid threats on solvency coming from the accumulation of NPLs. We thus argue that the structural changes necessary to put the Italian banking system into a viable path require new regulatory and supervisory approaches

    Nonlinear Model Predictive Control for Integrated Energy-Efficient Torque-Vectoring and Anti-Roll Moment Distribution

    Get PDF
    This study applies nonlinear model predictive control (NMPC) to the torque-vectoring (TV) and front-to-total anti-roll moment distribution control of a four-wheel-drive electric vehicle with in-wheel-motors, a brake-by-wire system, and active suspension actuators. The NMPC cost function formulation is based on energy efficiency criteria, and strives to minimize the power losses caused by the longitudinal and lateral tire slips, friction brakes, and electric powertrains, while enhancing the vehicle cornering response in steady-state and transient conditions. The controller is assessed through simulations using an experimentally validated high-fidelity vehicle model, along ramp steer and multiple step steer maneuvers, including and excluding the direct yaw moment and active anti-roll moment distribution actuations. The results show: 1) the substantial enhancement of energy saving and vehicle stabilization performance brought by the integration of the active suspension contribution and TV; 2) the significance of the power loss terms of the NMPC formulation on the results; and 3) the effectiveness of the NMPC with respect to the benchmarking feedback and rule based controllers
    corecore