255 research outputs found

    A methodology to support strategic decisions in future human space exploration: from scenario definition to building blocks assessment

    Get PDF
    The human exploration of multiple deep space destinations (e.g. Cis-Lunar, NEAs), in view of the final challenge of sending astronauts to Mars, represents a current and consistent study domain especially in terms of its possible scenarios and mission architectures assessments, as proved by the numerous on-going activities about this topic and moreover by the global exploration roadmap. After exploring and analysing different possible solutions to identify the most flexible path, a detailed characterisation of several Design Reference Missions (DRMs) represents a necessity in order to evaluate the feasibility and affordability of deep space exploration missions, specifically in terms of enabling technological capabilities. The study presented in this paper was aimed at defining an evolutionary scenario for deep space exploration in the next 30 years with the final goal of sending astronauts on the surface of Mars by the end of 2030 decade. Different destinations were considered as targets to build the human exploration scenario, with particular attention to Earth-Moon Lagrangian points, NEA and Moon. For all the destinations selected as part of the exploration scenario, the assessment and characterisation of the relative Design Reference Missions were performed. Specifically they were defined in terms of strategies, architectures and mission elements. All the analyses were based on a pure technical approach with the objective of evaluating the feasibility of a long term strategy for capabilities achievement and technological development to enable future space exploration. This paper describes the process that was followed within the study, focusing on the adopted methodology, and reports the major obtained results, in terms of scenario and mission analysi

    Speeding up Simplification of Polygonal Curves using Nested Approximations

    Full text link
    We develop a multiresolution approach to the problem of polygonal curve approximation. We show theoretically and experimentally that, if the simplification algorithm A used between any two successive levels of resolution satisfies some conditions, the multiresolution algorithm MR will have a complexity lower than the complexity of A. In particular, we show that if A has a O(N2/K) complexity (the complexity of a reduced search dynamic solution approach), where N and K are respectively the initial and the final number of segments, the complexity of MR is in O(N).We experimentally compare the outcomes of MR with those of the optimal "full search" dynamic programming solution and of classical merge and split approaches. The experimental evaluations confirm the theoretical derivations and show that the proposed approach evaluated on 2D coastal maps either shows a lower complexity or provides polygonal approximations closer to the initial curves.Comment: 12 pages + figure

    Free circulating ICAM-1 in serum and cerebrospinal fluid of HIV-1 infected patients correlate with TNF-α and blood-brain barrier damage

    Get PDF
    The mechanism for the initiation of blood-brain barrier damage and intrathecal inflammation in patients infected with the human immunodeficiency virus (HIV) is poorly understood. We have recently reported that tumour necrosis factor-α (TNF-α) mediates active neural inflammation and blood-brain barrier damage in HIV-1 infection. Stimulation of endothelial cells by TNF-α induces the expression of intercellular adhesion molecule-1 (ICAM-1), which is an important early marker of immune activation and response. We report herein for the first time the detection of high levels of free circulating ICAM-1 in serum and cerebrospinal fluid of patients with HIV-1 infection. Free circulating ICAM-1 in these patients correlated with TNF-α concentrations and with the degree of blood-brain barrier damage and were detected predominantly in patients with neurologic involvement. These findings have important implications for the understanding and investigation of the intrathecal inflammatory response in HIV-1 infection

    The Agile Alert System For Gamma-Ray Transients

    Full text link
    In recent years, a new generation of space missions offered great opportunities of discovery in high-energy astrophysics. In this article we focus on the scientific operations of the Gamma-Ray Imaging Detector (GRID) onboard the AGILE space mission. The AGILE-GRID, sensitive in the energy range of 30 MeV-30 GeV, has detected many gamma-ray transients of galactic and extragalactic origins. This work presents the AGILE innovative approach to fast gamma-ray transient detection, which is a challenging task and a crucial part of the AGILE scientific program. The goals are to describe: (1) the AGILE Gamma-Ray Alert System, (2) a new algorithm for blind search identification of transients within a short processing time, (3) the AGILE procedure for gamma-ray transient alert management, and (4) the likelihood of ratio tests that are necessary to evaluate the post-trial statistical significance of the results. Special algorithms and an optimized sequence of tasks are necessary to reach our goal. Data are automatically analyzed at every orbital downlink by an alert pipeline operating on different timescales. As proper flux thresholds are exceeded, alerts are automatically generated and sent as SMS messages to cellular telephones, e-mails, and push notifications of an application for smartphones and tablets. These alerts are crosschecked with the results of two pipelines, and a manual analysis is performed. Being a small scientific-class mission, AGILE is characterized by optimization of both scientific analysis and ground-segment resources. The system is capable of generating alerts within two to three hours of a data downlink, an unprecedented reaction time in gamma-ray astrophysics.Comment: 34 pages, 9 figures, 5 table

    Radar observation and recontruction of Cosmos 1408 fragmentation

    Get PDF
    The population of objects in space has increased dramatically over recent decades. Space debris now represents the majority of objects in space resulting from inactive satellites, breakups, collisions and fragmentations. It has become a concern for institutions all over the world and, as such, it has led to the fostering of several programmes to counter the issues. Among these, the use of ground-based sensors for Space Surveillance Tracking (SST) activities and services and tools for analysing fragmentations play a crucial role. This work presents the activities carried out by Politecnico di Milano, Italian Space Agency and Italian National Institute of Astrophysics in this framework, using data from SST networks and the observation measurements from Bistatic Radar for LEo Survey (BIRALES), an Italian bistatic radar belonging to the EUropean Space Surveillance and Tracking (EUSST), which contributed most to the monitoring of the cloud of fragments. Exploiting Two-Line Elements (TLEs) of observed fragments, a reverse engineering approach is used to reconstruct a fragmentation in orbit through the use of the software suite PUZZLE developed at Politecnico di Milano. The analyses focus on studying the fragmentation of the Cosmos 1408 satellite, which occurred on November 15th 2021 following an Anti-SATellite (ASAT) missile test. More than 1000 trackable pieces and millions of smaller debris (estimated from numerical analysis) were produced by this event, increasing the population of inactive objects around the Earth, and threatening nearby orbiting objects. First, the processing method adopted from BIRALES in observing Cosmos debris is presented and discussed and a critical analysis about the derivable information is conducted. Then, these data and those from SST network observations are used to identify the epoch and the location of the fragmentation. In this procedure, the software toolkit PUZZLE, developed by Politecnico di Milano within a project funded by the Italian Space Agency and extended through the European Research Council, is used

    PRACTICAL DESIGN EXAMPLES FOR HUMAN HABITATS IN SPACE, OFF-GRID, AND IN LOW-IMPACT COMMUNITIES

    Get PDF
    All human habitat problems fall into three major categories- the environment, the habitat itself, and the occupants. By breaking these problems down into common themes and addressing them directly, we can build a common knowledge base for all three challenges faced by humanity. A crew living in space has the new problems of coping with radiation, microgravity, and vacuum. All the while, they are dealing the usual issues of eating, sleeping, and getting along with the rest of the occupants. By isolating the differences between space and earth habitats, we can create common architectural styles for each human habitat challenge where commonality is appropriate. We can then examine the differences, then isolate and modularize the secondary systems where possible. This simplifies experimentation and testing of the physical and psychological design of a structure on Earth prior to attempting use in space. It also allows spin-off architectures for extreme environments, off-grid settlements, research bases, and low impact communities on Earth. By isolating and testing each attribute of the system in parallel with control groups, we can scientifically refine the systems for human shelter regardless of environment. This paper will show numerous examples of architectures designed for space or space analog research bases. These designs can be both de-scoped to off-grid sustainable architecture, and scoped up for space habitat applications. Concepts such as internal greenhouses, enclosed permaculture, thermal protection, energy management, and radiation shielding are included for both minimal habitats and large bases. These systems can then be applied for disaster first responders, research bases in extreme environments, o-grid homes, and low-impact communities

    AGILE detection of a strong gamma-ray flare from the blazar 3C 454.3

    Get PDF
    We report the first blazar detection by the AGILE satellite. AGILE detected 3C 454.3 during a period of strongly enhanced optical emission in July 2007. AGILE observed the source with a dedicated repointing during the period 2007 July 24-30 with its two co-aligned imagers, the Gamma-Ray Imaging Detector and the hard X-ray imager Super-AGILE sensitive in the 30 MeV-50 GeV and 18-60 keV, respectively. Over the entire period, AGILE detected gamma-ray emission from 3C 454.3 at a significance level of 13.8-σ\sigma with an average flux (E>>100 MeV) of (280±40)×108(280 \pm 40) \times 10^{-8} photons cm2^{-2} s1^{-1}. The gamma-ray flux appears to be variable towards the end of the observation. No emission was detected by Super-AGILE in the energy range 20-60 keV, with a 3-σ\sigma upper limit of 2.3×1032.3 \times 10^{-3} photons cm2^{-2} s1^{-1}. The gamma-ray flux level of 3C 454.3 detected by AGILE is the highest ever detected for this quasar and among the most intense gamma-ray fluxes ever detected from Flat Spectrum Radio Quasars.Comment: Accepted by Astrophysical Journal Letters; 14 pages, 3 EPS Figures, 1 Tabl

    Functional Progression after Dose Suspension or Discontinuation of Nintedanib in Idiopathic Pulmonary Fibrosis: A Real-Life Multicentre Study

    Get PDF
    Background: Idiopathic pulmonary fibrosis (IPF) is a chronic interstitial lung disease with rapidly progressive evolution and an unfavorable outcome. Nintedanib (NTD) is an antifibrotic drug that has been shown to be effective in slowing down the progression of the disease. The aim of our study was to examine the efficacy, especially in terms of the functional decline, and the safety profile of NTD in patients treated with the recommended dose and subjects who reduced or suspended the therapy due to the occurrence of adverse reactions. Methods: We conducted a real-life retrospective study based on the experience of NTD use in two centers between 2015 and 2022. Clinical data were evaluated at baseline, at 6 and 12 months after the NTD introduction in the whole population and in subgroups of patients who continued the full-dose treatment, at a reduced dosage, and at the discontinuation of treatment. The following data were recorded: the demographic features, IPF clinical features, NTD therapeutic dosage, tolerability and adverse events, pulmonary function tests (PFTs), the duration of treatment upon discontinuation, and the causes of interruption. Results: There were 54 IPF patients who were included (29.6% females, with a median (IQR) age at baseline of 75 (69.0-79.0) years). Twelve months after the introduction of the NTD therapy, 20 (37%) patients were still taking the full dose, 11 (20.4%) had reduced it to 200 mg daily, and 15 (27.8%) had stopped treatment. Gastrointestinal intolerance predominantly led to the dose reduction (13.0%) and treatment cessation (20.4%). There were two deaths within the initial 6 months (3.7%) and seven (13.0%) within 12 months. Compared to the baseline, the results of the PFTs remained stable at 6 and 12 months for the entire NTD-treated population, except for a significant decline in the DLCO (% predicted value) at both 6 (38.0 ± 17.8 vs. 43.0 ± 26.0; p = 0.041) and 12 months (41.5 ± 15.3 vs. 44.0 ± 26.8; p = 0.048). The patients who continued treatment at the full dose or a reduced dosage showed no significant differences in the FVC and the DLCO at 12 months. Conversely, those discontinuing the NTD exhibited a statistically significant decline in the FVC (% predicted value) at 12 months compared to the baseline (55.0 ± 13.5 vs. 70.0 ± 23.0; p = 0.035). Conclusions: This study highlights the functional decline of the FVC at 12 months after the NTD initiation among patients discontinuing therapy but not among those reducing their dosage

    Trends and patterns in the use of computed tomography in children and young adults in Catalonia — results from the EPI-CT study

    Get PDF
    Background Although there are undeniable diagnostic benefits of CT scanning, its increasing use in paediatric radiology has become a topic of concern regarding patient radioprotection. Objective To assess the rate of CT scanning in Catalonia, Spain, among patients younger than 21 years old at the scan time. Materials and methods This is a sub-study of a larger international cohort study (EPI-CT, the International pediatric CT scan study). Data were retrieved from the radiological information systems (RIS) of eight hospitals in Catalonia since the implementation of digital registration (between 1991 and 2010) until 2013. Results The absolute number of CT scans annually increased 4.5% between 1991 and 2013, which was less accentuated when RIS was implemented in most hospitals. Because the population attending the hospitals also increased, however, the rate of scanned patients changed little (8.3 to 9.4 per 1,000 population). The proportions of patients with more than one CT and more than three CTs showed a 1.51- and 2.7-fold increase, respectively, over the 23 years. Conclusion Gradual increases in numbers of examinations and scanned patients were observed in Catalonia, potentially explained by new CT scanning indications and increases in the availability of scanners, the number of scans per patient and the size of the attended population.Supported in part by the Seventh Framework Programme from the European Community (Grant agreement no: 269912) and the Consejo de Seguridad Nuclear
    corecore