135 research outputs found

    Évaluation de l'application des activitĂ©s de sĂ©curitĂ© proposĂ©es par les mĂ©thodes de gestion des risques de sĂ©curitĂ© pour les systĂšmes d'information dans un contexte de cycle de dĂ©veloppement du logiciel

    Get PDF
    Ce mĂ©moire concerne la sĂ©curitĂ© informatique appliquĂ©e dans le domaine du logiciel informatique. En fait, il s'agit de l'intĂ©gration des concepts de la gestion des risques de sĂ©curitĂ© pour les systĂšmes d'information dans un contexte du cycle de dĂ©veloppement du logiciel. AprĂšs la prĂ©sentation gĂ©nĂ©rale de ces sujets, la recherche aborde la problĂ©matique de la prĂ©sence des vulnĂ©rabilitĂ©s de sĂ©curitĂ© dans les logiciels mis en opĂ©ration. La solution proposĂ©e pour restreindre leur prĂ©sence est fondĂ©e sur l'hypothĂšse qu'il est possible d'intĂ©grer des activitĂ©s reliĂ©es Ă  la gestion des risques de sĂ©curitĂ© aux Ă©tapes du cycle de dĂ©veloppement du logiciel afin que ses bĂ©nĂ©fices permettent de diminuer la prĂ©sence de vulnĂ©rabilitĂ©s de sĂ©curitĂ© dans les logiciels produits. La recherche prĂ©sentĂ©e dans ce mĂ©moire prend ses appuis sur des concepts Ă©prouvĂ©s dans les diffĂ©rents domaines Ă©tudiĂ©s. Le Processus UnifiĂ© est utilisĂ© Ă  titre de rĂ©fĂ©rence pour le cycle de dĂ©veloppement du logiciel, tandis que les mĂ©thodes EBIOS, MEHARI et OCTAVE ont Ă©tĂ© employĂ©es pour la gestion des risques de sĂ©curitĂ©. La dĂ©marche analytique entreprise dans cette recherche commence d'abord par l'Ă©tude des mĂ©thodes de gestion des risques pour en extraire une liste gĂ©nĂ©ralisĂ©e d'activitĂ©s de sĂ©curitĂ©. Elle prĂ©sente ensuite des dĂ©tails sur les activitĂ©s effectuĂ©es dans chacune des Ă©tapes du Processus UnifiĂ©. En dernier lieu, elle intĂšgre une Ă  une les activitĂ©s gĂ©nĂ©rales de la gestion des risques dans les Ă©tapes du cycle de dĂ©veloppement logiciel. Les rĂ©sultats ont dĂ©montrĂ© qu'un petit nombre d'activitĂ©s gĂ©nĂ©rales de la gestion des risques avait un lien direct avec le cycle de dĂ©veloppement du logiciel, alors que les autres pouvaient ĂȘtre intĂ©grĂ©es et rĂ©alisĂ©es en fonction de leurs interdĂ©pendances avec les activitĂ©s concernĂ©es. En dĂ©montrant que certaines activitĂ©s avaient un ancrage rĂ©el avec une activitĂ© de projet rĂ©alisĂ©e lors d'une Ă©tape du cycle de dĂ©veloppement logiciel, et qu'il existe de fortes interdĂ©pendances entre les activitĂ©s de gestion des risques, il est alors possible de croire que les activitĂ©s de gestion des risques peuvent ĂȘtre rĂ©alisĂ©es conjointement aux activitĂ©s de projet dans un cycle de dĂ©veloppement du logiciel. Puisque la gestion des risques de sĂ©curitĂ© vise Ă  diminuer les vulnĂ©rabilitĂ©s de sĂ©curitĂ© et qu'elle se retrouve ainsi intĂ©grĂ©e au dĂ©veloppement logiciel, l'hypothĂšse fut confirmĂ©e. ______________________________________________________________________________ MOTS-CLÉS DE L’AUTEUR : MĂ©thode de gestion des risques de sĂ©curitĂ©, SĂ©curitĂ© des systĂšmes d'information, Cycle de dĂ©veloppement du logiciel, SĂ©curitĂ© informatique, VulnĂ©rabilitĂ©s de sĂ©curitĂ© dans les logiciels, EBIOS, MEHARI, OCTAVE, Processus UnifiĂ©

    GPI PSF subtraction with TLOCI: the next evolution in exoplanet/disk high-contrast imaging

    Full text link
    To directly image exoplanets and faint circumstellar disks, the noisy stellar halo must be suppressed to a high level. To achieve this feat, the angular differential imaging observing technique and the least-squares Locally Optimized Combination of Images (LOCI) algorithm have now become the standard in single band direct imaging observations and data reduction. With the development and commissioning of new high-order high-contrast adaptive optics equipped with integral field units, the image subtraction algorithm needs to be modified to allow the optimal use of polychromatic images, field-rotated images and archival data. A new algorithm, TLOCI (for Template LOCI), is designed to achieve this task by maximizing a companion signal-to-noise ratio instead of simply minimizing the noise as in the original LOCI algorithm. The TLOCI technique uses an input spectrum and template Point Spread Functions (PSFs, generated from unocculted and unsaturated stellar images) to optimize the reference image least-squares coefficients to minimize the planet self-subtraction, thus maximizing its throughput per wavelength, while simultaneously providing a maximum suppression of the speckle noise. The new algorithm has been developed using on-sky GPI data and has achieved impressive contrast. This paper presents the TLOCI algorithm, on-sky performance, and will discuss the challenges in recovering the planet spectrum with high fidelity.Comment: 13 pages, 8 figures, to appear in Proceedings of SPIE 914

    A DELPHI STUDY OF OBSOLETE ASSUMPTIONS IN FREE/LIBRE AND OPEN SOURCE SOFTWARE

    Get PDF
    Free/libre and open source software (FLOSS) has evolved significantly over the past 20 years and estimates suggest that it accounts for 80-90% of any given piece of modern software. A consequence of this evolution is that many of the assumptions made by FLOSS researchers may be obsolete. This would have major negative implications for research validity and hampers theory generation on FLOSS. This study sought to identify significant obsolete assumptions that persist in FLOSS research. Using Delphi research design with a panel of 20 expert researchers, 21 obsolete assumptions about FLOSS were identified and ranked. We performed a thematic analysis and grouped these obsolete assumptions into six themes: Sampling, Project/Community, Product, Contributor, Evaluation, and Development Process. The Sampling theme was ranked as having the most significant obsolete assumptions although only two assumptions were associated with this theme. The Project/Community theme contained six obsolete assumptions – the most of any theme

    Towards solving social and technical problems in open source software ecosystems : using cause-and-effect analysis to disentangle the causes of complex problems

    Get PDF
    Managing large-scale development projects in open source software ecosystems involves dealing with an array of technical and social problems. To disentangle the causes of such problems, we interviewed experts and performed a cause-and-effect analysis. Our findings demonstrate that loss of contributors is the most important social problem, while poor code quality is the most important technical problem, and that both problems result from complex socio-technical interrelations of causes. Our approach suggests that cause-and-effect analysis can help to better understand problems in open source software ecosystems

    Machine learning workflow for edge computed arrhythmia detection in exploration class missions

    Get PDF
    Deep-space missions require preventative care methods based on predictive models for identifying in-space pathologies. Deploying such models requires flexible edge computing, which Open Neural Network Exchange (ONNX) formats enable by optimizing inference directly on wearable edge devices. This work demonstrates an innovative approach to point-of-care machine learning model pipelines by combining this capacity with an advanced self-optimizing training scheme to classify periods of Normal Sinus Rhythm (NSR), Atrial Fibrillation (AFIB), and Atrial Flutter (AFL). 742 h of electrocardiogram (ECG) recordings were pre-processed into 30-second normalized samples where variable mode decomposition purged muscle artifacts and instrumentation noise. Seventeen heart rate variability and morphological ECG features were extracted by convoluting peak detection with Gaussian distributions and delineating QRS complexes using discrete wavelet transforms. The decision tree classifier’s features, parameters, and hyperparameters were self-optimized through stratified triple nested cross-validation ranked on F1-scoring against cardiologist labeling. The selected model achieved a macro F1-score of 0.899 with 0.993 for NSR, 0.938 for AFIB, and 0.767 for AFL. The most important features included median P-wave amplitudes, PRR20, and mean heart rates. The ONNX-translated pipeline took 9.2 s/sample. This combination of our self-optimizing scheme and deployment use case of ONNX demonstrated overall accurate operational tachycardia detection

    The fragmentation of protostellar discs: the Hill criterion for spiral arms

    Full text link
    We present a new framework to explain the link between cooling and fragmentation in gravitationally unstable protostellar discs. This framework consists of a simple model for the formation of spiral arms, as well as a criterion, based on the Hill radius, to determine if a spiral arm will fragment. This detailed model of fragmentation is based on the results of numerical simulations of marginally stable protostellar discs, including those found in the literature, as well as our new suite of 3-D radiation hydrodynamics simulations of an irradiated, optically-thick protostellar disc surrounding an A star. Our set of simulations probes the transition to fragmentation through a scaling of the physical opacity. This model allows us to directly calculate the critical cooling time of Gammie (2001), with results that are consistent with those found from numerical experiment. We demonstrate how this model can be used to predict fragmentation in irradiated protostellar discs. These numerical simulations, as well as the model that they motivate, provide strong support for the hypothesis that gravitational instability is responsible for creating systems with giant planets on wide orbits.Comment: 11 page, 10 figures, submitted to MNRA

    Gemini Planet Imager Observational Calibrations VI: Photometric and Spectroscopic Calibration for the Integral Field Spectrograph

    Full text link
    The Gemini Planet Imager (GPI) is a new facility instrument for the Gemini Observatory designed to provide direct detection and characterization of planets and debris disks around stars in the solar neighborhood. In addition to its extreme adaptive optics and corona graphic systems which give access to high angular resolution and high-contrast imaging capabilities, GPI contains an integral field spectrograph providing low resolution spectroscopy across five bands between 0.95 and 2.5 ÎŒ\mum. This paper describes the sequence of processing steps required for the spectro-photometric calibration of GPI science data, and the necessary calibration files. Based on calibration observations of the white dwarf HD 8049B we estimate that the systematic error in spectra extracted from GPI observations is less than 5%. The flux ratio of the occulted star and fiducial satellite spots within coronagraphic GPI observations, required to estimate the magnitude difference between a target and any resolved companions, was measured in the HH-band to be Δm=9.23±0.06\Delta m = 9.23\pm0.06 in laboratory measurements and Δm=9.39±0.11\Delta m = 9.39\pm 0.11 using on-sky observations. Laboratory measurements for the YY, JJ, K1K1 and K2K2 filters are also presented. The total throughput of GPI, Gemini South and the atmosphere of the Earth was also measured in each photometric passband, with a typical throughput in HH-band of 18% in the non-coronagraphic mode, with some variation observed over the six-month period for which observations were available. We also report ongoing development and improvement of the data cube extraction algorithm.Comment: 15 pages, 6 figures. Proceedings of the SPIE, 9147-30

    Constraints on the architecture of the HD 95086 planetary system with the Gemini Planet Imager

    Full text link
    We present astrometric monitoring of the young exoplanet HD 95086 b obtained with the Gemini Planet Imager between 2013 and 2016. A small but significant position angle change is detected at constant separation; the orbital motion is confirmed with literature measurements. Efficient Monte Carlo techniques place preliminary constraints on the orbital parameters of HD 95086 b. With 68% confidence, a semimajor axis of 61.7^{+20.7}_{-8.4} au and an inclination of 153.0^{+9.7}_{-13.5} deg are favored, with eccentricity less than 0.21. Under the assumption of a co-planar planet-disk system, the periastron of HD 95086 b is beyond 51 au with 68% confidence. Therefore HD 95086 b cannot carve the entire gap inferred from the measured infrared excess in the SED of HD 95086. We use our sensitivity to additional planets to discuss specific scenarios presented in the literature to explain the geometry of the debris belts. We suggest that either two planets on moderately eccentric orbits or three to four planets with inhomogeneous masses and orbital properties are possible. The sensitivity to additional planetary companions within the observations presented in this study can be used to help further constrain future dynamical simulations of the planet-disk system.Comment: Accepted for publication in ApJ

    The Peculiar Debris Disk of HD 111520 as Resolved by the Gemini Planet Imager

    Full text link
    Using the Gemini Planet Imager (GPI), we have resolved the circumstellar debris disk around HD 111520 at a projected range of ~30-100 AU in both total and polarized HH-band intensity. The disk is seen edge-on at a position angle of ~165∘^{\circ} along the spine of emission. A slight inclination or asymmetric warping are covariant and alters the interpretation of the observed disk emission. We employ 3 point spread function (PSF) subtraction methods to reduce the stellar glare and instrumental artifacts to confirm that there is a roughly 2:1 brightness asymmetry between the NW and SE extension. This specific feature makes HD 111520 the most extreme examples of asymmetric debris disks observed in scattered light among similar highly inclined systems, such as HD 15115 and HD 106906. We further identify a tentative localized brightness enhancement and scale height enhancement associated with the disk at ~40 AU away from the star on the SE extension. We also find that the fractional polarization rises from 10 to 40% from 0.5" to 0.8" from the star. The combination of large brightness asymmetry and symmetric polarization fraction leads us to believe that an azimuthal dust density variation is causing the observed asymmetry.Comment: 9 pages, 8 Figures, 1 table, Accepted to Ap

    Psychophysiological models of hypovigilance detection: A scoping review

    Get PDF
    Hypovigilance represents a major contributor to accidents. In operational contexts, the burden of monitoring/managing vigilance often rests on operators. Recent advances in sensing technologies allow for the development of psychophysiology‐based (hypo)vigilance prediction models. Still, these models remain scarcely applied to operational situations and need better understanding. The current scoping review provides a state of knowledge regarding psychophysiological models of hypovigilance detection. Records evaluating vigilance measuring tools with gold standard comparisons and hypovigilance prediction performances were extracted from MEDLINE, PsychInfo, and Inspec. Exclusion criteria comprised aspects related to language, non‐empirical papers, and sleep studies. The Quality Assessment tool for Diagnostic Accuracy Studies (QUADAS) and the Prediction model Risk Of Bias ASsessment Tool (PROBAST) were used for bias evaluation. Twenty‐one records were reviewed. They were mainly characterized by participant selection and analysis biases. Papers predominantly focused on driving and employed several common psychophysiological techniques. Yet, prediction methods and gold standards varied widely. Overall, we outline the main strategies used to assess hypovigilance, their principal limitations, and we discuss applications of these models
    • 

    corecore