86 research outputs found
Um processo baseado em redes bayesianas para avaliação da aplicação do scrum em projetos de software.
O aumento na utilização de métodos ágeis tem sido motivado pela necessidade de respostas rápidas a demandas de um mercado volátil na área de software. Em contraste com os tradicionais processos dirigidos a planos, métodos ágeis são focados nas pessoas, orientados à comunicação, flexíveis, rápidos, leves, responsivos e dirigidos à aprendizagem e melhoria contínua. Como consequência, fatores subjetivos tais como colaboração, comunicação e auto-organização são chaves para avaliar a maturidade do desenvolvimento de software ágil. O Scrum, focado no gerenciamento de projetos, é o método ágil mais popular. Ao ser adotado por uma equipe, a aplicação do Scrum deve ser melhorada continuamente sendo complementado com práticas e processos de desenvolvimento e gerenciamento ágeis. Apesar da Reunião de Retrospectiva, evento do Scrum, ser um período reservado ao final de cada sprint para a equipe refletir sobre a melhoria do método de desenvolvimento, não há procedimentos claros e específicos para a realização da mesma. Na literatura, há diversas propostas de soluções, embora nenhuma consolidada, para tal. Desta forma, o problema em questão é: como instrumentar o Scrum para auxiliar na melhoria contínua do método de desenvolvimento com foco na avaliação do processo de engenharia de requisitos, equipe de desenvolvimento e incrementos do produto? Nesta tese, propõe-se um processo sistemático baseado em redes bayesianas para auxiliar na avaliação da aplicação do Scrum em projetos de software, instrumentando o método para auxiliar na sua melhoria contínua com foco na avaliação do processo de engenharia de requisitos, equipe de desenvolvimento e incrementos do produto. A rede bayesiana foi construída por meio de um processo de Engenharia de Conhecimento de Redes Bayesianas. Uma base de dados, elicitada de dezoito projetos reais de uma empresa, foi coletada por meio de um questionário. Essa base de dados foi utilizada para avaliar a acurácia da predição da Rede Bayesiana. Como resultado, a previsão foi correta para quatorze projetos (acurácia de 78%). Dessa forma, conclui-se que o modelo é capaz de realizar previsões com acurácia satisfatória e, dessa forma, é útil para auxiliar nas tomadas de decisões de projetos Scrum.The use of Agile Software Development (ASD) is increasing to satisfy the need to respond
to fast moving market demand and gain market share. In contrast with traditional plan-driven processes, ASD are people and communication-oriented, flexible, fast, lightweight, responsive, driven for learning and continuous improvement. As consequence, subjective factors such as collaboration, communication and self-management are key to evaluate the maturity of agile adoption. Scrum, which is focused on project management, is the most popular agile method. Whenever adopted, the usage of Scrum must be continuously improved by complementing it with development and management practices and processes. Even though the Retrospective Meeting, a Scrum event, is a period at the end of each sprint for the team to assess the development method, there are no clear and specific procedures to conduct it. In literature, there are several, but no consolidated, proposed solutions to assist on ASD adoption and assessment. Therefore, the research problem is: how to instrument Scrum to assist on the continuous improvement of the development method focusing on the requirements engineering process, development team and product increment? In this thesis, we propose a Bayesian networks-based process to assist on the assessment of Scrum-based projects, instrumenting the software development method to assist on its continuous improvement focusing on the requirements engineering process, development team and product increments. We have built the Bayesian network using a Knowledge Engineering Bayesian Network (KEBN) process that calculates the customer satisfaction given factors of the software development method. To evaluate its prediction accuracy, we have collected data from 18 industry projects from one organization through a questionnaire. As a result, the prediction was correct for fourteen projects (78% accuracy). Therefore, we conclude that the model is capable of accurately predicting the customer satisfaction and is useful to assist on decision-support on Scrum projects
Issues in the Probability Elicitation Process of Expert-Based Bayesian Networks
A major challenge in constructing a Bayesian network (BN) is defining the node probability tables (NPT), which can be learned from data or elicited from domain experts. In practice, it is common not to have enough data for learning, and elicitation from experts is the only option. However, the complexity of defining NPT grows exponentially, making their elicitation process costly and error-prone. In this research, we conducted an exploratory study through a literature review that identified the main issues related to the task of probability elicitation and solutions to construct large-scale NPT while reducing the exposure to these issues. In this chapter, we present in detail three semiautomatic methods that reduce the burden for experts. We discuss the benefits and drawbacks of these methods, and present directions on how to improve them
Continuous Learning of the Structure of Bayesian Networks: A Mapping Study
Bayesian networks can be built based on knowledge, data, or both. Independent of the source of information used to build the model, inaccuracies might occur or the application domain might change. Therefore, there is a need to continuously improve the model during its usage. As new data are collected, algorithms to continuously incorporate the updated knowledge can play an essential role in this process. In regard to the continuous learning of the Bayesian network’s structure, the current solutions are based on its structural refinement or adaptation. Recent researchers aim to reduce complexity and memory usage, allowing to solve complex and large-scale practical problems. This study aims to identify and evaluate solutions for the continuous learning of the Bayesian network’s structures, as well as to outline related future research directions. Our attention remains on the structures because the accurate parameters are completely useless if the structure is not representative
Effects of exercise on physical and mental health, and cognitive and brain functions in schizophrenia: clinical and experimental evidence
Exercise promotes several health benefits, such as cardiovascular, musculoskeletal and cardiorespiratory improvements. It is believed that the practice of exercise in individuals with psychiatric disorders, e.g. schizophrenia, can cause significant changes. Schizophrenic patients have problematic lifestyle habits compared with general population; this may cause a high mortality rate, mainly caused by cardiovascular and metabolic diseases. Thus, the aim of this study is to investigate changes in physical and mental health, cognitive and brain functioning due to the practice of exercise in patients with schizophrenia. Although still little is known about the benefits of exercise on mental health, cognitive and brain functioning of schizophrenic patients, exercise training has been shown to be a beneficial intervention in the control and reduction of disease severity. Type of training, form of execution, duration and intensity need to be better studied as the effects on physical and mental health, cognition and brain activity depend exclusively of interconnected factors, such as the combination of exercise and medication. However, one should understand that exercise is not only an effective nondrug alternative, but also acts as a supporting linking up interventions to promote improvements in process performance optimization. In general, the positive effects on mental health, cognition and brain activity as a result of an exercise program are quite evident. Few studies have been published correlating effects of exercise in patients with schizophrenia, but there is increasing evidence that positive and negative symptoms can be improved. Therefore, it is important that further studies be undertaken to expand the knowledge of physical exercise on mental health in people with schizophrenia, as well as its dose-response and the most effective type of exercise
AI is a viable alternative to high throughput screening: a 318-target study
: High throughput screening (HTS) is routinely used to identify bioactive small molecules. This requires physical compounds, which limits coverage of accessible chemical space. Computational approaches combined with vast on-demand chemical libraries can access far greater chemical space, provided that the predictive accuracy is sufficient to identify useful molecules. Through the largest and most diverse virtual HTS campaign reported to date, comprising 318 individual projects, we demonstrate that our AtomNet® convolutional neural network successfully finds novel hits across every major therapeutic area and protein class. We address historical limitations of computational screening by demonstrating success for target proteins without known binders, high-quality X-ray crystal structures, or manual cherry-picking of compounds. We show that the molecules selected by the AtomNet® model are novel drug-like scaffolds rather than minor modifications to known bioactive compounds. Our empirical results suggest that computational methods can substantially replace HTS as the first step of small-molecule drug discovery
Testing a global standard for quantifying species recovery and assessing conservation impact.
Recognizing the imperative to evaluate species recovery and conservation impact, in 2012 the International Union for Conservation of Nature (IUCN) called for development of a "Green List of Species" (now the IUCN Green Status of Species). A draft Green Status framework for assessing species' progress toward recovery, published in 2018, proposed 2 separate but interlinked components: a standardized method (i.e., measurement against benchmarks of species' viability, functionality, and preimpact distribution) to determine current species recovery status (herein species recovery score) and application of that method to estimate past and potential future impacts of conservation based on 4 metrics (conservation legacy, conservation dependence, conservation gain, and recovery potential). We tested the framework with 181 species representing diverse taxa, life histories, biomes, and IUCN Red List categories (extinction risk). Based on the observed distribution of species' recovery scores, we propose the following species recovery categories: fully recovered, slightly depleted, moderately depleted, largely depleted, critically depleted, extinct in the wild, and indeterminate. Fifty-nine percent of tested species were considered largely or critically depleted. Although there was a negative relationship between extinction risk and species recovery score, variation was considerable. Some species in lower risk categories were assessed as farther from recovery than those at higher risk. This emphasizes that species recovery is conceptually different from extinction risk and reinforces the utility of the IUCN Green Status of Species to more fully understand species conservation status. Although extinction risk did not predict conservation legacy, conservation dependence, or conservation gain, it was positively correlated with recovery potential. Only 1.7% of tested species were categorized as zero across all 4 of these conservation impact metrics, indicating that conservation has, or will, play a role in improving or maintaining species status for the vast majority of these species. Based on our results, we devised an updated assessment framework that introduces the option of using a dynamic baseline to assess future impacts of conservation over the short term to avoid misleading results which were generated in a small number of cases, and redefines short term as 10 years to better align with conservation planning. These changes are reflected in the IUCN Green Status of Species Standard
Observation of the B0 → ρ0ρ0 decay from an amplitude analysis of B0 → (π+π−)(π+π−) decays
Proton–proton collision data recorded in 2011 and 2012 by the LHCb experiment, corresponding to an integrated luminosity of 3.0 fb−1 , are analysed to search for the charmless B0→ρ0ρ0 decay. More than 600 B0→(π+π−)(π+π−) signal decays are selected and used to perform an amplitude analysis, under the assumption of no CP violation in the decay, from which the B0→ρ0ρ0 decay is observed for the first time with 7.1 standard deviations significance. The fraction of B0→ρ0ρ0 decays yielding a longitudinally polarised final state is measured to be fL=0.745−0.058+0.048(stat)±0.034(syst) . The B0→ρ0ρ0 branching fraction, using the B0→ϕK⁎(892)0 decay as reference, is also reported as B(B0→ρ0ρ0)=(0.94±0.17(stat)±0.09(syst)±0.06(BF))×10−6
Study of the rare B-s(0) and B-0 decays into the pi(+) pi(-) mu(+) mu(-) final state
A search for the rare decays and is performed in a data set corresponding to an integrated luminosity of 3.0 fb collected by the LHCb detector in proton-proton collisions at centre-of-mass energies of 7 and 8 TeV. Decay candidates with pion pairs that have invariant mass in the range 0.5-1.3 GeV/ and with muon pairs that do not originate from a resonance are considered. The first observation of the decay and the first evidence of the decay are obtained and the branching fractions are measured to be and , where the third uncertainty is due to the branching fraction of the decay , used as a normalisation.A search for the rare decays Bs0→π+π−μ+μ− and B0→π+π−μ+μ− is performed in a data set corresponding to an integrated luminosity of 3.0 fb−1 collected by the LHCb detector in proton–proton collisions at centre-of-mass energies of 7 and 8 TeV . Decay candidates with pion pairs that have invariant mass in the range 0.5–1.3 GeV/c2 and with muon pairs that do not originate from a resonance are considered. The first observation of the decay Bs0→π+π−μ+μ− and the first evidence of the decay B0→π+π−μ+μ− are obtained and the branching fractions, restricted to the dipion-mass range considered, are measured to be B(Bs0→π+π−μ+μ−)=(8.6±1.5 (stat)±0.7 (syst)±0.7(norm))×10−8 and B(B0→π+π−μ+μ−)=(2.11±0.51(stat)±0.15(syst)±0.16(norm))×10−8 , where the third uncertainty is due to the branching fraction of the decay B0→J/ψ(→μ+μ−)K⁎(892)0(→K+π−) , used as a normalisation.A search for the rare decays Bs0→π+π−μ+μ− and B0→π+π−μ+μ− is performed in a data set corresponding to an integrated luminosity of 3.0 fb−1 collected by the LHCb detector in proton–proton collisions at centre-of-mass energies of 7 and 8 TeV . Decay candidates with pion pairs that have invariant mass in the range 0.5–1.3 GeV/c2 and with muon pairs that do not originate from a resonance are considered. The first observation of the decay Bs0→π+π−μ+μ− and the first evidence of the decay B0→π+π−μ+μ− are obtained and the branching fractions, restricted to the dipion-mass range considered, are measured to be B(Bs0→π+π−μ+μ−)=(8.6±1.5 (stat)±0.7 (syst)±0.7(norm))×10−8 and B(B0→π+π−μ+μ−)=(2.11±0.51(stat)±0.15(syst)±0.16(norm))×10−8 , where the third uncertainty is due to the branching fraction of the decay B0→J/ψ(→μ+μ−)K⁎(892)0(→K+π−) , used as a normalisation.A search for the rare decays and is performed in a data set corresponding to an integrated luminosity of 3.0 fb collected by the LHCb detector in proton-proton collisions at centre-of-mass energies of 7 and 8 TeV. Decay candidates with pion pairs that have invariant mass in the range 0.5-1.3 GeV/ and with muon pairs that do not originate from a resonance are considered. The first observation of the decay and the first evidence of the decay are obtained and the branching fractions, restricted to the dipion-mass range considered, are measured to be and , where the third uncertainty is due to the branching fraction of the decay , used as a normalisation
Angular analysis of the B-0 -> K*(0) e(+) e(-) decay in the low-q(2) region
An angular analysis of the decay is performed using a data sample, corresponding to an integrated luminosity of 3.0 {\mbox{fb}^{-1}}, collected by the LHCb experiment in collisions at centre-of-mass energies of 7 and 8 TeV during 2011 and 2012. For the first time several observables are measured in the dielectron mass squared () interval between 0.002 and 1.120. The angular observables and which are related to the polarisation and to the lepton forward-backward asymmetry, are measured to be and , where the first uncertainty is statistical and the second systematic. The angular observables and which are sensitive to the photon polarisation in this range, are found to be and . The results are consistent with Standard Model predictions.An angular analysis of the B → K^{*}^{0} e e decay is performed using a data sample, corresponding to an integrated luminosity of 3.0 fb, collected by the LHCb experiment in pp collisions at centre-of-mass energies of 7 and 8 TeV during 2011 and 2012. For the first time several observables are measured in the dielectron mass squared (q) interval between 0.002 and 1.120 GeV /c. The angular observables F and A which are related to the K^{*}^{0} polarisation and to the lepton forward-backward asymmetry, are measured to be F = 0.16 ± 0.06 ± 0.03 and A = 0.10 ± 0.18 ± 0.05, where the first uncertainty is statistical and the second systematic. The angular observables A and A which are sensitive to the photon polarisation in this q range, are found to be A = − 0.23 ± 0.23 ± 0.05 and A = 0.14 ± 0.22 ± 0.05. The results are consistent with Standard Model predictions.An angular analysis of the decay is performed using a data sample, corresponding to an integrated luminosity of 3.0 {\mbox{fb}^{-1}}, collected by the LHCb experiment in collisions at centre-of-mass energies of 7 and 8 TeV during 2011 and 2012. For the first time several observables are measured in the dielectron mass squared () interval between 0.002 and 1.120. The angular observables and which are related to the polarisation and to the lepton forward-backward asymmetry, are measured to be and , where the first uncertainty is statistical and the second systematic. The angular observables and which are sensitive to the photon polarisation in this range, are found to be and . The results are consistent with Standard Model predictions
Measurement of the Z plus b-jet cross-section in pp collisions at root s=7 TeV in the forward region
The associated production of a Z boson or an off-shell photon with a bottom quark in the forward region is studied using proton-proton collisions at a centre-of-mass energy of . The Z bosons are reconstructed in the final state from muons with a transverse momentum larger than , while two transverse momentum thresholds are considered for jets ( and ). Both muons and jets are reconstructed in the pseudorapidity range , and \sigma(\text{\text{Z}/\gamma^*(\mu^{+}\mu^{-})+b-jet}) = 167 \pm 47 (\text{stat}) \pm 29 (\text{syst}) \pm 6 (\text{lumi}) {\,{fb}} for {p_{\rm T}}(jet)
- …