8,706 research outputs found

    Experimental Study Using Functional Size Measurement in Building Estimation Models for Software Project Size

    Get PDF
    This paper reports on an experiment that investigates the predictability of software project size from software product size. The predictability research problem is analyzed at the stage of early requirements by accounting the size of functional requirements as well as the size of non-functional requirements. The experiment was carried out with 55 graduate students in Computer Science from Concordia University in Canada. In the experiment, a functional size measure and a project size measure were used in building estimation models for sets of web application development projects. The results show that project size is predictable from product size. Further replications of the experiment are, however, planed to obtain more results to confirm or disconfirm our claim

    PeSOTIF: a Challenging Visual Dataset for Perception SOTIF Problems in Long-tail Traffic Scenarios

    Full text link
    Perception algorithms in autonomous driving systems confront great challenges in long-tail traffic scenarios, where the problems of Safety of the Intended Functionality (SOTIF) could be triggered by the algorithm performance insufficiencies and dynamic operational environment. However, such scenarios are not systematically included in current open-source datasets, and this paper fills the gap accordingly. Based on the analysis and enumeration of trigger conditions, a high-quality diverse dataset is released, including various long-tail traffic scenarios collected from multiple resources. Considering the development of probabilistic object detection (POD), this dataset marks trigger sources that may cause perception SOTIF problems in the scenarios as key objects. In addition, an evaluation protocol is suggested to verify the effectiveness of POD algorithms in identifying the key objects via uncertainty. The dataset never stops expanding, and the first batch of open-source data includes 1126 frames with an average of 2.27 key objects and 2.47 normal objects in each frame. To demonstrate how to use this dataset for SOTIF research, this paper further quantifies the perception SOTIF entropy to confirm whether a scenario is unknown and unsafe for a perception system. The experimental results show that the quantified entropy can effectively and efficiently reflect the failure of the perception algorithm.Comment: 7 pages, 5 figures, 4 tables, submitted to 2023 ICR

    Variable-rate pumping tests for radially symmetric nonuniform aquifers

    Get PDF
    This is the published version. Copyright American Geophysical UnionConventional pumping test methodology is of limited effectiveness for defining the spatial distribution of aquifer properties because of the nonuniqueness of the parameter estimates. Sensitivity analysis can be used to develop a pumping test procedure that significantly decreases the uncertainty associated with the estimated parameters. This approach employs systematic variations in pumpage rates to achieve reductions in parameter uncertainty. These reductions are obtained by increasing the sensitivity of drawdown to flow properties while simultaneously constraining the growth in the correlation between the effects of different flow properties on observation well drawdown. Numerical experiments demonstrate the importance of the magnitude and frequency of the rate variations, the spatial and temporal pattern of data collection, as well as the dependence of the technique on the total duration of the pumping test. Significant decreases in parameter uncertainty can be expected in any flow system in which the primary component of flow is in the radial direction. This study demonstrates that sensitivity analysis can be an important tool in the development of methodology for the characterization of subsurface properties

    Highly Irregular Functional Generalized Linear Regression with Electronic Health Records

    Full text link
    This work presents a new approach, called MISFIT, for fitting generalized functional linear regression models with sparsely and irregularly sampled data. Current methods do not allow for consistent estimation unless one assumes that the number of observed points per curve grows sufficiently quickly with the sample size. In contrast, MISFIT is based on a multiple imputation framework, which has the potential to produce consistent estimates without such an assumption. Just as importantly, it propagates the uncertainty of not having completely observed curves, allowing for a more accurate assessment of the uncertainty of parameter estimates, something that most methods currently cannot accomplish. This work is motivated by a longitudinal study on macrocephaly, or atypically large head size, in which electronic medical records allow for the collection of a great deal of data. However, the sampling is highly variable from child to child. Using MISFIT we are able to clearly demonstrate that the development of pathologic conditions related to macrocephaly is associated with both the overall head circumference of the children as well as the velocity of their head growth.Comment: 5 figures, 17 tables (including supplementary material), 34 pages (including supplementary material

    Quality assurance of CT scanning for industrial applications

    Get PDF

    Concept of operations for ATM by managing uncertainty through multiple metering points

    Full text link
    This paper presents an operational concept for Air Traffic Management, and in particular arrival management, in which aircraft are permitted to operate in a manner consistent with current optimal aircraft operating techniques. The proposed concept allows aircraft to descend in the fuel efficient path managed mode and with arrival time not actively controlled. It will be demonstrated how the associated uncertainty in the time dimension of the trajectory can be managed through the application of multiple metering points strategically chosen along the trajectory. The proposed concept does not make assumptions on aircraft equipage (e.g. time of arrival control), but aims at handling mixed-equipage scenarios that most likely will remain far into the next decade and arguably beyond

    Data-Driven Energy Storage Scheduling to Minimise Peak Demand on Distribution Systems with PV Generation

    Get PDF
    The growing adoption of decentralised renewable energy generation (such as solar photovoltaic panels and wind turbines) and low-carbon technologies will increase the strain experienced by the distribution networks in the near future. In such a scenario, energy storage is becoming a key alternative to traditional expensive reinforcements to network infrastructure, due to its flexibility, decreasing costs and fast deployment capabilities. In this work, an end-to-end data-driven solution to optimally design the control of a battery unit with the aim of reducing the peak electricity demand is presented. The proposed solution uses state-of-the-art machine learning methods for forecasting electricity demand and PV generation, combined with an optimisation strategy to maximise the use of photovoltaic energy to charge the energy storage unit. To this end, historical demand, weather, and solar energy generation data collected at the Stentaway Primary substation near Plymouth, UK, and at other six locations were employed

    Estimating Agile Software Project Effort: An Empirical Study

    Get PDF
    This paper describes an empirical study of effort estimation in agile software development. Estimated effort and actual effort of a 46-iteration project are collected and analyzed. The results show that estimation in agile development is more accurate than that in traditional development even though agile developers still underestimate the effort. However, estimation accuracy is not improved over time as expected by agile communities

    Parameter Estimation Methods for Comprehensive Pyrolysis Modeling

    Get PDF
    This dissertation documents a study on parameter estimation methods for comprehensive pyrolysis modeling. There are four parts to this work, which are (1) evaluating effects of applying different kinetic models to pyrolysis modeling of fiberglass reinforced polymer composites; (2); evaluation of pyrolysis parameters for fiberglass reinforced polymer composites based on multi-objective optimization; (3) parameter estimation for comprehensive pyrolysis modeling: guidance and critical observations; and (4) engineering guide for estimating material pyrolysis properties for fire modeling. In the first section (Section 1), evaluation work is conducted to determine the effects of applying different kinetic models (KMs), developed based on thermal analysis using TGA data, when used in typical 1D pyrolysis models of fiberglass reinforced polymer (FRP) composites. The study shows that that increasing complexity of KMs to be used in pyrolysis modeling is unnecessary for the FRP samples investigated. Additionally, the findings from this research indicates that the basic assumption of considering thermal decomposition of each computational cell in comprehensive pyrolysis modeling as equivalent to that in a TGA experiment becomes inapplicable at depth and higher heating rates. The second part of this dissertation (Section 2) reports the results from a study conducted to investigate the ability of global, multi-objective and multi-variable optimization methods to estimate material parameters for comprehensive pyrolysis models. The research materials are two fiberglass reinforced polymer (FRP) composites that share the same fiberglass mats but with two different resin systems. One resin system is composed of a single component and the other system is composed of two components (resin and fire retardant additive). The results show that for a well-configured parameter estimation exercise using the optimization method described above, (1) estimated results are within ± 100% of the measurements in general; (2) increasing complexity of the kinetic modeling for a single component system has insignificant effect on estimated values; (3) increasing complexity of the kinetic modeling for a multiple component system with each element having different thermal characteristics has positive effect on estimated values; and (4) parameter estimation using an optimization method with appropriate level of complexity in kinetic model and optimization targets can find estimations that can be considered as effective material property values. The third part of this dissertation (Section 3) proposes a process for conducting parameter estimation for comprehensive pyrolysis models. The work describes the underlying concepts considered in the proposed process and gives discussions of its limitations. Additionally, example cases of parameter estimation exercise are shown to illustrate the application of the parameter estimation process. There are four materials considered in the example cases – thermoplastics (PMMA), corrugated cardboard, fiberglass reinforced polymer composites and plywood. In the last part (Section 4), the actual Guide, a standardized procedure for obtaining material parameters for input into a wide range of pyrolysis models is presented. This is a step-by-step process that provides a brief description of modeling approaches and assumptions; a typical mathematical formulation to identify model parameters in the equations; and methods of estimating the model parameters either by independent measurements or optimization in pair with the model. In the Guide, example cases are given to show how the process can be applied to different types of real-world materials

    Long-term Informative Path Planning with Autonomous Soaring

    Get PDF
    The ability of UAVs to cover large areas efficiently is valuable for information gathering missions. For long-term information gathering, a UAV may extend its endurance by accessing energy sources present in the atmosphere. Thermals are a favourable source of wind energy and thermal soaring is adopted in this thesis to enable long-term information gathering. This thesis proposes energy-constrained path planning algorithms for a gliding UAV to maximise information gain given a mission time that greatly exceeds the UAV's endurance. This thesis is motivated by the problem of probabilistic target-search performed by an energy-constrained UAV, which is tasked to simultaneously search for a lost ground target and explore for thermals to regain energy. This problem is termed informative soaring (IFS) and combines informative path planning (IPP) with energy constraints. IFS is shown to be NP-hard by showing that it has a similar problem structure to the weight-constrained shortest path problem with replenishments. While an optimal solution may not exist in polynomial time, this thesis proposes path planning algorithms based on informed tree search to find high quality plans with low computational cost. This thesis addresses complex probabilistic belief maps and three primary contributions are presented: • First, IFS is formulated as a graph search problem by observing that any feasible long-term plan must alternate between 1) information gathering between thermals and 2) replenishing energy within thermals. This is a first step to reducing the large search state space. • The second contribution is observing that a complex belief map can be viewed as a collection of information clusters and using a divide and conquer approach, cluster tree search (CTS), to efficiently find high-quality plans in the large search state space. In CTS, near-greedy tree search is used to find locally optimal plans and two global planning versions are proposed to combine local plans into a full plan. Monte Carlo simulation studies show that CTS produces similar plans to variations of exhaustive search, but runs five to 20 times faster. The more computationally efficient version, CTSDP, uses dynamic programming (DP) to optimally combine local plans. CTSDP is executed in real time on board a UAV to demonstrate computational feasibility. • The third contribution is an extension of CTS to unknown drifting thermals. A thermal exploration map is created to detect new thermals that will eventually intercept clusters, and therefore be valuable to the mission. Time windows are computed for known thermals and an optimal cluster visit schedule is formed. A tree search algorithm called CTSDrift combines CTS and thermal exploration. Using 2400 Monte Carlo simulations, CTSDrift is evaluated against a Full Knowledge method that has full knowledge of the thermal field and a Greedy method. On average, CTSDrift outperforms Greedy in one-third of trials, and achieves similar performance to Full Knowledge when environmental conditions are favourable
    • …
    corecore