1,940 research outputs found
Endless Data
Small and Medium Enterprises (SMEs), as well as micro teams, face an uphill
task when delivering software to the Cloud. While rapid release methods
such as Continuous Delivery can speed up the delivery cycle: software quality,
application uptime and information management remain key concerns. This
work looks at four aspects of software delivery: crowdsourced testing, Cloud
outage modelling, collaborative chat discourse modelling, and collaborative
chat discourse segmentation. For each aspect, we consider business related
questions around how to improve software quality and gain more significant
insights into collaborative data while respecting the rapid release paradigm
Index to 1984 NASA Tech Briefs, volume 9, numbers 1-4
Short announcements of new technology derived from the R&D activities of NASA are presented. These briefs emphasize information considered likely to be transferrable across industrial, regional, or disciplinary lines and are issued to encourage commercial application. This index for 1984 Tech B Briefs contains abstracts and four indexes: subject, personal author, originating center, and Tech Brief Number. The following areas are covered: electronic components and circuits, electronic systems, physical sciences, materials, life sciences, mechanics, machinery, fabrication technology, and mathematics and information sciences
Ribosome recycling induces optimal translation rate at low ribosomal availability
Funding statement The authors thank BBSRC (BB/F00513/X1, BB/I020926/1 and DTG) and SULSA for funding. Acknowledgement The authors thank R. Allen, L. Ciandrini, B. Gorgoni and P. Greulich for very helpful discussions and careful reading of the manuscript.Peer reviewedPublisher PD
Stationary uphill currents in locally perturbed Zero Range Processes
Uphill currents are observed when mass diffuses in the direction of the
density gradient. We study this phenomenon in stationary conditions in the
framework of locally perturbed 1D Zero Range Processes (ZRP). We show that the
onset of currents flowing from the reservoir with smaller density to the one
with larger density can be caused by a local asymmetry in the hopping rates on
a single site at the center of the lattice. For fixed injection rates at the
boundaries, we prove that a suitable tuning of the asymmetry in the bulk may
induce uphill diffusion at arbitrarily large, finite volumes. We also deduce
heuristically the hydrodynamic behavior of the model and connect the local
asymmetry characterizing the ZRP dynamics to a matching condition relevant for
the macroscopic problem
Wavelet-based filtration procedure for denoising the predicted CO2 waveforms in smart home within the Internet of Things
The operating cost minimization of smart homes can be achieved with the optimization of the management of the building's technical functions by determination of the current occupancy status of the individual monitored spaces of a smart home. To respect the privacy of the smart home residents, indirect methods (without using cameras and microphones) are possible for occupancy recognition of space in smart homes. This article describes a newly proposed indirect method to increase the accuracy of the occupancy recognition of monitored spaces of smart homes. The proposed procedure uses the prediction of the course of CO2 concentration from operationally measured quantities (temperature indoor and relative humidity indoor) using artificial neural networks with a multilayer perceptron algorithm. The mathematical wavelet transformation method is used for additive noise canceling from the predicted course of the CO2 concentration signal with an objective increase accuracy of the prediction. The calculated accuracy of CO2 concentration waveform prediction in the additive noise-canceling application was higher than 98% in selected experiments.Web of Science203art. no. 62
Parallelized Integrated Time-Correlated Photon Counting System for High Photon Counting Rate Applications
Time-correlated single-photon counting (TCSPC) applications usually deal with a high counting rate, which leads to a decrease in the system efficiency. This problem is further complicated due to the random nature of photon arrivals making it harder to avoid counting loss as the system is busy dealing with previous arrivals. In order to increase the rate of detected photons and improve the signal quality, many parallelized structures and imaging arrays have been reported, but this trend leads to an increased data bottleneck requiring complex readout circuitry and the use of very high output frequencies. In this paper, we present simple solutions that allow the improvement of signal-to-noise ratio (SNR) as well as the mitigation of counting loss through a parallelized TCSPC architecture and the use of an embedded memory block. These solutions are presented, and their impact is demonstrated by means of behavioral and mathematical modeling potentially allowing a maximum signal-to-noise ratio improvement of 20 dB and a system efficiency as high as 90% without the need for extremely high readout frequencies
Models, Techniques, and Metrics for Managing Risk in Software Engineering
The field of Software Engineering (SE) is the study of systematic and quantifiable approaches to software development, operation, and maintenance. This thesis presents a set of scalable and easily implemented techniques for quantifying and mitigating risks associated with the SE process. The thesis comprises six papers corresponding to SE knowledge areas such as software requirements, testing, and management. The techniques for risk management are drawn from stochastic modeling and operational research.
The first two papers relate to software testing and maintenance. The first paper describes and validates novel iterative-unfolding technique for filtering a set of execution traces relevant to a specific task. The second paper analyzes and validates the applicability of some entropy measures to the trace classification described in the previous paper. The techniques in these two papers can speed up problem determination of defects encountered by customers, leading to improved organizational response and thus increased customer satisfaction and to easing of resource constraints.
The third and fourth papers are applicable to maintenance, overall software quality and SE management. The third paper uses Extreme Value Theory and Queuing Theory tools to derive and validate metrics based on defect rediscovery data. The metrics can aid the allocation of resources to service and maintenance teams, highlight gaps in quality assurance processes, and help assess the risk of using a given software product. The fourth paper characterizes and validates a technique for automatic selection and prioritization of a minimal set of customers for profiling. The minimal set is obtained using Binary Integer Programming and prioritized using a greedy heuristic. Profiling the resulting customer set leads to enhanced comprehension of user behaviour, leading to improved test specifications and clearer quality assurance policies, hence reducing risks associated with unsatisfactory product quality.
The fifth and sixth papers pertain to software requirements. The fifth paper both models the relation between requirements and their underlying assumptions and measures the risk associated with failure of the assumptions using Boolean networks and stochastic modeling. The sixth paper models the risk associated with injection of requirements late in development cycle with the help of stochastic processes
Impact of stakeholder type and collaboration on issue resolution time in OSS Projects
Born as a movement of collective contribution of volunteer developers, Open source software (OSS) attracts an increasing involvement of commercial firms. Many OSS projects are composed of a mix group of firm-
paid and volunteer developers, with different motivation, collaboration practices and working styles. As OSS is collaborative work in nature, it is
important to know whether these differences have an impact on project outcomes. In this paper, we empirically investigate the firm-paid participation in resolving OSS evolution issues, the stakeholder collaboration and its impact on OSS issue resolution time. The results suggest that though a firm-paid developer resolves much more issues than a volunteer developer does, there is no difference in issue resolution time between firm-paid and volunteer developers. Besides, the more important factor that influences the issue resolution time comes from the collaboration among
stakeholders rather than from measures of individual characteristics.Peer ReviewedPostprint (author’s final draft
In-situ crack and keyhole pore detection in laser directed energy deposition through acoustic signal and deep learning
Cracks and keyhole pores are detrimental defects in alloys produced by laser
directed energy deposition (LDED). Laser-material interaction sound may hold
information about underlying complex physical events such as crack propagation
and pores formation. However, due to the noisy environment and intricate signal
content, acoustic-based monitoring in LDED has received little attention. This
paper proposes a novel acoustic-based in-situ defect detection strategy in
LDED. The key contribution of this study is to develop an in-situ acoustic
signal denoising, feature extraction, and sound classification pipeline that
incorporates convolutional neural networks (CNN) for online defect prediction.
Microscope images are used to identify locations of the cracks and keyhole
pores within a part. The defect locations are spatiotemporally registered with
acoustic signal. Various acoustic features corresponding to defect-free
regions, cracks, and keyhole pores are extracted and analysed in time-domain,
frequency-domain, and time-frequency representations. The CNN model is trained
to predict defect occurrences using the Mel-Frequency Cepstral Coefficients
(MFCCs) of the lasermaterial interaction sound. The CNN model is compared to
various classic machine learning models trained on the denoised acoustic
dataset and raw acoustic dataset. The validation results shows that the CNN
model trained on the denoised dataset outperforms others with the highest
overall accuracy (89%), keyhole pore prediction accuracy (93%), and AUC-ROC
score (98%). Furthermore, the trained CNN model can be deployed into an
in-house developed software platform for online quality monitoring. The
proposed strategy is the first study to use acoustic signals with deep learning
for insitu defect detection in LDED process.Comment: 36 Pages, 16 Figures, accepted at journal Additive Manufacturin
- …