227,395 research outputs found
Printed Circuit Board Quality Assurance
PCB Assurance Summary: PCB assurance actives are informed by risk in context of the Project. Lessons are being applied across Projects for continuous improvements. Newer component technologies, smaller/high pitch devices: tighter and more demanding PCB designs: Identifying new research areas. New materials, designs, structures and test methods
Evaluation of High Density Air Traffic Operations with Automation for Separation Assurance, Weather Avoidance and Schedule Conformance
In this paper we discuss the development and evaluation of our prototype technologies and procedures for far-term air traffic control operations with automation for separation assurance, weather avoidance and schedule conformance. Controller-in-the-loop simulations in the Airspace Operations Laboratory at the NASA Ames Research Center in 2010 have shown very promising results. We found the operations to provide high airspace throughput, excellent efficiency and schedule conformance. The simulation also highlighted areas for improvements: Short-term conflict situations sometimes resulted in separation violations, particularly for transitioning aircraft in complex traffic flows. The combination of heavy metering and growing weather resulted in an increased number of aircraft penetrating convective weather cells. To address these shortcomings technologies and procedures have been improved and the operations are being re-evaluated with the same scenarios. In this paper we will first describe the concept and technologies for automating separation assurance, weather avoidance, and schedule conformance. Second, the results from the 2010 simulation will be reviewed. We report human-systems integration aspects, safety and efficiency results as well as airspace throughput, workload, and operational acceptability. Next, improvements will be discussed that were made to address identified shortcomings. We conclude that, with further refinements, air traffic control operations with ground-based automated separation assurance can routinely provide currently unachievable levels of traffic throughput in the en route airspace
The evaluation of ontologies: Editorial review vs. democratic ranking
Increasingly, the high throughput technologies used by biomedical researchers are bringing about a situation in which large bodies of data are being described using controlled structured vocabularies—also known as ontologies—in order to support the integration and analysis of this data. Annotation of data by means of ontologies is already contributing in significant ways to the cumulation of scientific knowledge and, prospectively, to the applicability of cross-domain algorithmic reasoning in support of scientific advance. This very success, however, has led to a proliferation of ontologies of varying scope and quality. We define one strategy for achieving quality assurance of ontologies—a plan of action already adopted by a large community of collaborating ontologists—which consists in subjecting ontologies to a process of peer review analogous to that which is applied to scientific journal articles
Development of the Integrated Model of the Automotive Product Quality Assessment
Issues on building an integrated model of the automotive product quality assessment are studied herein basing on widely applicable methods and models of the quality assessment. A conceptual model of the automotive product quality system meeting customer requirements has been developed. Typical characteristics of modern industrial production are an increase in the production dynamism that determines the product properties; a continuous increase in the volume of information required for decision-making, an increased role of knowledge and high technologies implementing absolutely new scientific and technical ideas. To solve the problem of increasing the automotive product quality, a conceptual structural and hierarchical model is offered to ensure its quality as a closed system with feedback between the regulatory, manufacturing, and information modules, responsible for formation of the product quality at all stages of its life cycle. The three module model of the system of the industrial product quality assurance is considered to be universal and to give the opportunity to explore processes of any complexity while solving theoretical and practical problems of the quality assessment and prediction for products for various purposes, including automotive
Малая автоматизация на ЦВМ "Урал-11Б"
Wireless communication technologies like GPRS, UMTS and WLAN, combined with the availability of high-end, affordable mobile devices enable the development of the advanced and innovative mobile services. Devices such as mobile phones and Personal Digital Assistants let the users access a wide range of new offerings whenever and wherever they happen to be. A strategic approach for the quality assurance of these mobile data services should take into account a number of characteristics unique to the mobile paradigm, the increased complexity of emerging handheld devices, the greater sensitivity to security and load related problems in wireless infrastructure and increased complexities of scale. This paper identifies the major factors influencing the development and testing strategies for these applications and accordingly elaborates effective quality assurance principles to ensure productive and scalable mobile data services
Extending TLS with mutual attestation for platform integrity assurance
Normally, secure communication between client-server applications is established using secure channel technologies such as Transport Layer Security (TLS). TLS is cryptographic protocol which ensures secure transmission of data and authenticity of communication at each endpoint platform. However, the protocol does not provide any trustworthiness assurance of the involved endpoint. This paper incorporates remote attestation in the TLS key exchange protocol to solve this issue.The proposed embedded attestation extension in TLS protocol will provide assurance of sender's platforms integrity to receiver, and vice versa.The CA responsibility in TLS is replaced using own Trusted Certificate Authority (TCA) in our protocol. The credibility of the proposed protocol is studied to secure against replay attack and collusion attack. The proof is performed using AVISPA with High Level Protocol Specification (HLPSL) through Dolev-Yao intruder model implementation of the proposed protocol
Review of in-situ process monitoring and in-situ metrology for metal additive manufacturing
Lack of assurance of quality with additively manufactured (AM) parts is a key technological barrier that prevents manufacturers from adopting AM technologies, especially for high-value applications where component failure cannot be tolerated. Developments in process control have allowed significant enhancement of AM techniques and marked improvements in surface roughness and material properties, along with a reduction in inter-build variation and the occurrence of embedded material discontinuities. As a result, the exploitation of AM processes continues to accelerate. Unlike established subtractive processes, where in-process monitoring is now commonplace, factory-ready AM processes have not yet incorporated monitoring technologies that allow discontinuities to be detected in process. Researchers have investigated new forms of instrumentation and adaptive approaches which, when integrated, will allow further enhancement to the assurance that can be offered when producing AM components. The state-of-the-art with respect to inspection methodologies compatible with AM processes is explored here. Their suitability for the inspection and identification of typical material discontinuities and failure modes is discussed with the intention of identifying new avenues for research and proposing approaches to integration into future generations of AM systems
Recommended from our members
The development of an in-vivo dosimeter for the application in radiotherapy
This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University.The expectation for continual improvements in the treatment of cancer has brought quality assurance in radiotherapy under scrutiny in recent years. After a cancer diagnosis a custom treatment plan is devised to meet the particular needs of the patient's condition based on their prognosis. A cancer treatment plan will typically comprise of several cancer treatment technologies combining to form a comprehensive programme to fight the malignant growth. Inherent in each cancer treatment technology is a percentage error
in treatment accuracy. Quality assurance is the medical practice to minimise the percentage error in treatment accuracy. Radiotherapy is one of the several cancer treatment technologies a patient might receive as part of their treatment plan, and in-vivo dosimetry is a quality assurance technology specifically designed to minimise the percentage error in the treatment accuracy of radiotherapy. This thesis outlines the work completed in the design of a next generation dosimeter for in-vivo dosimetry. The proposed dosimeter is intended to modernise the process of measuring the absorbed dose of ionising radiation received by the target volume during a radiotherapy session. To accomplish
this goal the new dosimeter will amalgamate specialist technologies from the field of particle physics and reapply them to the field of medical physics. This thesis describes the design of a new implantable in-vivo dosimeter, a dosimeter comprising of several individual stages of electronics working together to modernise quality assurance in radiotherapy. Presented within this thesis are the results demonstrating the performance of two critical stages
for this new dosimeter, including: the
oating gate metal oxide field effective
transistor, a radiation sensitive electronic component measuring an absorbed dose of radiation; and the micro antenna, a highly specialist wireless communications device working to transmit a high frequency radio signal. This was a collaborative project between Rutherford Appleton Laboratory and Brunel University. The presented work in this thesis was completed between March 2007 and January 2011.This study is funded by the Science and Technology Facilities Council
Tools and methods for providing assurance of clonality for legacy cell lines
Over the last several years demonstration of cell line clonality has been a topic of many industry and regulatory presentations and papers. Guidance has been provided by the regulatory authorities, especially the FDA, on a path forward for providing evidence of clonality with high probability. It has been recommended that two-rounds of limiting dilution cloning (LDC) at sufficiently low seeding densities (≤0.5 cells/well) provides sufficient evidence that a cell line is clonal. Furthermore, one-round of LDC may also suffice if supplemental data from a characterized FACS or plate-imaging workflow are also included in the package.
Cell lines generated by methods that do not demonstrate high probability of clonal derivation, including legacy cell lines, may require additional studies to provide assurance and/or process control strategies to satisfy regulatory expectations.
Within the Biologics function of the IQ Consortium the “Clonality” Working Group is focusing on methods and tools which could be utilized to provide a high assurance of clonality for legacy cell lines.
The presentation will outline a three tier approach to address legacy cell line clonality assurance: standard practices already used in industry to support limit of in vitro cell age studies, enhanced control strategies to ensure process consistency, and emerging technologies that could be used to further support cell line clonality
- …