11,142 research outputs found

    EDITOR'S NOTE: About This Supplement

    Get PDF

    Development of an intelligent geometry measurement procedure for coordinate measuring machines

    Get PDF
    A Coordinate Measuring Machines (CMM) is a highly accurate electronic scale for the automatic measurement of 2 and 3 dimensional geometries. In a typical operation the CMM measures a set of user defined points, and then utilizes some internal logic to ascertain whether the inspected part meets the specifications. CMMs have received widespread acceptance among the manufacturing community, and in many instances are required as per supplier contract. Applications of CMMs vary from the measurement of simple 2D parts to complex 3D spatial frames (as for example in their use to measure the integrity of automobile frames). The primary objective of the proposed research is to investigate procedures for the efficient use of CMMs. Two of the key parameters in CMM usage are the number of points measured, and the relative location of the points measured. In this thesis we firsts show that when these two inspection parameters are varied, for the same part, then different conclusions with regard to the part\u27s geometry may be drawn. Next we investigate the relationship between these two parameters and the reliability of the concluded data. Specifically we focus on a 2D circle, a 2D rectangle, and a 2D plane. The experiments were conducted on the Brown & Sharpe\u27s Coordinate Measuring Machine

    EDITOR'S NOTE - About This Supplement

    Get PDF

    Towards More Efficient 5G Networks via Dynamic Traffic Scheduling

    Get PDF
    Department of Electrical EngineeringThe 5G communications adopt various advanced technologies such as mobile edge computing and unlicensed band operations, to meet the goal of 5G services such as enhanced Mobile Broadband (eMBB) and Ultra Reliable Low Latency Communications (URLLC). Specifically, by placing the cloud resources at the edge of the radio access network, so-called mobile edge cloud, mobile devices can be served with lower latency compared to traditional remote-cloud based services. In addition, by utilizing unlicensed spectrum, 5G can mitigate the scarce spectrum resources problem thus leading to realize higher throughput services. To enhance user-experienced service quality, however, aforementioned approaches should be more fine-tuned by considering various network performance metrics altogether. For instance, the mechanisms for mobile edge computing, e.g., computation offloading to the edge cloud, should not be optimized in a specific metric's perspective like latency, since actual user satisfaction comes from multi-domain factors including latency, throughput, monetary cost, etc. Moreover, blindly combining unlicensed spectrum resources with licensed ones does not always guarantee the performance enhancement, since it is crucial for unlicensed band operations to achieve peaceful but efficient coexistence with other competing technologies (e.g., Wi-Fi). This dissertation proposes a focused resource management framework for more efficient 5G network operations as follows. First, Quality-of-Experience is adopted to quantify user satisfaction in mobile edge computing, and the optimal transmission scheduling algorithm is derived to maximize user QoE in computation offloading scenarios. Next, regarding unlicensed band operations, two efficient mechanisms are introduced to improve the coexistence performance between LTE-LAA and Wi-Fi networks. In particular, we develop a dynamic energy-detection thresholding algorithm for LTE-LAA so that LTE-LAA devices can detect Wi-Fi frames in a lightweight way. In addition, we propose AI-based network configuration for an LTE-LAA network with which an LTE-LAA operator can fine-tune its coexistence parameters (e.g., CAA threshold) to better protect coexisting Wi-Fi while achieving enhanced performance than the legacy LTE-LAA in the standards. Via extensive evaluations using computer simulations and a USRP-based testbed, we have verified that the proposed framework can enhance the efficiency of 5G.clos

    Extended Topics in the Integration of Data Envelopment Analysis and the Analytic Hierarchy Process in Decision Making.

    Get PDF
    The Analytic Hierarchy Process (AHP) is a procedure, which can only consider relative priorities as estimated by decision-makers. A Data Envelopment Analysis (DEA) model is a data-oriented approach for evaluating the relative efficiency of a group of entities referred to as Decision Making Units (DMUs). This research work integrates and combines positive aspects of AHP\u27s estimated qualitative data and DEA\u27s quantitative data. This combination is accomplished by specifying two variants of the DEA methodology for selection of the best DMU. Initially the priority weights of AHP are integrated with the DEA methodology to provide results that are logic based. Next, a method is developed to work backwards through the DEA model to provide values that would be the required results from an AHP formulation to give the same result in DEA. The objective of the research is to propose variants of DEA that would possibly improve the results and also integrate subjective data. Through the application of the methods developed in this research, it is believed that the acceptability of the results obtained from DEA analysis can be improved

    The role of the N-end rule pathway in mammalian development and innate immunity

    Get PDF
    The N-end rule pathway is a proteolytic system in which single N-terminal amino acids of proteins act as a class of degrons (N-degrons) that determine the half-lives of proteins. We have previously identified a family of mammals N-recognins whose conserved UBR boxes bind N-degrons to facilitate substrate ubiquitination, leading to proteolysis via the ubiquitin proteasome system or autophagy. Amongst these N-recognins, UBR4 binds to both type-1 and type-2 residues without known ubiquitylation domain. N-terminal Arg is one of the principal degrons that it can be generated through post-translational conjugation of L-Arg from Arg-tRNAArg to N-terminal Asp or Glu, which is mediated by ATE1-encoded Arg-tRNA transferases. In this dissertation study, we addressed the roles of UBR4 in mammalian development and ATE1 in innate immune response, respectively. First, we generated UBR4-deficient mice in which the UBR box of UBR4 was deleted and characterized the null phenotypes. UBR4-deficient mice exhibit severe embryonic lethality and pleiotropic abnormalities, including accumulated autophagic vacuoles in the yolk sac endoderm. UBR4 also modulates early endosomal maturation and the trafficking through the interaction with Ca2+-bound calmodulin. UBR4-/- embryos have multiple developmental defects including neurogenesis and cardiovascular system, which is at least in part attributed to the impairment in cell adhesion and depletion of cell surface proteins. Collectively, these data reveal that the N-recognin UBR4 plays important roles in multiple developmental processes associated with angiogenesis, neurogenesis and cardiovascular system. The developmental processes are most commonly involved in non-proteolytic processes such as endosomal maturation, trafficking and cellular adhesion. We show that cytosolic foreign DNA induces N-terminal arginylation of ER chaperons, which is required for host defense system for IFN-β mediated gene induction and IRF3 phosphorylation. Cytosolic dsDNA facilitates relocation of ATE1 to the ER, on which ATE1 is colocalized with STING. Interference of Nt-arginylation that is important for host defense induces the production of virion. Our results suggest that N-terminal arginylation is essential for cellular immune response against foreign DNA and viral infections. This work provides meaningful evidences demonstrating that the N-end rule pathway plays a pivotal role in mammalian development and innate immunity beside proteolytic function

    Toward efficient maximum likelihood algorithms

    Get PDF
    Motivated by recent extensive studies of maximum likelihood (ML) algorithms, especially EM-type schemes, the author proposes a class of generalized conditional maximization (GCM) algorithms that pursues dimension reduction as well as stability of algorithm simultaneously. This model-dependent approach for developing ML algorithms is to apply an appropriate, but possibly different approximation to each selected subset of parameters that ensure fast and stable convergence to a candidate for a local maximum. In the first part of this dissertation, the author illustrates the application of this algorithm to several examples--random effects model, variance components, normal finite mixture, t-distribution model, contingency table, and compares the performance of each to conventional EM-type algorithms using numerical studies. For the rest of this dissertation, new models emphasizing variance components model, which might be helpful in a data analysis, are studied and new GCM algorithms for those ML estimations are developed
    corecore