2,651 research outputs found
The technology base for agile manufacturing
The effective use of information is a critical problem faced by manufacturing organizations that must respond quickly to market changes. As product runs become shorter, rapid and efficient development of product manufacturing facilities becomes crucial to commercial success. Effective information utilization is a key element to successfully meeting these requirements. This paper reviews opportunities for developing technical solutions to information utilization problems within a manufacturing enterprise and outlines a research agenda for solving these problems
The effect of orientation of retinal configuration upon accommodation and convergence
The effect of orientation of retinal configuration upon accommodation and convergenc
Recommended from our members
Automatic design of 3-d fixtures and assembly pallets
This paper presents an implemented algorithm that automatically designs fixtures and assembly pallets to hold three-dimensional parts. All fixtures generated by the algorithm employ round side locators, a side clamp, and cylindrical supports; depending on the value of an input control flag, the fixture may also include swing-arm top clamps. Using these modular elements, the algorithm designs fixtures that rigidly constrain and locate the part, obey task constraints, are robust to part shape variations, are easy to load, and are economical to produce. For the class of fixtures that are considered, the algorithm is guaranteed to find the global optimum design that satisfies these and other pragmatic conditions. The authors present the results of the algorithm applied to several practical manufacturing problems. For these complex problems the algorithm typically returns initial high-quality fixture designs in less than a minute, and identifies the global optimum design in just over an hour. The algorithm is also capable of solving difficult design problems where a single fixture is desired that can hold either of two parts
Experimental evaluation of confidence interval procedures in sequential steady-state simulation
Sequential analysis of simulation output is generally
accepted as the most efficient way for securing
representativeness of samples of collected observations.
In this scenario a simulation experiment is stopped when
the relative precision of estimates, defined as the relative
width of confidence intervals at an assumed confidence
level, reaches the required level. This paper deals with
the statistical correctness of the methods proposed for
estimating confidence intervals for mean values in
sequential steady-state stochastic simulation. We
formulate basic rules that should be followed in proper
experimental analysis of coverage of different steadystate
interval estimators. Our main argument is that such
analysis should be done sequentially. The numerical
results of our preliminary coverage analysis of the
method of Spectral Analysis (SA/HW) and Nonoverlapping
Batch Means are presented, and compared
with those obtained by traditional, non-sequential
approaches
Recommended from our members
Measuring worst-case errors in a robot workcell
Errors in model parameters, sensing, and control are inevitably present in real robot systems. These errors must be considered in order to automatically plan robust solutions to many manipulation tasks. Lozano-Perez, Mason, and Taylor proposed a formal method for synthesizing robust actions in the presence of uncertainty; this method has been extended by several subsequent researchers. All of these results presume the existence of worst-case error bounds that describe the maximum possible deviation between the robot`s model of the world and reality. This paper examines the problem of measuring these error bounds for a real robot workcell. These measurements are difficult, because of the desire to completely contain all possible deviations while avoiding bounds that are overly conservative. The authors present a detailed description of a series of experiments that characterize and quantify the possible errors in visual sensing and motion control for a robot workcell equipped with standard industrial robot hardware. In addition to providing a means for measuring these specific errors, these experiments shed light on the general problem of measuring worst-case errors
A statistical method for retrospective cardiac and respiratory motion gating of interventional cardiac x-ray images
Purpose: Image-guided cardiac interventions involve the use of fluoroscopic images to guide the insertion and movement of interventional devices. Cardiorespiratory gating can be useful for 3D reconstruction from multiple x-ray views and for reducing misalignments between 3D anatomical models overlaid onto fluoroscopy. Methods: The authors propose a novel and potentially clinically useful retrospective cardiorespiratory gating technique. The principal component analysis (PCA) statistical method is used in combination with other image processing operations to make our proposed masked-PCA technique suitable for cardiorespiratory gating. Unlike many previously proposed techniques, our technique is robust to varying image-content, thus it does not require specific catheters or any other optically opaque structures to be visible. Therefore, it works without any knowledge of catheter geometry. The authors demonstrate the application of our technique for the purposes of retrospective cardiorespiratory gating of normal and very low dose x-ray fluoroscopy images. Results: For normal dose x-ray images, the algorithm was validated using 28 clinical electrophysiology x-ray fluoroscopy sequences (2168 frames), from patients who underwent radiofrequency ablation (RFA) procedures for the treatment of atrial fibrillation and cardiac resynchronization therapy procedures for heart failure. The authors established end-systole, end-expiration, and end-inspiration success rates of 97.0%, 97.9%, and 97.0%, respectively. For very low dose applications, the technique was tested on ten x-ray sequences from the RFA procedures with added noise at signal to noise ratio (SNR) values of √50, √10, √8, √6, √5, √2 and √1 to simulate the image quality of increasingly lower dose x-ray images. Even at the low SNR value of √2, representing a dose reduction of more than 25 times, gating success rates of 89.1%, 88.8%, and 86.8% were established. Conclusions: The proposed technique can therefore extract useful information from interventional x-ray images while minimizing exposure to ionizing radiation. © 2014 American Association of Physicists in Medicine
Вимоги видавничого відділу ІМФЕ ім. М. Т. Рильського до оформлення авторами рукописів
Industrial parts are manufactured to tolerances as no production process is capable of delivering perfectly identical parts. It is unacceptable that a plan for a manipulation task that was determined on the basis of a CAD model of a part fails on some manufactured instance of that part, and therefore it is crucial that the admitted shape variations are systematically taken into account during the planning of the task. We study the problem of orienting a part with given admitted shape variations by means of pushing with a single frictionless jaw. We use a very general model for admitted shape variations that only requires that any valid instance must contain a given convex polygon PI while it must be contained in another convex polygon PE. The problem that we solve is to determine, for a given h, the sequence of h push actions that puts all valid instances of a part with given shape variation into the smallest possible interval of final orientations. The resulting algorithm runs in O(hn) time, where n=|PI|+|PE|
Identification of a Bacterial Type III Effector Family with G Protein Mimicry Functions
SummaryMany bacterial pathogens use the type III secretion system to inject “effector” proteins into host cells. Here, we report the identification of a 24 member effector protein family found in pathogens including Salmonella, Shigella, and enteropathogenic E. coli. Members of this family subvert host cell function by mimicking the signaling properties of Ras-like GTPases. The effector IpgB2 stimulates cellular responses analogous to GTP-active RhoA, whereas IpgB1 and Map function as the active forms of Rac1 and Cdc42, respectively. These effectors do not bind guanine nucleotides or have sequences corresponding the conserved GTPase domain, suggesting that they are functional but not structural mimics. However, several of these effectors harbor intracellular targeting sequences that contribute to their signaling specificities. The activities of IpgB2, IpgB1, and Map are dependent on an invariant WxxxE motif found in numerous effectors leading to the speculation that they all function by a similar molecular mechanism
Recommended from our members
Turbulent flow at 190 m height above London during 2006-2008: A climatology and the applicability of similarity theory
Flow and turbulence above urban terrain is more complex than above rural terrain, due to the different momentum and heat transfer characteristics that are affected by the presence of buildings (e.g. pressure variations around buildings). The applicability of similarity theory (as developed over rural terrain) is tested using observations of flow from a sonic anemometer located at 190.3 m height in London, U.K. using about 6500 h of data. Turbulence statistics—dimensionless wind speed and temperature, standard deviations and correlation coefficients for momentum and heat transfer—were analysed in three ways. First, turbulence statistics were plotted as a function only of a local stability parameter z/Λ (where Λ is the local Obukhov length and z is the height above ground); the σ_i/u_* values (i = u, v, w) for neutral conditions are 2.3, 1.85 and 1.35 respectively, similar to canonical values. Second, analysis of urban mixed-layer formulations during daytime convective conditions over London was undertaken, showing that atmospheric turbulence at high altitude over large cities might not behave dissimilarly from that over rural terrain. Third, correlation coefficients for heat and momentum were analyzed with respect to local stability. The results give confidence in using the framework of local similarity for turbulence measured over London, and perhaps other cities. However, the following caveats for our data are worth noting: (i) the terrain is reasonably flat, (ii) building heights vary little over a large area, and (iii) the sensor height is above the mean roughness sublayer depth
- …