161 research outputs found

    The viability of Weibull analysis of small samples in process manufacturing

    Get PDF
    This research deals with some Statistical Quality Control (SQC) methods, which are used in quality testing. It investigates the problem encountered with statistical process control (SPC) tools when small sample sizes are used. Small sample size testing is a new area of concern especially when using expensive (or large) products, which are produced in small batches (low volume production). Critical literature review and analysis of current technologies and methods in SPC with small samples testing failed to show a conformance with conventional SPC techniques, as the confidence limits for averages and standard deviation are too wide. Therefore, using such sizes will provide unsecured results with a lack in accuracy. The current research demonstrates such problems in manufacturing by using examples, in order to show the lack and the difficulties faced with conventional SPC tools (control charts). Weibull distribution has always shown a clear and acceptable prediction of failure and life behaviour with small sample size batches. Using such distribution enables the accuracy needed with small sample size to be obtained. With small sample control charts generate inaccurate confidence limits, which are low. On the contrary, Weibull theory suggests that using small samples enable achievement of accurate confidence limits. This research highlights these two aspects and explains their features in more depth. An outline of the overall problem and solution point out success of Weibull analysis when Weibull distribution is modified to overcome the problems encountered when small sample sizes are used. This work shows the viability of Weibull distribution to be used as a quality tool and construct new control charts, which will provide accurate result and detect nonconformance and variability with the use of small sample sizes. Therefore, the new proposed Weibull deduction control charts shows a successful replacement of the conventional control chart, and these new charts will compensate the errors in quality testing when using small size samples

    Web-Base Point Of Sale Application For Segarmart

    Get PDF
    With Web-base and point of sale application services can be easily in any way. Consumer point-of-sale (POS) applications include those applications that consumers encounter directly or indirectly at the point of sale. Examples include terminals used by cashiers, ATM machines, and in-store kiosks. This research introduces a prototype "Web-base Point of sale selection application prototype" that provides the cashier inside the Segarmart, to use the system for make calculate for fruits, so the system can save the time and effort

    Physicians’ Interest Measurement towards Islamic Document for Medicine and Health Ethics in Jordanian Public Hospitals

    Get PDF
    The current study examined one of the important aspects of health services provision in Jordan, namely the application of Islamic Document of Medicine and Public Health Ethics in Jordanian hospitals. The study aimed to measure physicians’ interest towards Islamic document application levels of medicine and public health ethics in Jordanian public hospitals. This study is an observational and analytical study, and represented the sampling unit patients in eight of the Jordanian government hospitals. The study collected data through the questionnaire. Data were analyzed statistically by using Social Packaging Statistical System (SPSS, Ver. 15). The results indicated that physicians are interested in applying Islamic document of medical and health ethics. The study suggested some recommendations, of which the most important is to educate and motivate employees to practice health ethics as stipulated in Islamic document of health ethics through their general behavior and the performance of their tasks. Keywords: medical ethics; patient satisfaction, and Islamic document

    Brain Tumors Classification by Using Gray Level Co-occurrence Matrix, Genetic Algorithm and Probabilistic Neural Network

    Get PDF
    Background:Brain tumors classification by MRI (Magnetic Resonance Imaging) is important in medical diagnosis because it provides information associated with anatomical structures as well as potential abnormal tissues necessary for treatment planning and patient's case follow-up. There are a number of techniques for medical image classification. In this paper brain tumors detection and classification system are developed into seven tumors types. The image processing techniques such as preprocessing by using a mean filter and feature extraction have been implemented for the detection of a brain tumor in the MRI images. In this paper, extraction of texture features using GLCM (Gray Level Co-occurrence Matrix). We used Probabilistic Neural Network Algorithm (PNNA) for image classification technique based on Genetic Algorithm (GA) and K-Nearest Neighbor (K-NN) classifier for feature selection is proposed in this paper. Objective: MRI brain tumors detection and classification system by using GA and PNN which able to diagnose different types of tumors in human brain. Patients and Methods: Medical image techniques are used to imaging the internal structures of the human body for medical diagnosis. Image processing is an effective field of research in the medical field. MRI dataset, obtained from the Atlas Website of Harvard University. Results: Brain Tumors are classified by using the genetic algorithm where the total number of features (20 features) has been reduced to 10 features as the strongest features in the classification. Conclusion: MRI brain image is one of the best methods in brain tumor detection and classification, by observing only MRI images the specialists are unable to keep up with diagnosing. Hence, the computer-based diagnosis is necessary for the correct brain tumor classification

    The Evaluation of Calcium Score Validity in the Diagnosis of Patients with Coronary Artery Disease by Using CT Angiography

    Get PDF
    Background: Coronary artery disease is one of important diseases as in many cases ends up with death. Among many types of coronary artery disease is the lipoprotein plaque deposition on the artery wall. Many reports appeared in the literature concerning the causes, investigation, and treatment of the coronary artery disease. As computed tomography scanners were developed, a new non-invasive procedure was introduced using the calcium present in the plaque as an indicator for the amount of plaque in the coronary artery. Objective: To investigate the validity of the calcium score in the diagnosis of coronary artery disease, and also to find the relation between calcium score with calcification and plaque. Patients and methods: Sixty one patients 40 were men and 21 were women evaluated for calcium score. They had symptoms of chest pain and were subjected to electro cardio gram examination to determine their eligibility for computed tomography angiography to investigate the coronary calcification as a marker of atherosclerosis. The history of diseases including hypertension and diabetes were recorded and check renal function test. Anthropometric measurements and the level of fasting lipid profile for patients and normal subjects were tasted. Patients were advised to come fasting prior to the   examination. They have been given Beta blocker to reduce the heart rate in the range of 55-65 beats/min. Contrast medium was injected IV by means of injector immediately before scanning. Results: Results revel that not all patients suffering from chest pain with electro cardio gram changes show high calcium score; on the other hand patients with high calcium score they have an increased plaque in their coronary artery. At low calcium score calcium score           (0-100), cholesterol, triglyceride and high density lipoprotein are generally inversely proportional with calcium score with the exception of LDL remains virtually unchanged throughout the whole range of calcium score (0 > 300) as appear in the figures . While at high calcium score concentration (>300) lipoproteins are directly proportional with calcium score in contrast with high density lipoprotein which is inversely proportional with calcium Score

    Comparison of weighted and unweighted methods of wealth indices for assessing SOCIO-ECONOMIC status

    Get PDF
    Due to some of the limitations of monetary measures, various non-monetary approaches for assessing household wealth have been developed as alternative tools for classifying household socio-economic status. Among them, wealth indices based on household durable assets are being used. The literature revealed that two basic methods of constructing wealth indices are employed: an unweighted method, where assets are weighted equally; and a weighted method, where specific weights are assigned to assets. In the case of using the weighted method, weighting can be assigned using various techniques. The overall objective of the study is to compare the wealth indices constructed by using weighted and unweighted methods for assessing the socio-economic status of households in rural Bangladesh. Firstly, the study attempts to construct wealth indices based on durable assets using the unweighted method and two techniques of the weighted method: weighted index using the inverse of proportion, and weighted index using principal component analysis (PCA). Following this, the study compares some distributional characteristics of these indices as well as monetary indicators. At the same time, the study evaluates and examines some attractive properties of these indices such as the extent of clumping and truncation, consistency with traditional monetary measures. Comparative analysis revealed that the unweighted asset index, as well as weighted asset index using PCA, can be treated as an efficient alternative to the monetary measures to evaluate the living standard of the households in the present study. However, due to some advantage\u27s asset index using PCA can be considered to be somewhat better than the unweighted index. But, as the unweighted asset index is not very different from the weighted asset index using PCA, it can also be used as an alternative to the monetary measures without the need to use weighting

    Numerical investigation on the effect of solder paste rheological behaviour and printing speed on stencil printing

    Get PDF
    Purpose of this paper was to investigate the effect of different viscosity models (Cross and Al-Ma’aiteh) and different printing speeds on the numerical results (e.g., pressure over stencil) of a numerical model regarding stencil printing. A finite volume model was established for describing the printing process. Two types of viscosity models for non-Newtonian fluid properties were compared. The Cross model was fitted to the measurement results in the initial state of a lead-free solder paste, and the parameters of a Al-Ma’aiteh material model were fitted in the stabilised state of the same paste. Four different printing speeds were also investigated from 20 to 200 mm/s. Noteworthy differences were found in the pressure between utilising the Cross model and the Al-Ma’aiteh viscosity model. The difference in pressure reached 33–34% for both printing speeds of 20 and 70 mm/s, and reached 31% and 27% for the printing speed of 120 and 200 mm/s. The variation in the difference was explained by the increase in the rates of shear by increasing printing speeds. Parameters of viscosity model should be determined for the stabilised state of the solder paste. Neglecting the thixotropic paste nature in the modelling of printing can cause a calculation error of even ~30%. By using the Al-Ma’aiteh viscosity model over the stabilised state of solder pastes can provide more accurate results in the modelling of printing, which is necessary for the effective optimisation of this process, and for eliminating soldering failures in highly integrated electronic devices

    Predicting the Transfer Efficiency of Stencil Printing by Machine Learning Technique

    Get PDF
    Experiment was carried out for acquiring data regarding the transfer efficiency of stencil printing, and a machine learning-based technique (artificial neural network) was trained for predicting that parameter. The input parameters space in the experiment included the printing speed at five different levels (between 20 and120 mm/s) and the area ratio of stencil apertures from 0.34 to1.69. Three types of lead-free solder paste were also investigated as follows: Type-3 (particle size range is 20–45 μm), Type-4 (20–38 μm), Type-5 (10–25 μm). The output parameter space included the height and the area of the print deposits and the respective transfer efficiency, which is the ratio of the deposited paste volume to the aperture volume. Finally, an artificial neural network was trained with the empirical data using the Levenberg–Marquardt training algorithm. The optimal tuning factor for the fine-tuning of the network size was found to be approximately 9, resulting in a hidden neuron number of 160. The trained network was able to predict the output parameters with a mean average percentage error (MAPE) lower than 3%. Though, the prediction error depended on the values of the input parameters, which is elaborated in the paper in details. The research proved the applicability of machine learning techniques in the yield prediction of the process of stencil printing
    • …
    corecore