94 research outputs found

    Estimation of the Rayleigh Parameters Based On Interval Grouped Data

    Get PDF
    In this thesis the performance, the efficiency, the accuracy and the validity of the statistical estimation using the interval grouped data derived from the intermittent inspection life testing experiment are tested, improved and modified. To achieve these objectives several estimation methods are investigated employing Rayleigh as the underlying survival model. Based on the interval grouped data the likelihood functions of the unknown Rayleigh parameters are constructed using the unconditional probability and the conditional probability(in case of censoring) of failure in the corresponding intervals. The existence and the uniqueness of the MLE’s are proved. Using the equidistance case partitioning the MLE’s of the scale parameter are bounded and hence bisection and secant numerical methods can be applied to arrive at faster solution. The intervals end points and the cumulative number of failures at these ends are used to derive the mid interval and the compound grouped estimators .These estimators are in explicit forms and evaluated in terms of their bias and consistency. The results of applying the maximum likelihood estimation to real life time data relatively show better estimates of the survival and the hazard functions, as compared to the classical non parametric estimates. In the least square estimation and based on the multinomial distribution of failures the resulting estimators are compared to the corresponding estimators obtained by fitting regression models based on the nonparametric estimates of both the survival and the hazard functions at a pre given time. In the Bayesian estimation approach the conjugate priors are derived using both the complete and the interval grouped data. High posterior credible intervals are obtained and mathematical improvements of the Bayesian estimators obtained by the interval grouped are made to increase their relative efficiency and performance. Applying the modified Bayesian estimation procedures to a generated Rayleigh lifetimes data show a significance efficiency of the Bayesian estimation method. Despite the fact that there is a considerable loss of information in the exact unobservable lifetimes, simulation studies at different settings of the life testing experiment show a high relative efficiency of the estimators obtained using the interval grouped data in comparison with the estimators obtained using type I and right censored data. To measure the loss of information due to the intermittent inspection life testing experiment Shannon information and distance divergence measures are considered. Modifications in the Shannon information measure and derivation of a new information measure based on the sufficient statistics are investigated to reflect the actual loss of information. A criterion for minimizing the loss of information, selecting the suitable number of intervals, the inspection times and the sample size is extracted. The performance of the estimation procedures is also tested on some known survival analysis issues with application to a real lifetimes data. Hence, modifications in the conventional methods and formulating of alternative models are devoted to guarantee existence of the solution, improve the performance and reduce computations. Finally general conclusions on the overall thesis are given together with highlights for further researches

    EVALUATING THE IMPACT OF PREPARATION CONDITIONS AND FORMULATION ON THE ACCELERATED STABILITY OF TRETINOIN LOADED LIPOSOMES PREPARED BY HEATING METHOD

    Get PDF
    Objective: The aim of this study was to prepare stable all-trans retinoic acid or tretinoin (RA) loaded liposome formulation with high encapsulation efficiency intended for immediate application on the skin using the simple preparation method.Methods: Formulas were prepared by heating method. The effect of formulation variables on liposome properties was investigated. Dynamic Light Scattering Process (DLS) and the electrophoretic mobility process were used to determine the mean size and zeta potential, respectively. The encapsulation efficiency % was determined spectrophotometrically. Six liposome formulas (Fs) were prepared. Three types of phospholipids and two ratios of cholesterol were used. Sodium cholesteryl sulfate (SCS) was added to all formulas except formula 1. Six formulas and topical ethanolic solution of RA was stored at accelerated storage conditions for six months. EE% was determined by validated H. P. L. C method in the stage of the stability study.Results: pH=6.5 showed the highest EE%. Substantial decrease in zeta potential and EE% was noticed by increasing the ionic strength. SCS showed a positive effect on the stability of liposomes. Hydrogenated soybean phosphatidylcholine (HSPC) showed the best results on the stability of liposomes and RA. The leakage of RA from liposomes was not observed when the cholesterol was added to the formulas.Conclusion: The phospholipid that has a phase transition temperature (Tm) above the storage temperature showed the best stability with time and prevented the leakage of the active substance without needing to add cholesterol. It is necessary to add charge inducing substance to the formula to improve its stability.Â

    Obesity and Hypertension in Students of Jahangirnagar University: Alarming Issues

    Full text link
    The prevalence of obesity and hypertension (HTN) in university students of Bangladesh has not reported yet. Considering the proper health maintenance of this population in mind, the study was aimed to determine the prevalence of obesity and HTN as well as relationship among them in the students of a residential university of Bangladesh, Jahangirnagar University. This descriptive cross sectional study included 500 randomly selected students (250 males and 250 females). Participants completed a questionnaire on physical activity, sedentary behaviour, dietary factors, smoking and family history of obesity, HTN, and coronary artery disease. Blood pressure and anthropometric parameters such as height, weight, waist and hip circumferences were measured following standard procedure. The Statistical analyses were performed using the software SPSS.The prevalence of overweight was 25% (31.1% males, 15.6% females) and obesity 7.2% (9.4% males, 4% females). Pre-HTN was found at 27.1% (38% males, 11.2% females) and HTN at 2.2% (3.3% males, 0.4% females). A high rate of smoking, sedentary behavior, physical inactivity, excessive consumption of unhealthy food, and caffeine-rich drinks was also observed. Significant correlation was found between parameters of obesity and HTN. High prevalence of pre-HTN in males and central obesity in females were found which is immediately needed to control for better health maintenance of this population

    Removal of Mercury from Wastewater by Nanoparticle Pyrite (FeS2) and Ultrafiltration (UF) Membrane System

    Get PDF
    This research investigated the removal of mercury by Reactive Adsorbent Membrane (RAM) hybrid filtration process to attain high quality water from wastewater or water resources contaminated with Hg(II), while producing stable final residuals. Pyrite (FeS2) nanoparticles were employed as the reactive adsorbent and the FeS2-contacted mercury residuals were separated by either Dead End Ultrafiltration (DE-UF) or Cross Flow Ultrafiltration (CF-UF) system. The first task of this research was to synthesize pyrite nanoparticles with high purity in short reaction time. Microwave irradiation process was used to synthesize pyrite as microwave digestion method has the advantage of producing fine particles of highly pure pyrite with minimal reaction time. Scanning electron microscopy (SEM) and Energy dispersive X-ray spectroscopy (EDS/EDX) were used to characterize pyrite. Synthesized pyrite were used in all experiments. Reaction mechanism for Hg(II) removal by pyrite and behavior of the treatment system were characterized by observing flux decline, pH change, and Hg and Fe concentration in permeate water with time. Effects of the presence of anions (Cl-, SO42-, NO3-) and natural organic matter (HA) on Hg(II) removal were investigated. Also, stability of final residuals was evaluated by using thiosulfate solution (Na2S2O3) as desorbing reagent. This study also examined the possibility of continuous removal of mercury by reusing Hg/pyrite laden membrane to remove additional Hg(II) contaminated water. Analytical techniques used in this study included cold vapor atomic absorption spectrometry (CV-AAS) for mercury measurement, inductively coupled plasma optical emission spectrometry (ICP-OES) for Fe measurement and thermo triode pH meter calibrated with 4, 7 and 10 pH buffers for pH measurement. The surface of Hg/pyrite-deposited membranes were characterized using surface analysis techniques, including scanning electron microscopy (SEM) for sample's surface topography and X-ray photoelectron spectroscopy (XPS) to analyze the surface chemistry (oxidation state) of solids. Results of this research indicated that the Hg(II)-contacted FeS2 was completely rejected by both dead-end and cross-flow ultrafiltration membrane system regardless of the presence of anions and humic acid. However, Hg(II) removal was accompanied by considerable flux decline and pH change. Desorption tests were conducted using thiosulfate and almost no release of Hg(II) or iron was observed in permeate water indicating that the formed residuals are very stable. Recycle test showed that this system successfully achieved the goal of continuous and complete removal of mercury from water

    Program Flow Graph Decomposition

    Get PDF
    The purpose of this thesis involved the implementation, validation, complexity analysis, and comparison of two graph decomposition approaches. The two approaches are Forman's algorithm for prime decomposition of a program flow graph, and Cunningham's approach for decomposing a program digraph into graph-oriented components. To validate the two implementations, each was tested with six inputs. Comparison of these two approaches was based on these dimensions time and space complexities, composability, repeated decomposition, and uniqueness. Forman's algorithm appears to have four advantages over Cunningham's algorithm 1. the algorithm overhead (i.e, the time and space complexities) was lower in Forman's algorithm; 2. Forman's algorithm yields a unique set of decomposed units, whereas Cunningham's does not; 3. in Forman's algorithm, reconstructing the original graph from the decomposed prime graphs results in the original graph that was decomposed, whereas in Cunningham's algorithm, the attempt at the reconstruction of the original graph from the decomposed parts does not always yield the graph that was decomposed; 4 Forman's approach can be used to decompose a graph until it is irreducible (all its part are primes), whereas in Cunningham's algorithm, the algorithm decomposes the graph only once even if it is still decomposable Thus, Forman's approach could be recommended as a program flow graph decomposition algorithm. Implementation of the decomposition techniques could help in better software comprehension and can be used in the development of some software reusability tools

    Per Unit Cost Calculation of a Stand Alone PV System Considering the Equipment Cost

    Get PDF
    This paper contains the per unit cost Calculation of standalone Photovoltaic Systems (Home Solar Electricity) for residential use. In this paper on the photovoltaic system, the overall approach will be: First calculate the cost of each major component in terms of user specified variables. The user specified variables are taken at Peak power as what is required to power appliances and energy total consumed per day on average Hours of sunshine. After calculating the components costs, we add them up to create simple formulas with which to answer each of the questions above. The major components of a solar PV system are inverter, solar panels and Batteries. Inverter is a device that converts DC power into AC power at desire output voltage and frequency. This will help us to use home appliances and its major part of producing solar PV system

    Computer Assisted Drug Design of Tinosporide for treatment of Cancer: a Combined Density Functional and Molecular Docking Study

    Get PDF
    This article discusses theory behind the most important methods and recent successful applications of halogen‑directed tinosporide, ligand-based methods use only ligand information for predicting activity depending on its similarity/dissimilarity to previously known active ligands. We review widely used ligand-based methods such as ligand-based pharmacophores, molecular descriptors, and quantitative structure-activity relationships. In addition, important tools such as target/ligand data bases, homology modeling, ligand ADMET etc., and necessary for successful implementation of various computer-aided drug discovery/design methods in best analogue of tinosporides discovery are discussed. Finally, computational methods for toxicity prediction and optimization for favorable physiologic properties are discussed with successful lead for tinosporides from literature. The therapeutic potential of tinosporide has been studied extensively and the active compounds of tinosporide are shown to be involved in modulating multiple physiological responses. Moreover this article will review the structure of series of halogen-directed tinosporides before illustration on how the molecules exert their functions via interactions with various signal transducer and activator proteins of transcription which were designed by homology modeling. Strategies for CADD vary depending on the extent of structural and other information available regarding the target (enzyme/receptor) and the ligands. The process by which a new tinosporide product is brought to market stage is referred to by a number of names most commonly as the development chain and consists of a number of distinct stages. Keywords: CADD; ADMET; Molecular Modeling; Tinosporide

    Automatic Hemorrhage Segmentation In Brain CT Scans Using Curriculum-based Semi-Supervised Learning

    Get PDF
    One of the major neuropathological consequences of traumatic brain injury (TBI) is intracranial hemorrhage (ICH), which requires swift diagnosis to avert perilous outcomes. We present a new automatic hemorrhage segmentation technique via curriculum-based semi-supervised learning. It employs a pre-trained lightweight encoder-decoder framework (MobileNetV2) on labeled and unlabeled data. The model integrates consistency regularization for improved generalization, offering steady predictions from original and augmented versions of unlabeled data. The training procedure employs curriculum learning to progressively train the model at diverse complexity levels. We utilize the PhysioNet dataset to train and evaluate the proposed approach. The performance results surpass those of supervised model with an average Dice coefficient and Jaccard index of 0.573 and 0.428, respectively. Additionally, the method achieves 87.86% accuracy in hemorrhage classification and Cohen\u27s Kappa value of 0.81, indicating substantial agreement with ground truth

    Status and economic valuation of ecosystem services of Tanguar haor: A wetland of Bangladesh

    Get PDF
    Tanguar haor wetland is one of the listed Ramsar sites enrich with biodiversity variety and provides several ecosystem services with significant contribution to the national economy of Bangladesh. But these services were decreasing day by day due to natural and anthropogenic activities. The purposes of this study were to identify the utilized ecosystem services by communities, economic values of utilized ecosystem services and the basic reasons for depleting of ecosystem services. Data were collected through baseline survey, checklists, face to face questionnaire survey and focus group discussion (FGD) from 120 residential respondents and 50 tourist respondents by accepting random sampling techniques during November, 2016 to September, 2017. Physiochemical characteristics of the water were determined where the mean temperature and pH were 28.26°C and 7.72 respectively. The highest TDS, EC, DO, NO3 and PO4 values were 1020 (mg L−1), 1460 (μS cm−1), 8.56 (mg L−1), 1.769 (mg L−1) and 0.078 (mg L−1), respectively. Commonly utilized ecosystem services were crops, vegetables, fuel, fresh water, fishes and migratory birds, climate regulation, water purification, natural hazards protection, aesthetic, social relations, recreation & tourism, health benefit, primary production, nutrient cycling, water cycling habitats for species and provision of habitat etc. Market Value Method (MVM) and Contingent Value Method (CVM) were applied to measure the economic value of Tanguar haor wetland services. DPSIR framework and Impact Matrix (IM) were applied for conceptual analysis to identify the effects on ecosystem services. Total economic value of 39 ecosystem services of Tanguar haor was estimated at 174039980 BDT year−1. Mismanagement of biodiversity, over exploitation, sedimentation of haor, climate change, illegal hunting, land use changes and habitat changes were the responsible factors for depleting ecosystem services. The impact factor (4.161) was identified by natural and anthropogenic factors on ecosystem services. There have a vast prospect of the Tanguar haor wetland services for near communities. Finally the research suggested several sustainable management approaches which have the potentiality to protect the services of the wetland

    Mass media utilization to promote public behavior change during COVID-19 situation: A population survey of Dhaka city

    Get PDF
    A huge amount of information associated with the novel coronavirus disease 2019 (COVID-19) outbreak was circulated by mass media in Bangladesh. There has been a rare example so far of how media intervention during the epidemic can affect the public behaviour of Bangladesh. We aimed to assess mass media's influence on changing public behaviour during the second wave of COVID-19. An online cross-sectional survey among 416 Bangladeshi respondents was conducted between August and September 2021. Besides descriptive statistics, datasets were analyzed through a set of statistical methods such as Pearson’s correlation coefficient, and stepwise multiple regression model. The results showed that knowledge level change towards COVID-19 (10 items) had the highest association with behaviour change towards COVID-19 (16 items), indicating a high adoption of public behavior change. There was a positive significant relationship between the behaviour change towards COVID-19 (16 items) with the media's role in making awareness regarding COVID-19 (r= 0.342, p < .001), while there was a negative relationship between the behaviour change towards COVID-19 (16 items) of the respondents with age of participants (r= -.234, p < .001). The results also disclosed that knowledge level, media credibility, and media check-in had the largest contribution to influence the public behavior change. We also found that social media was highly used media as expected during the COVID-19 outbreak. The outcomes of the survey have vital implications for public behavior change and may support infectious disease suppression and control. Our outcomes also stress the significance of the reliability of information shared via mass media outlets and practical strategies to counter misinformation during the COVID-19 outbreak
    corecore