3,604 research outputs found

    Methods of Cultivation of Hyperthermophiles that Utilize Crude Oil

    Get PDF
    This study demonstrated the presence of hyperthermophilic organisms in the upper Jurassic Smackover formation in Womack Hills, AL. Evidence for the presence of these organisms was shown by the cultivation of an aerobic and an anaerobic, oil-degrading hyperthermophilic culture from the cuttings of an oil well in the Jurassic Smackover at 90¢ªC. Viability of microorganisms in the formation was established through electron microscopy, by carbon dioxide production, and by protein production during incubation in medium at 90¢ªC. Not only was the presence of viable microorganisms in the reservoir established, but as a result of this study, new cultivation methods were also developed that may prove useful in future studies of these types of organisms

    Assessment of Management Factors Prior to Breeding and their Impact on Bovine Fertility

    Get PDF
    Management of female infertility is a primary determinant of economic efficiency in the cattle industry. Management factors involved in impacting fertility include identification of females with suboptimal fertility and reducing the period of anestrus, prior to pubescence and after parturition. The use of anti-Müllerian hormone in the identification of females with suboptimal follicular populations allows for selection of females with optimal follicular populations and could reduce infertility resulting from a decrease in the quantity of follicles. A reduction in the period of anestrus also impacts fertility and management strategies that induce an ovulatory response in anestrous females improves fertility. Biostimulation has advanced pubescence in heifers and reduced the length of postpartum anestrus in cows. Advancing the understanding of anti-Müllerian hormone and the biostimulatory effect allows for further assessment of these management factors and their impact on infertility. Improved management of female infertility increases profitability of cattle production

    Adolescent Delinquency and Family Processes among Single Parent Families

    Get PDF
    This study used secondary data from the National Longitudinal Study of Adolescent to Adult Health (Add Health) to examine the relationship between adolescent delinquency and family processes (i.e., relationship to residential parents and autonomy), among single-mother and singleather families. The findings indicate that adolescents in single-mother families reported a higher quality relationship to residential parents than those living with singleathers. Additionally, the relationship to residential parents variable was modestly predictive of adolescent delinquency. However, the results indicate there is no statistically significant difference between rates of adolescent delinquency among single-mother and singleather families. Research and practical implications of this study are discussed

    Guest Intercalation Into Metal Halide Inorganic-Organic Layered Perovskite Hybrid Solids And Hydrothermal Synthesis Of Tin Oxide Spheres

    Get PDF
    The work presented in this thesis is divided into two research areas. In part I, the synthesis and guest intercalation of inorganic-organic metal halide ammonium layered perovskites is discussed. Comparisons are made between the solid matrix before and after the intercalations, and all solids are characterized using thermogravimetric analysis (TGA) and powder X-ray diffraction (XRD). In part II, “templateree” hydrothermal synthesis of tin oxide spheres in the presence of different ortho-substituted anilines is discussed. The aim is to determine whether there are differences in the structures, shapes and surface morphology of the tin oxide spheres that correspond to the identity/shape of the ortho-substituent on the anilines. Solids were characterized by XRD, transmission electron microscopy (TEM) and scanning electron microscopy (SEM) techniques

    Feed Quality Effects on Modern Heavy Broiler Performance

    Get PDF
    Commercial broilers are fed exclusively pelleted diets; this is due to research that has demonstrated numerous benefits to feeding pellets. The first objective was to investigate the effects of modest improvements in pellet quality on two modern broiler strains. Regardless of strain, feeding 80% pellets improved broiler performance from d 28 to 42. The second objective was to investigate the effects of feed form and liquid application method on feed augering segregation and subsequent broiler performance. In general, percent pellets steadily decreased across location throughout feed augering. Also, phytase segregation occurred throughout augering and was exacerbated in post-pellet liquid application diets. When the augered diets were fed to broilers, 75% pellets and post-pellet liquid application diets improved performance. The final objective was to investigate the change in percent pellets as feed was augered throughout an entire commercial poultry house. Ultimately, creating high-quality pellets decreases pellet attrition and improves broiler performance

    Quantification of Storm Surge Probability using Ensemble Slosh Model Data

    Get PDF
    One of the greatest hazards from hurricanes is the flooding due to storm surge. Emergency managers traditionally plan for storm surge by looking at the worst possible impact and design their plans accordingly. This is a safe course of action, but can also be a wasted expense if the worst case does not occur. Risk-based planning is a way to incorporate the likelihood or probability of an impact occurring into emergency planning. With respect to storm surge, though, there is very little information regarding probability of occurrence. This research uses data from a commonly accepted storm surge model, SLOSH from the National Weather Service, to develop probabilities of impact. The process and products are prototypes utilizing data from the 2007 SLOSH model run for the New Orleans basin. Products developed include a map of probability, probabilities of exceedance, and a list of model storms that generate surge at given locations

    Introducing Transferability and the Upmds Usability Framework in a Multiple-Device System

    Get PDF
    This research introduces the concept of transferability into the usability construct and creates the Usability Paradigm for Multiple Device System (UPMDS) to conceptualize and quantify the usability in multiple device scenarios. This study fills the literature gap that no effective method exists in measuring transferability and in quantifying usability in a multiple device context. This study also answers the research questions regarding the impact of task complexity, user experience, and device order on the total usability of the system. Study one follows a systematic approach to develop, validate, and apply a new questionnaire tailored specifically to measure the transferability within a multiple device system. The System Transferability Questionnaire (STQ) is obtained after validation with 15 question items. In a software usability study, the STQ demonstrated excellent internal reliability and validity. Results show that the STQ is effective in capturing four factors regarding transferability, which are transfer experience (TE), overall experience (OE), consistency perception (CP) and functionality perception (FP). Validation results show good convergent, discriminant, criterion and nomonlogical validity. Study two adopts a systematic tool to consolidate usability constructs into a total usability score. The study utilizes principal component analysis (PCA) to determine the weight of the four usability components (satisfaction, transferability, effectiveness, and efficiency), which is used when obtaining the total usability score. Results show slightly different weights for the four components. This quantitative tool can be applied in different usability context in which multiple devices are involved. Usability specialists are encouraged to adjust the tool based on different usability scenarios. Study three investigates the impact of task complexity, user experience, and device order on the total system usability. Results show that the total usability score is not affected by task complexity, user experience or device order. However, lower physical task complexity leads to longer performance time and lower errors from the users. High experienced users have significantly lower errors made in tasks. The machine order also has divergent results. When the mini-lathe machine was used first, users had better transferability results but poorer performance outcomes as compared to when the drill press was used first

    Deterministic and Random Isogeometric Analysis of Fluid Flow in Unsaturated Soils

    Get PDF
    The main objective of this research is to use IGA as an efficient and robust alternative for numerical simulation of unsaturated seepage problems. Moreover, this research develops an IGA-based probabilistic framework that can properly account for the variability of soil hydraulic properties in the simulations. In the first part, IGA is used in a deterministic framework to solve a head-based form of Richards’ equation. It is shown that IGA is able to properly simulate changes in pore pressure at the soils interface. In the second part of this research, a new probabilistic framework, named random IGA (RIGA), is developed. A joint lognormal distribution function is used with IGA to perform Monte Carlo simulations. The results depict the statistical outputs relating to seepage quantities and pore water pressure. It is shown that pore water pressure, flow rate, etc. change considerably with respect to standard deviation and correlation of the model parameters

    Generalizing List Scheduling for Stochastic Soft Real-time Parallel Applications

    Get PDF
    Advanced architecture processors provide features such as caches and branch prediction that result in improved, but variable, execution time of software. Hard real-time systems require tasks to complete within timing constraints. Consequently, hard real-time systems are typically designed conservatively through the use of tasks? worst-case execution times (WCET) in order to compute deterministic schedules that guarantee task?s execution within giving time constraints. This use of pessimistic execution time assumptions provides real-time guarantees at the cost of decreased performance and resource utilization. In soft real-time systems, however, meeting deadlines is not an absolute requirement (i.e., missing a few deadlines does not severely degrade system performance or cause catastrophic failure). In such systems, a guaranteed minimum probability of completing by the deadline is sufficient. Therefore, there is considerable latitude in such systems for improving resource utilization and performance as compared with hard real-time systems, through the use of more realistic execution time assumptions. Given probability distribution functions (PDFs) representing tasks? execution time requirements, and tasks? communication and precedence requirements, represented as a directed acyclic graph (DAG), this dissertation proposes and investigates algorithms for constructing non-preemptive stochastic schedules. New PDF manipulation operators developed in this dissertation are used to compute tasks? start and completion time PDFs during schedule construction. PDFs of the schedules? completion times are also computed and used to systematically trade the probability of meeting end-to-end deadlines for schedule length and jitter in task completion times. Because of the NP-hard nature of the non-preemptive DAG scheduling problem, the new stochastic scheduling algorithms extend traditional heuristic list scheduling and genetic list scheduling algorithms for DAGs by using PDFs instead of fixed time values for task execution requirements. The stochastic scheduling algorithms also account for delays caused by communication contention, typically ignored in prior DAG scheduling research. Extensive experimental results are used to demonstrate the efficacy of the new algorithms in constructing stochastic schedules. Results also show that through the use of the techniques developed in this dissertation, the probability of meeting deadlines can be usefully traded for performance and jitter in soft real-time systems

    A Domain Specific Language Based Approach for Generating Deadlock-Free Parallel Load Scheduling Protocols for Distributed Systems

    Get PDF
    In this dissertation, the concept of using domain specific language to develop errorree parallel asynchronous load scheduling protocols for distributed systems is studied. The motivation of this study is rooted in addressing the high cost of verifying parallel asynchronous load scheduling protocols. Asynchronous parallel applications are prone to subtle bugs such as deadlocks and race conditions due to the possibility of non-determinism. Due to this non-deterministic behavior, traditional testing methods are less effective at finding software faults. One approach that can eliminate these software bugs is to employ model checking techniques that can verify that non-determinism will not cause software faults in parallel programs. Unfortunately, model checking requires the development of a verification model of a program in a separate verification language which can be an error-prone procedure and may not properly represent the semantics of the original system. The model checking approach can provide true positive result if the semantics of an implementation code and a verification model is represented under a single framework such that the verification model closely represents the implementation and the automation of a verification process is natural. In this dissertation, a domain specific language based verification framework is developed to design parallel load scheduling protocols and automatically verify their behavioral properties through model checking. A specification language, LBDSL, is introduced that facilitates the development of parallel load scheduling protocols. The LBDSL verification framework uses model checking techniques to verify the asynchronous behavior of the protocol. It allows the same protocol specification to be used for verification and the code generation. The support to automatic verification during protocol development reduces the verification cost post development. The applicability of LBDSL verification framework is illustrated by performing case study on three different types of load scheduling protocols. The study shows that the LBDSL based verification approach removes the need of debugging for deadlocks and race bugs which has potential to significantly lower software development costs
    corecore