9,125 research outputs found

    A step-by-step guide to non-linear regression analysis of experimental data using a Microsoft Excel spreadsheet,

    Get PDF
    Abstract The objective of this present study was to introduce a simple, easily understood method for carrying out non-linear regression analysis based on user input functions. While it is relatively straightforward to fit data with simple functions such as linear or logarithmic functions, fitting data with more complicated non-linear functions is more difficult. Commercial specialist programmes are available that will carry out this analysis, but these programmes are expensive and are not intuitive to learn. An alternative method described here is to use the SOLVER function of the ubiquitous spreadsheet programme Microsoft Excel, which employs an iterative least squares fitting routine to produce the optimal goodness of fit between data and function. The intent of this paper is to lead the reader through an easily understood step-by-step guide to implementing this method, which can be applied to any function in the form y=f(x), and is well suited to fast, reliable analysis of data in all fields of biology

    Introductory of Microsoft Excel SOLVER function-Spreadsheet method for isotherm and kinetics modelling of metals biosorption in water and wastewater

    Full text link
    This paper aims to introduce a simple method to run a complicated non-linear analysis of isotherm and kinetics models for metals biosorption based on input functions of spreadsheets. A robust method is demonstrated here to exploit the `SOLVER function available in Microsoft (MS) Excel spreadsheet. It is more economic and user friendly than specialized computer programmes. In this study, an iterative method was proposed to produce the optimal goodness of fit between experimental data and predicted data. This was described the implementing method of a set of real data (garden grass as biosorbent) and the predicted results were compared with linear analysis and MATLAB analysis. The R2 values found from MS Excel spreadsheet were 0.995, 0.999 and 0.996 while being 0.997, 1.000 and 0.999 by MATLAB for copper, lead and cadmium adsorption, respectively onto garden grass. The prediction of maximum adsorption, qm by excel (59.336, 63.663 and 42.310 mg/g) were very similar to MATLAB (59.889, 63.509 and 41.560 mg/g). The predictions of kinetics parameters were also close to MATLAB analysis. Hence, the MS Excel Spreadsheet method could be a handy tool for biosorption models

    LEAF-E: a tool to analyze grass leaf growth using function fitting

    Get PDF
    In grasses, leaf growth is often monitored to gain insights in growth processes, biomass accumulation, regrowth after cutting, etc. To study the growth dynamics of the grass leaf, its length is measured at regular time intervals to derive the leaf elongation rate (LER) profile over time. From the LER profile, parameters such as maximal LER and leaf elongation duration (LED), which are essential for detecting inter-genotype growth differences and/or quantifying plant growth responses to changing environmental conditions, can be determined. As growth is influenced by the circadian clock and, especially in grasses, changes in environmental conditions such as temperature and evaporative demand, the LER profiles show considerable experimental variation and thus often do not follow a smooth curve. Hence it is difficult to quantify the duration and timing of growth. For these reasons, the measured data points should be fitted using a suitable mathematical function, such as the beta sigmoid function for leaf elongation. In the context of high-throughput phenotyping, we implemented the fitting of leaf growth measurements into a user-friendly Microsoft Excel-based macro, a tool called LEAF-E. LEAF-E allows to perform non-linear regression modeling of leaf length measurements suitable for robust and automated extraction of leaf growth parameters such as LER and LED from large datasets. LEAF-E is particularly useful to quantify the timing of leaf growth, which forms an important added value for detecting differences in leaf growth development. We illustrate the broad application range of LEAF-E using published and unpublished data sets of maize, Miscanthus spp. and Brachypodium distachyon, generated in independent experiments and for different purposes. In addition, we show that LEAF-E could also be used to fit datasets of other growth-related processes that follow the sigmoidal profile, such as cell length measurements along the leaf axis. Given its user-friendliness, ability to quantify duration and timing of leaf growth and broad application range, LEAF-E is a tool that could be routinely used to study growth processes following the sigmoidal profile

    Ca analysis : an Excel based program for the analysis of intracellular calcium transients including multiple, simultaneous regression analysis

    Get PDF
    Here I present an Excel based program for the analysis of intracellular Ca transients recorded using fluorescent indicators. The program can perform all the necessary steps which convert recorded raw voltage changes into meaningful physiological information. The program performs two fundamental processes. (1) It can prepare the raw signal by several methods. (2) It can then be used to analyze the prepared data to provide information such as absolute intracellular Ca levels. Also, the rates of change of Ca can be measured using multiple, simultaneous regression analysis. I demonstrate that this program performs equally well as commercially available software, but has numerous advantages, namely creating a simplified, self-contained analysis workflow

    Estimating uncertainty in ecosystem budget calculations

    Get PDF
    © The Authors, 2010. This article is distributed under the terms of the Creative Commons Attribution-Noncommercial License. The definitive version was published in Ecosystems 13 (2010): 239-248, doi:10.1007/s10021-010-9315-8.Ecosystem nutrient budgets often report values for pools and fluxes without any indication of uncertainty, which makes it difficult to evaluate the significance of findings or make comparisons across systems. We present an example, implemented in Excel, of a Monte Carlo approach to estimating error in calculating the N content of vegetation at the Hubbard Brook Experimental Forest in New Hampshire. The total N content of trees was estimated at 847 kg ha−1 with an uncertainty of 8%, expressed as the standard deviation divided by the mean (the coefficient of variation). The individual sources of uncertainty were as follows: uncertainty in allometric equations (5%), uncertainty in tissue N concentrations (3%), uncertainty due to plot variability (6%, based on a sample of 15 plots of 0.05 ha), and uncertainty due to tree diameter measurement error (0.02%). In addition to allowing estimation of uncertainty in budget estimates, this approach can be used to assess which measurements should be improved to reduce uncertainty in the calculated values. This exercise was possible because the uncertainty in the parameters and equations that we used was made available by previous researchers. It is important to provide the error statistics with regression results if they are to be used in later calculations; archiving the data makes resampling analyses possible for future researchers. When conducted using a Monte Carlo framework, the analysis of uncertainty in complex calculations does not have to be difficult and should be standard practice when constructing ecosystem budgets

    Naturalistic Allocation: Working Memory and Cued-Attention Effects on Resource Allocation

    Get PDF
    The allocation of resources is a ubiquitous decision making task. In the workplace, resource allocation, in the context of multiple task and/or work demands, is significantly related to task performance as the commitment of more resources generally results in better performance on a given task. I apply both resource and naturalistic decision making theories to better understand resource allocation behavior and related performance. Resource theories suggest that individuals have limited cognitive capacity: limited capacity may limit performance in dynamic situations such as situations that involve the allocation of attentional resources. Additionally, the naturalistic decision making framework highlights the role of context cues as key aids to effective decision making. Therefore, I proposed an interactive relationship between working memory, a cognitive resource, and allocation cue, a contextual variable. Specifically, I conducted an experimental study in which I manipulated allocation cue type and examined the individual difference of working memory on allocation behavior and task performance. I hypothesized a moderated-mediated effect including cue type, working memory, and proportion of time on task on task performance (i.e., accuracy and efficiency). The effect of cue type on both the proportion of time spent on task and task performance was expected to be contingent on working memory capacity. As working memory increased, both time on task and performance were expected to increase for participants exposed to either goal- or both task- and goal-related cues, as opposed to task cues. Conversely, as working memory decreased both time on task and performance were expected to increase for participants exposed to task cues in comparison to those exposed to either goal- or both task- and goal-related cues. Additionally, as proportion of time on task increased, performance was expected to improve. Results from this study did not find support for the hypothesized moderated-mediated effect. However, results indicated an effect of task cue on task efficiency. Specifically, individuals cued to allocate their attention based stimulus-related features (i.e., task cue) completed the task more quickly. Theoretical and practical implications as well as study limitations are discussed in detail

    Application of a new high performance liquid chromatography method to the pharmacokinetics of dibudipine in rats

    Get PDF
    Purpose: To develop a HPLC method for assay of dibudipine in biological fluids and to study its pharmacokinetics in the rat. Methods. HPLC: 2 μl (20 μg/ml) mebudipine as internal standard, 0.2 ml NaOH 1 M and 2 ml ethyl acetate were added to 0.2 ml of rat plasma. The mixture was shaken for 10 min, centrifuged, and the supernatant was dried under nitrogen. The dissolved residue was injected to a C18 analytical column. Mobile phase flowed at 1 ml/min with a composition of methanol - water-acetonitrile (70-25-5). The eluent was monitored at 238 nm. Pharmacokinetic study: plasma samples were collected periodically after intravenous (0.5 mg/kg) or oral (10 mg/ kg) administration of dibudipine to rats (n = 4/group). In addition, separate groups of animals were administered 0.5 mg/kg doses of the drug for serial collection of brain, heart, kidney and liver (n = 4/time). The concentration of the drug in tissue or plasma was assayed using the above HPLC method. Results. Calibration curves were linear over a concent ration of 10-1000 ng/ml and CV was less than 10%. Dibudipine showed a bi-exponential decline after IV injection in the rats with a t1/2 beta of 2.5 ± 0.5(mean ± SE) hr. Oral bioavailability was low. Distribution of dibudipine to the examined tissues was rapid, and with the exception of the brain, the concentrations of the drug in all tissues were higher than the plasma levels Conclusions. The HPLC method was simple and convenient. Moreover, it could be applied to investigations of the pharmacokinetics of dibudipine in the rat

    Regression Analysis to Determine the Reserve Strength Ratio of Fixed Offshore Structures

    Get PDF
    This study estimates the Reserve Strength Ratio (RSR) of fixed offshore structures with the utilization of regression analysis, which is capable in replacing the conventional expensive and time consuming methods currently adapted in the oil and gas industry. Offshore structures from the three regions of Malaysian Waters namely Peninsular Malaysia, Sarawak and Sabah were used to perform the analysis of different jacket configurations and sea-states accordingly. The pushover analysis of this study was performed by SACS program version 5.3 and the regression analysis was done using Microsoft Excel. Finding has revealed that regression analysis is able to produce regression coefficients to formulate non-linear regression equations fitting to the set of data to estimate the platform RSR
    corecore