2,021 research outputs found

    Extrapolating Monte Carlo Simulations to Infinite Volume: Finite-Size Scaling at ξ/L ≫1

    Get PDF
    We present a simple and powerful method for extrapolating finite-volume Monte Carlo data to infinite volume, based on finite-size-scaling theory. We discuss carefully its systematic and statistical errors, and we illustrate it using three examples: the two-dimensional three-state Potts antiferromagnet on the square lattice, and the two-dimensional O(3)O(3) and O()O(\infty) σ\sigma-models. In favorable cases it is possible to obtain reliable extrapolations (errors of a few percent) even when the correlation length is 1000 times larger than the lattice

    Synergistic effect of simvastatin and ezetimibe on lipid and pro-inflammatory profiles in pre-diabetic subjects

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Ezetimibe specifically blocks the absorption of dietary and biliary cholesterol and plant sterols. Synergism of ezetimibe-statin therapy on LDL-cholesterol has been demonstrated, but data concerning the pleiotropic effects of this combination are controversial.</p> <p>Objective</p> <p>This open-label trial evaluated whether the combination of simvastatin and ezetimibe also results in a synergistic effect that reduces the pro-inflammatory status of pre-diabetic subjects.</p> <p>Methods</p> <p>Fifty pre-diabetic subjects were randomly assigned to one of 2 groups, one receiving ezetimibe (10 mg/day), the other, simvastatin (20 mg/d) for 12 weeks, followed by an additional 12-week period of combined therapy. Blood samples were collected at baseline, 12 and 24 weeks. RESULTS: Total cholesterol, LDL-cholesterol and apolipoprotein B levels decreased in all the periods analyzed (p < 0.01), but triglycerides declined significantly only after combined therapy. Both drugs induced reductions in C-reactive protein, reaching statistical significance after combining ezetimibe with the simvastatin therapy (baseline 0.59 ± 0.14, simvastatin monotherapy 0.48 ± 0.12 mg/dL and 0.35 ± 0.12 mg/dL, p < 0.023). Such a reduction was independent of LDL-cholesterol change. However, mean levels of TNF-α and interleukin-6 and leukocyte count did not vary during the whole study.</p> <p>Conclusion</p> <p>Expected synergistic lowering effects of a simvastatin and ezetimibe combination on LDL-cholesterol, apolipoprotein B and triglycerides levels were confirmed in subjects with early disturbances of glucose metabolism. We suggest an additive effect of this combination also on inflammatory status based on the reduction of C-reactive protein. Attenuation of pro-inflammatory conditions may be relevant in reducing cardiometabolic risk.</p> <p>Title/ID of trial registration</p> <p>Effect of simvastatin and ezetimibe on lipid and inflammation/NCT01103648.</p

    First and second fundamental solutions of the time-fractional telegraph equation with Laplace or Dirac operators

    Get PDF
    In this work, we obtain the first and second fundamental solutions (FS) of the multidimensional time-fractional equation with Laplace or Dirac operators, where the two time-fractional derivatives of orders α ∈]0, 1] and β ∈]1, 2] are in the Caputo sense. We obtain representations of the FS in terms of Hankel transform, double Mellin- Barnes integrals, and H-functions of two variables. As an application, the FS are used to solve Cauchy problems of Laplace and Dirac type

    Improved identification of abdominal aortic aneurysm using the Kernelized Expectation Maximization algorithm

    Get PDF
    Abdominal aortic aneurysm (AAA) monitoring and risk of rupture is currently assumed to be correlated with the aneurysm diameter. Aneurysm growth, however, has been demonstrated to be unpredictable. Using PET to measure uptake of [18F]-NaF in calcified lesions of the abdominal aorta has been shown to be useful for identifying AAA and to predict its growth. The PET low spatial resolution, however, can affect the accuracy of the diagnosis. Advanced edge-preserving reconstruction algorithms can overcome this issue. The kernel method has been demonstrated to provide noise suppression while retaining emission and edge information. Nevertheless, these findings were obtained using simulations, phantoms and a limited amount of patient data. In this study, the authors aim to investigate the usefulness of the anatomically guided kernelized expectation maximization (KEM) and the hybrid KEM (HKEM) methods and to judge the statistical significance of the related improvements. Sixty-one datasets of patients with AAA and 11 from control patients were reconstructed with ordered subsets expectation maximization (OSEM), HKEM and KEM and the analysis was carried out using the target-to-blood-pool ratio, and a series of statistical tests. The results show that all algorithms have similar diagnostic power, but HKEM and KEM can significantly recover uptake of lesions and improve the accuracy of the diagnosis by up to 22% compared to OSEM. The same improvements are likely to be obtained in clinical applications based on the quantification of small lesions, like for example cancer

    Polyvalent horse F(Ab`)2 snake antivenom: Development of process to produce polyvalent horse F(Ab`)2 antibodies anti-african snake venom

    Get PDF
    A method to obtain polyvalent anti-Bitis and polyvalent-anti-Naja antibodies was developed by immunizing horses with B. arietans, B. nasicornis, B. rhinoceros, N. melanoleuca and N. mossambicacrude venoms. Antibody production was followed by the ELISA method during the immunization procedure. Once the desired anti-venom antibody titers were attained, horses were bled and the immunoglobulins were separated from the sera by (NH4)2SO4 precipitation, cleaved with pepsin and filtered through a 30 kDa ultrafiltration membrane. F(ab&#180;)2 fragments were further purified by Q-Fast Flow chromatography, concentrated by molecular ultrafiltration and sterilized by filtration through 0.22 m membranes. The resulting F(ab&#180;)2 preparations were rich in intact L and in pieces of H IgG(T) chains, as demonstrated by electrophoresis and Western blot and exhibited high antibody titers, as assayed by the ELISA method. In addition, the preparations possess a significant capacity to neutralize the lethality of venoms, as estimated by ED50 determination in mouse assay and are free of toxic substances, pyrogen and bacterial or fungal contaminations

    Thermographic imaging in sports and exercise medicine: A Delphi study and consensus statement on the measurement of human skin temperature

    Get PDF
    This is an accepted manuscript of an article published by Elsevier in Journal of Thermal Biology on 18/07/2017, available online: https://doi.org/10.1016/j.jtherbio.2017.07.006 The accepted version of the publication may differ from the final published version.© 2017 Elsevier Ltd The importance of using infrared thermography (IRT) to assess skin temperature (tsk) is increasing in clinical settings. Recently, its use has been increasing in sports and exercise medicine; however, no consensus guideline exists to address the methods for collecting data in such situations. The aim of this study was to develop a checklist for the collection of tsk using IRT in sports and exercise medicine. We carried out a Delphi study to set a checklist based on consensus agreement from leading experts in the field. Panelists (n  =  24) representing the areas of sport science (n = 8; 33%), physiology (n = 7; 29%), physiotherapy (n = 3; 13%) and medicine (n = 6; 25%), from 13 different countries completed the Delphi process. An initial list of 16 points was proposed which was rated and commented on by panelists in three rounds of anonymous surveys following a standard Delphi procedure. The panel reached consensus on 15 items which encompassed the participants’ demographic information, camera/room or environment setup and recording/analysis of tsk using IRT. The results of the Delphi produced the checklist entitled “Thermographic Imaging in Sports and Exercise Medicine (TISEM)” which is a proposal to standardize the collection and analysis of tsk data using IRT. It is intended that the TISEM can also be applied to evaluate bias in thermographic studies and to guide practitioners in the use of this technique.Published versio

    Filter-based stochastic algorithm for global optimization

    Get PDF
    We propose the general Filter-based Stochastic Algorithm (FbSA) for the global optimization of nonconvex and nonsmooth constrained problems. Under certain conditions on the probability distributions that generate the sample points, almost sure convergence is proved. In order to optimize problems with computationally expensive black-box objective functions, we develop the FbSA-RBF algorithm based on the general FbSA and assisted by Radial Basis Function (RBF) surrogate models to approximate the objective function. At each iteration, the resulting algorithm constructs/updates a surrogate model of the objective function and generates trial points using a dynamic coordinate search strategy similar to the one used in the Dynamically Dimensioned Search method. To identify a promising best trial point, a non-dominance concept based on the values of the surrogate model and the constraint violation at the trial points is used. Theoretical results concerning the sufficient conditions for the almost surely convergence of the algorithm are presented. Preliminary numerical experiments show that the FbSA-RBF is competitive when compared with other known methods in the literature.The authors are grateful to the anonymous referees for their fruitful comments and suggestions.The first and second authors were partially supported by Brazilian Funds through CAPES andCNPq by Grants PDSE 99999.009400/2014-01 and 309303/2017-6. The research of the thirdand fourth authors were partially financed by Portuguese Funds through FCT (Fundação para Ciência e Tecnologia) within the Projects UIDB/00013/2020 and UIDP/00013/2020 of CMAT-UM and UIDB/00319/2020

    Combining filter method and dynamically dimensioned search for constrained global optimization

    Get PDF
    In this work we present an algorithm that combines the filter technique and the dynamically dimensioned search (DDS) for solving nonlinear and nonconvex constrained global optimization problems. The DDS is a stochastic global algorithm for solving bound constrained problems that in each iteration generates a randomly trial point perturbing some coordinates of the current best point. The filter technique controls the progress related to optimality and feasibility defining a forbidden region of points refused by the algorithm. This region can be given by the flat or slanting filter rule. The proposed algorithm does not compute or approximate any derivatives of the objective and constraint functions. Preliminary experiments show that the proposed algorithm gives competitive results when compared with other methods.The first author thanks a scholarship supported by the International Cooperation Program CAPES/ COFECUB at the University of Minho. The second and third authors thanks the support given by FCT (Funda¸c˜ao para Ciˆencia e Tecnologia, Portugal) in the scope of the projects: UID/MAT/00013/2013 and UID/CEC/00319/2013. The fourth author was partially supported by CNPq-Brazil grants 308957/2014-8 and 401288/2014-5.info:eu-repo/semantics/publishedVersio

    The BINGO project: VII. Cosmological forecasts from 21 cm intensity mapping

    Get PDF
    Context. The 21 cm line of neutral hydrogen (HI) opens a new avenue in our exploration of the structure and evolution of the Universe. It provides complementary data to the current large-scale structure (LSS) observations with different systematics, and thus it will be used to improve our understanding of the Icold dark matter (ICDM) model. This will ultimately constrain our cosmological models, attack unresolved tensions, and test our cosmological paradigm. Among several radio cosmological surveys designed to measure this line, BINGO is a single-dish telescope mainly designed to detect baryon acoustic oscillations (BAOs) at low redshifts (0.127 < z < 0.449). Aims. Our goal is to assess the fiducial BINGO setup and its capabilities of constraining the cosmological parameters, and to analyze the effect of different instrument configurations. Methods. We used the 21 cm angular power spectra to extract cosmological information about the HI signal and the Fisher matrix formalism to study BINGO's projected constraining power. Results. We used the Phase 1 fiducial configuration of the BINGO telescope to perform our cosmological forecasts. In addition, we investigated the impact of several instrumental setups, taking into account some instrumental systematics, and different cosmological models. Combining BINGO with Planck temperature and polarization data, the projected constraint improves from a 13% and 25% precision measurement at the 68% confidence level with Planck only to 1% and 3% for the Hubble constant and the dark energy (DE) equation of state (EoS), respectively, within the wCDM model. Assuming a Chevallier- Polarski- Linder (CPL) parameterization, the EoS parameters have standard deviations given by w0 = 0.30 and wa = 1.2, which are improvements on the order of 30% with respect to Planck alone. We also compared BINGO's fiducial forecast with future SKA measurements and found that, although it will not provide competitive constraints on the DE EoS, significant information about HI distribution can be acquired. We can access information about the HI density and bias, obtaining 8.5% and 6% precision, respectively, assuming they vary with redshift at three independent bins. BINGO can also help constrain alternative models, such as interacting dark energy and modified gravity models, improving the cosmological constraints significantly. Conclusions. The fiducial BINGO configuration will be able to extract significant cosmological information from the HI distribution and provide constraints competitive with current and future cosmological surveys. It will also help in understanding the HI physics and systematic effects

    A methodology for parameter estimation in seaweed productivity modelling

    Get PDF
    This paper presents a combined approach for parameter estimation in models of primary production. The focus is on gross primary production and nutrient assimilation by seaweeds. A database of productivity determinations, biomass and mortality measurements and nutrient uptake rates obtained over one year for Gelidium sesquipedale in the Atlantic Ocean off Portugal has been used. Annual productivity was estimated by harvesting methods, and empirical relationships using mortality/ wave energy and respiration rates have been derived to correct for losses and to convert the estimates to gross production. In situ determinations of productivity have been combined with data on the light climate (radiation periods, intensity, mean turbidity) to give daily and annual productivity estimates. The theoretical nutrient uptake calculated using a 'Redfield ratio' approach and determinations of in situ N and P consumption by the algae during incubation periods have also been compared. The results of the biomass difference and incubation approaches are discussed in order to assess the utility of coefficients determined in situ for parameter estimation in seaweed production models
    corecore