23 research outputs found

    Antitumor effect of salidroside on mice bearing HepA hepatocellular carcinoma

    Get PDF
    Salidroside, a phenylpropanoid glycoside extracted from Rhodiola rosea L., has antiproliferative effects on tumour cells in mice. However it’s antitumor mechanism remains largely unknown. In this study, 4 groups of mice bearing hepatocarcinoma cells were given treatment with vehicle alone, cyclophosphamide (25 mg/kg, i.p.) and salidroside, either 100 or 200 mg/kg (p.o.) for 14 days. The morphology of tumour specimens was analysed by transmission electron microscopy. Apoptotic cells in sections of mouse tumour tissue were analysed using an in situ apoptosis kit. The expression of Bcl-2, Bax and caspase 3 mRNA were examined with RT-PCR. The results showed that the tumour weights in groups 100 or 200 mg/kg/day of salidroside were reduced significantly (45.34 and 52.48% respectively), compared to vehicle groups. Salidroside increased apoptotic cells index, e.g. in 200 mg/kg group, it was four times higher compared to the control group. Even more, treatment with salidroside decreased Bcl-2 mRNA expression and increased Bax and caspase 3 mRNA expressions. These indicated that the antitumor mechanism of salidroside may induce tumour cell apoptosis in mice by triggering the mitochondrial-dependent pathway and activation of caspase 3

    Simulation of random field samples directly from sparse measurements using Bayesian compressive sampling and Karhunen-Loève expansion

    Get PDF
    Geotechnical materials (e.g., soils and rocks) are natural materials, and they are affected by many spatially varying factors during the geological process, such as properties of their parent materials, weathering and erosion processes, transportation agents, and sedimentation conditions. Geotechnical data therefore exhibit spatial variability, and to some extent, are unique in every site. In recent years, random field has been increasingly used to model spatial variability of geotechnical data. In conventional frequentist approach, measurement data at a specific site are used to estimate random field parameters, such as mean and standard deviation, as well as parameters (e.g., correlation length) of a pre-determined parametric form of correlation function (e.g., an exponential correlation function). Estimation of these random field parameters, particularly the correlation length, and selection of the suitable parametric form of correlation function generally require extensive measurements from a specific site, which are generally not available in geotechnical engineering practice. This paper presents a random field generator that is able to simulate random field samples directly from sparse measurements, bypassing the difficulty in the estimation of correlation function and its parameters. The proposed generator is based on Bayesian compressive sensing/sampling and Karhunen–Loève expansion. The proposed method is illustrated and validated using simulated geotechnical data. It is also compared with the conventional random field models. The results show that the proposed generator can rationally simulate the geotechnical spatial variability at a specific site from sparse measurements.The work described in this paper was supported by grants from the Research Grants Council of the Hong Kong Special Administrative Region, China (Project No. 9042331 (CityU 11225216) and Project No. 9042516 (CityU 11213117)). The financial support is gratefully acknowledged

    Determination of efficient sampling locations in geotechnical site characterization using information entropy and Bayesian compressive sampling

    No full text
    Site characterization is indispensable in geotechnical practice, and measurements on soil properties are performed through in-situ tests, laboratory tests, or other methods. However, due to time or budget limit, technical or access constraints etc., the measurements are usually taken at a limited number of locations. This leads to a question of how to select the efficient locations for measurements/sampling such that as much as possible information on the spatial variability of soil properties can be obtained from a given number of measurements. In addition, site characterization is a multi-stage process, and additional measurements might be required at a later stage of site characterization. In this case, how to efficiently select the additional sampling locations such that the pre-existing measurements obtained from the preliminary stages of site characterization can be best used and as much as possible information on soil properties can be further obtained? This paper aims to address these two problems using information entropy and Bayesian compressive sampling (BCS). Real cone penetration test data along vertical and horizontal directions are used to illustrate and validate the proposed methods. The results show that the proposed methods are very effective and robust in selecting efficient sampling locations for geotechnical site characterization.The accepted manuscript in pdf format is listed with the files at the bottom of this page. The presentation of the authors' names and (or) special characters in the title of the manuscript may differ slightly between what is listed on this page and what is listed in the pdf file of the accepted manuscript; that in the pdf file of the accepted manuscript is what was submitted by the author

    CPT-based subsurface soil classification and zonation in a 2D vertical cross-section using Bayesian compressive sampling

    No full text
    A novel method is developed in this study for soil classification and zonation in a two-dimensional (2D) vertical cross-section using cone penetration tests (CPT). CPT is usually performed vertically, and the number of CPT soundings in a site is often limited in geotechnical engineering practice. It is, therefore, difficult to properly interpret CPT results along horizontal direction or accurately estimate horizontal correlation length of CPT data. The method proposed in this study bypasses the difficulty in estimating horizontal correlation length and provides proper identification of subsurface soil stratification (i.e., soil layer number is constant along horizontal direction) and zonation (i.e., soil layer number varies along horizontal direction) in a 2D vertical cross-section directly from a limited number of CPT soundings. The proposed method consists of three key elements: 2D interpolation of CPT data using 2D Bayesian compressive sampling, determination of soil behavior type (SBT) using SBT chart at every location in the 2D section, including locations with measurements and unsampled locations, and soil layer/zone delineation using edge detection method. Both simulated and real data examples are used to illustrate the proposed method. The results show that the method performs well even when only five sets of CPT soundings are available.The accepted manuscript in pdf format is listed with the files at the bottom of this page. The presentation of the authors' names and (or) special characters in the title of the manuscript may differ slightly between what is listed on this page and what is listed in the pdf file of the accepted manuscript; that in the pdf file of the accepted manuscript is what was submitted by the author

    Sample size determination in geotechnical site investigation considering spatial variation and correlation

    No full text
    Site investigation is a fundamental element in geotechnical engineering practice, but only a small portion of geo-materials is sampled and tested during site investigation. This leads to a question of sample size determination: how many samples are needed for achieving a target level of accuracy for the results inferred from the samples? Sample size determination is a well-known topic in statistics and has many applications in a wide variety of areas. However, conventional statistical methods, which mainly deal with independent data, only have limited application in geotechnical site investigation, because geotechnical data are NOT independent, but spatially varying and correlated. Existing design codes around the world (e.g., Eurocode 7) only provide conceptual principles on sample size determination. NO scientific or quantitative method is available for sample size determination in site investigation considering spatial variation and correlation of geotechnical properties. This study performs an extensive parametric study and develops a statistical chart for sample size determination with consideration of spatial variation and correlation using Bayesian compressive sensing or sampling. Real cone penetration test data and real laboratory test data are used to illustrate application of the proposed statistical chart, and the method is shown to perform well.The accepted manuscript in pdf format is listed with the files at the bottom of this page. The presentation of the authors' names and (or) special characters in the title of the manuscript may differ slightly between what is listed on this page and what is listed in the pdf file of the accepted manuscript; that in the pdf file of the accepted manuscript is what was submitted by the author

    Determination of soil property characteristic values from standard penetration tests

    No full text
    Characteristic values of soil property is a key element in geotechnical design guidelines, particularly for probability-based design codes, and it is usually defined as a pre-specified quantile, such as a lower 5% quantile in Eurocode 7, of the probability distribution of soil property. Such a probabilistic characterization requires a large number of data points from laboratory and/or in situ tests, which are usually not available for most geotechnical projects, especially for those with medium or small sizes. For most projects, only a limited number of standard penetration test (SPT) N values are generally available. It is therefore rather challenging to determine the probability distribution and characteristic values for geotechnical design. To address this challenge, this paper presents a Bayesian equivalent sample approach that determines the probability distribution and characteristic value of effective friction angle and Young’s modulus using only a limited number of SPT N values. Two case histories are used to illustrate the approach.Non UBCUnreviewedThis collection contains the proceedings of ICASP12, the 12th International Conference on Applications of Statistics and Probability in Civil Engineering held in Vancouver, Canada on July 12-15, 2015. Abstracts were peer-reviewed and authors of accepted abstracts were invited to submit full papers. Also full papers were peer reviewed. The editor for this collection is Professor Terje Haukaas, Department of Civil Engineering, UBC Vancouver.FacultyGraduat

    Battery Pack State of Health Prediction Based on the Electric Vehicle Management Platform Data

    No full text
    In electric vehicle technologies, the state of health prediction and safety assessment of battery packs are key issues to be solved. In this paper, the battery system data collected on the electric vehicle data management platform is used to model the corresponding state of health of the electric vehicle during charging and discharging processes. The increment in capacity in the same voltage range is used as the battery state of health indicator. In order to improve the modeling accuracy, the influence of ambient temperature on the capacity performance of the battery pack is considered. A temperature correction coefficient is added to the battery state of health model. Finally, a double exponential function is used to describe the process of battery health decline. Additionally, for the case where the amount of data is relatively small, model migration is also applied in the method. Particle swarm optimization algorithm is used to calibrate the model parameters. Based on the migration battery pack model and parameter identification method, the proposed method can obtain accurate battery pack SOH prediction result. The method is simple and easy to perform on the electric vehicle data management platform

    Energy Consumption Estimation for Electric Buses Based on a Physical and Data-Driven Fusion Model

    No full text
    The energy consumption of electric vehicles is closely related to the problems of charging station planning and vehicle route optimization. However, due to various factors, such as vehicle performance, driving habits and environmental conditions, it is difficult to estimate vehicle energy consumption accurately. In this work, a physical and data-driven fusion model was designed for electric bus energy consumption estimation. The basic energy consumption of the electric bus was modeled by a simplified physical model. The effects of rolling drag, brake consumption and air-conditioning consumption are considered in the model. Taking into account the fluctuation in energy consumption caused by multiple factors, a CatBoost decision tree model was constructed. Finally, a fusion model was built. Based on the analysis of electric bus data on the big data platform, the performance of the energy consumption model was verified. The results show that the model has high accuracy with an average relative error of 6.1%. The fusion model provides a powerful tool for the optimization of the energy consumption of electric buses, vehicle scheduling and the rational layout of charging facilities

    Simulation of non-stationary non-Gaussian random fields from sparse measurements using Bayesian compressive sampling and Karhunen-Loève expansion

    No full text
    The first step to simulate random fields in practice is usually to obtain or estimate random field parameters, such as mean, standard deviation, correlation function, among others. However, it is difficult to estimate these parameters, particularly the correlation length and correlation functions, in the presence of sparse measurement data. In such cases, assumptions are often made to define the probabilistic distribution and correlation structure (e.g. Gaussian distribution and stationarity), and the sparse measurement data are only used to estimate the parameters tailored by these assumptions. However, uncertainty associated with the degree of imprecision in this estimation process is not taken into account in random field simulations. This paper aims to address the challenge of properly simulating non-stationary non-Gaussian random fields, when only sparse data are available. A novel method is proposed to simulate non-stationary and non-Gaussian random field samples directly from sparse measurement data, bypassing the difficulty in random field parameter estimation from sparse measurement data. It is based on Bayesian compressive sampling and Karhunen–Loève expansion. First, the formulation of the proposed generator is described. Then, it is illustrated through simulated examples, and tested with wind speed time series data. The results show that the proposed method is able to accurately depict the underlying spatial correlation from sparse measurement data for both non-Gaussian and non-stationary random fields. In addition, the proposed method is able to quantify the uncertainty related to random field parameter estimation from the sparse measurement data and propagate it to the generated random field. © 2019 Elsevier Lt

    Effects of flow regimes on the interaction between granular flow and flexible barrier

    No full text
    Flexible barriers are widely used to mitigate granular flows. In practice, flow regimes may keep changing along a flow path after the initiation of granular flows. The effects of flow regimes should be considered in the design of flexible barriers to intercept granular flow. In this study, flow regimes are divided into three types: dilute flow; dense flow; and quasistatic flow. The impact mechanisms of dense granular flows and dilute granular flows against flexible barriers are investigated using flume tests and the discrete element method. Influences of the ratio of the average particle size to the mesh size of a flexible barrier and particle segregation on the interaction between the flexible barrier and the granular flow are revealed. Differences of the impact mechanisms between rockfall and granular flow are compared. Results show that the impact force of dense granular flow against a flexible barrier will not increase linearly with the average particle size. The tensile force of the bottom cable is usually the maximum tensile force among all cables of the flexible barrier. Particle segregation will lead to increase in impact force of dense flows and tensile force of the upper cables. Impact force of the dilute granular flow increases with the average particle size. Different from the failure of a flexible barrier under the impact of the dense flow, the middle and upper cables are easier to break. Based on these findings, a useful reference for the future design of flexible barriers was proposed
    corecore