4 research outputs found

    i-Cellulo: a SaaS platform for the automatic statistical analysis of cell impedance signals

    Get PDF
    Présentation PosterInternational audienceLabel-free methods such as cell impedance assays are in vitro tests increasingly used in drug development. An indirect difficulty with those technologies is the large amount of kinetic responses to be processed. Our objective is to automate the processing and analysis of those data with a web computational server available to all biologists and able to perform multivariate tests, response profile clustering and dynamic AC50 estimation. The proposed solution relies on a SaaS platform in which R-language algorithms have been implemented for the on-line processing of cell impedance signals. Three generic statistical problems are addressed: clustering of response profiles to screen compounds, multivariate testing to compare their activity and AC50 estimation to determine their concentration effects. ANOVA, Kruskall-Wallis and Tuckey’s range tests have been implemented for the multivariate testing. Hierarchical clustering based on Singular Spectrum Analysis were used for the non-supervised response profile classification. A Hill’s model structure and a maximum likelihood estimator were adopted for the AC50 calculation. Hundreds of tests were carried out on real in vitro signals to assess the practical relevance of i-Cellulo for the fast analysis and characterization of anti-cancer activities in early steps of drug development. Results clearly show the ability of this web-based solution to correctly discriminate, classify, compare and rank the anti-cancer responses of tested compounds compared to gold standards. With the advent of real-time cell measurement technologies in preclinical tests, new services for the analysis of high-content data are needed. i–Cellulo is a solution to that challenge and allows biologists to speed up their data analysis and facilitate interpretation of their results

    Computer-aided design of ECG telemetry systems for online cardiac monitoring

    Get PDF
    Présentation PosterInternational audienceTelemetry is rapidly becoming standard practice for clinical studies. Remote monitoring may be performed at multiple locations of subjects over WiFi or 3G cell phone networks. Nevertheless, the accuracy and reliability of monitoring systems have still to be improved for clinical acceptability. Main issues are to enhance the battery lifetime of IoT-enabled devices and to allow the transmission of acceptable quality ECG signals.The objective is to propose an in silico-based method to optimize ECG telemetry systems by identifying critical telecommunication parameters and optimizing both battery lifetime and quality of transmitted ECG.15 simulation parameters have been selected to test their criticality and three quality attributes have been examined: battery lifetime, number of received packets and ECG distortion. The methodology framework relies on the Quality-by-Design guidelines (ICH Q8-Q12). Its partial implementation was based on two consecutive sets of simulations. In a first step, a screening design of numerical experimentations (Plackett-Burman design) was carried out to identify the critical network parameters. In a second step, an optimization campaign, based on a central composite design, was implemented to identify the design space, i.e. the region of interest in which all the telecommunication parameters have to be kept fulfilling the quality specifications. Simulations were carried out in Omnet++ and the statistical analysis in the software environment R. For the number of packets received, two critical parameters were identified: message length and bit rate. For ECG distortion, the first two most critical factors are background noise power and energy detection of the radio receiver. For the battery lifetime, preliminary results tend to show that background noise power and bit rate are critical.This study demonstrates the relevance to use numerical simulations to design but also check the compatibility of a telemetry system with predefined specifications on quality of services in a given and constrained context. Such an approach can be implemented in the early steps of development and can save of lot of time and money by preventing malfunctions, scraps and posteriori restructurations of the telemetry networks

    Simulation and sensitivity analysis of sensors network for cardiac monitoring

    Get PDF
    International audienceThis study's aim was to model and simulate a wireless sensor network and to propose a two-step sensitivity analysis for a targeted application related to home cardiac monitoring. After an initial phase of research to design the appropriate network simulator implemented in Omnet++, 13 simulation parameters have been selected to test their criticality. The sensitivity analysis relies on two consecutive steps carried out in Matlab: a screening phase (Plackett-Burman design) and a global sensitivity analysis (Space-fiiling design). Two output variables are considered: the number of packets received by the sink and the reception cache hit percentage. Four critical simulation parameters have been identified: the message length, bit rate, the background noise power and the energy detection of the radio receiver. In perspective, this sensitivity analysis will be included as a component of a Quality-by-Design approach of network development. This contribution is the early stage result obtained by Y. Kolasa during his MsC and PhD thesis (begun end of 2017) [2]

    Quality-by-design-engineered pBFT consensus configuration for medical device development

    Get PDF
    International audienceHealth product development has been lately tainted by wariness in manufacturers, which has reduced trust in the system. It also affects Digital Health were patients' big data flows generated by numerous sensors are subject to increased security and confidentiality to lower the risks incurred. Our aim is to increase trust in the system again by implementing a dedicated Blockchain solution where data are automatically stored, and where each actor in the development process can access and host them. Blockchain has its downside, such as a subefficient management of big data flows. This study is a first step toward defining a Blockchain solution that will not deteriorate the Quality of Service in this particular context by using the Quality by Design approach. We will mainly focus on the time to consensus attribute which affects both of them. From our experiments' results generated after running screening design and surface response design on a practical Byzantine Fault Tolerance (pBFT) simulator, we find that the transmission time and the message processing time are the most impacting factors
    corecore