897,316 research outputs found

    B mu G@Sbase - a microarray database and analysis tool

    Get PDF
    The manufacture and use of a whole-genome microarray is a complex process and it is essential that all data surrounding the process is stored, is accessible and can be easily associated with the data generated following hybridization and scanning. As part of a program funded by the Wellcome Trust, the Bacterial Microarray Group at St. George's Hospital Medical School (BμG@S) will generate whole-genome microarrays for 12 bacterial pathogens for use in collaboration with specialist research groups. BμG@S will collaborate with these groups at all levels, including the experimental design, methodology and analysis. In addition, we will provide informatic support in the form of a database system (BμG@Sbase). BμG@Sbase will provide access through a web interface to the microarray design data and will allow individual users to store their data in a searchable, secure manner. Tools developed by BμG@S in collaboration with specific research groups investigating analysis methodology will also be made available to those groups using the arrays and submitting data to BμG@Sbase

    Exploring the Interplay between CAD and FreeFem++ as an Energy Decision-Making Tool for Architectural Design

    Get PDF
    The energy modelling software tools commonly used for architectural purposes do not allow a straightforward real-time implementation within the architectural design programs. In addition, the surrounding exterior spaces of the building, including the inner courtyards, hardly present a specific treatment distinguishing these spaces from the general external temperature in the thermal simulations. This is a clear disadvantage when it comes to streamlining the design process in relation to the whole-building energy optimization. In this context, the present study aims to demonstrate the advantages of the FreeFem++ open source program for performing simulations in architectural environments. These simulations include microclimate tests that describe the interactions between a building architecture and its local exterior. The great potential of this mathematical tool can be realized through its complete system integration within CAD (Computer-Aided Design) software such as SketchUp or AutoCAD. In order to establish the suitability of FreeFem++ for the performance of simulations, the most widely employed energy simulation tools able to consider a proposed architectural geometry in a specific environment are compared. On the basis of this analysis, it can be concluded that FreeFem++ is the only program displaying the best features for the thermal performance simulation of these specific outdoor spaces, excluding the currently unavailable easy interaction with architectural drawing programs. The main contribution of this research is, in fact, the enhancement of FreeFem++ usability by proposing a simple intuitive method for the creation of building geometries and their respective meshing (pre-processing). FreeFem++ is also considered a tool for data analysis (post-processing) able to help engineers and architects with building energy-efficiency-related tasks

    An academic design methodology for electrical mobility products - from necessity to functional prototype

    Get PDF
    The undergraduate program in Product Design Engineering at EAFIT University-Colombia, includes an applied project course during eight semesters with different topics -- Students attend their last project course in seventh and eighth semester integrated into one year topic -- In this project, they have to design a new high-tech consumer product in electrical mobility for different types of transportation need and to construct a completely functional prototype -- The objectives of these courses are to focus on the triad of “Product-User-Context” as well as to foster design, engineering, manufacturing, management and entrepreneurship skills -- In order to offer a systematic way of working, and to obtain better results, a systematic design methodology has been adopted, adapted and applied during the whole product development process in order to facilitate representation, analysis, calculation, management and control of the information related to the product -- The methodology is broadly explained through activities, tools, information and results related to four main stages: 1) Need research & problem statement, 2) Conceptual design, 3) Detailed design and 4) Prototype construction & testing -- A successful case study is presented following all the stages of the presented methodology for the development of an Electric Power-Assisted Bicycl

    A Preliminary Design Tool For Radial Compressors

    Get PDF
    The aim of this thesis has been to implement a computer program capable of modelling radial compressors. The work was conducted at the Division of Thermal Power Engineering, Department of Energy Sciences at the Faculty of Engineering, Lund University. With such a large field as radial compressors, some decomposition of the subject was necessary. This thesis focuses on modelling the impeller and diffuser of the compressor, where the components are individually studied. By doing this division between different parts of the compressor, the individual components can be designed in isolation. If, for instance, only a vaned diffuser is to be designed, no impeller design must first be found. The resulting computer program called LURC (Lund University Radial Compressor) have two different design tools for the impeller, one in-depth impeller flow analysis, one vaneless diffuser design tool and one vaned diffuser design tool. The compressor designer can easily perform different analyses with LURC, all design tools being easily accessible at all times. Also, several of the different analysis tools are interconnected, allowing easy data exchange. The so-called preliminary impeller design tool requires minimum input data from the user, quickly yielding a promising impeller design. It is a one-dimensional analysis method, where most parameters are specified on the mean streamline. The complete impeller geometry is directly generated, making it a perfect tool for the non-experienced compressor designer. Further, the result can be used as input data in any of the two diffuser analyses. In the detailed impeller design tool more input data must be specified. Also, the geometry and blade shapes are under full control of the designer. This analysis is also one-dimensional but has the capability of creating three-dimensional impeller geometries. Such a tool is well suited to the more experienced designer, knowing what he or she want to achieve. The result can be used in both diffuser analyses, but also in the in-depth impeller flow analysis. The in-depth impeller flow analysis is based on a quasi three-dimensional calculation method. In this case a streamline curvature method is coupled with a stream function flow analysis, allowing the three-dimensional problem to be composed of two two-dimensional ones. From this analysis tool the velocity distributions along the blades may be found, giving the experienced designer a vital piece of information. Vaneless diffusers are very common in radial compressors, making it important to incorporate these in LURC. Input data may be entered manually or imported from a previous impeller analysis. The analysis is based on one-dimensional theory, including effects of wall friction. The complete geometry is generated and the key results, such as discharge pressure and velocity, are calculated. The vaned diffuser design tool are focused on channel diffusers of the wedge type. Also included in the analysis are the vaneless space right after the impeller discharge. As was the case for the vaneless diffuser design tool, inlet data can be entered either manually or fetched from an impeller analysis conducted earlier. The whole diffuser geometry is generated, that is, both the vaneless space and the channel diffuser part. One-dimensional compressible channel flow is the basis for this analysis. LURC fulfils its main objective; being a fast design tool for establishing promising compressor stage designs. Although mainly based on one-dimensional analyses, it is a very efficient and powerful tool for determining important design parameters in a timely fashion

    A Preliminary Design Tool for Radial Compressors

    Get PDF
    The aim of this thesis has been to implement a computer program capable of modelling radial compressors. The work was conducted at the Division of Thermal Power Engineering, Department of Energy Sciences at the Faculty of Engineering, Lund University. With such a large field as radial compressors, some decomposition of the subject was necessary. This thesis focuses on modelling the impeller and diffuser of the compressor, where the components are individually studied. By doing this division between different parts of the compressor, the individual components can be designed in isolation. If, for instance, only a vaned diffuser is to be designed, no impeller design must first be found. The resulting computer program called LURC (Lund University Radial Compressor) have two different design tools for the impeller, one in-depth impeller flow analysis, one vaneless diffuser design tool and one vaned diffuser design tool. The compressor designer can easily perform different analyses with LURC, all design tools being easily accessible at all times. Also, several of the different analysis tools are interconnected, allowing easy data exchange. The so-called preliminary impeller design tool requires minimum input data from the user, quickly yielding a promising impeller design. It is a one-dimensional analysis method, where most parameters are specified on the mean streamline. The complete impeller geometry is directly generated, making it a perfect tool for the non-experienced compressor designer. Further, the result can be used as input data in any of the two diffuser analyses. In the detailed impeller design tool more input data must be specified. Also, the geometry and blade shapes are under full control of the designer. This analysis is also one-dimensional but has the capability of creating three-dimensional impeller geometries. Such a tool is well suited to the more experienced designer, knowing what he or she want to achieve. The result can be used in both diffuser analyses, but also in the in-depth impeller flow analysis. The in-depth impeller flow analysis is based on a quasi three-dimensional calculation method. In this case a streamline curvature method is coupled with a stream function flow analysis, allowing the three-dimensional problem to be composed of two two-dimensional ones. From this analysis tool the velocity distributions along the blades may be found, giving the experienced designer a vital piece of information. Vaneless diffusers are very common in radial compressors, making it important to incorporate these in LURC. Input data may be entered manually or imported from a previous impeller analysis. The analysis is based on one-dimensional theory, including effects of wall friction. The complete geometry is generated and the key results, such as discharge pressure and velocity, are calculated. The vaned diffuser design tool are focused on channel diffusers of the wedge type. Also included in the analysis are the vaneless space right after the impeller discharge. As was the case for the vaneless diffuser design tool, inlet data can be entered either manually or fetched from an impeller analysis conducted earlier. The whole diffuser geometry is generated, that is, both the vaneless space and the channel diffuser part. One-dimensional compressible channel flow is the basis for this analysis. LURC fulfils its main objective; being a fast design tool for establishing promising compressor stage designs. Although mainly based on one-dimensional analyses, it is a very efficient and powerful tool for determining important design parameters in a timely fashion

    Application Development using Compositional Performance Analysis

    Get PDF
    A parallel programming archetype [Cha94, CMMM95] is an abstraction that captures the common features of a class of problems with a similar computational structure and combines them with a parallelization strategy to produce a pattern of dataflow and communication. Such abstractions are useful in application development, both as a conceptual framework and as a basis for tools and techniques. The efficiency of a parallel program can depend a great deal on how its data and tasks are decomposed and distributed. This thesis describes a simple performance evaluation methodology that includes an analytic model for predicting the performance of parallel and distributed computations developed for multicomputer machines and networked personal computers. This analytic model can be supplemented by a simulation infrastructure for application writers to use when developing parallel programs using archetypes. These performance evaluation tools were developed with the following restricted goal in mind: We require accuracy of the analytic model and simulation infrastructure only to the extent that they suggest directions for the programmer to make the appropriate optimizations. This restricted goal sacrifices some accuracy, but makes the tools simpler and easier to use. A programmer can use these tools to design programs with decomposition and distribution specialized to a given machine configuration. By instantiating a few architecture-based parameters, the model can be employed in the performance analysis of data-parallel applications, guiding process generation, communication, and mapping decisions. The model is language-independent and machine-independent; it can be applied to help programmers make decisions about performance-affecting parameters as programs are ported across architectures and languages. Furthermore, the model incorporates both platform-specific and application-specific aspects, and it allows programmers to experiment with tradeoffs better than either strictly simulation-based or purely theoretical models. In addition, the model was designed to be simple. In summary, this thesis outlines a simple method for benchmarking a parallel communication library and for using the results to model the performance of applications developed with that communication library. We use compositional performance analysis - decomposing a parallel program into its modular parts and analyzing their respective performances - to gain perspective on the performance of the whole program. This model is useful for predicting parallel program execution times for different types of program archetypes (e.g., mesh and mesh-spectral), using communication libraries built with different message-passing schemes (e.g., Fortran M and Fortran with MPI) running on different architectures (e.g., IBM SP2 and a network of Pentium personal computers)

    Selective Dynamic Analysis of Virtualized Whole-System Guest Environments

    Get PDF
    Dynamic binary analysis is a prevalent and indispensable technique in program analysis. While several dynamic binary analysis tools and frameworks have been proposed, all suffer from one or more of: prohibitive performance degradation, a semantic gap between the analysis code and the execution under analysis, architecture/OS specificity, being user-mode only, and lacking flexibility and extendability. This dissertation describes the design of the Dynamic Executable Code Analysis Framework (DECAF), a virtual machine-based, multi-target, whole-system dynamic binary analysis framework. In short, DECAF seeks to address the shortcomings of existing whole-system dynamic analysis tools and extend the state of the art by utilizing a combination of novel techniques to provide rich analysis functionality without crippling amounts of execution overhead. DECAF extends the mature QEMU whole-system emulator, a type-2 hypervisor capable of emulating every instruction that executes within a complete guest system environment. DECAF provides a novel, hardware event-based method of just-in-time virtual machine introspection (VMI) to address the semantic gap problem. It also implements a novel instruction-level taint tracking engine at bitwise level of granularity, ensuring that taint propagation is sound and highly precise throughout the guest environment. A formal analysis of the taint propagation rules is provided to verify that most instructions introduce neither false positives nor false negatives. DECAF’s design also provides a plugin architecture with a simple-to-use, event-driven programming interface that makes it both flexible and extendable for a variety of analysis tasks. The implementation of DECAF consists of 9550 lines of C++ code and 10270 lines of C code. Its performance is evaluated using CPU2006 SPEC benchmarks, which show an average overhead of 605% for system wide tainting and 12% for VMI. Three platformneutral DECAF plugins - Instruction Tracer, Keylogger Detector, and API Tracer - are described and evaluated in this dissertation to demonstrate the ease of use and effectiveness of DECAF in writing cross-platform and system-wide analysis tools. This dissertation also presents the Virtual Device Fuzzer (VDF), a scalable fuzz testing framework for discovering bugs within the virtual devices implemented as part of QEMU. Such bugs could be used by malicious software executing within a guest under analysis by DECAF, so the discovery, reproduction, and diagnosis of such bugs helps to protect DECAF against attack while improving QEMU and any analysis platforms built upon QEMU. VDF uses selective instrumentation to perform targeted fuzz testing, which explores only the branches of execution belonging to virtual devices under analysis. By leveraging record and replay of memory-mapped I/O activity, VDF quickly cycles virtual devices through an arbitrarily large number of states without requiring a guest OS to be booted or present. Once a test case is discovered that triggers a bug, VDF reduces the test case to the minimum number of reads/writes required to trigger the bug and generates source code suitable for reproducing the bug during debugging and analysis. VDF is evaluated by fuzz testing eighteen QEMU virtual devices, generating 1014 crash or hang test cases that reveal bugs in six of the tested devices. Over 80% of the crashes and hangs were discovered within the first day of testing. VDF covered an average of 62.32% of virtual device branches during testing, and the average test case was minimized to a reproduction test case only 18.57% of its original size

    Endometrial receptivity revisited: endometrial transcriptome adjusted for tissue cellular heterogeneity

    Get PDF
    STUDY QUESTION Does cellular composition of the endometrial biopsy affect the gene expression profile of endometrial whole-tissue samples? SUMMARY ANSWER The differences in epithelial and stromal cell proportions in endometrial biopsies modify the whole-tissue gene expression profiles and affect the results of differential expression analyses. WHAT IS ALREADY KNOWN Each cell type has its unique gene expression profile. The proportions of epithelial and stromal cells vary in endometrial tissue during the menstrual cycle, along with individual and technical variation due to the method and tools used to obtain the tissue biopsy. STUDY DESIGN, SIZE, DURATION Using cell-population specific transcriptome data and computational deconvolution approach, we estimated the epithelial and stromal cell proportions in whole-tissue biopsies taken during early secretory and mid-secretory phases. The estimated cellular proportions were used as covariates in whole-tissue differential gene expression analysis. Endometrial transcriptomes before and after deconvolution were compared and analysed in biological context. PARTICIPANTS/MATERIAL, SETTING, METHODS Paired early- and mid-secretory endometrial biopsies were obtained from 35 healthy, regularly cycling, fertile volunteers, aged 23–36 years, and analysed by RNA sequencing. Differential gene expression analysis was performed using two approaches. In one of them, computational deconvolution was applied as an intermediate step to adjust for the proportions of epithelial and stromal cells in the endometrial biopsy. The results were then compared to conventional differential expression analysis. Ten paired endometrial samples were analysed with qPCR to validate the results. MAIN RESULTS AND THE ROLE OF CHANCE The estimated average proportions of stromal and epithelial cells in early secretory phase were 65% and 35%, and during mid-secretory phase, 46% and 54%, respectively, correlating well with the results of histological evaluation (r = 0.88, P = 1.1 × 10−6). Endometrial tissue transcriptomic analysis showed that approximately 26% of transcripts (n = 946) differentially expressed in receptive endometrium in cell-type unadjusted analysis also remain differentially expressed after adjustment for biopsy cellular composition. However, the other 74% (n = 2645) become statistically non-significant after adjustment for biopsy cellular composition, underlining the impact of tissue heterogeneity on differential expression analysis. The results suggest new mechanisms involved in endometrial maturation, involving genes like LINC01320, SLC8A1 and GGTA1P, described for the first time in context of endometrial receptivity. LARGE-SCALE DATA The RNA-seq data presented in this study is deposited in the Gene Expression Omnibus database with accession number GSE98386. LIMITATIONS REASONS FOR CAUTION Only dominant endometrial cell types were considered in gene expression profile deconvolution; however, other less frequent endometrial cell types also contribute to the whole-tissue gene expression profile. WIDER IMPLICATIONS OF THE FINDINGS The better understanding of molecular processes during transition from pre-receptive to receptive endometrium serves to improve the effectiveness and personalization of assisted reproduction protocols. Biopsy cellular composition should be taken into account in future endometrial ‘omics’ studies, where tissue heterogeneity could potentially influence the results. STUDY FUNDING/COMPETING INTEREST(S) This study was funded by: Estonian Ministry of Education and Research (grant IUT34-16); Enterprise Estonia (EU48695); the EU-FP7 Eurostars program (NOTED, EU41564); the EU-FP7 Marie Curie Industry-Academia Partnerships and Pathways (SARM, EU324509); Horizon 2020 innovation program (WIDENLIFE, EU692065); MSCA-RISE-2015 project MOMENDO (No 691058) and the Miguel Servet Program Type I of Instituto de Salud Carlos III (CP13/00038); Spanish Ministry of Economy, Industry and Competitiveness (MINECO) and European Regional Development Fund (FEDER): grants RYC-2016-21199 and ENDORE SAF2017-87526. Authors confirm no competing interests

    Elementary Teachers\u27 Experiences with Technology Professional Development and Classroom Technology Integration: Influences of Elements of Diffusion and Support

    Get PDF
    Lack of teacher technology integration is a documented concern within education. Effective staff development practices, the need for on-going support, and the presence of elements of diffusion are all recognized as factors that lead to higher rates of technology integration. These elements are not currently studied as a whole in research on technology education. This study sought to examine all three of these factors within a southern metropolitan school district’s technology teacher development initiative. The following questions guided the research: 1. How do teachers experience the five elements of diffusion (complexity, triability, observability, relative advantage, and compatibility) in the area of technology integration in elementary schools? 2. How do teachers experience instructional technology support and the impact of support on their technology integration instruction? 3. How do teachers experience technology staff development and the impact of staff development on their classroom technology integration? Data were collected from 81 online survey participants, 16 oral interview and web log analysis participants, and an interview with the project director at the completion of the first year of a two-year initiative. Participants received updated technology tools within their classroom and were required to take technology related courses, keep web logs, and complete technology projects. Research was conducted within a mixed methods triangulation design using a pragmatic paradigm with descriptive statistics and correlations as forms of quantitative analysis and a phenomenological approach applied in qualitative analysis. Findings showed the presence of elements of diffusion and support across all data sources. Teachers’ experiences with the program were positive and led to frequent and varied technology integration. Correlations indicated high levels of interrelatedness among the variables of support, elements of diffusion, and impact on instruction. Teachers reported enhanced engagement in learning among themselves and their students. The fact that teachers chose to be in the staff development program and had choices within the program to fulfill the requirements appeared to engage and motivate them. Even though teachers self-reported they were early adopters of technology, the program support structure was highly valued. The program could be used as a model for effective technology staff development
    corecore