1,230,563 research outputs found

    A network approach for managing and processing big cancer data in clouds

    Get PDF
    Translational cancer research requires integrative analysis of multiple levels of big cancer data to identify and treat cancer. In order to address the issues that data is decentralised, growing and continually being updated, and the content living or archiving on different information sources partially overlaps creating redundancies as well as contradictions and inconsistencies, we develop a data network model and technology for constructing and managing big cancer data. To support our data network approach for data process and analysis, we employ a semantic content network approach and adopt the CELAR cloud platform. The prototype implementation shows that the CELAR cloud can satisfy the on-demanding needs of various data resources for management and process of big cancer data

    IMMERSION PROGRAM SEBAGAI DASAR RANCANG BANGUN PEMBELAJARAN BERBAHASA INGGRIS DI SEKOLAH MENENGAH PERTAMA BILINGUAL DI DAERAH ISTIMEWA YOGYAKARTA

    Get PDF
    This research aims at constructing an ideal learning model of bilingual class through immersion program. The objects of the research are students, parents, teachers, and the principal of SMPN 1 Bantul. Through several field observations, it is found that although the teachers of bilingual class have got several annual trainings, they still have major problems in managing the bilingual class. They are too preoccupied with the language of the instructions, and this makes the students less active since the learning process doesn’t run smoothly. After conducting a careful data analysis, it is obtained that this problem can be solved by constructing a model of training that can upgrade teachers’ ability of managing classroom learning process and motivate students to participate actively in classroom activities. FBS, 2007 (PEND. BHS INGGRIS

    Analyzing library collections with starfield visualizations

    Get PDF
    This paper presents a qualitative and formative study of the uses of a starfield-based visualization interface for analysis of library collections. The evaluation process has produced feedback that suggests ways to significantly improve starfield interfaces and the interaction process to improve their learnability and usability. The study also gave us clear indication of additional potential uses of starfield visualizations that can be exploited by further functionality and interface development. We report on resulting implications for the design and use of starfield visualizations that will impact their graphical interface features, their use for managing data quality and their potential for various forms of visual data mining. Although the current implementation and analysis focuses on the collection of a physical library, the most important contributions of our work will be in digital libraries, in which volume, complexity and dynamism of collections are increasing dramatically and tools are needed for visualization and analysis

    Experiences with starfield visualizations for analysis of library collections

    Get PDF
    This paper presents a qualitative and formative study of the uses of a starfield-based visualization interface for analysis of library collections. The evaluation process has produced feedback that suggests ways to significantly improve starfield interfaces and the interaction process to improve their learnability and usability. The study also gave us clear indication of additional potential uses of starfield visualizations that can be exploited by further functionality and interface development. We report on resulting implications for the design and use of starfield visualizations that will impact their graphical interface features, their use for managing data quality and their potential for various forms of visual data mining. Although the current implementation and analysis focuses on the collection of a physical library, the most important contributions of our work will be in digital libraries, in which volume, complexity and dynamism of collections are increasing dramatically and tools are needed for visualization and analysis

    Exploring positive adjustment in people with spinal cord injury.

    Get PDF
    This study explored adjustment in people with spinal cord injury; data from four focus groups are presented. Thematic analysis revealed four themes, managing goals and expectations, comparison with others, feeling useful and acceptance, showing participants positively engaged in life, positively interpreted social comparison information and set realistic goals and expectations. These positive strategies show support for adjustment theories, such as the Cognitive Adaptation Theory, the Control Process Theory and Response Shift Theory. These results also provide insight into the adjustment process of a person with spinal cord injury and may be useful in tailoring support during rehabilitation

    The Workflow of Data Analysis Using Stata

    Get PDF
    The Workflow of Data Analysis Using Stata, by J. Scott Long, is a productivity tool for data analysts. Long guides you toward streamlining your workflow, because a good workflow is essential for replicating your work, and replication is essential for good science. A workflow of data analysis is a process for managing all aspects of data analysis. Planning, documenting, and organizing your work; cleaning the data; creating, renaming, and verifying variables; performing and presenting statistical analyses; producing replicable results; and archiving what you have done are all integral parts of your workflow. Long shows how to design and implement efficient workflows for both one-person projects and team projects.Stata, data management, workflow

    On the Behaviour of General-Purpose Applications on Cloud Storages

    Get PDF
    Managing data over cloud infrastructures raises novel challenges with respect to existing and well studied approaches such as ACID and long running transactions. One of the main requirements is to provide availability and partition tolerance in a scenario with replicas and distributed control. This comes at the price of a weaker consistency, usually called eventual consistency. These weak memory models have proved to be suitable in a number of scenarios, such as the analysis of large data with Map-Reduce. However, due to the widespread availability of cloud infrastructures, weak storages are used not only by specialised applications but also by general purpose applications. We provide a formal approach, based on process calculi, to reason about the behaviour of programs that rely on cloud stores. For instance, one can check that the composition of a process with a cloud store ensures `strong' properties through a wise usage of asynchronous message-passing

    Application of BIM in sustainability analysis

    Get PDF
    Building Information Modeling (BIM) is the process of generating and managing building data during its life cycle. Typically it uses three-dimensional, real-time, dynamic building modeling software to increase productivity in building design and construction. The process produces the Building Information Model, which encompasses building geometry, spatial relationships, geographic information, quantities and properties of building components. On the other hand, Green Building Index (GBI) as the localized sustainability building rating system in Malaysia assesses the impact of building on environment based on energy efficiency, indoor environment quality, sustainable site & management, materials & resources, water efficiency and innovation. By integrating GBI assessment criteria with BIM technology, this research proposes a comparative case study analysis of Residential New Construction (RNC) and Non-Residential New Construction (NRNC) based on the Autodesk Ecotect Analysis software (a comprehensive, concept-to-detail sustainable design analysis tool that provides a wide range of simulation and analysis functionality through desktop and web-service platforms) and Autodesk Green Building Studio (A web-based energy analysis service which performs whole building analysis, optimizes energy efficiency, and works toward carbon neutrality earlier in the design process) to investigate the influence of construction material on energy consumption, lifecycle energy cost and carbon emission

    Backing the horse or the jockey? Due diligence, agency costs, information and the evaluation of risk by business angel investors

    Get PDF
    This paper explores the argument that business angel investors are more concerned with managing and minimising agency risk than market risk. Based on data on the due diligence process from a survey of business angels in the UK, the paper concludes that business angels do view entrepreneur characteristics and experience as having the greatest impact on the perceived riskiness of an investment opportunity. Further, they emphasise personal and informal over formal sources of information in the due diligence process, and seek information on both the entrepreneur and the venture in determining valuation. Indeed, the reliance of business angels on short-term and subjective information to value investment opportunities leads to the conclusion that their approach to valuation is not a function of the conventional protocols of financial analysis, but of personal relations and assessment
    • 

    corecore