869 research outputs found

    User oriented access to secure biomedical resources through the grid

    Get PDF
    The life science domain is typified by heterogeneous data sets that are evolving at an exponential rate. Numerous post-genomic databases and areas of post-genomic life science research have been established and are being actively explored. Whilst many of these databases are public and freely accessible, it is often the case that researchers have data that is not so freely available and access to this data needs to be strictly controlled when distributed collaborative research is undertaken. Grid technologies provide one mechanism by which access to and integration of federated data sets is possible. Combining such data access and integration technologies with fine grained security infrastructures facilitates the establishment of virtual organisations (VO). However experience has shown that the general research (non-Grid) community are not comfortable with the Grid and its associated security models based upon public key infrastructures (PKIs). The Internet2 Shibboleth technology helps to overcome this through users only having to log in to their home site to gain access to resources across a VO – or in Shibboleth terminology a federation. In this paper we outline how we have applied the combination of Grid technologies, advanced security infrastructures and the Internet2 Shibboleth technology in several biomedical projects to provide a user-oriented model for secure access to and usage of Grid resources. We believe that this model may well become the de facto mechanism for undertaking e-Research on the Grid across numerous domains including the life sciences

    Deep Learning for Genomics: A Concise Overview

    Full text link
    Advancements in genomic research such as high-throughput sequencing techniques have driven modern genomic studies into "big data" disciplines. This data explosion is constantly challenging conventional methods used in genomics. In parallel with the urgent demand for robust algorithms, deep learning has succeeded in a variety of fields such as vision, speech, and text processing. Yet genomics entails unique challenges to deep learning since we are expecting from deep learning a superhuman intelligence that explores beyond our knowledge to interpret the genome. A powerful deep learning model should rely on insightful utilization of task-specific knowledge. In this paper, we briefly discuss the strengths of different deep learning models from a genomic perspective so as to fit each particular task with a proper deep architecture, and remark on practical considerations of developing modern deep learning architectures for genomics. We also provide a concise review of deep learning applications in various aspects of genomic research, as well as pointing out potential opportunities and obstacles for future genomics applications.Comment: Invited chapter for Springer Book: Handbook of Deep Learning Application

    AVENTIS - An architecture for event data analysis

    Full text link
    Time-stamped event data is being generated at an exponential rate from various sources (sensor networks, e-markets etc.), which are stored in event logs and made available to researchers. Despite the data deluge and evolution of a plethora of tools and technologies, science behind exploratory analysis and knowledge discovery lags. There are several reasons behind this. In conducting event data analysis, researchers typically detect a pattern or trend in the data through computation of time-series measures and apply the computed measures to several mathematical models to glean information from data. This is a complex and time-consuming process covering a range of activities from data capture (from a broad array of data sources) to interpretation and dissemination of experimental results forming a pipeline of activities. Further, data-analysis is conducted by domain-users, who are typically non-IT experts but data processing tools and applications are largely developed by application developers. End-users not only lack the critical skills to build a structured analysis pipeline, but are also perplexed by the number of different ways available to derive the necessary information. Consequently, this thesis proposes AVENTIS (Architecture for eVENT Data analysIS), a novel framework to guide the design of analytic solutions to facilitate time-series analysis of event data and is tailored to the needs of domain users. The framework comprises three components; a knowledge base, a model-driven analytic methodology and an accompanying software architecture that provides the necessary technical and operational requirements. Specifically, the research contribution lies in the ability of the framework to enable expressing analysis requirements at a level of abstraction consistent with the domain users and readily make available the information sought without the users having to build the analysis process themselves. Secondly, the framework also facilitates an abstract design space for the domain experts to enable them to build conceptual models of their experiment as a sequence of structured tasks in a technology neutral manner and transparently translate these abstract process models to executable implementations. To evaluate the AVENTIS framework, a prototype based on AVENTIS is implemented and tested with case studies taken from the financial research domain

    Distributional Impacts of a U.S. Greenhouse Gas Policy: A General Equilibrium Analysis of Carbon Pricing

    Get PDF
    Abstract and PDF report are also available on the MIT Joint Program on the Science and Policy of Global Change website (http://globalchange.mit.edu/).We develop a new model of the U.S., the U.S. Regional Energy Policy (USREP) model that is resolved for large states and regions of the U.S. and by income class and apply the model to investigate a $15 per ton CO2 equivalent price on greenhouse gas emissions. Previous estimates of distributional impacts of carbon pricing have been done outside of the model simulation and have been based on energy expenditure patterns of households in different regions and of different income levels. By estimating distributional effects within the economic model, we include the effects of changes in capital returns and wages on distribution and find that the effects are significant and work against the expenditure effects. We find the following: First, while results based only on energy expenditure have shown carbon pricing to be regressive we find the full distributional effect to be neutral or slightly progressive. This demonstrates the importance of tracing through all economic impacts and not just focusing on spending side impacts. Second, the ultimate impact of such a policy on households depends on how allowances, or the revenue raised from auctioning them, is used. Free distribution to firms would be highly regressive, benefiting higher income households and forcing lower income households to bear the full cost of the policy and what amounts to a transfer of wealth to higher income households. Lump sum distribution through equal-sized household rebates would make lower income households absolutely better off while shifting the costs to higher income households. Schemes that would cut taxes are generally slightly regressive but improve somewhat the overall efficiency of the program. Third, proposed legislation would distribute allowances to local distribution companies (electricity and natural gas distributors) and public utility commissions would then determine how the value of those allowances was used. A significant risk in such a plan is that distribution to households might be perceived as lowering utility rates That reduced the efficiency of the policy we examined by 40 percent. Finally, the states on the coasts bear little cost or can benefit because of the distribution of allowance revenue while mid-America and southern states bear the highest costs. This regional pattern reflects energy consumption and energy production difference among states. Use of allowance revenue to cut taxes generally exacerbates these regional differences because coastal states are also generally higher income states, and those with higher incomes benefit more from tax cuts.MIT Joint Program on the Science and Policy of Global Change through a combination of government, industry, and foundation funding, the MIT Energy Initiative, and additional support for this work from a coalition of industrial sponsors

    A domain-specific language based approach to component composition, error-detection, and fault prediction

    Get PDF
    Current methods of software production are resource-intensive and often require a number of highly skilled professionals. To develop a well-designed and effectively implemented system requires a large investment of resources, often numbering into millions of pounds. The time required may also prove to be prohibitive. However, many parts of the new systems being currently developed already exist, either in the form of whole or parts of existing systems. It is therefore attractive to reuseexisting code when developing new software, in order to reduce the time andresources required. This thesis proposes the application of a domain-specific language (DSL) to automatic component composition, testing and fault-prediction. The DSL ISinherently based on a domain-model which should aid users of the system m knowing how the system is structured and what responsibilities the system fulfils. The DSL structure proposed in this thesis uses a type system and grammar hence enabling the early detection of syntactically incorrect system usage. Each DSL construct's behaviour can also be defined in a testing DSL, described here as DSL-test. This can take the form of input and output parameters, which should suffice for specifying stateless components, or may necessitate the use of a special method call, described here as a White-Box Test (WBT), which allows the external observer to view the abstract state of a component. Each DSL-construct can be mapped to its implementing components i.e. the component, or amalgamation of components, that implement(s) the behaviour as prescribed by the DSL-construct. User-requirements are described using the DS Land appropriate implementing components (if sufficient exist) are automatically located and integrated. That is to say, given a requirement described in terms of the DSL and sufficient components, the architecture (which was named Hydra) will be able to generate an executable which should behave as desired. The DSL-construct behaviour description language (DSL-test) is designed in such a way that it can be translated into a computer programming language, and so code can be inserted between the system automatically to verify that the implementing component is acting in a way consistent with the model of its expected behaviour. Upon detection of an error, the system examines available data (i.e. where the error occurred, what sort of error was it, and what was the structure of the executable), to attempt to predict the location of the fault and, where possible, make remedialaction. A number of case studies have been investigated and it was found that, if applied to the appropriate problem domain, the approach proposed in this thesis shows promise in terms of full automation and integration of black-box or grey-box software. However, further work is required before it can be claimed that this approach should be used in real scale systems

    04451 Abstracts Collection -- Future Generation Grids

    Get PDF
    The Dagstuhl Seminar 04451 "Future Generation Grid" was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl from 1st to 5th November 2004. The focus of the seminar was on open problems and future challenges in the design of next generation Grid systems. A total of 45 participants presented their current projects, research plans, and new ideas in the area of Grid technologies. Several evening sessions with vivid discussions on future trends complemented the talks. This report gives an overview of the background and the findings of the seminar

    The Golden Age of Software Architecture: A Comprehensive Survey

    Full text link

    A Pattern-based Approach towards Modular Safety Analysis and Argumentation

    Get PDF
    International audienceSafety standards recommend (if not dictate) performing many analyses during the concept phase of development as well as the early adoption of multiple measures at the architectural design level. In practice, the reuse of architectural measures or safety mechanisms is widely-spread, especially in well-understood domains, as is reusing the corresponding safety-cases aiming to document and prove the fulfillment of the underlying safety goals. Safety-cases in the automotive domain are not well-integrated into architectural models and as such do not provide comprehensible and reproducible argumentation nor any evidence for argument correctness. The reuse is mostly ad-hoc, with loss of knowledge and traceability and lack of consistency or process maturity as well as being the most widely spread and cited drawbacks.Using a simplified description of software functions and their most common error management subtypes (avoidance, detection, handling, ..) we propose to define a pattern library covering known solution algorithms and architectural measures/constraints in a seamless holistic model-based approach with corresponding tool support. The pattern libraries would comprise the requirement the pattern covers and the architecture elements/ measures / constraints required and may include deployment or scheduling strategies as well as the supporting safety case template, which would then be integrated into existing development environments. This paper explores this approach using an illustrative example
    • …
    corecore