2,537 research outputs found

    Underground CO2 storage: demonstrating regulatory conformance by convergence of history-matched modeled and observed CO2 plume behavior using Sleipner time-lapse seismics

    Get PDF
    One of the three key regulatory requirements in Europe for transfer of storage site liability is to demonstrate conformity between predictive models of reservoir performance and monitoring observations. This is a challenging requirement because a perfect and unique match between observed and modeled behavior is near impossible to achieve. This study takes the time-lapse seismic monitoring data from the Sleipner storage operation to demonstrate that as more seismic data becomes available with time, predictive models can be matched more accurately to observations and become more reliable predictors of future performance. Six simple performance measures were defined: plume footprint area, maximum lateral migration distance of CO2 from the injection point, area of CO2 accumulation trapped at top reservoir, volume of CO2 accumulation trapped at top reservoir, area of all CO2 layers summed, and spreading co-efficient. Model scenarios were developed to predict plume migration up to 2008. Scenarios were developed for 1996 (baseline), 2001, and 2006 conditions, with models constrained by the information available at those times, and compared with monitoring datasets obtained up to 2008. The 1996 predictive range did generally encompass the future observed plume behavior, but with such a wide range of uncertainty as to render it of only marginal practical use. The 2001 predictions (which used the 1999 and 2001 seismic monitoring datasets) had a much lower uncertainty range, with the 2006 uncertainties somewhat lower again. There are still deficiencies in the actual quality of match but a robust convergence, with time, of predicted and observed models is clearly demonstrated. We propose modeling-monitoring convergence as a generic approach to demonstrating conformance

    Short cavity InGaAsP/InP lasers with dielectric mirrors

    Get PDF
    Short cavity length (38 µm) lasers have been fabricated using a recently developed microcleavage technique. SiO2-amorphous Si multilayer coatings have been evaported on the lasers to obtain high reflectivity mirrors. The lasers have current thresholds as low as 3.8 mA with 85% reflecting front mirror and high reflectivity rear mirror and 2.9 mA with two high reflectivity mirrors. Single longitudinal mode operation is observed over a wide range of driving currents and temperatures

    How Unsplittable-Flow-Covering helps Scheduling with Job-Dependent Cost Functions

    Full text link
    Generalizing many well-known and natural scheduling problems, scheduling with job-specific cost functions has gained a lot of attention recently. In this setting, each job incurs a cost depending on its completion time, given by a private cost function, and one seeks to schedule the jobs to minimize the total sum of these costs. The framework captures many important scheduling objectives such as weighted flow time or weighted tardiness. Still, the general case as well as the mentioned special cases are far from being very well understood yet, even for only one machine. Aiming for better general understanding of this problem, in this paper we focus on the case of uniform job release dates on one machine for which the state of the art is a 4-approximation algorithm. This is true even for a special case that is equivalent to the covering version of the well-studied and prominent unsplittable flow on a path problem, which is interesting in its own right. For that covering problem, we present a quasi-polynomial time (1+ϵ)(1+\epsilon)-approximation algorithm that yields an (e+ϵ)(e+\epsilon)-approximation for the above scheduling problem. Moreover, for the latter we devise the best possible resource augmentation result regarding speed: a polynomial time algorithm which computes a solution with \emph{optimal }cost at 1+ϵ1+\epsilon speedup. Finally, we present an elegant QPTAS for the special case where the cost functions of the jobs fall into at most logn\log n many classes. This algorithm allows the jobs even to have up to logn\log n many distinct release dates.Comment: 2 pages, 1 figur

    SPEDEN: Reconstructing single particles from their diffraction patterns

    Full text link
    Speden is a computer program that reconstructs the electron density of single particles from their x-ray diffraction patterns, using a single-particle adaptation of the Holographic Method in crystallography. (Szoke, A., Szoke, H., and Somoza, J.R., 1997. Acta Cryst. A53, 291-313.) The method, like its parent, is unique that it does not rely on ``back'' transformation from the diffraction pattern into real space and on interpolation within measured data. It is designed to deal successfully with sparse, irregular, incomplete and noisy data. It is also designed to use prior information for ensuring sensible results and for reliable convergence. This article describes the theoretical basis for the reconstruction algorithm, its implementation and quantitative results of tests on synthetic and experimentally obtained data. The program could be used for determining the structure of radiation tolerant samples and, eventually, of large biological molecular structures without the need for crystallization.Comment: 12 pages, 10 figure

    Utilising semantic technologies for intelligent indexing and retrieval of digital images

    Get PDF
    The proliferation of digital media has led to a huge interest in classifying and indexing media objects for generic search and usage. In particular, we are witnessing colossal growth in digital image repositories that are difficult to navigate using free-text search mechanisms, which often return inaccurate matches as they in principle rely on statistical analysis of query keyword recurrence in the image annotation or surrounding text. In this paper we present a semantically-enabled image annotation and retrieval engine that is designed to satisfy the requirements of the commercial image collections market in terms of both accuracy and efficiency of the retrieval process. Our search engine relies on methodically structured ontologies for image annotation, thus allowing for more intelligent reasoning about the image content and subsequently obtaining a more accurate set of results and a richer set of alternatives matchmaking the original query. We also show how our well-analysed and designed domain ontology contributes to the implicit expansion of user queries as well as the exploitation of lexical databases for explicit semantic-based query expansion

    The Influence of Financial Literacy, Self-Control and Parents's Socio Economic Status on Students Consumptive Behavior

    Get PDF
              This study was conducted to examine the effect of the variables Influence of financial literacy, self control and parents' socio economic status on the consumptive behavior of Korean Pop (K-Pop) fan students in Jayapura City. The population of this study were students who were fans of k-pop music in the city of Jayapura who followed the Army_Jayapura Instagram as many as 207 people. Based on the purposive sampling technique, a sample of 104 people was obtained. This study uses descriptive quantitative methods obtained from distributing questionnaires through google form. The analysis technique used is multiple linear regression. The results of the study show that: (1) Financial literacy has a significant effect on consumptive behavior. (2) Self control has a significant effect on consumptive behavior. (3) Parents' socio economic status has no effect on consumptive behavior. (4) Financial literacy, self control and socio economic status of parents have a significant simultaneous effect on consumptive behavior in Korean Pop (K-Pop) music fan students in Jayapura City

    The devices, experimental scaffolds, and biomaterials ontology (DEB): a tool for mapping, annotation, and analysis of biomaterials' data

    Get PDF
    The size and complexity of the biomaterials literature makes systematic data analysis an excruciating manual task. A practical solution is creating databases and information resources. Implant design and biomaterials research can greatly benefit from an open database for systematic data retrieval. Ontologies are pivotal to knowledge base creation, serving to represent and organize domain knowledge. To name but two examples, GO, the gene ontology, and CheBI, Chemical Entities of Biological Interest ontology and their associated databases are central resources to their respective research communities. The creation of the devices, experimental scaffolds, and biomaterials ontology (DEB), an open resource for organizing information about biomaterials, their design, manufacture, and biological testing, is described. It is developed using text analysis for identifying ontology terms from a biomaterials gold standard corpus, systematically curated to represent the domain's lexicon. Topics covered are validated by members of the biomaterials research community. The ontology may be used for searching terms, performing annotations for machine learning applications, standardized meta-data indexing, and other cross-disciplinary data exploitation. The input of the biomaterials community to this effort to create data-driven open-access research tools is encouraged and welcomed.Preprin
    corecore