1,232 research outputs found
Chemical investigation of a biologically active schinus molle L. leaf extract
The pepper tree Schinus molle L. is an evergreen ornamental plant belonging to the Anacardiaceae family, native to South America and widespread throughout the world. It has biological activities and is used in folk medicine. This paper aims to contribute to a deeper knowledge of its chemical composition and biological properties. S. molle leaf extracts were obtained by sequential extraction with solvents of different polarities and subsequently tested on the HL-60 human leukaemia cell line to define a possible cytotoxic activity. Among the investigated extracts, the petroleum ether extract revealed a high cytotoxic activity, and its chemical composition was further investigated. By a silica column chromatography, eight fractions were obtained, and their compositions were determined by GC-MS analysis. Compounds and relative abundance differed widely among the fractions; sesquiterpenes resulted the main component and alcoholic sesquiterpenes the most abundant
Variability Flagging in the Wide-field Infrared Survey Explorer Preliminary Data Release
The Wide-field Infrared Survey Explorer Preliminary Data Release Source Catalog contains over 257 million objects. We describe the method used to flag variable source candidates in the Catalog. Using a method based on the chi-square of single-exposure flux measurements, we generated a variability flag for each object, and have identified almost 460,000 candidate sources that exhibit significant flux variability with greater than ~7σ confidence. We discuss the flagging method in detail and describe its benefits and limitations. We also present results from the flagging method, including example light curves of several types of variable sources including Algol-type eclipsing binaries, RR Lyr, W UMa, and a blazar candidate
Using PVS to support the analysis of distributed cognition systems
The rigorous analysis of socio-technical systems is challenging, because people are inherent parts of the system, together with devices and artefacts. In this paper, we report on the use of PVS as a way of analysing such systems in terms of distributed cognition. Distributed cognition is a conceptual framework that allows us to derive insights about plausible user trajectories in socio-technical systems by exploring what information in the environment provides resources for user action, but its application has traditionally required substantial craft skill. DiCoT adds structure and method to the analysis of socio-technical systems from a distributed cognition perspective. In this work, we demonstrate how PVS can be used with DiCoT to conduct a systematic analysis. We illustrate how a relatively simple use of PVS can help a field researcher to (i) externalise assumptions and facts, (ii) verify the consistency of the logical argument framed in the descriptions, (iii) help uncover latent situations that may warrant further investigation, and (iv) verify conjectures about potential hazards linked to the observed use of information resources. Evidence is also provided that formal methods and empirical studies are not alternative approaches for studying a socio-technical system, but that they can complement and refine each other. The combined use of PVS and DiCoT is illustrated through a case study concerning a real-world emergency medical dispatch system
Balancing the formal and the informal in user-centred design
This paper explores the role of formal methods as part of the user-centred design of interactive
systems. An iterative process is described, developing prototypes incrementally, proving user-centred
requirements while at the same time evaluating the prototypes that are executable forms of the
developed models using ‘traditional’ techniques for user evaluation. A formal analysis complements
user evaluations. This approach enriches user-centred design that typically focuses understanding
on context and producing sketch designs. These sketches are often non-functional (e.g. paper)
prototypes. They provide a means of exploring candidate design possibilities using techniques such
as cooperative evaluation. This paper describes a further step in the process using formal analysis
techniques. The use of formal methods provides a systematic approach to checking plausibility and
consistency during early design stages, while at the same time enabling the generation of executable
prototypes. The technique is illustrated through an example based on a pill dispenser.This work is financed by National Funds through the Portuguese funding agency, FCT -- Fundação para a Ciência e a Tecnologia, within project UIDB/50014/2020
Developing and Verifying User Interface Requirements for Infusion Pumps: A Refinement Approach
It is common practice in the description of criteria for the acceptable safety of systems for the regulator to describe safety requirements that should be satisfied by the system. These requirements are typically described precisely but in natural language and it is often unclear how the regulator can be assured that the given requirements are satisfied. This paper is concerned with a rigorous refinement process that demonstrates that a precise requirement is satisfied by the specification of a given device. It focuses on a particular class of requirements that relate to the user interface of the device. For user interface requirements, refinement is made more complex by the fact that systems can use different interaction devices that have very different characteristics. The described refinement process recognises an input/output hierarchy
Modelling and systematic analysis of interactive systems
Two aspects of our research concern the application of formal methods in human-computer interaction. The first aspect is the modelling and analysis of interactive devices with a particular emphasis on the user device dyad. The second is the modelling and analysis of ubiquitous systems where there are many users, one might say crowds of users.The common thread of both is to articulate and prove properties of interactive systems, to explore interactive behaviour as it influences the user, with a particular emphasis on interaction failure. The goal is to develop systematic techniques that can be packaged in such a way that they can be used effectively by developers. This “whitepaper” will briefly describe the two approaches and their potential value as well as their limitations and development opportunities
An improved ontological representation of dendritic cells as a paradigm for all cell types
The Cell Ontology (CL) is designed to provide a standardized representation of cell types for data annotation. Currently, the CL employs multiple is_a relations, defining cell types in terms of histological, functional, and lineage properties, and the majority of definitions are written with sufficient generality to hold across multiple species. This approach limits the CL’s utility for cross-species data integration. To address this problem, we developed a method for the ontological representation of cells and applied this method to develop a dendritic cell ontology (DC-CL). DC-CL subtypes are delineated on the basis of surface protein expression, systematically including both species-general and species-specific types and optimizing DC-CL for the analysis of flow cytometry data. This approach brings benefits in the form of increased accuracy, support for reasoning, and interoperability with other ontology resources.
104. Barry Smith, “Toward a Realistic Science of Environments”, Ecological Psychology, 2009, 21 (2), April-June, 121-130.
Abstract: The perceptual psychologist J. J. Gibson embraces a radically externalistic view of mind and action. We have, for Gibson, not a Cartesian mind or soul, with its interior theater of contents and the consequent problem of explaining how this mind or soul and its psychological environment can succeed in grasping physical objects external to itself. Rather, we have a perceiving, acting organism, whose perceptions and actions are always already tuned to the parts and moments, the things and surfaces, of its external environment. We describe how on this basis Gibson sought to develop a realist science of environments which will be ‘consistent with physics, mechanics, optics, acoustics, and chemistry’
The Calibration of the WISE W1 and W2 Tully-Fisher Relation
In order to explore local large-scale structures and velocity fields,
accurate galaxy distance measures are needed. We now extend the well-tested
recipe for calibrating the correlation between galaxy rotation rates and
luminosities -- capable of providing such distance measures -- to the all-sky,
space-based imaging data from the Wide-field Infrared Survey Explorer (WISE) W1
(m) and W2 (m) filters. We find a linewidth to absolute
magnitude correlation (known as the Tully-Fisher Relation, TFR) of
(0.54
magnitudes rms) and (0.56 magnitudes rms) from 310 galaxies in 13 clusters. We update the
I-band TFR using a sample 9% larger than in Tully & Courtois (2012). We derive
(0.46 magnitudes
rms). The WISE TFRs show evidence of curvature. Quadratic fits give
(0.52 magnitudes rms) and (0.55
magnitudes rms). We apply an I-band -- WISE color correction to lower the
scatter and derive
and (both 0.46
magnitudes rms). Using our three independent TFRs (W1 curved, W2 curved and
I-band), we calibrate the UNION2 supernova Type Ia sample distance scale and
derive (stat) (sys) kms Mpc with 4%
total error.Comment: 22 page, 21 figures, accepted to ApJ, Table 1 data at
http://spartan.srl.caltech.edu/~neill/tfwisecal/table1.tx
- …