3,354 research outputs found

    Building an Argument for the Use of Science Fiction in HCI Education

    Full text link
    Science fiction literature, comics, cartoons and, in particular, audio-visual materials, such as science fiction movies and shows, can be a valuable addition in Human-computer interaction (HCI) Education. In this paper, we present an overview of research relative to future directions in HCI Education, distinct crossings of science fiction in HCI and Computer Science teaching and the Framework for 21st Century Learning. Next, we provide examples where science fiction can add to the future of HCI Education. In particular, we argue herein first that science fiction, as tangible and intangible cultural artifact, can serve as a trigger for creativity and innovation and thus, support us in exploring the design space. Second, science fiction, as a means to analyze yet-to-come HCI technologies, can assist us in developing an open-minded and reflective dialogue about technological futures, thus creating a singular base for critical thinking and problem solving. Provided that one is cognizant of its potential and limitations, we reason that science fiction can be a meaningful extension of selected aspects of HCI curricula and research.Comment: 6 pages, 1 table, IHSI 2019 accepted submissio

    Dynamic Set Intersection

    Full text link
    Consider the problem of maintaining a family FF of dynamic sets subject to insertions, deletions, and set-intersection reporting queries: given S,SFS,S'\in F, report every member of SSS\cap S' in any order. We show that in the word RAM model, where ww is the word size, given a cap dd on the maximum size of any set, we can support set intersection queries in O(dw/log2w)O(\frac{d}{w/\log^2 w}) expected time, and updates in O(logw)O(\log w) expected time. Using this algorithm we can list all tt triangles of a graph G=(V,E)G=(V,E) in O(m+mαw/log2w+t)O(m+\frac{m\alpha}{w/\log^2 w} +t) expected time, where m=Em=|E| and α\alpha is the arboricity of GG. This improves a 30-year old triangle enumeration algorithm of Chiba and Nishizeki running in O(mα)O(m \alpha) time. We provide an incremental data structure on FF that supports intersection {\em witness} queries, where we only need to find {\em one} eSSe\in S\cap S'. Both queries and insertions take O\paren{\sqrt \frac{N}{w/\log^2 w}} expected time, where N=SFSN=\sum_{S\in F} |S|. Finally, we provide time/space tradeoffs for the fully dynamic set intersection reporting problem. Using MM words of space, each update costs O(MlogN)O(\sqrt {M \log N}) expected time, each reporting query costs O(NlogNMop+1)O(\frac{N\sqrt{\log N}}{\sqrt M}\sqrt{op+1}) expected time where opop is the size of the output, and each witness query costs O(NlogNM+logN)O(\frac{N\sqrt{\log N}}{\sqrt M} + \log N) expected time.Comment: Accepted to WADS 201

    Biodiversity Loss and the Taxonomic Bottleneck: Emerging Biodiversity Science

    Get PDF
    Human domination of the Earth has resulted in dramatic changes to global and local patterns of biodiversity. Biodiversity is critical to human sustainability because it drives the ecosystem services that provide the core of our life-support system. As we, the human species, are the primary factor leading to the decline in biodiversity, we need detailed information about the biodiversity and species composition of specific locations in order to understand how different species contribute to ecosystem services and how humans can sustainably conserve and manage biodiversity. Taxonomy and ecology, two fundamental sciences that generate the knowledge about biodiversity, are associated with a number of limitations that prevent them from providing the information needed to fully understand the relevance of biodiversity in its entirety for human sustainability: (1) biodiversity conservation strategies that tend to be overly focused on research and policy on a global scale with little impact on local biodiversity; (2) the small knowledge base of extant global biodiversity; (3) a lack of much-needed site-specific data on the species composition of communities in human-dominated landscapes, which hinders ecosystem management and biodiversity conservation; (4) biodiversity studies with a lack of taxonomic precision; (5) a lack of taxonomic expertise and trained taxonomists; (6) a taxonomic bottleneck in biodiversity inventory and assessment; and (7) neglect of taxonomic resources and a lack of taxonomic service infrastructure for biodiversity science. These limitations are directly related to contemporary trends in research, conservation strategies, environmental stewardship, environmental education, sustainable development, and local site-specific conservation. Today’s biological knowledge is built on the known global biodiversity, which represents barely 20% of what is currently extant (commonly accepted estimate of 10 million species) on planet Earth. Much remains unexplored and unknown, particularly in hotspots regions of Africa, South Eastern Asia, and South and Central America, including many developing or underdeveloped countries, where localized biodiversity is scarcely studied or described. ‘‘Backyard biodiversity’’, defined as local biodiversity near human habitation, refers to the natural resources and capital for ecosystem services at the grassroots level, which urgently needs to be explored, documented, and conserved as it is the backbone of sustainable economic development in these countries. Beginning with early identification and documentation of local flora and fauna, taxonomy has documented global biodiversity and natural history based on the collection of ‘‘backyard biodiversity’’ specimens worldwide. However, this branch of science suffered a continuous decline in the latter half of the twentieth century, and has now reached a point of potential demise. At present there are very few professional taxonomists and trained local parataxonomists worldwide, while the need for, and demands on, taxonomic services by conservation and resource management communities are rapidly increasing. Systematic collections, the material basis of biodiversity information, have been neglected and abandoned, particularly at institutions of higher learning. Considering the rapid increase in the human population and urbanization, human sustainability requires new conceptual and practical approaches to refocusing and energizing the study of the biodiversity that is the core of natural resources for sustainable development and biotic capital for sustaining our life-support system. In this paper we aim to document and extrapolate the essence of biodiversity, discuss the state and nature of taxonomic demise, the trends of recent biodiversity studies, and suggest reasonable approaches to a biodiversity science to facilitate the expansion of global biodiversity knowledge and to create useful data on backyard biodiversity worldwide towards human sustainability

    Migrations and habitat use of the smooth hammerhead shark (Sphyrna zygaena) in the Atlantic Ocean

    Get PDF
    The smooth hammerhead shark, Sphyrna zygaena, is a cosmopolitan semipelagic shark captured as bycatch in pelagic oceanic fisheries, especially pelagic longlines targeting swordfish and/or tunas. From 2012 to 2016, eight smooth hammerheads were tagged with Pop-up Satellite Archival Tags in the inter-tropical region of the Northeast Atlantic Ocean, with successful transmissions received from seven tags (total of 319 tracking days). Results confirmed the smooth hammerhead is a highly mobile species, as the longest migration ever documented for this species (> 6600 km) was recorded. An absence of a diel vertical movement behavior was noted, with the sharks spending most of their time at surface waters (0-50 m) above 23 degrees C. The operating depth of the pelagic long-line gear was measured with Minilog Temperature and Depth Recorders, and the overlap with the species vertical distribution was calculated. The overlap is taking place mainly during the night and is higher for juveniles (similar to 40% of overlap time). The novel information presented can now be used to contribute to the provision of sustainable management tools and serve as input for Ecological Risk Assessments for smooth hammerheads caught in Atlantic pelagic longline fisheries.Oceanario de Lisboa through Project "SHARK-TAG: Migrations and habitat use of the smooth hammerhead shark in the Atlantic Ocean"; Investigador-FCT from the Portuguese Foundation for Science and Technology (FCT, Fundacao para a Ciencia e Tecnologia) [Ref: IF/00253/2014]; EU European Social Fund; Programa Operacional Potencial Human

    TRY plant trait database - enhanced coverage and open access

    Get PDF
    Plant traits-the morphological, anatomical, physiological, biochemical and phenological characteristics of plants-determine how plants respond to environmental factors, affect other trophic levels, and influence ecosystem properties and their benefits and detriments to people. Plant trait data thus represent the basis for a vast area of research spanning from evolutionary biology, community and functional ecology, to biodiversity conservation, ecosystem and landscape management, restoration, biogeography and earth system modelling. Since its foundation in 2007, the TRY database of plant traits has grown continuously. It now provides unprecedented data coverage under an open access data policy and is the main plant trait database used by the research community worldwide. Increasingly, the TRY database also supports new frontiers of trait-based plant research, including the identification of data gaps and the subsequent mobilization or measurement of new data. To support this development, in this article we evaluate the extent of the trait data compiled in TRY and analyse emerging patterns of data coverage and representativeness. Best species coverage is achieved for categorical traits-almost complete coverage for 'plant growth form'. However, most traits relevant for ecology and vegetation modelling are characterized by continuous intraspecific variation and trait-environmental relationships. These traits have to be measured on individual plants in their respective environment. Despite unprecedented data coverage, we observe a humbling lack of completeness and representativeness of these continuous traits in many aspects. We, therefore, conclude that reducing data gaps and biases in the TRY database remains a key challenge and requires a coordinated approach to data mobilization and trait measurements. This can only be achieved in collaboration with other initiatives

    Harmonizing semantic annotations for computational models in biology

    Get PDF
    Life science researchers use computational models to articulate and test hypotheses about the behavior of biological systems. Semantic annotation is a critical component for enhancing the interoperability and reusability of such models as well as for the integration of the data needed for model parameterization and validation. Encoded as machine-readable links to knowledge resource terms, semantic annotations describe the computational or biological meaning of what models and data represent. These annotations help researchers find and repurpose models, accelerate model composition and enable knowledge integration across model repositories and experimental data stores. However, realizing the potential benefits of semantic annotation requires the development of model annotation standards that adhere to a community-based annotation protocol.Without such standards, tool developers must account for a variety of annotation formats and approaches, a situation that can become prohibitively cumbersome and which can defeat the purpose of linking model elements to controlled knowledge resource terms. Currently, no consensus protocol for semantic annotation exists among the larger biological modeling community. Here, we report on the landscape of current annotation practices among the Computational Modeling in BIology NEtwork community and provide a set of recommendations for building a consensus approach to semantic annotation

    Pair Interaction Potentials of Colloids by Extrapolation of Confocal Microscopy Measurements of Collective Structure

    Full text link
    A method for measuring the pair interaction potential between colloidal particles by extrapolation measurement of collective structure to infinite dilution is presented and explored using simulation and experiment. The method is particularly well suited to systems in which the colloid is fluorescent and refractive index matched with the solvent. The method involves characterizing the potential of mean force between colloidal particles in suspension by measurement of the radial distribution function using 3D direct visualization. The potentials of mean force are extrapolated to infinite dilution to yield an estimate of the pair interaction potential, U(r)U(r). We use Monte Carlo (MC) simulation to test and establish our methodology as well as to explore the effects of polydispersity on the accuracy. We use poly-12-hydroxystearic acid-stabilized poly(methyl methacrylate) (PHSA-PMMA) particles dispersed in the solvent dioctyl phthalate (DOP) to test the method and assess its accuracy for three different repulsive systems for which the range has been manipulated by addition of electrolyte.Comment: 35 pages, 14 figure

    Search for new phenomena in final states with an energetic jet and large missing transverse momentum in pp collisions at √ s = 8 TeV with the ATLAS detector

    Get PDF
    Results of a search for new phenomena in final states with an energetic jet and large missing transverse momentum are reported. The search uses 20.3 fb−1 of √ s = 8 TeV data collected in 2012 with the ATLAS detector at the LHC. Events are required to have at least one jet with pT > 120 GeV and no leptons. Nine signal regions are considered with increasing missing transverse momentum requirements between Emiss T > 150 GeV and Emiss T > 700 GeV. Good agreement is observed between the number of events in data and Standard Model expectations. The results are translated into exclusion limits on models with either large extra spatial dimensions, pair production of weakly interacting dark matter candidates, or production of very light gravitinos in a gauge-mediated supersymmetric model. In addition, limits on the production of an invisibly decaying Higgs-like boson leading to similar topologies in the final state are presente

    A hippocampal Cdk5 pathway regulates extinction of contextual fear

    Get PDF
    Treatment of emotional disorders involves the promotion of extinction processes, which are defined as the learned reduction of fear. The molecular mechanisms underlying extinction have only begun to be elucidated. By employing genetic and pharmacological approaches in mice, we show here that extinction requires downregulation of Rac-1 and cyclin-dependent kinase 5 (Cdk5), and upregulation of p21 activated kinase-1 (PAK-1) activity. This is physiologically achieved by a Rac-1–dependent relocation of the Cdk5 activator p35 from the membrane to the cytosol and dissociation of p35 from PAK-1. Moreover, our data suggest that Cdk5/p35 activity prevents extinction in part by inhibition of PAK-1 activity in a Rac-1–dependent manner. We propose that extinction of contextual fear is regulated by counteracting components of a molecular pathway involving Rac-1, Cdk5 and PAK-1. Our data suggest that this pathway could provide a suitable target for therapeutic treatment of emotional disorders.National Institutes of Health (U.S.) (Grant NS051874)Alexander von Humboldt-Stiftung (German Research Foundation Fellowship)European Neuroscience Institute Goettinge

    Sociomateriality Implications of Software As a Service Adoption on IT-workers’ Roles and Changes in Organizational Routines of IT Systems Support

    Get PDF
    This paper aims to deepen our understanding on how sociomateriality practices influence IT workers’ roles and skill set requirements and changes to the organizational routines of IT systems support, when an organization migrates an on-premise IT system to a software as a service (SaaS) model. This conceptual paper is part of an ongoing study investigating organizations that migrated on-premise IT email systems to SaaS business models, such as Google Apps for Education (GAE) and Microsoft Office 365 systems, in New Zealand tertiary institutions. We present initial findings from interpretive case studies. The findings are, firstly, technological artifacts are entangled in sociomaterial practices, which change the way humans respond to the performative aspects of the organizational routines. Human and material agencies are interwoven in ways that reinforce or change existing routines. Secondly, materiality, virtual realm and spirit of the technology provide elementary levels at which human and material agencies entangle. Lastly, the elementary levels at which human and material entangle depends on the capabilities or skills set of an individual
    corecore