21 research outputs found
Recommended from our members
Developing "humane" interfaces to data clearinghouses for improving the delivery of spatial information to marine resource managers
Web geographic information systems (GISs) and the Internet are now providing the connectivity necessary to support large-scale data access by a wide variety of users, including not just scientific researchers, but also policy-makers and marine resource managers. However, connectivity alone cannot ensure that those who need natural resource information will be able to locate relevant data. Data clearinghouses need to provide user interfaces that help a broad user community understand what spatial data are available, how they can be obtained, and how they should be interpreted. The Metadata Humane Society project conducted by researchers at Oregon State University combined traditional interface engineering knowledge, scientific research and geographic information science (www.nacse.org/mhs). The researchers wanted to improve access to spatial information by identifying the primary barriers to usability for the National Geospatial Data Clearinghouse (NGDC) interfaces. The project included developing an understanding of the current state of usability among GIS-related Web sites that provide metadata search facilities and identifying promising approaches to âlearnabilityâ and navigability that might be exploited in improving the NGDC interfaces. To accomplish these 3 goals, three types of usability evaluations were performed: initial predictive evaluation of existing sites, user testing of existing NGDC Interface, and a user expectations survey. The evaluations involved actual users from a range of disciplinary backgrounds and user communities, as well different levels of expertise. The project found that different levels of user expertise require distinct subsets of the usability criteria. It is recommended that there be at least two interfaces available for the NGDC addressing different target audiences, and that each interface should focus on certain criteria. To improve the delivery of spatial information to marine resource managers, these recommendations to increase usability should be applied to data clearinghouses such as the Virtual Oregon, Oregon Coast Geospatial Clearinghouse and the Geography Network
Recommended from our members
Metadata subsetting framework
The amount of research data online is growing exponentially, scattered across a multitude of locations and stored in various formats on a wide variety of platforms. The value of metadata which assists us in determining the relevance, location, and accessibility of data and related research, has become correspondingly more important. Although there are a number of international standards that govern the format and structure of metadata in specific disciplines, the complexity of these standards makes it difficult for data owners and clearinghouses to create and manage metadata effectively. The Metadata Subsetting Framework is a system that allows metadata clearinghouses to define a subset of a metadata standard, containing the fields most relevant to the specific domain. The system automatically generates an interface specifically customized for the subset, with capabilities for validating user inputs against constraints expressed in the subset. Implemented as a collection of Java components, the system produces a complete in-memory representation of XML Schema documents expressing the structural and data constraints of a subsetted metadata standard. A knowledgeable user builds the subset by manipulating these constraints via a web-based interface. Subsetting makes the metadata entry process easier and less error-prone, thereby improving the quality of metadata.Keywords: Subsetting, Metadata subsetting framework, Metadata standar
Public Commons for Geospatial Data: A Conceptual Model
A wide variety of spatial data collection efforts are ongoing throughout local, state and federal agencies, private firms and non-profit organizations. Each effort is established for a different purpose but organizations and individuals often collect and maintain the same or similar information. The United States federal government has undertaken many initiatives such as the National Spatial Data Infrastructure, the National Map and Geospatial One-Stop to reduce duplicative spatial data collection and promote the coordinated use, sharing, and dissemination of spatial data nationwide. A key premise in most of these initiatives is that no national government will be able to gather and maintain more than a small percentage of the geographic data that users want and desire. Thus, national initiatives depend typically on the cooperation of those already gathering spatial data and those using GIs to meet specific needs to help construct and maintain these spatial data infrastructures and geo-libraries for their nations (Onsrud 2001). Some of the impediments to widespread spatial data sharing are well known from directly asking GIs data producers why they are not currently involved in creating datasets that are of common or compatible formats, documenting their datasets in a standardized metadata format or making their datasets more readily available to others through Data Clearinghouses or geo-libraries. The research described in this thesis addresses the impediments to wide-scale spatial data sharing faced by GIs data producers and explores a new conceptual data-sharing approach, the Public Commons for Geospatial Data, that supports user-friendly metadata creation, open access licenses, archival services and documentation of parent lineage of the contributors and value- adders of digital spatial data sets
Recommended from our members
An earthquake emergency management model integrating remote sensing with geographic information systems: lessons learned from the 1999 Kocaeli, Turkey earthquake
An emergency management system integrating remote sensing and geographic
information systems applied to emergency preparation, response, and recovery could
have reduced casualties and damage after the August 17, 1999 Kocaeli, Turkey
earthquake. Model system components include a horizontally and vertically distributed
knowledge base, LIDAR 3D urban modeling, automatic feature extraction, hazard and
vulnerability mapping, improved public presentations, damage detection, remote sensing
and GIS in emergency coordination centers, mapping for emergency responders, tent city
monitoring, and debris management
The three dimensions of inclusive design: A design framework for a digitally transformed and complexly connected society
This thesis attempts to answer the following meta-design challenge: In this digitally transformed and increasingly connected society, how can we design in such a way that we include the full range of human diversity? How can we use design to both circumvent the new barriers that escalate exclusion and leverage the new affordances of emerging sociotechnical systems to reduce disparity?
This thesis documents the formulation, application and testing of a guiding framework for Inclusive Design, suitable for a digitally transformed and increasingly connected context. During the course of my doctoral studies I have iteratively formalised and refined this framework. As a doctoral student, Founder/Director of the Inclusive Design Research Centre of Canada (1993â), and co-Director of the sister European lab, the Inclusive Design Research Centre of Ireland (2008â), I have implemented the inclusive design framework in applied research with colleagues. I have also taught the framework in the graduate programme that I launched at OCAD University in Toronto in 2011. These framework applications have helped to develop tools and design methods that support the framework. The thesis conveys the formulation, implementation, and communication of the framework to several application domains.
The fields of knowledge are diverse and post-disciplinary. If a primary field must be chosen, then it would be the field of Design, not only in terms of Design Engineering but also in the broader scope of Design for Society: both are explored and developed in tandem. But the impact of the work in the âreal worldâ and within the industry sector that can support community change, is the most important aim and contribution of this research. The evolving framework is already being applied by a global collaborating community and has formed the basis of the corporate transformation of companies such as Microsoft. The applied research has delved into many cognate fields, including Systems Thinking, Deeper Learning, Economics, Machine Learning, Human Computer Interfaces, and Critical Disability Studies.
The thesis makes an original and substantial contribution to knowledge, articulating a guiding framework for Inclusive Design in a digitally transformed and complexly connected global society. The framework applies Systems Thinking to the area of digital inclusion for people experiencing disabilities and adds the consideration of the design process to inclusive or accessible Design. Examples taken from years of intensive practice that support the thesis are provided as use cases, to support future research and implementation.
The thesis also attempts to provide a bridge between scholarly study and community action, in part by using clear language to prevent or overcome any conceptual divide between scholars and the diverse individuals who must participate in co-designing a more inclusive society. The thesis includes translations of the concepts inherent in the proposed framework, expressed clearly and succinctly, for a variety of co-designers. The thesis posits that Diversity is Strength: a concept that can be applied in many cognate fields as well
Is What Works Working? Thinking Evaluatively About the What Works Clearinghouse
Since the mid-twentieth century, the U.S. Department of Education has drafted and enacted policies to bridge the research-practice gapâthat is, the gap between âwhat worksâ according to educational research and what is actually practiced by teachers and their administrators (e.g., Dirkx, 2006; Joyce & Cartwright, 2019; Tseng, 2012). One of the latest manifestations of this âwhat worksâ political legacy is the What Works Clearinghouse (WWC), which took shape as part of the Institute of Education Sciences (IES) in 2002. The WWCâs mission is to be a âcentral and trusted source of scientific evidence for what works in educationâ (WWC, 2020d, p. 1) while, at the same, helping the IES ââŠincrease [the] use of data and research in education decision-makingâ (IES, n.d.-a). The purpose of this dissertation is to evaluate the extent to which the WWC has realized its own mission as well as contributed to the IESâs larger goal.
Guided by principles of evaluative thinking (Vo & Archibald, 2018) and premises of the Two-Communities theoretical tradition (Caplan, 1979; Farley-Ripple et al., 2018), this project used a theory-based evaluation approach called contribution analysis (Mayne, 2008, 2012b, 2019) to investigate three guiding questions. Those questions inquired into (a) the extent of the WWCâs impact among educators, (b) the reasons why its impact may be wanting, and (c) the changes it could make to maximize its impact. To investigate these questions, a six-step procedure was used to both articulate and scrutinize the WWCâs theory of change according to available evidence. An array of evidence was considered, including existing publications (e.g., previously published evaluations, literature reviews, and large-scale surveys), analyses of publicly available data (e.g., public data exports, data requested through the Freedom of Information Act, transcripts from congressional hearings), and findings from a preservice teacher survey conducted for this project.
The results of this contribution analysis offered compelling answers to each of the three guiding questions. First, given the WWCâs original benchmark for success (e.g., Baldwin et al., 2008), evidence suggested that it is likely failing to fully reach educators and guide their decision-making. This was especially true for teachers. Second, the evidence suggested that the WWCâs impact may be wanting because its theory of change depends on several unsupported assumptions. Not only were many of the WWCâs causal assumptions refuted by the evidence, but some of its foundational assumptionsâsuch as the belief that systematic research review would be an effective way of bringing educational research to practiceâwere refuted as well. Finally, because several of its foundational assumptions were refutable, the WWC may only be able to maximize its impact if it fundamentally retools its approach to systematic research review or to educational research more generally. Suggestions for doing so are discussed
Information Security Governance Simplified
Security practitioners must be able to build cost-effective security programs while also complying with government regulations. Information Security Governance Simplified: From the Boardroom to the Keyboard lays out these regulations in simple terms and explains how to use control frameworks to build an air-tight information security (IS) program and governance structure. Defining the leadership skills required by IS officers, the book examines the pros and cons of different reporting structures and highlights the various control frameworks available. It details the functions of the security department and considers the control areas, including physical, network, application, business continuity/disaster recover, and identity management. Todd Fitzgerald explains how to establish a solid foundation for building your security program and shares time-tested insights about what works and what doesnât when building an IS program. Highlighting security considerations for managerial, technical, and operational controls, it provides helpful tips for selling your program to management. It also includes tools to help you create a workable IS charter and your own IS policies. Based on proven experience rather than theory, the book gives you the tools and real-world insight needed to secure your information while ensuring compliance with government regulations
Assessing the Scholarly Value of Online Texts
Publishing discipline-specific scholarly articles in refereed print journals is a traditional and especially important professional requirement for post-secondary faculty seeking initial employment, tenure, and promotion. Online writing, particularly web-based online journal publications that incorporate the unique hypertextual and/or hypermedia allowances of the medium, is expanding the boundaries of print-based scholarship and engaging academicians within English Studies in ongoing discussions that attempt to resolve issues of parity between print-based and web-based scholarship. A review of the relevant literature shows a persistent perception within English Studies that online journal publications lack scholarly value in comparison to traditional print publications, and therefore they may not be recognized as equal evidence of scholarly achievement for tenure, promotion, and review purposes. Scholars generally agree upon traditional scholarly standards for assessing print-based texts; however, no grounding rationale for understanding and valuing web-based texts as equally valid scholarship is readily available. This study aims to provide such a rationale.
Specifically, this dissertation addresses the need for valuing web-based journal publications as legitimate scholarship particularly among scholars in the subfield of Computers and Writing. The study provides a rhetorical analysis of a select group of "webtexts" published in the Computers and Writing subfield's premier online journal, Kairos: A Journal of Rhetoric, Technology, and Pedagogy. The analysis identifies common characteristics of webtexts and determines the extent to which these characteristics fail to meet, meet and/or extend traditional conventions of scholarship, thus contributing to the ongoing conversation of online scholarship assessment. The findings from the analysis lead to the development of an example assessment heuristic that may be useful for tenure, promotion, and review participants, online journal editors, and scholars within the Computers and Writing subfield to assess and defend the scholarly value of web-based journal publications