1,807 research outputs found

    Pyrgeometer data reduction and calibration procedures

    Get PDF
    May 1976.Pyrgeometer measurements from aircraft, by Bruce Albrecht, Michael Poellot, Stephen K. Cox.Includes bibliographical references (page 48)

    Large-scale response of the tropical atmosphere to radiative heating, The

    Get PDF
    March 1974.Includes bibliographical references.Sponsored by the National Science Foundation GA-36302

    Emotion Classification of Indonesian Tweets using Bidirectional LSTM

    Get PDF
    Emotion classification can be a powerful tool to derive narratives from social media data. Traditional machine learning models that perform emotion classification on Indonesian Twitter data exist but rely on closed-source features. Recurrent neural networks can meet or exceed the performance of state-of-the-art traditional machine learning techniques using exclusively open-source data and models. Specifically, these results show that recurrent neural network variants can produce more than an 8% gain in accuracy in comparison with logistic regression and SVM techniques and a 15% gain over random forest when using FastText embeddings. This research found a statistical significance in the performance of a single-layer bidirectional long short-term memory model over a two-layer stacked bidirectional long short-term memory model. This research also found that a single-layer bidirectional long short-term memory recurrent neural network met the performance of a state-of-the-art logistic regression model with supplemental closed-source features from a study by Saputri et al. [8] when classifying the emotion of Indonesian tweets

    A Comparison of Quaternion Neural Network Backpropagation Algorithms

    Get PDF
    This research paper focuses on quaternion neural networks (QNNs) - a type of neural network wherein the weights, biases, and input values are all represented as quaternion numbers. Previous studies have shown that QNNs outperform real-valued neural networks in basic tasks and have potential in high-dimensional problem spaces. However, research on QNNs has been fragmented, with contributions from different mathematical and engineering domains leading to unintentional overlap in QNN literature. This work aims to unify existing research by evaluating four distinct QNN backpropagation algorithms, including the novel GHR-calculus backpropagation algorithm, and providing concise, scalable implementations of each algorithm using a modern compiled programming language. Additionally, the authors apply a robust Design of Experiments (DoE) methodology to compare the accuracy and runtime of each algorithm. The experiments demonstrate that the Clifford Multilayer Perceptron (CMLP) learning algorithm results in statistically significant improvements in network test set accuracy while maintaining comparable runtime performance to the other three algorithms in four distinct regression tasks. By unifying existing research and comparing different QNN training algorithms, this work develops a state-of-the-art baseline and provides important insights into the potential of QNNs for solving high-dimensional problems

    Workplace-Based Practicum: Enabling Expansive Practices

    Get PDF
    Effective pre-service teacher education integrates theoretical and practical knowledge. One means of integration is practicum in a school workplace. In a time of variable approaches to, and models of, practicum, we outline an innovative model of school immersion as part of a teacher preparation program. We apply Fuller and Unwin’s (2004) expansive and restrictive conceptual framework of workplace learning to a case study of an immersive practicum experience to discuss themes of participation, personal development and institutional arrangements in relation to school-based practicum. Enablers and constraints are identified for our immersion model of workplace-based practicum. Based on the data analysis a number of implications for structuring an expansive practicum learning experience are outlined

    Archaeological and Historical Investigations at the West End of the Martin and Bowie Streets Connections, San Antonio, Bexar County, Texas

    Get PDF
    Test excavations were conducted in November 1987 by the Center for Archaeological Research, The University of Texas at San Antonio at the site of a planned relocation of the eastbound lanes of Martin Street in downtown San Antonio. The lots in question were located on the north side of the early town site, and were known to have contained the residence of Thaddeus Smith, the county clerk for Bexar County in the last half of the 19th century. Testing was done by backhoe and by controlled hand excavations. Testing was done in relation to each of the three structures that were known to have existed in the area. The foundations of a house at 409 North Flores Street, built in 1868, were examined to record methods of construction, the structural evolution of the house, and its conversion into a commercial establishment ca. 1927. Part of the foundations were uncovered and recorded of Smith\u27s elaborate 1898 home at 403 North Flores Street. In search for the source of the late 18th-century artifacts from the levels beneath an 1868 house, test units were excavated at 401 North Flores Street in what appeared to be on the 1873 Koch map the back yard of a one-story adobe house. Artifacts recovered there, however, indicated post-1800 occupation. A jacal wall trench was located and recorded at this address. As a result of the comparatively undisturbed nature of the archaeological deposits at 401 North Flores Street, it is recommended that the area bounded by Flores, Salinas, and Rossy Streets and the new location of Martin Street be made a State Archeological Landmark. A number of structures were located in this area in 1873 that undoubtedly represent the first expansion of the town to the north of the plazas

    Comparing Greedy Constructive Heuristic Subtour Elimination Methods for the Traveling Salesman Problem

    Get PDF
    Purpose β€” This paper aims to define the class of fragment constructive heuristics used to compute feasible solutions for the traveling salesman problem (TSP) into edge-greedy and vertex-greedy subclasses. As these subclasses of heuristics can create subtours, two known methodologies for subtour elimination on symmetric instances are reviewed and are expanded to cover asymmetric problem instances. This paper introduces a third novel subtour elimination methodology, the greedy tracker (GT), and compares it to both known methodologies. Design/methodology/approach β€” Computational results for all three subtour elimination methodologies are generated across 17 symmetric instances ranging in size from 29 vertices to 5,934 vertices, as well as 9 asymmetric instances ranging in size from 17 to 443 vertices. Findings β€” The results demonstrate the GT is the fastest method for preventing subtours for instances below 400 vertices. Additionally, a distinction between fragment constructive heuristics and the subtour elimination methodology used to ensure the feasibility of resulting solutions enables the introduction of a new vertex-greedy fragment heuristic called ordered greedy. Originality/value β€” This research has two main contributions: first, it introduces a novel subtour elimination methodology. Second, the research introduces the concept of ordered lists which remaps the TSP into a new space with promising initial computational results

    Analysis of the Effects of Spatiotemporal Demand Data Aggregation Methods on Distance and Volume Errors

    Get PDF
    Purpose β€” Large/stochastic spatiotemporal demand data sets can prove intractable for location optimization problems, motivating the need for aggregation. However, demand aggregation induces errors. Significant theoretical research has been performed related to the modifiable areal unit problem and the zone definition problem. Minimal research has been accomplished related to the specific issues inherent to spatiotemporal demand data, such as search and rescue (SAR) data. This study provides a quantitative comparison of various aggregation methodologies and their relation to distance and volume based aggregation errors. Design/methodology/approach β€” This paper introduces and applies a framework for comparing both deterministic and stochastic aggregation methods using distance- and volume-based aggregation error metrics. This paper additionally applies weighted versions of these metrics to account for the reality that demand events are nonhomogeneous. These metrics are applied to a large, highly variable, spatiotemporal demand data set of SAR events in the Pacific Ocean. Comparisons using these metrics are conducted between six quadrat aggregations of varying scales and two zonal distribution models using hierarchical clustering. Findings β€” As quadrat fidelity increases the distance-based aggregation error decreases, while the two deliberate zonal approaches further reduce this error while using fewer zones. However, the higher fidelity aggregations detrimentally affect volume error. Additionally, by splitting the SAR data set into training and test sets this paper shows the stochastic zonal distribution aggregation method is effective at simulating actual future demands. Originality/value β€” This study indicates no singular best aggregation method exists, by quantifying trade-offs in aggregation-induced errors practitioners can utilize the method that minimizes errors most relevant to their study. Study also quantifies the ability of a stochastic zonal distribution method to effectively simulate future demand data
    • …
    corecore