121 research outputs found
Accelerated Training via Incrementally Growing Neural Networks using Variance Transfer and Learning Rate Adaptation
We develop an approach to efficiently grow neural networks, within which
parameterization and optimization strategies are designed by considering their
effects on the training dynamics. Unlike existing growing methods, which follow
simple replication heuristics or utilize auxiliary gradient-based local
optimization, we craft a parameterization scheme which dynamically stabilizes
weight, activation, and gradient scaling as the architecture evolves, and
maintains the inference functionality of the network. To address the
optimization difficulty resulting from imbalanced training effort distributed
to subnetworks fading in at different growth phases, we propose a learning rate
adaption mechanism that rebalances the gradient contribution of these separate
subcomponents. Experimental results show that our method achieves comparable or
better accuracy than training large fixed-size models, while saving a
substantial portion of the original computation budget for training. We
demonstrate that these gains translate into real wall-clock training speedups
Growing Efficient Deep Networks by Structured Continuous Sparsification
We develop an approach to training deep networks while dynamically adjusting
their architecture, driven by a principled combination of accuracy and sparsity
objectives. Unlike conventional pruning approaches, our method adopts a gradual
continuous relaxation of discrete network structure optimization and then
samples sparse subnetworks, enabling efficient deep networks to be trained in a
growing and pruning manner. Extensive experiments across CIFAR-10, ImageNet,
PASCAL VOC, and Penn Treebank, with convolutional models for image
classification and semantic segmentation, and recurrent models for language
modeling, show that our training scheme yields efficient networks that are
smaller and more accurate than those produced by competing pruning methods
Making Scholarly Activity Available to the Masses: The Scaffolding of Scholarship Throughout the Undergraduate Curriculum
Florida Gulf Coast University’s Quality Enhancement Plan (QEP) focuses on improving student critical thinking, information literacy, and written communication. Rather than developing these skills through traditional methods (e.g., through senior-level, independent research), these learning outcomes are practiced through scholarly experiences. Traditional undergraduate scholarship manifests itself through terminal, senior capstone or research experiences. These, because of the economy of scale, typically reach a minority of students, often just honors students or those approached by faculty mentors. At FGCU, however, scholarly experiences are a part of the curriculum throughout the program of study, and scaffolded to build greater depth and sophistication. Presented here are examples from both a program in STEM (Marine Science) and the humanities (Music Performance).
Students in Marine Science receive their first exposure to the vetting of literature and expository scientific writing within their general education science courses. Students are presented with an exercise to evaluate the credibility of web-based literature using the CRAAP test. A semester-long writing assignment has them investigate an earth-process-related problem that has societal consequences. They review and evaluate the secondary literature, prepare a first draft that is critiqued, and then submit a final version while meeting a number of milestones along the way. Students enter the major’s curriculum through a course entitled “Scientific Process”, which introduces them to all aspects of scientific research and culminates with them writing and defending a research proposal they may eventually work to completion. Numerous courses at the upper-class level are designed as scholarly focused or enriched, a branding requiring that certain criteria are met. In these courses, students often participate in genuine collaborative research projects that can lead to student publication and enhance faculty productivity. Finally, as a senior, the capstone course requires that they produce a scholarly poster or oral presentation that is either given in the class or within a university forum.
Music Performance students’ experiences track towards demonstration of content mastery in the artifact of a senior recital. In this public display of scholarly achievement a student presents repertoire from major historical eras on his or her instrument or voice for an hour or more. Additionally the students complete a comprehensive document analyzing music in terms of performance practice (how and why certain music should be performed to meet historically appropriate creations and recreations). Students enter this major their freshman year after an audition process and immediately begin developing the skills required to demonstrate proficiency as professional musicians. Experiences performing in ensembles and in private lessons cultivate listening skills to make informed musical judgments. Theory courses develop students’ abilities to hear music with their eyes. Upper level courses require students to clearly articulate in writing their thoughts about music’s formal properties, why certain music requires particular performance considerations, and how to execute those performance requirements in their technique. The conundrum for collection of data is how to assess university-wide learning outcomes in the context of a performance. Without a tangible artifact, FGCU relies on artist teams to develop assessment procedures that accurately capture if students meet targets as demonstrated in performance.
Though too early for us to have extensive assessment data, anecdotal evidence suggests students enjoy this approach and are honing their skills within these learning outcomes. We anticipate these improvements will increase graduates’ life-long learning potential, as well as their competitiveness for employment and further education
Storm-Generated Molluscan Thanatocoenosis along a Carbonate Paleoshoreline: Southern Eleuthera Island, The Bahamas
Concentrations of large mollusc shells in coastal deposits provide important information about the local malacofauna and potential transport agents, including extreme events [1-4]. Such accumulations are common in the rock record [5,6], with Quaternary examples serving as good time-averaged examples by combining aspects of both the modern biocoenoses and the fossil record. Death assemblages of local organisms (thanatocoenosis) and their preserved record (taphocoenosis) in carbonate settings, where granulometric spectrum may be very limited (e.g., ooilitic sand), can serve as important paleo-environmental indicators, especially when considered in combination with primary sedimentary structures (in outcrops or geophysical images) and in situ biogenic structures (trace fossils)[7]. Along prograded beach/dune ridge complexes (strandplains) [8], extensive accumulations of large nearshore mollusc shells are likely related to extreme events, such as intense storms [1]. This study reports on an anomalous accumulation of mostly juvenile conch shells (Aliger sp.) along one of the oldest (landwardmost) paleoshorelines of the Plum Creek Beach in Freetown, southern Eleuthera Island, The Bahamas (Fig. 1). Shell preservation is assessed using semi-quantitative taphonomic grades
Cylindrical Mega-Voids in Quaternary Aeolianites, Little Exuma Island, The Bahamas: Georadar Response
In addition to karst features, tropical carbonates contain a wide range of smaller cylindrical voids (“pipes”) attributed to bioturbation, tree molds, or dissolution, among others. During geophysical investigation of the Little Exuma Island, The Bahamas, several sites with enigmatic voids were investigated using a high-frequency ground-penetrating radar (GPR) imaging.
The aim of the paper is to assess the feasibility of GPR to detect voids within lithified Holocene calcarenites of the Hannah Bay Membe
Mechanical Search: Multi-Step Retrieval of a Target Object Occluded by Clutter
When operating in unstructured environments such as warehouses, homes, and
retail centers, robots are frequently required to interactively search for and
retrieve specific objects from cluttered bins, shelves, or tables. Mechanical
Search describes the class of tasks where the goal is to locate and extract a
known target object. In this paper, we formalize Mechanical Search and study a
version where distractor objects are heaped over the target object in a bin.
The robot uses an RGBD perception system and control policies to iteratively
select, parameterize, and perform one of 3 actions -- push, suction, grasp --
until the target object is extracted, or either a time limit is exceeded, or no
high confidence push or grasp is available. We present a study of 5 algorithmic
policies for mechanical search, with 15,000 simulated trials and 300 physical
trials for heaps ranging from 10 to 20 objects. Results suggest that success
can be achieved in this long-horizon task with algorithmic policies in over 95%
of instances and that the number of actions required scales approximately
linearly with the size of the heap. Code and supplementary material can be
found at http://ai.stanford.edu/mech-search .Comment: To appear in IEEE International Conference on Robotics and Automation
(ICRA), 2019. 9 pages with 4 figure
Modeling Dynamic Environments with Scene Graph Memory
Embodied AI agents that search for objects in large environments such as
households often need to make efficient decisions by predicting object
locations based on partial information. We pose this as a new type of link
prediction problem: link prediction on partially observable dynamic graphs. Our
graph is a representation of a scene in which rooms and objects are nodes, and
their relationships are encoded in the edges; only parts of the changing graph
are known to the agent at each timestep. This partial observability poses a
challenge to existing link prediction approaches, which we address. We propose
a novel state representation -- Scene Graph Memory (SGM) -- with captures the
agent's accumulated set of observations, as well as a neural net architecture
called a Node Edge Predictor (NEP) that extracts information from the SGM to
search efficiently. We evaluate our method in the Dynamic House Simulator, a
new benchmark that creates diverse dynamic graphs following the semantic
patterns typically seen at homes, and show that NEP can be trained to predict
the locations of objects in a variety of environments with diverse object
movement dynamics, outperforming baselines both in terms of new scene
adaptability and overall accuracy. The codebase and more can be found at
https://www.scenegraphmemory.com
- …