50 research outputs found
Application of remote sensing for fishery resources assessment and monitoring
The author has identified the following significant results. The distribution and abundance of white marlin correlated with the chlorophyll, water temperature, and Secchi depth sea truth measurements. Results of correlation analyses for dolphin were inconclusive. Predicition models for white marlin were developed using stepwise multiple regression and discriminant function analysis techniques which demonstrated a potential for increasing the probability of game fishing success. The S190A and B imagery was density sliced/color enhanced with white marlin location superimposed on the image, but no density/white marlin relationship could be established
SoK: On the Impossible Security of Very Large Foundation Models
Large machine learning models, or so-called foundation models, aim to serve
as base-models for application-oriented machine learning. Although these models
showcase impressive performance, they have been empirically found to pose
serious security and privacy issues. We may however wonder if this is a
limitation of the current models, or if these issues stem from a fundamental
intrinsic impossibility of the foundation model learning problem itself. This
paper aims to systematize our knowledge supporting the latter. More precisely,
we identify several key features of today's foundation model learning problem
which, given the current understanding in adversarial machine learning, suggest
incompatibility of high accuracy with both security and privacy. We begin by
observing that high accuracy seems to require (1) very high-dimensional models
and (2) huge amounts of data that can only be procured through user-generated
datasets. Moreover, such data is fundamentally heterogeneous, as users
generally have very specific (easily identifiable) data-generating habits. More
importantly, users' data is filled with highly sensitive information, and maybe
heavily polluted by fake users. We then survey lower bounds on accuracy in
privacy-preserving and Byzantine-resilient heterogeneous learning that, we
argue, constitute a compelling case against the possibility of designing a
secure and privacy-preserving high-accuracy foundation model. We further stress
that our analysis also applies to other high-stake machine learning
applications, including content recommendation. We conclude by calling for
measures to prioritize security and privacy, and to slow down the race for ever
larger models.Comment: 13 page
A multi-level predictive methodology for terminal area air traffic flow
Over the past few decades, the air transportation system has grown significantly. In particular, the number of passengers using air transportation has greatly increased. As the demand for air travel expands, airport departure/arrival demand almost reaches its capacity. In consequence, the level of delays increases since the system capacity cannot manage the increased demand. With this trend, the national airspace system (NAS) will be saturated, and the congestion at the airport will become even more severe. As a result of congestion, a considerable number of flights experience delays. According to the Bureau of Transportation Statistics (BTS), over 1 million flights are operated in a year, and about twenty percent of all scheduled commercial flights are delayed more than 15 minutes. These delays cost billions of dollars annually for airlines, passengers, and the US economy. Therefore, this study seeks to find out why the delays occur and to analyze patterns in which the delays occurred. Analysis of airport operations generally falls into a macro or micro perspective. At the macro point of view, very few details are considered, and delays are aggregated at the airport level. Especially, shortfalls in airport capacity and a capacity-demand imbalance are the primary causes of delays in this respect. In the micro perspective, each aircraft is modeled individually, and the causes of delays are reproduced as precisely as possible. Micro reasons for air traffic delays include inclement weather, mechanics problems, operation issues. In this regard, this research proposes a methodology that can efficiently and practically predict macro and micro-level air traffic flow in the terminal area. For a macro-level analysis of delays, artificial neural networks models are proposed to predict the hourly airport capacity. Multi-layer perceptron (MLP), recurrent neural network (RNN), and long short-term memory (LSTM) are trained with historical weather and airport capacity data of Hartsfield-Jackson Atlanta airport (ATL). In the performance evaluation, the models have presented decent predictive performance and successfully predicted the test data as well as the training data. On the other hand, Random Forests and AdaBoost are implemented in the micro-level modeling of the air traffic. The micro-level models trained with on-time flight performance data and corresponding weather data focus on a classification of the individual flight delays. The model provides interpretability and imbalanced data handling while the accuracy is as good as the existing methods. Lastly, the predictive model for individual flight delays is refined using the cost-proportionate rejection sampling (costing) method. Along with the integration of the costing method, general machine learning algorithms have been converted to cost-sensitive classifiers. The cost-sensitive classifiers were able to account for asymmetric misclassification costs without losing their diagnostic functionality as binary classifiers. This study presents a data-driven approach to air traffic flow management that can effectively utilize air traffic data accumulated over decades. Through data analysis from the macro and micro perspective, an integrated methodology for terminal air traffic flow prediction is provided. An accurate prediction of the airport capacity and individual flight delays will assist stakeholders in taking more informed decisions.Ph.D
Effective Visualizations of the Uncertainty in Hurricane Forecasts
The track forecast cone developed by the U.S. National Hurricane Center is the one most universally adopted by the general public, the news media, and governmental officials to enhance viewers\u27 understanding of the forecasts and their underlying uncertainties. However, current research has experimentally shown that it has limitations that result in misconceptions of the uncertainty included. Most importantly, the area covered by the cone tends to be misinterpreted as the region affected by the hurricane. In addition, the cone summarizes forecasts for the next three days into a single representation and, thus, makes it difficult for viewers to accurately determine crucial time-specific information. To address these limitations, this research develops novel alternative visualizations. It begins by developing a technique that generates and smoothly interpolates robust statistics from ensembles of hurricane predictions, thus creating visualizations that inherently include the spatial uncertainty by displaying three levels of positional storm strike risk at a specific point in time. To address the misconception of the area covered by the cone, this research develops time-specific visualizations depicting spatial information based on a sampling technique that selects a small, representative subset from an ensemble of points. It also allows depictions of such important storm characteristics as size and intensity. Further, this research generalizes the representative sampling framework to process ensembles of forecast tracks, selecting a subset of tracks accurately preserving the original distributions of available storm characteristics and keeping appropriately defined spatial separations. This framework supports an additional hurricane visualization portraying prediction uncertainties implicitly by directly showing the members of the subset without the visual clutter. We collaborated on cognitive studies that suggest that these visualizations enhance viewers\u27 ability to understand the forecasts because they are potentially interpreted more like uncertainty distributions. In addition to benefiting the field of hurricane forecasting, this research potentially enhances the visualization community more generally. For instance, the representative sampling framework for processing 2D points developed here can be applied to enhancing the standard scatter plots and density plots by reducing sizes of data sets. Further, as the idea of direct ensemble displays can possibly be extended to more general numerical simulations, it, thus, has potential impacts on a wide range of ensemble visualizations
Art and the artist in the literary works of Elsa Triolet
This thesis takes a representative selection of Triolet's works to study the themes of writing and creativity as they are presented in the novels. These are all portraits of artists and the accounts of the search for a synthesis of aesthetic freedom and ethical responsibility. It considers Triolet's importance as a foreign writer, adopting a new creative language to be adopted by a different cultural
environment, to be essential in understanding her importance to the French literary tradition. By emphasising her formative years in the avant-garde circles of prerevolutionary Russia, my study demonstrates her considerable contribution to the meeting of Russian and French aesthetic theories. I extend this with close textual
readings of certain works to demonstrate her techniques in novelistic construction which reveal many Formalist practices before Formalist works in translation made
their official influence on creative methods.
The introduction considers the reasons for Triolet's neglect as a writer. It then considers various contemporary and recent critical appraisals which indicate
the interest she has received until present and which allow me to define my own critical approach. Part One traces Triolet's literary evolution from her formative
years in Russia, through exile to her first publications in Russian. It then considers her insertion into French literary activity, and her association with the schools of
socialist realism and the "nouveau roman".
Part Two examines two traditional novels which portray the creative and metaphorical roles of the artist and his work, showing the constant conflict between private and public lives. In Part Three, I show how aspects of novelistic
traditionalism are gradually foregrounded so that the work develops a dual-sided character where it both narrates and examines the processes of its own narration. In Part Four, this move to highly self-conscious aesthetics demonstrates an idiosyncratic exploration of new paths for the novel that bring visual, auditive and cinematographic media into the traditional domain of written art. Accompanying
the very post-modernist experimentation, I show how this research within the novel into the novel's own future has an ethical and redemptive purpose whose final conclusion is that creativity and human freedom are inexorably interwoven
Obsessions Semblables: The Creation of Two American Gothic Authors in the French Imagination
Senior Project submitted to The Division of Languages and Literature of Bard College
Model of models -- Part 1
This paper proposes a new cognitive model, acting as the main component of an
AGI agent. The model is introduced in its mature intelligence state, and as an
extension of previous models, DENN, and especially AKREM, by including
operational models (frames/classes) and will. This model's core assumption is
that cognition is about operating on accumulated knowledge, with the guidance
of an appropriate will. Also, we assume that the actions, part of knowledge,
are learning to be aligned with will, during the evolution phase that precedes
the mature intelligence state. In addition, this model is mainly based on the
duality principle in every known intelligent aspect, such as exhibiting both
top-down and bottom-up model learning, generalization verse specialization, and
more. Furthermore, a holistic approach is advocated for AGI designing, and
cognition under constraints or efficiency is proposed, in the form of
reusability and simplicity. Finally, reaching this mature state is described
via a cognitive evolution from infancy to adulthood, utilizing a consolidation
principle. The final product of this cognitive model is a dynamic operational
memory of models and instances. Lastly, some examples and preliminary ideas for
the evolution phase to reach the mature state are presented.Comment: arXiv admin note: text overlap with arXiv:2301.1355