23,576 research outputs found
A Minimal Architecture for General Cognition
A minimalistic cognitive architecture called MANIC is presented. The MANIC
architecture requires only three function approximating models, and one state
machine. Even with so few major components, it is theoretically sufficient to
achieve functional equivalence with all other cognitive architectures, and can
be practically trained. Instead of seeking to transfer architectural
inspiration from biology into artificial intelligence, MANIC seeks to minimize
novelty and follow the most well-established constructs that have evolved
within various sub-fields of data science. From this perspective, MANIC offers
an alternate approach to a long-standing objective of artificial intelligence.
This paper provides a theoretical analysis of the MANIC architecture.Comment: 8 pages, 8 figures, conference, Proceedings of the 2015 International
Joint Conference on Neural Network
Missing Value Imputation With Unsupervised Backpropagation
Many data mining and data analysis techniques operate on dense matrices or
complete tables of data. Real-world data sets, however, often contain unknown
values. Even many classification algorithms that are designed to operate with
missing values still exhibit deteriorated accuracy. One approach to handling
missing values is to fill in (impute) the missing values. In this paper, we
present a technique for unsupervised learning called Unsupervised
Backpropagation (UBP), which trains a multi-layer perceptron to fit to the
manifold sampled by a set of observed point-vectors. We evaluate UBP with the
task of imputing missing values in datasets, and show that UBP is able to
predict missing values with significantly lower sum-squared error than other
collaborative filtering and imputation techniques. We also demonstrate with 24
datasets and 9 supervised learning algorithms that classification accuracy is
usually higher when randomly-withheld values are imputed using UBP, rather than
with other methods
Recommended from our members
The Restructuring and Privatisation of British Rail: Was it really that bad?
Following the government�s decision to place Railtrack into administration (October 2001), attention has focused on what went wrong with privatisation, and how crucial network investment will be financed in future. This paper uses a social cost-benefit analysis framework to assess whether the restructuring and privatisation of British Rail has produced savings in operating costs. The paper shows that major efficiencies have been achieved, consumers have benefited through lower prices, whilst the increased government subsidy has been largely recouped through privatisation proceeds. We find that output quality has also improved (pre-Hatfield). The achievement of further savings will be key to delivering improved rail services in the future. This paper finds that a privatised structure, where shareholders demand a return on their investment, has led to significant improvements in operating efficiency - it remains to be seen whether the new regime, with a not-for-profit infrastructure owner, will deliver the same efficiency improvements
Big Bang Nucleosynthesis with Independent Neutrino Distribution Functions
We have performed new Big Bang Nucleosynthesis calculations which employ
arbitrarily-specified, time-dependent neutrino and antineutrino distribution
functions for each of up to four neutrino flavors. We self-consistently couple
these distributions to the thermodynamics, the expansion rate and scale
factor-time/temperature relationship, as well as to all relevant weak,
electromagnetic, and strong nuclear reaction processes in the early universe.
With this approach, we can treat any scenario in which neutrino or antineutrino
spectral distortion might arise. These scenarios might include, for example,
decaying particles, active-sterile neutrino oscillations, and active-active
neutrino oscillations in the presence of significant lepton numbers. Our
calculations allow lepton numbers and sterile neutrinos to be constrained with
observationally-determined primordial helium and deuterium abundances. We have
modified a standard BBN code to perform these calculations and have made it
available to the community.Comment: 9 pages, 5 figure
Recommended from our members
The durability of oral diabetic medications: Time to A1c baseline and a review of common oral medications used by the primary care provider.
Introduction:Cost of generic medications has risen more in the past few years than any other time in history. While medical insurance covers much of these costs, health care professionals can better provide medications that have the longest duration of action when compared to placebo-treated controls. This will save health care costs and improve prescribing accuracy. Methods:Papers in PubMed were identified with keywords placebo. The study must be at least 2 years in length to evaluate the change in A1c over time. The primary endpoint was time to A1c neutrality (return of A1c to baseline at a maximum dose of single oral agent). A medication would be considered at neutrality if the 95% CI crossed baseline. Time to neutrality was averaged for each medication within the class and each summarized for class effect. Results:Effective therapy for the DPP-4 and sulfonylurea classes of medications are 3-4 years as compared to a 5-year time to A1c neutrality for metformin usage. In comparison, the projected time to A1c neutrality was approximately 6-8 years for rosiglitazone and pioglitazone. While only a few studies have been published in the SGLT-2 class of medication, the time to A1c neutrality was also 6-8 years with Canagliflozin and full dosage of Empagliflozin. Conclusion:Metformin appears to have a 5-year duration of effect before the A1c returns to baseline. The sulfonylureas and DPP-4 inhibitors class of medications have one of the shortest durability which ranges between 3.3 to 4.4 years. In contrast, the SGLT-2 class of medication and the TZD class of medications has a projected time to A1c neutrality from 6-8 years. Diabetic duration of therapy as compared to placebo should be listed with those medications tested so the provider can choose wisely
The Impact of Nuclear Reaction Rate Uncertainties on Evolutionary Studies of the Nova Outburst
The observable consequences of a nova outburst depend sensitively on the
details of the thermonuclear runaway which initiates the outburst. One of the
more important sources of uncertainty is the nuclear reaction data used as
input for the evolutionary calculations. A recent paper by Starrfield, Truran,
Wiescher, & Sparks (1998) has demonstrated that changes in the reaction rate
library used within a nova simulation have significant effects, not just on the
production of individual isotopes (which can change by an order of magnitude),
but on global observables such as the peak luminosity and the amount of mass
ejected. We present preliminary results of systematic analyses of the impact of
reaction rate uncertainties on nova nucleosynthesis.Comment: 4 pages, 3 figures. to appear in "Cosmic Explosions", proceeding of
the 10th Annual October Astrophysics Conference in Maryland (ed. S.S. Holt
and W. W. Zhang
\u3cem\u3eA Workforce Divided: Community, Labor, and the State in Saint-Nazaire\u27s Shipbuilding Industry, 1880-1910\u3c/em\u3e, by Leslie A. Schuster
A review of A Workforce Divided: Community, Labor, and the State in Saint-Nazaire\u27s Shipbuilding Industry, 1880-1910, by Leslie A. Schuste
\u3cem\u3eThe Glassworkers of Carmaux: French Craftsmen and Political Action in a 19th-Century City\u3c/em\u3e, by Joan Wallach Scott
A review of The Glassworkers of Carmaux: French Craftsmen and Political Action in a 19th-Century City, by Joan Wallach Scot
\u3cem\u3eThe Alliance of Iron and Wheat in the Third French Republic, 1860-1914: Origins of the New Conservatism\u3c/em\u3e, by Herman Lebovics
A review of The Alliance of Iron and Wheat in the Third French Republic, 1860-1914: Origins of the New Conservatism, by Herman Lebovic
- …