3,451 research outputs found
Surprises in High-Dimensional Ridgeless Least Squares Interpolation
Interpolators -- estimators that achieve zero training error -- have
attracted growing attention in machine learning, mainly because state-of-the
art neural networks appear to be models of this type. In this paper, we study
minimum norm (``ridgeless'') interpolation in high-dimensional least
squares regression. We consider two different models for the feature
distribution: a linear model, where the feature vectors
are obtained by applying a linear transform to a vector of i.i.d.\ entries,
(with ); and a nonlinear model,
where the feature vectors are obtained by passing the input through a random
one-layer neural network, (with ,
a matrix of i.i.d.\ entries, and an
activation function acting componentwise on ). We recover -- in a
precise quantitative way -- several phenomena that have been observed in
large-scale neural networks and kernel machines, including the "double descent"
behavior of the prediction risk, and the potential benefits of
overparametrization.Comment: 68 pages; 16 figures. This revision contains non-asymptotic version
of earlier results, and results for general coefficient
Connections Between Adaptive Control and Optimization in Machine Learning
This paper demonstrates many immediate connections between adaptive control
and optimization methods commonly employed in machine learning. Starting from
common output error formulations, similarities in update law modifications are
examined. Concepts in stability, performance, and learning, common to both
fields are then discussed. Building on the similarities in update laws and
common concepts, new intersections and opportunities for improved algorithm
analysis are provided. In particular, a specific problem related to higher
order learning is solved through insights obtained from these intersections.Comment: 18 page
Towards an Early Software Estimation Using Log-Linear Regression and a Multilayer Perceptron Model
Software estimation is a tedious and daunting task in project management and software development. Software estimators are notorious in predicting software effort and they have been struggling in the past decades to provide new models to enhance software estimation. The most critical and crucial part of software estimation is when estimation is required in the early stages of the software life cycle where the problem to be solved has not yet been completely revealed. This paper presents a novel log-linear regression model based on the use case point model (UCP) to calculate the software effort based on use case diagrams. A fuzzy logic approach is used to calibrate the productivity factor in the regression model. Moreover, a multilayer perceptron (MLP) neural network model was developed to predict software effortbased on the software size and team productivity. Experiments show that the proposed approach outperforms the original UCP model. Furthermore, a comparison between the MLP and log-linear regression models was conducted based on the size of the projects. Results demonstrate that the MLP model can surpass the regression model when small projects are used, but the log-linear regression model gives better results when estimating larger projects
Sobolev Acceleration and Statistical Optimality for Learning Elliptic Equations via Gradient Descent
In this paper, we study the statistical limits in terms of Sobolev norms of
gradient descent for solving inverse problem from randomly sampled noisy
observations using a general class of objective functions. Our class of
objective functions includes Sobolev training for kernel regression, Deep Ritz
Methods (DRM), and Physics Informed Neural Networks (PINN) for solving elliptic
partial differential equations (PDEs) as special cases. We consider a
potentially infinite-dimensional parameterization of our model using a suitable
Reproducing Kernel Hilbert Space and a continuous parameterization of problem
hardness through the definition of kernel integral operators. We prove that
gradient descent over this objective function can also achieve statistical
optimality and the optimal number of passes over the data increases with sample
size. Based on our theory, we explain an implicit acceleration of using a
Sobolev norm as the objective function for training, inferring that the optimal
number of epochs of DRM becomes larger than the number of PINN when both the
data size and the hardness of tasks increase, although both DRM and PINN can
achieve statistical optimality
COMPARATIVE ANALYSIS OF SOFTWARE EFFORT ESTIMATION USING DATA MINING TECHNIQUE AND FEATURE SELECTION
Software development involves several interrelated factors that influence development efforts and productivity. Improving the estimation techniques available to project managers will facilitate more effective time and budget control in software development. Software Effort Estimation or software cost/effort estimation can help a software development company to overcome difficulties experienced in estimating software development efforts. This study aims to compare the Machine Learning method of Linear Regression (LR), Multilayer Perceptron (MLP), Radial Basis Function (RBF), and Decision Tree Random Forest (DTRF) to calculate estimated cost/effort software. Then these five approaches will be tested on a dataset of software development projects as many as 10 dataset projects. So that it can produce new knowledge about what machine learning and non-machine learning methods are the most accurate for estimating software business. As well as knowing between the selection between using Particle Swarm Optimization (PSO) for attributes selection and without PSO, which one can increase the accuracy for software business estimation. The data mining algorithm used to calculate the most optimal software effort estimate is the Linear Regression algorithm with an average RMSE value of 1603,024 for the 10 datasets tested. Then using the PSO feature selection can increase the accuracy or reduce the RMSE average value to 1552,999. The result indicates that, compared with the original regression linear model, the accuracy or error rate of software effort estimation has increased by 3.12% by applying PSO feature selectio
- β¦