138 research outputs found
CT-SRCNN: Cascade Trained and Trimmed Deep Convolutional Neural Networks for Image Super Resolution
We propose methodologies to train highly accurate and efficient deep
convolutional neural networks (CNNs) for image super resolution (SR). A cascade
training approach to deep learning is proposed to improve the accuracy of the
neural networks while gradually increasing the number of network layers. Next,
we explore how to improve the SR efficiency by making the network slimmer. Two
methodologies, the one-shot trimming and the cascade trimming, are proposed.
With the cascade trimming, the network's size is gradually reduced layer by
layer, without significant loss on its discriminative ability. Experiments on
benchmark image datasets show that our proposed SR network achieves the
state-of-the-art super resolution accuracy, while being more than 4 times
faster compared to existing deep super resolution networks.Comment: Accepted to IEEE Winter Conf. on Applications of Computer Vision
(WACV) 2018, Lake Tahoe, US
A note on maximal operators for the Schr\"{o}dinger equation on
Motivated by the study of the maximal operator for the Schr\"{o}dinger
equation on the one-dimensional torus , it is conjectured that
for any complex sequence , In this note, we show that if we replace the sequence by an arbitrary sequence with
only some convex properties, then We further show that this bound is sharp up to a
factor.Comment: 13 page
TinyReptile: TinyML with Federated Meta-Learning
Tiny machine learning (TinyML) is a rapidly growing field aiming to
democratize machine learning (ML) for resource-constrained microcontrollers
(MCUs). Given the pervasiveness of these tiny devices, it is inherent to ask
whether TinyML applications can benefit from aggregating their knowledge.
Federated learning (FL) enables decentralized agents to jointly learn a global
model without sharing sensitive local data. However, a common global model may
not work for all devices due to the complexity of the actual deployment
environment and the heterogeneity of the data available on each device. In
addition, the deployment of TinyML hardware has significant computational and
communication constraints, which traditional ML fails to address. Considering
these challenges, we propose TinyReptile, a simple but efficient algorithm
inspired by meta-learning and online learning, to collaboratively learn a solid
initialization for a neural network (NN) across tiny devices that can be
quickly adapted to a new device with respect to its data. We demonstrate
TinyReptile on Raspberry Pi 4 and Cortex-M4 MCU with only 256-KB RAM. The
evaluations on various TinyML use cases confirm a resource reduction and
training time saving by at least two factors compared with baseline algorithms
with comparable performance.Comment: Accepted by The International Joint Conference on Neural Network
(IJCNN) 202
A Semi-Parametric Model Simultaneously Handling Unmeasured Confounding, Informative Cluster Size, and Truncation by Death with a Data Application in Medicare Claims
Nearly 300,000 older adults experience a hip fracture every year, the
majority of which occur following a fall. Unfortunately, recovery after
fall-related trauma such as hip fracture is poor, where older adults diagnosed
with Alzheimer's Disease and Related Dementia (ADRD) spend a particularly long
time in hospitals or rehabilitation facilities during the post-operative
recuperation period. Because older adults value functional recovery and
spending time at home versus facilities as key outcomes after hospitalization,
identifying factors that influence days spent at home after hospitalization is
imperative. While several individual-level factors have been identified, the
characteristics of the treating hospital have recently been identified as
contributors. However, few methodological rigorous approaches are available to
help overcome potential sources of bias such as hospital-level unmeasured
confounders, informative hospital size, and loss to follow-up due to death.
This article develops a useful tool equipped with unsupervised learning to
simultaneously handle statistical complexities that are often encountered in
health services research, especially when using large administrative claims
databases. The proposed estimator has a closed form, thus only requiring light
computation load in a large-scale study. We further develop its asymptotic
properties that can be used to make statistical inference in practice.
Extensive simulation studies demonstrate superiority of the proposed estimator
compared to existing estimators.Comment: Contact Emails: [email protected]
- β¦