986 research outputs found
PredNet and Predictive Coding: A Critical Review
PredNet, a deep predictive coding network developed by Lotter et al.,
combines a biologically inspired architecture based on the propagation of
prediction error with self-supervised representation learning in video. While
the architecture has drawn a lot of attention and various extensions of the
model exist, there is a lack of a critical analysis. We fill in the gap by
evaluating PredNet both as an implementation of the predictive coding theory
and as a self-supervised video prediction model using a challenging video
action classification dataset. We design an extended model to test if
conditioning future frame predictions on the action class of the video improves
the model performance. We show that PredNet does not yet completely follow the
principles of predictive coding. The proposed top-down conditioning leads to a
performance gain on synthetic data, but does not scale up to the more complex
real-world action classification dataset. Our analysis is aimed at guiding
future research on similar architectures based on the predictive coding theory
PredProp: Bidirectional Stochastic Optimization with Precision Weighted Predictive Coding
We present PredProp, a method for bidirectional, parallel and local
optimisation of weights, activities and precision in neural networks. PredProp
jointly addresses inference and learning, scales learning rates dynamically and
weights gradients by the curvature of the loss function by optimizing
prediction error precision. PredProp optimizes network parameters with
Stochastic Gradient Descent and error forward propagation based strictly on
prediction errors and variables locally available to each layer. Neighboring
layers optimise shared activity variables so that prediction errors can
propagate forward in the network, while predictions propagate backwards. This
process minimises the negative Free Energy, or evidence lower bound of the
entire network. We show that networks trained with PredProp resemble gradient
based predictive coding when the number of weights between neighboring activity
variables is one. In contrast to related work, PredProp generalizes towards
backward connections of arbitrary depth and optimizes precision for any deep
network architecture. Due to the analogy between prediction error precision and
the Fisher information for each layer, PredProp implements a form of Natural
Gradient Descent. When optimizing DNN models, layer-wise PredProp renders the
model a bidirectional predictive coding network. Alternatively DNNs can
parameterize the weights between two activity variables. We evaluate PredProp
for dense DNNs on simple inference, learning and combined tasks. We show that,
without an explicit sampling step in the network, PredProp implements a form of
variational inference that allows to learn disentangled embeddings from low
amounts of data and leave evaluation on more complex tasks and datasets to
future work
Effect of a Computer-Based Decision Support Intervention on Autism Spectrum Disorder Screening in Pediatric Primary Care Clinics: A Cluster Randomized Clinical Trial
Importance:
Universal early screening for autism spectrum disorder (ASD) is recommended but not routinely performed.
Objective:
To determine whether computer-automated screening and clinical decision support can improve ASD screening rates in pediatric primary care practices.
Design, Setting, and Participants:
This cluster randomized clinical trial, conducted between November 16, 2010, and November 21, 2012, compared ASD screening rates among a random sample of 274 children aged 18 to 24 months in urban pediatric clinics of an inner-city county hospital system with or without an ASD screening module built into an existing decision support software system. Statistical analyses were conducted from February 6, 2017, to June 1, 2018.
Interventions:
Four clinics were matched in pairs based on patient volume and race/ethnicity, then randomized within pairs. Decision support with the Child Health Improvement Through Computer Automation system (CHICA) was integrated with workflow and with the electronic health record in intervention clinics.
Main Outcomes and Measures:
The main outcome was screening rates among children aged 18 to 24 months. Because the intervention was discontinued among children aged 18 months at the request of the participating clinics, only results for those aged 24 months were collected and analyzed. Rates of positive screening results, clinicians' response rates to screening results in the computer system, and new cases of ASD identified were also measured. Main results were controlled for race/ethnicity and intracluster correlation.
Results:
Two clinics were randomized to receive the intervention, and 2 served as controls. Records from 274 children (101 girls, 162 boys, and 11 missing information on sex; age range, 23-30 months) were reviewed (138 in the intervention clinics and 136 in the control clinics). Of 263 children, 242 (92.0%) were enrolled in Medicaid, 138 (52.5%) were African American, and 96 (36.5%) were Hispanic. Screening rates in the intervention clinics increased from 0% (95% CI, 0%-5.5%) at baseline to 68.4% (13 of 19) (95% CI, 43.4%-87.4%) in 6 months and to 100% (18 of 18) (95% CI, 81.5%-100%) in 24 months. Control clinics had no significant increase in screening rates (baseline, 7 of 64 children [10.9%]; 6-24 months after the intervention, 11 of 72 children [15.3%]; P = .46). Screening results were positive for 265 of 980 children (27.0%) screened by CHICA during the study period. Among the 265 patients with positive screening results, physicians indicated any response in CHICA in 151 (57.0%). Two children in the intervention group received a new diagnosis of ASD within the time frame of the study.
Conclusions and Relevance:
The findings suggest that computer automation, when integrated with clinical workflow and the electronic health record, increases screening of children for ASD, but follow-up by physicians is still flawed. Automation of the subsequent workup is still needed
Towards a Process Reference Model for Information Supply Chain Management
High-quality information is a prerequisite for companies to accomplish business and strategic goals, such as global reporting, customer relationship management or compliance with legal provisions. During the last years, experts in the field of information quality begun to realize that a paradigm shift is needed to solve information quality issues in organizations. Information should be treated as a product and information quality is possible only through the quality management of information supply chains. The paper at hand contributes to this new direction by proposing a process reference model for quality management of information supply chains (Information Product Supply Chain Management, IPSCM) by leveraging the SCOR-Model, a widely accepted standard for supply chain management. The IPSCM-Model enables users to address, improve, and communicate information creation practices within and between all interested parties
Strategic Business Requirements for Master Data Management Systems
Master Data Management (MDM) is of increasing importance because it is seen as a promising approach in companies torespond to a number of strategic business requirements, such as complying with an increasing number of regulations,supporting internal and external business process integration, and establishing a “360-degree-view on the customer”. As aresult, software vendors such as IBM, Oracle, SAP, and TIBCO are offering MDM application systems. However, the usercommunity feels a significant mismatch between their own strategic requirements and the functionality currently offered bythe software products. As the Information Systems (IS) research community has remained silent so far regarding this researchproblem, the research presented in this paper makes intensive use of knowledge from the practitioners’ community in order todesign a framework for strategic business requirements to be met by MDM systems. As an outcome of a design-orientedresearch process, the framework is an artifact which advances the scientific body of knowledge while at the same timeproviding benefit for practitioners. The framework includes seven design principles which are translated into 23requirements. The requirements form a baseline for internal and external communication in companies and for the design ofconcrete MDM systems
- …