69 research outputs found

    Evidence-based Kernels: Fundamental Units of Behavioral Influence

    Get PDF
    This paper describes evidence-based kernels, fundamental units of behavioral influence that appear to underlie effective prevention and treatment for children, adults, and families. A kernel is a behaviorā€“influence procedure shown through experimental analysis to affect a specific behavior and that is indivisible in the sense that removing any of its components would render it inert. Existing evidence shows that a variety of kernels can influence behavior in context, and some evidence suggests that frequent use or sufficient use of some kernels may produce longer lasting behavioral shifts. The analysis of kernels could contribute to an empirically based theory of behavioral influence, augment existing prevention or treatment efforts, facilitate the dissemination of effective prevention and treatment practices, clarify the active ingredients in existing interventions, and contribute to efficiently developing interventions that are more effective. Kernels involve one or more of the following mechanisms of behavior influence: reinforcement, altering antecedents, changing verbal relational responding, or changing physiological states directly. The paper describes 52 of these kernels, and details practical, theoretical, and research implications, including calling for a national database of kernels that influence human behavior

    Estimation for Robust Control.

    No full text
    this paper we discuss this latter line of research, and compare and contrast it to the former `hard bounding' schools of thought. For this purpose we assume that the required model description is a frequency domain one so that we consider only linear time invariant plants. Furthermore, we assume that the description is to be obtained from some observed noise-corrupted data of short duration. We do not consider the noise free case or long data record cases. We believe these problems can be solved with existing theory by fitting a high order, high fidelity model to the data and then carrying out some sort of model order reduction if this is deemed desirable. The interest arises in short duration noise-corrupted data where undermodelling is introduced due to a bias versus variance tradeoff. Specifically, for given data, as the model order is increased the fitted model infidelity due to undermodelling (bias error) decreases but the infidelity due to noise (variance error) increases [25, 48]. The net result is that for short, noisy data records it is optimal to fit a lower order model than the true system to the data. The difficulty addressed by this paper is the question of quantifying the bias error introduced by this resultant undermodelling. That is, we seek to identify a model that consists not only of a nominal plant, but also error bounds around this nominal value that are constructed to take into account both stochastic disturbances and undermodelling. The question of quantifying model errors has been widely addressed in existing literature using stochastic estimation theory, but only when these errors are introduced by noise alone. For example, suppose we choose a model structure that allows us to describe the observed data via a linear regressio

    A novel input design approach for systems with quantized output data

    No full text
    In this paper, we explore the problem of input design for systems with quantized measurements. For the input design problem, we calculate and optimize a function of the Fisher Information Matrix (FIM). The calculation of the FIM is greatly simplified by using known relationships of the derivative of the likelihood function, and the auxiliary function arising from the Expectation Maximization (EM) algorithm. To optimize the FIM, we design an experiment using a recently published method based on graph theory. A numerical example shows that the proposed experiment can be successfully used in quantized systems
    • ā€¦
    corecore