140 research outputs found

    Banishment in the Late Medieval Eastern Netherlands

    Get PDF
    This open access book analyses the practice of banishment and what it can tell us about the values of late medieval society concerning morally acceptable behaviour. It focuses on the Dutch town of Kampen and considers the exclusion of offenders through banishment and the redemption of individuals after their exile. Banishment was a common punishment in late medieval Europe, especially for sexual offences. In Kampen it was also meted out as a consequence of the non-payment of fines, after which people could arrange repayment schemes which allowed them to return. The books firstly considers the legal context of the practice of banishment, before discussing punishment in Kampen more generally. In the third chapter the legal practice of banishment as a punitive and coercive measure is discussed. The final chapter focuses on the redemption of exiles, either because their punishment was completed, or because they arranged for the payment of outstanding fines

    The Role of High-Fat Diets in Exacerbating Cognitive Deficits After Traumatic Brain Injury

    Get PDF
    Traumatic brain injury (TBI) can cause chronic psychiatric-like impairments that may be driven by inflammation in the brain. In the current study, inflammation was upregulated using a high-fat diet (HFD) to assess the role of inflammation in TBI-induced deficits. Rats were randomly assigned to receive an HFD or calorie-matched low-fat diet (LFD) for the duration of the experiment. After two weeks of free access to their respective diets, rats began behavioral training on the Rodent Gambling Task (RGT), during which they were allowed to freely choose to nosepoke in one of four holes in a standard operant chamber. Responses in each hole were associated with different probabilities and magnitudes of reinforcement (sucrose pellets) or punishment (timeout from reinforcement); thus, choices could be classified as either risky or optimal. Premature responses (i.e., nosepokes made before the trial began) were used as a measure of motor impulsivity. After behavior on the RGT stabilized, rats received either a frontal TBI or a sham procedure and continued post-injury testing for 10 weeks. TBI rats substantially decreased in optimal choice but increased in risky choices and motor impulsivity. However, deficits induced or exacerbated by the HFD were inconsistent and low in magnitude. After the behavioral portion of the study, rats were transcardially perfused. The HFD and TBI in combination interacted to increase neuroinflammation, as measured by microglia count. Increases in microglia unaccompanied by changes in behavior indicated that inflammation may simply be a symptom of brain injury and not a driver of psychiatric-like deficits. Thus, further evidence is required to characterize the role of inflammation in cognitive impairment both within and outside the context of brain injury

    A Monte Carlo Simulation of Rat Choice Behavior with Interdependent Outcomes

    Get PDF
    Preclinical behavioral neuroscience often uses choice paradigms to capture psychiatric symptoms. In particular, the subfield of operant research produces nested datasets with many discrete choices in a session. The standard analytic practice is to aggregate choice into a continuous variable and analyze using ANOVA or linear regression. However, choice data often have multiple interdependent outcomes of interest, violating an assumption of general linear models. The aim of the current study was to quantify the accuracy of linear mixed-effects regression (LMER) for analyzing data from a 4-choice operant task called the Rodent Gambling Task (RGT), which measures decision-making in the context of various manipulations (e.g., brain injury). Prior analysis of RGT data from intact rats (Sham; n = 58) and brain-injured rats (TBI; n = 51) revealed five distinct decision-making phenotypes for this task. To generate datasets for parametric analysis, trial-level data was simulated using a Monte Carlo approach recapitulating those phenotypes. Population parameters were defined from existing data, and repeated sampling was conducted to generate 1000 datasets for four sample sizes (n = 6, 10, 14, 20) and four effect sizes (f = 0.0, 0.3, 0.4 and 0.5). Two LMER models were performed to compare TBI versus Sham across datasets: a full LMER where choice of all four outcomes was analyzed simultaneously, and a control LMER where choice of a single outcome was analyzed. The full LMER exceeded 75% false positives across all sample sizes, and the control LMER was underpowered to detect expected effects. These results suggest analyzing trial-level data in a mixed effects logistic regression will be necessary to accurately analyze RGT data. More broadly, these types of errors must be remedied to improve translation to clinical research

    Banishment in the Late Medieval Eastern Netherlands

    Get PDF
    This open access book analyses the practice of banishment and what it can tell us about the values of late medieval society concerning morally acceptable behaviour. It focuses on the Dutch town of Kampen and considers the exclusion of offenders through banishment and the redemption of individuals after their exile. Banishment was a common punishment in late medieval Europe, especially for sexual offences. In Kampen it was also meted out as a consequence of the non-payment of fines, after which people could arrange repayment schemes which allowed them to return. The books firstly considers the legal context of the practice of banishment, before discussing punishment in Kampen more generally. In the third chapter the legal practice of banishment as a punitive and coercive measure is discussed. The final chapter focuses on the redemption of exiles, either because their punishment was completed, or because they arranged for the payment of outstanding fines

    Integrating Surface Normal Vectors Using Fast Marching Method

    Full text link
    Abstract. Integration of surface normal vectors is a vital component in many shape reconstruction algorithms that require integrating surface normals to produce their final outputs, the depth values. In this paper, we introduce a fast and efficient method for computing the depth val-ues from surface normal vectors. The method is based on solving the Eikonal equation using Fast Marching Method. We introduce two ideas. First, while it is not possible to solve for the depths Z directly using Fast Marching Method, we solve the Eikonal equation for a function W of the form W = Z+λf. With appropriately chosen values for λ, we can ensure that the Eikonal equation for W can be solved using Fast March-ing Method. Second, we solve for W in two stages with two different λ values, first in a small neighborhood of the given initial point with large λ, and then for the rest of the domain with a smaller λ. This step is needed because of the finite machine precision and rounding-off errors. The proposed method is very easy to implement, and we demonstrate experimentally that, with insignificant loss in precision, our method is considerably faster than the usual optimization method that uses conju-gate gradient to minimize an error function.

    PS-FCN: A Flexible Learning Framework for Photometric Stereo

    Full text link
    This paper addresses the problem of photometric stereo for non-Lambertian surfaces. Existing approaches often adopt simplified reflectance models to make the problem more tractable, but this greatly hinders their applications on real-world objects. In this paper, we propose a deep fully convolutional network, called PS-FCN, that takes an arbitrary number of images of a static object captured under different light directions with a fixed camera as input, and predicts a normal map of the object in a fast feed-forward pass. Unlike the recently proposed learning based method, PS-FCN does not require a pre-defined set of light directions during training and testing, and can handle multiple images and light directions in an order-agnostic manner. Although we train PS-FCN on synthetic data, it can generalize well on real datasets. We further show that PS-FCN can be easily extended to handle the problem of uncalibrated photometric stereo.Extensive experiments on public real datasets show that PS-FCN outperforms existing approaches in calibrated photometric stereo, and promising results are achieved in uncalibrated scenario, clearly demonstrating its effectiveness.Comment: ECCV 2018: https://guanyingc.github.io/PS-FC

    Solving the Uncalibrated Photometric Stereo Problem using Total Variation

    Get PDF
    International audienceIn this paper we propose a new method to solve the problem of uncalibrated photometric stereo, making very weak assumptions on the properties of the scene to be reconstructed. Our goal is to solve the generalized bas-relief ambiguity (GBR) by performing a total variation regularization of both the estimated normal field and albedo. Unlike most of the previous attempts to solve this ambiguity, our approach does not rely on any prior information about the shape or the albedo, apart from its piecewise smoothness. We test our method on real images and obtain results comparable to the state-of-the-art algorithms

    Height from Photometric Ratio with Model-based Light Source Selection

    Get PDF
    In this paper, we present a photometric stereo algorithm for estimating surface height. We follow recent work that uses photometric ratios to obtain a linear formulation relating surface gradients and image intensity. Using smoothed finite difference approximations for the surface gradient, we are able to express surface height recovery as a linear least squares problem that is large but sparse. In order to make the method practically useful, we combine it with a model-based approach that excludes observations which deviate from the assumptions made by the image formation model. Despite its simplicity, we show that our algorithm provides surface height estimates of a high quality even for objects with highly non-Lambertian appearance. We evaluate the method on both synthetic images with ground truth and challenging real images that contain strong specular reflections and cast shadows

    Solving Uncalibrated Photometric Stereo using Total Variation

    Get PDF
    International audienceEstimating the shape and appearance of an object, given one or several images, is still an open and challenging research problem called 3D-reconstruction. Among the different techniques available, photometric stereo (PS) produces highly accurate results when the lighting conditions have been identified. When these conditions are unknown, the problem becomes the so-called uncalibrated PS problem, which is ill-posed. In this paper, we will show how total variation can be used to reduce the ambiguities of uncalibrated PS, and we will study two methods for estimating the parameters of the generalized bas-relief ambiguity. These methods will be evaluated through the 3D-reconstruction of real-world objects
    • …
    corecore