224 research outputs found

    Are v1 simple cells optimized for visual occlusions? : A comparative study

    Get PDF
    Abstract: Simple cells in primary visual cortex were famously found to respond to low-level image components such as edges. Sparse coding and independent component analysis (ICA) emerged as the standard computational models for simple cell coding because they linked their receptive fields to the statistics of visual stimuli. However, a salient feature of image statistics, occlusions of image components, is not considered by these models. Here we ask if occlusions have an effect on the predicted shapes of simple cell receptive fields. We use a comparative approach to answer this question and investigate two models for simple cells: a standard linear model and an occlusive model. For both models we simultaneously estimate optimal receptive fields, sparsity and stimulus noise. The two models are identical except for their component superposition assumption. We find the image encoding and receptive fields predicted by the models to differ significantly. While both models predict many Gabor-like fields, the occlusive model predicts a much sparser encoding and high percentages of ‘globular’ receptive fields. This relatively new center-surround type of simple cell response is observed since reverse correlation is used in experimental studies. While high percentages of ‘globular’ fields can be obtained using specific choices of sparsity and overcompleteness in linear sparse coding, no or only low proportions are reported in the vast majority of studies on linear models (including all ICA models). Likewise, for the here investigated linear model and optimal sparsity, only low proportions of ‘globular’ fields are observed. In comparison, the occlusive model robustly infers high proportions and can match the experimentally observed high proportions of ‘globular’ fields well. Our computational study, therefore, suggests that ‘globular’ fields may be evidence for an optimal encoding of visual occlusions in primary visual cortex. Author Summary: The statistics of our visual world is dominated by occlusions. Almost every image processed by our brain consists of mutually occluding objects, animals and plants. Our visual cortex is optimized through evolution and throughout our lifespan for such stimuli. Yet, the standard computational models of primary visual processing do not consider occlusions. In this study, we ask what effects visual occlusions may have on predicted response properties of simple cells which are the first cortical processing units for images. Our results suggest that recently observed differences between experiments and predictions of the standard simple cell models can be attributed to occlusions. The most significant consequence of occlusions is the prediction of many cells sensitive to center-surround stimuli. Experimentally, large quantities of such cells are observed since new techniques (reverse correlation) are used. Without occlusions, they are only obtained for specific settings and none of the seminal studies (sparse coding, ICA) predicted such fields. In contrast, the new type of response naturally emerges as soon as occlusions are considered. In comparison with recent in vivo experiments we find that occlusive models are consistent with the high percentages of center-surround simple cells observed in macaque monkeys, ferrets and mice

    Wirtschaftlichkeit von Bioethanol – Produktion und Produktionskosten im nationalen und internationalen Vergleich

    Get PDF
    In this article the profitability of the bioethanol production in Germany is analysed under consideration of the international competition. Therefore, the production costs of bioethanol from wheat and beet in Germany as well as from sugar cane and corn in other representative countries are compared. Based on this, the competitiveness of imported as well as domestic bioethanol against gasoline on the German market are analyzed and the maximum payable feedstock price for sugar beet is calculated. The calculations lead to the result, that, despite the implemented mandatory blending, further cost reductions for the production of bioethanol in Germany are required in order to achieve competitiveness against imports, whereas the utilization of bioethanol is independent of the crude oil price within the targets of the mandatory blending scheme.bioethanol, competitiveness, sugar beet, Resource /Energy Economics and Policy,

    Truncated Variational Sampling for "Black Box" Optimization of Generative Models

    Get PDF
    We investigate the optimization of two probabilistic generative models with binary latent variables using a novel variational EM approach. The approach distinguishes itself from previous variational approaches by using latent states as variational parameters. Here we use efficient and general purpose sampling procedures to vary the latent states, and investigate the "black box" applicability of the resulting optimization procedure. For general purpose applicability, samples are drawn from approximate marginal distributions of the considered generative model as well as from the model's prior distribution. As such, variational sampling is defined in a generic form, and is directly executable for a given model. As a proof of concept, we then apply the novel procedure (A) to Binary Sparse Coding (a model with continuous observables), and (B) to basic Sigmoid Belief Networks (which are models with binary observables). Numerical experiments verify that the investigated approach efficiently as well as effectively increases a variational free energy objective without requiring any additional analytical steps

    Changes in the British and Irish flora - the role of genome size

    Get PDF
    Unprecedented anthropogenic changes are causing drastic shifts in biodiversity, species ranges and the survival of plants. Understanding which attributes put plants at risk is of vital importance for safeguarding the natural world. Genome size is a fundamental plant attribute with strong links to a variety of plant traits and its study opens novel areas of ecological research, leading to a new understanding of plant responses to environmental changes. The aim of this thesis is to consider the role that genome size plays at landscape scales. To achieve this aim, I assembled an inventory of the flora of Britain and Ireland and analysed species distribution patterns within the flora over time, together with information on land use, climate and nutrient deposition changes across the past three decades. Distinctive spatial patterns of mean genome size per hectad of Britain and Ireland were found across time, with a steady increase in mean genome size since the 1980s. A particular driver of the patterns appears to be land use, with areas especially impacted by humans containing plant communities characterised by larger mean genome sizes. Genome size, along with a set of functional traits and niche descriptors, were all informative characters in a random forest algorithm predicting species trends, achieving 70% prediction accuracy. The effect of genome size was found to be indirect, mediated via its influence on functional traits, which in turn lead to differing niche requirements and temporal trends. The results suggest that the effects of genome size on plant growth, fitness and response to the abiotic environment impacts landscape scale species compositions. Genome size emerges as an important meta-trait to consider when monitoring and anticipating biodiversity changes in response to environmental change and could be used in models that guide conservation efforts

    PSO pour l'apprentissage supervisé des réseaux neuronaux de type fuzzy ARTMAP

    Get PDF
    Dans ce mémoire, nous avons étudié les divers comportements d'un type de réseau de neurones en particulier, soit le réseau fuzzy ARTMAP (FAM), dans le but de développer une stratégie d'apprentissage spécialisée pour ce type de réseau. Pour ce faire, nous avons observé les effets de plusieurs caractéristiques sur ce type de réseau, soit: la taille de la base de données d'entraînement, les stratégies d'apprentissage standard, la technique de normalisation, la structure du chevauchement, la polarité du MatchTracking ainsi que l'influence des paramètres internes du réseau fuzzy ARTMAP. Ces effets sont mesurés au niveau de la qualité et des ressources utilisées par le réseau FAM à travers des bases de données synthétiques et réelles. Nous avons remarqué que le réseau FAM présente une dégradation de performances due à un effet de sur-apprentissage créé par le nombre de patrons d'entraînement et le nombre d'époques d'apprentissage, et ce, avec les bases de données possédant un degré de chevauchement. Pour éviter ce problème, nous avons développé une stratégie d'apprentissage spécialisée pour les réseaux FAM. Celle-ci permet d'améliorer les performances en généralisation en utilisant l'optimisation par essaims particulaires ou PSO (anglais pour "Particle Swarm Optimization") pour optimiser la valeur des quatre paramètres internes FAM (α, β, є et ρ). Cette stratégie spécialisée obtient lors de toutes nos simulations, tant avec les bases de données synthétiques que réelles, de meilleures performances en généralisation que lors de l'utilisation des stratégies d'apprentissage standard utilisant les paramètres standard des réseaux FAM (MT+, MT-). De plus, elle permet d'éliminer la majorité de l'erreur de sur-apprentissage due à la taille de la base d'entraînement et au nombre d'époques d'apprentissage. Ainsi, cette stratégie spécialisée pour FAM a démontré que la valeur des paramètres internes du réseau FAM a un impact considérable sur les performances du réseau. De plus, pour toutes les bases testées, les valeurs optimisées des paramètres sont généralement toutes éloignées de leurs valeurs standard (MT-et MT+), lesquelles sont majoritairement utilisées lors de l'emploi du réseau FAM. Cependant, cette stratégie d'apprentissage spécialisée n'est pas consistante avec la philosophie « on-line » de la famille ART, car la valeur des paramètres n'est pas optimisée séquentiellement. Malgré tout, elle permet d'indiquer les zones de performances optimales pouvant être atteintes par le réseau fuzzy ARTMAP. À notre connaissance, c'est la première fois qu'une stratégie d'apprentissage pour FAM utilise l'optimisation des valeurs des quatre paramètres internes de ce réseau

    International tourists' environmental orientation and willingness to pay for conservation : implications for New Zealand's tourism and conservation policy

    Get PDF
    Through books such as Carson’s Silent Spring (1962), environmental disasters such as Chernobyl (1989) and increased scientific evidence of climate change and its consequences (IPCC 2007, UNWTO and UNEP 2008) people have become more concerned about human impacts on the environment. This growing environmental awareness and concern could affect choices towards tourism products and services for example, travel by air. Forsyth et al. (2007) write that environmentally conscious tourists may perceive aviation as increasingly negative and might consider flying less or even boycotting air travel. This attitude would have serious consequences for long distance destinations such as New Zealand. Some airlines have already responded to the more environmentally conscious consumer by launching carbon offsetting schemes. Becken (2004) and Fairweather et al. (2005) have found that some tourists are already willing to pay a voluntary fee to reduce carbon impacts created by their personal travel. Generally, tourism products and services are increasingly scrutinised and demand is rising for sustainable forms such as ecotourism (Fennell 2003). Ecotourism relies on quality natural environments often found in national parks. According to New Zealand’s Ministry of Tourism (2007) over 30 percent of all international tourists visited at least one park while on holiday. Through increased interest in nature experiences pressure rises for park managers to effectively administer the growing visitor numbers. Managers find themselves in the difficult position to protect and care for the natural environment to manage visitor numbers in an equitable, just and effective way. This research studies tourists’ environmental values, attitudes, behaviours and willingness to pay for carbon offsetting services and national park entrance fees. To meet the thesis aim, primary data was obtained using an on-site survey at four visitor centres located in the South Island of New Zealand. Overall, 385 of all 400 questionnaires were fully answered, resulting in a response rate of 95%. Data was described and analysed and the main findings were compared with previous research. There was evidence for the existence of a pro-environmental orientated tourist in New Zealand, generally supporting findings of Higham and Carr (2002), Luck (2003) and Fairweather et al. (2005). A strong interest in nature experiences was eminent. Over 80 percent had visited at least one national park while on holiday and were also willing to pay an entrance fee of NZ 10.00(mean).Mostindicatedtoengageinproenvironmentalbehaviours.However,only20percentofall385touristsbelongedtoanenvironmentalgroupindicatingthatageneralideologicalselfplacementdoesnotnecessarilyresultinproenvironmentalbehaviour.GermantouristsshowedstrongerproenvironmentalattitudesthanrespondentsofothernationalitieswhichgenerallysupportsLucks(2003)findings.Furthermore,over60percentoftouristsviewedclimatechangerisksasbeingnegative.Interestingly,over50percentwerewillingtopayavoluntaryfeeforcarbonoffsettingschemes.Whileanenvironmentalorientationamongstinternationaltouristshasbeenacknowledged,NewZealandstourismmanagersshouldincreasinglyaddressenvironmentalstandardstomeettheexpectationsofacleanandgreenimage.WithregardtonationalparkmanagementinNewZealanditisrecommendedtoreaddressadiscussiononentrancefees.ItshouldbeacknowledgedthattouristsarewillingtopayNZ 10.00 (mean). Most indicated to engage in pro-environmental behaviours. However, only 20 percent of all 385 tourists belonged to an environmental group indicating that a general ideological self-placement does not necessarily result in pro-environmental behaviour. German tourists showed stronger pro-environmental attitudes than respondents of other nationalities which generally supports Luck’s (2003) findings. Furthermore, over 60 percent of tourists viewed climate change risks as being negative. Interestingly, over 50 percent were willing to pay a voluntary fee for carbon offsetting schemes. While an environmental orientation amongst international tourists has been acknowledged, New Zealand’s tourism managers should increasingly address environmental standards to meet the expectations of a ‘clean and green’ image. With regard to national park management in New Zealand it is recommended to re-address a discussion on entrance fees. It should be acknowledged that tourists are willing to pay NZ 10.00

    Analysis of red chalk drawings from the workshop of Giovanni Battista Piranesi using fiber optics reflectance spectroscopy

    Get PDF
    The viability of fiber optics reflectance spectroscopy (FORS) for the differentiation of red chalk drawing media was investigated, focusing on the group of drawings from the workshop of Giovanni Battista Piranesi (1720–1778) at the Staatliche Kunsthalle Karlsruhe, Germany. The evaluation of spectra was supported by principal component analysis (PCA). The method was tested on mock-up drawings made with recently acquired natural and synthetic red chalks of known origin. It was possible to sort these mock-up drawings according to chalk type and application technique. The compositional differences of these reference chalks were confirmed by X-ray diffraction (XRD) and scanning electron microscopy (SEM). Subsequent FORS analysis of selected original drawings revealed the existence of several closely grouped clusters, implying similarities on the basis of the underlying spectral features among the historical red chalks used in Rome. These similarities distinguished the historical drawings from the red chalk mock-up drawings, except for the drawings made with red chalk samples from the area near the town of Theley, Germany, which were shown to bear close similarities to those in the cluster of historical samples

    ProSper -- A Python Library for Probabilistic Sparse Coding with Non-Standard Priors and Superpositions

    Get PDF
    ProSper is a python library containing probabilistic algorithms to learn dictionaries. Given a set of data points, the implemented algorithms seek to learn the elementary components that have generated the data. The library widens the scope of dictionary learning approaches beyond implementations of standard approaches such as ICA, NMF or standard L1 sparse coding. The implemented algorithms are especially well-suited in cases when data consist of components that combine non-linearly and/or for data requiring flexible prior distributions. Furthermore, the implemented algorithms go beyond standard approaches by inferring prior and noise parameters of the data, and they provide rich a-posteriori approximations for inference. The library is designed to be extendable and it currently includes: Binary Sparse Coding (BSC), Ternary Sparse Coding (TSC), Discrete Sparse Coding (DSC), Maximal Causes Analysis (MCA), Maximum Magnitude Causes Analysis (MMCA), and Gaussian Sparse Coding (GSC, a recent spike-and-slab sparse coding approach). The algorithms are scalable due to a combination of variational approximations and parallelization. Implementations of all algorithms allow for parallel execution on multiple CPUs and multiple machines for medium to large-scale applications. Typical large-scale runs of the algorithms can use hundreds of CPUs to learn hundreds of dictionary elements from data with tens of millions of floating-point numbers such that models with several hundred thousand parameters can be optimized. The library is designed to have minimal dependencies and to be easy to use. It targets users of dictionary learning algorithms and Machine Learning researchers

    ProSper -- A Python Library for Probabilistic Sparse Coding with Non-Standard Priors and Superpositions

    Get PDF
    ProSper is a python library containing probabilistic algorithms to learn dictionaries. Given a set of data points, the implemented algorithms seek to learn the elementary components that have generated the data. The library widens the scope of dictionary learning approaches beyond implementations of standard approaches such as ICA, NMF or standard L1 sparse coding. The implemented algorithms are especially well-suited in cases when data consist of components that combine non-linearly and/or for data requiring flexible prior distributions. Furthermore, the implemented algorithms go beyond standard approaches by inferring prior and noise parameters of the data, and they provide rich a-posteriori approximations for inference. The library is designed to be extendable and it currently includes: Binary Sparse Coding (BSC), Ternary Sparse Coding (TSC), Discrete Sparse Coding (DSC), Maximal Causes Analysis (MCA), Maximum Magnitude Causes Analysis (MMCA), and Gaussian Sparse Coding (GSC, a recent spike-and-slab sparse coding approach). The algorithms are scalable due to a combination of variational approximations and parallelization. Implementations of all algorithms allow for parallel execution on multiple CPUs and multiple machines for medium to large-scale applications. Typical large-scale runs of the algorithms can use hundreds of CPUs to learn hundreds of dictionary elements from data with tens of millions of floating-point numbers such that models with several hundred thousand parameters can be optimized. The library is designed to have minimal dependencies and to be easy to use. It targets users of dictionary learning algorithms and Machine Learning researchers
    corecore