1,998 research outputs found

    State of the Art on Stylized Fabrication

    Get PDF
    © 2018 The Authors Computer Graphics Forum © 2018 The Eurographics Association and John Wiley & Sons Ltd. Digital fabrication devices are powerful tools for creating tangible reproductions of 3D digital models. Most available printing technologies aim at producing an accurate copy of a tridimensional shape. However, fabrication technologies can also be used to create a stylistic representation of a digital shape. We refer to this class of methods as ‘stylized fabrication methods’. These methods abstract geometric and physical features of a given shape to create an unconventional representation, to produce an optical illusion or to devise a particular interaction with the fabricated model. In this state-of-the-art report, we classify and overview this broad and emerging class of approaches and also propose possible directions for future research

    Volume-aware design of composite molds

    Get PDF
    © 2019 Association for Computing Machinery. We propose a novel technique for the automatic design of molds to cast highly complex shapes. The technique generates composite, two-piece molds. Each mold piece is made up of a hard plastic shell and a flexible silicone part. Thanks to the thin, soft, and smartly shaped silicone part, which is kept in place by a hard plastic shell, we can cast objects of unprecedented complexity. An innovative algorithm based on a volumetric analysis defines the layout of the internal cuts in the silicone mold part. Our approach can robustly handle thin protruding features and intertwined topologies that have caused previous methods to fail. We compare our results with state of the art techniques, and we demonstrate the casting of shapes with extremely complex geometry

    Pivotal estimation in high-dimensional regression via linear programming

    Full text link
    We propose a new method of estimation in high-dimensional linear regression model. It allows for very weak distributional assumptions including heteroscedasticity, and does not require the knowledge of the variance of random errors. The method is based on linear programming only, so that its numerical implementation is faster than for previously known techniques using conic programs, and it allows one to deal with higher dimensional models. We provide upper bounds for estimation and prediction errors of the proposed estimator showing that it achieves the same rate as in the more restrictive situation of fixed design and i.i.d. Gaussian errors with known variance. Following Gautier and Tsybakov (2011), we obtain the results under weaker sensitivity assumptions than the restricted eigenvalue or assimilated conditions

    Patterson Function from Low-Energy Electron Diffraction Measured Intensities and Structural Discrimination

    Full text link
    Surface Patterson Functions have been derived by direct inversion of experimental Low-Energy Electron Diffraction I-V spectra measured at multiple incident angles. The direct inversion is computationally simple and can be used to discriminate between different structural models. 1x1 YSi_2 epitaxial layers grown on Si(111) have been used to illustrate the analysis. We introduce a suitable R-factor for the Patterson Function to make the structural discrimination as objective as possible. From six competing models needed to complete the geometrical search, four could easily be discarded, achieving a very significant and useful reduction in the parameter space to be explored by standard dynamical LEED methods. The amount and quality of data needed for this analysis is discussed.Comment: 5 pages, 4 figure

    Stuck in Time: Negative Income Shock Constricts the Temporal Window of Valuation Spanning the Future and the Past

    Get PDF
    Insufficient resources are associated with negative consequences including decreased valuation of future reinforcers. To determine if these effects result from scarcity, we examined the consequences of acute, abrupt changes in resource availability on delay discounting-the subjective devaluation of rewards as delay to receipt increases. In the current study, 599 individuals recruited from Amazon Mechanical Turk read a narrative of a sudden change (positive, neutral, or negative) to one\u27s hypothetical future income and completed a delay discounting task examining future and past monetary gains and losses. The effects of the explicit zero procedure, a framing manipulation, was also examined. Negative income shock significantly increased discounting rates for gains and loses occurring both in the future and the past. Positive income windfalls significantly decreased discounting to a lesser extent. The framing procedure significantly reduced discounting under all conditions. Negative income shocks may result in short-term choices

    The multiple ionospheric probe Auroral ionospheric report

    Get PDF
    Multiple impedance and resonance probe payload for ionospheric property observation in Nike- Apache rocke

    Dwarf mongoose alarm calls: investigating a complex non-human animal call

    Get PDF
    Communication plays a vital role in the social lives of many species and varies greatly in complexity. One possible way to increase communicative complexity is by combining signals into longer sequences, which has been proposed as a mechanism allowing species with a limited repertoire to increase their communicative output. In mammals, most studies on combinatoriality have focused on vocal communication in non-human primates. Here, we investigated a potential combination of alarm calls in the dwarf mongoose (Helogale parvula), a non-primate mammal. Acoustic analyses and playback experiments with a wild population suggest: i) that dwarf mongooses produce a complex call type (T3) which, at least at the surface level, seems to comprise units that are not functionally different to two meaningful alarm calls (aerial and terrestrial); and ii) that this T3 call functions as a general alarm, produced in response to a wide range of threats. Using a novel approach, we further explored multiple interpretations of the T3 call based on the information content of the apparent comprising calls and how they are combined. We also considered an alternative, non-combinatorial, interpretation that frames T3 as the origin, rather than the product, of the individual alarm calls. This study complements previous knowledge of vocal combinatoriality in non-primate mammals and introduces an approach that could facilitate comparisons between different animal and human communication systems

    Choice Bundling Increases Valuation of Delayed Losses More Than Gains in Cigarette Smokers

    Get PDF
    Choice bundling, in which a single choice produces a series of repeating consequences over time, increases valuation of delayed monetary and non-monetary gains. Interventions derived from this manipulation may be an effective method for mitigating the elevated delay discounting rates observed in cigarette smokers. No prior work, however, has investigated whether the effects of choice bundling generalize to reward losses. In the present study, an online panel of cigarette smokers (N = 302), recruited using survey firms Ipsos and InnovateMR, completed assessments for either monetary gains or losses (randomly assigned). In Step 1, participants completed a delay-discounting task to establish Effective Delay 50 (ED50), or the delay required for an outcome to lose half of its value. In Step 2, participants completed three conditions of an adjusting-amount task, choosing between a smaller, sooner (SS) adjusting amount and a larger, later (LL) fixed amount. The bundle size (i.e., number of consequences) was manipulated across conditions, where a single choice produced either 1 (control), 3, or 9 consequences over time (ascending/descending order counterbalanced). The delay to the first LL amount in each condition, as well as the intervals between all additional SS and LL amounts (where applicable), were set to individual participants’ ED50 values from Step 1 to control for differences in discounting of gains and losses. Results from Step 1 showed significantly higher ED50 values (i.e., less discounting) for losses compared to gains (p \u3c 0.001). Results from Step 2 showed that choice bundling significantly increased valuation of both LL gains and losses (p \u3c 0.001), although effects were significantly greater for losses (p \u3c 0.01). Sensitivity analyses replicated these conclusions. Future research should examine the potential clinical utility of choice bundling, such as development of motivational interventions that emphasize both the bundled health gains associated with smoking cessation and the health losses associated with continued smoking

    Validation of differential gene expression algorithms: Application comparing fold-change estimation to hypothesis testing

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Sustained research on the problem of determining which genes are differentially expressed on the basis of microarray data has yielded a plethora of statistical algorithms, each justified by theory, simulation, or ad hoc validation and yet differing in practical results from equally justified algorithms. Recently, a concordance method that measures agreement among gene lists have been introduced to assess various aspects of differential gene expression detection. This method has the advantage of basing its assessment solely on the results of real data analyses, but as it requires examining gene lists of given sizes, it may be unstable.</p> <p>Results</p> <p>Two methodologies for assessing predictive error are described: a cross-validation method and a posterior predictive method. As a nonparametric method of estimating prediction error from observed expression levels, cross validation provides an empirical approach to assessing algorithms for detecting differential gene expression that is fully justified for large numbers of biological replicates. Because it leverages the knowledge that only a small portion of genes are differentially expressed, the posterior predictive method is expected to provide more reliable estimates of algorithm performance, allaying concerns about limited biological replication. In practice, the posterior predictive method can assess when its approximations are valid and when they are inaccurate. Under conditions in which its approximations are valid, it corroborates the results of cross validation. Both comparison methodologies are applicable to both single-channel and dual-channel microarrays. For the data sets considered, estimating prediction error by cross validation demonstrates that empirical Bayes methods based on hierarchical models tend to outperform algorithms based on selecting genes by their fold changes or by non-hierarchical model-selection criteria. (The latter two approaches have comparable performance.) The posterior predictive assessment corroborates these findings.</p> <p>Conclusions</p> <p>Algorithms for detecting differential gene expression may be compared by estimating each algorithm's error in predicting expression ratios, whether such ratios are defined across microarray channels or between two independent groups.</p> <p>According to two distinct estimators of prediction error, algorithms using hierarchical models outperform the other algorithms of the study. The fact that fold-change shrinkage performed as well as conventional model selection criteria calls for investigating algorithms that combine the strengths of significance testing and fold-change estimation.</p

    Coherent frequentism

    Full text link
    By representing the range of fair betting odds according to a pair of confidence set estimators, dual probability measures on parameter space called frequentist posteriors secure the coherence of subjective inference without any prior distribution. The closure of the set of expected losses corresponding to the dual frequentist posteriors constrains decisions without arbitrarily forcing optimization under all circumstances. This decision theory reduces to those that maximize expected utility when the pair of frequentist posteriors is induced by an exact or approximate confidence set estimator or when an automatic reduction rule is applied to the pair. In such cases, the resulting frequentist posterior is coherent in the sense that, as a probability distribution of the parameter of interest, it satisfies the axioms of the decision-theoretic and logic-theoretic systems typically cited in support of the Bayesian posterior. Unlike the p-value, the confidence level of an interval hypothesis derived from such a measure is suitable as an estimator of the indicator of hypothesis truth since it converges in sample-space probability to 1 if the hypothesis is true or to 0 otherwise under general conditions.Comment: The confidence-measure theory of inference and decision is explicitly extended to vector parameters of interest. The derivation of upper and lower confidence levels from valid and nonconservative set estimators is formalize
    • …
    corecore