109,008 research outputs found

    Word-of-mouth recruiting: Why small businesses using this efficient practice should survive disparate impact challenges under title VII

    Get PDF
    Joe’s Widget Company, a sole proprietorship specializing in manufacturing and selling widgets to businesses and consumers, operates in a small community in the Midwest. Joe started the company eighteen years ago out of his own garage with a five thousand dollar loan from his father and a single employee in his brother. Within two decades, Joe’s had sales exceeding four million dollars annually and sold its widgets to customers in three different countries and twelve states. With twenty-one employees, Joe’s had become one of the community’s fifteen largest employers

    An evaluation of constructivism for learners with ADHD: Development of a constructivist pedagogy for special needs

    Get PDF
    We examine whether constructivist eLearning tools can be used to help learners cope with special educational needs, such as difficulties with attention and concentration. Preliminary work is reported here, in which we seek to determine the reasons why a constructivist approach is difficult for learners with ADHD. This work is intended to lead to recommendations of how learners with ADHD could benefit from constructivist eLearning systems, e.g. through the managed use of multimedia technology. A preliminary model has been developed that illustrates the areas in which constructivist pedagogies need to address the limitations of ADHD learners. Further work will expand this model and eventually test it in a real environment (e.g. in a school with ADHD learners). The outcome will encourage a reconsideration of existing multimedia theories as they relate to learners with special needs, and provide new directions in order to support learners with ADHD

    βCaMKII regulates bidirectional long-term plasticity in cerebellar Purkinje cells by a CaMKII/PP2B switch mechanism

    Get PDF
    This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise statedPeer reviewe

    Three-dimensional context-aware tailoring of information

    Get PDF
    This is the post-print version of the Article. The official published version can be accessed from the link below - Copyright @ 2010 EmeraldPurpose – The purpose of this paper is to explore the notion of context in ubiquitous computing. Personal Information Managers exploit the ubiquitous paradigm in mobile computing to integrate services and programs for business and leisure. Recognising that every situation is constituted by information and events, context will vary depending on the situation in which users find themselves. The paper aims to show the viability of tailoring contextual information to provide users with timely and relevant information. Design/methodology/approach – A survey was conducted after testing on a group of real world users. The test group used the application for approximately half a day each and performed a number of tasks. Findings – The results from the survey show the viability of tailoring contextual information to provide users with timely and relevant information. Among the questions in the questionnaire the users were asked to state whether or not they would like to use this application in their daily life. Statistically significant results indicate that the users found value in using the application. Originality/value – This work is a new exploration and implementation of context by integrating three dimensions of context: social information, activity information, and geographical position

    Variational Principles for Stochastic Soliton Dynamics

    Full text link
    We develop a variational method of deriving stochastic partial differential equations whose solutions follow the flow of a stochastic vector field. As an example in one spatial dimension we numerically simulate singular solutions (peakons) of the stochastically perturbed Camassa-Holm (CH) equation derived using this method. These numerical simulations show that peakon soliton solutions of the stochastically perturbed CH equation persist and provide an interesting laboratory for investigating the sensitivity and accuracy of adding stochasticity to finite dimensional solutions of stochastic partial differential equations (SPDE). In particular, some choices of stochastic perturbations of the peakon dynamics by Wiener noise (canonical Hamiltonian stochastic deformations, or CH-SD) allow peakons to interpenetrate and exchange order on the real line in overtaking collisions, although this behaviour does not occur for other choices of stochastic perturbations which preserve the Euler-Poincar\'e structure of the CH equation (parametric stochastic deformations, or P-SD), and it also does not occur for peakon solutions of the unperturbed deterministic CH equation. The discussion raises issues about the science of stochastic deformations of finite-dimensional approximations of evolutionary PDE and the sensitivity of the resulting solutions to the choices made in stochastic modelling.Comment: 21 pages, 15 figures -- 2nd versio

    The radiative forcing potential of different climate geoengineering options

    Get PDF
    Climate geoengineering proposals seek to rectify the Earth's current and potential future radiative imbalance, either by reducing the absorption of incoming solar (shortwave) radiation, or by removing CO2 from the atmosphere and transferring it to long-lived reservoirs, thus increasing outgoing longwave radiation. A fundamental criterion for evaluating geoengineering options is their climate cooling effectiveness, which we quantify here in terms of radiative forcing potential. We use a simple analytical approach, based on energy balance considerations and pulse response functions for the decay of CO2 perturbations. This aids transparency compared to calculations with complex numerical models, but is not intended to be definitive. It allows us to compare the relative effectiveness of a range of proposals. We consider geoengineering options as additional to large reductions in CO2 emissions. By 2050, some land carbon cycle geoengineering options could be of comparable magnitude to mitigation "wedges", but only stratospheric aerosol injections, albedo enhancement of marine stratocumulus clouds, or sunshades in space have the potential to cool the climate back toward its pre-industrial state. Strong mitigation, combined with global-scale air capture and storage, afforestation, and bio-char production, i.e. enhanced CO2 sinks, might be able to bring CO2 back to its pre-industrial level by 2100, thus removing the need for other geoengineering. Alternatively, strong mitigation stabilising CO2 at 500 ppm, combined with geoengineered increases in the albedo of marine stratiform clouds, grasslands, croplands and human settlements might achieve a patchy cancellation of radiative forcing. Ocean fertilisation options are only worthwhile if sustained on a millennial timescale and phosphorus addition may have greater long-term potential than iron or nitrogen fertilisation. Enhancing ocean upwelling or downwelling have trivial effects on any meaningful timescale. Our approach provides a common framework for the evaluation of climate geoengineering proposals, and our results should help inform the prioritisation of further research into them

    From discretization to regularization of composite discontinuous functions

    Get PDF
    Discontinuities between distinct regions, described by different equation sets, cause difficulties for PDE/ODE solvers. We present a new algorithm that eliminates integrator discontinuities through regularizing discontinuities. First, the algorithm determines the optimum switch point between two functions spanning adjacent or overlapping domains. The optimum switch point is determined by searching for a “jump point” that minimizes a discontinuity between adjacent/overlapping functions. Then, discontinuity is resolved using an interpolating polynomial that joins the two discontinuous functions. This approach eliminates the need for conventional integrators to either discretize and then link discontinuities through generating interpolating polynomials based on state variables or to reinitialize state variables when discontinuities are detected in an ODE/DAE system. In contrast to conventional approaches that handle discontinuities at the state variable level only, the new approach tackles discontinuity at both state variable and the constitutive equations level. Thus, this approach eliminates errors associated with interpolating polynomials generated at a state variable level for discontinuities occurring in the constitutive equations. Computer memory space requirements for this approach exponentially increase with the dimension of the discontinuous function hence there will be limitations for functions with relatively high dimensions. Memory availability continues to increase with price decreasing so this is not expected to be a major limitation

    Recent advances in understanding idiopathic pulmonary fibrosis

    Get PDF
    Despite major research efforts leading to the recent approval of pirfenidone and nintedanib, the dismal prognosis of idiopathic pulmonary fibrosis (IPF) remains unchanged. The elaboration of international diagnostic criteria and disease stratification models based on clinical, physiological, radiological, and histopathological features has improved the accuracy of IPF diagnosis and prediction of mortality risk. Nevertheless, given the marked heterogeneity in clinical phenotype and the considerable overlap of IPF with other fibrotic interstitial lung diseases (ILDs), about 10% of cases of pulmonary fibrosis remain unclassifiable. Moreover, currently available tools fail to detect early IPF, predict the highly variable course of the disease, and assess response to antifibrotic drugs. Recent advances in understanding the multiple interrelated pathogenic pathways underlying IPF have identified various molecular phenotypes resulting from complex interactions among genetic, epigenetic, transcriptional, post-transcriptional, metabolic, and environmental factors. These different disease endotypes appear to confer variable susceptibility to the condition, differing risks of rapid progression, and, possibly, altered responses to therapy. The development and validation of diagnostic and prognostic biomarkers are necessary to enable a more precise and earlier diagnosis of IPF and to improve prediction of future disease behaviour. The availability of approved antifibrotic therapies together with potential new drugs currently under evaluation also highlights the need for biomarkers able to predict and assess treatment responsiveness, thereby allowing individualised treatment based on risk of progression and drug response. This approach of disease stratification and personalised medicine is already used in the routine management of many cancers and provides a potential road map for guiding clinical care in IPF

    Dynamic Set Intersection

    Full text link
    Consider the problem of maintaining a family FF of dynamic sets subject to insertions, deletions, and set-intersection reporting queries: given S,SFS,S'\in F, report every member of SSS\cap S' in any order. We show that in the word RAM model, where ww is the word size, given a cap dd on the maximum size of any set, we can support set intersection queries in O(dw/log2w)O(\frac{d}{w/\log^2 w}) expected time, and updates in O(logw)O(\log w) expected time. Using this algorithm we can list all tt triangles of a graph G=(V,E)G=(V,E) in O(m+mαw/log2w+t)O(m+\frac{m\alpha}{w/\log^2 w} +t) expected time, where m=Em=|E| and α\alpha is the arboricity of GG. This improves a 30-year old triangle enumeration algorithm of Chiba and Nishizeki running in O(mα)O(m \alpha) time. We provide an incremental data structure on FF that supports intersection {\em witness} queries, where we only need to find {\em one} eSSe\in S\cap S'. Both queries and insertions take O\paren{\sqrt \frac{N}{w/\log^2 w}} expected time, where N=SFSN=\sum_{S\in F} |S|. Finally, we provide time/space tradeoffs for the fully dynamic set intersection reporting problem. Using MM words of space, each update costs O(MlogN)O(\sqrt {M \log N}) expected time, each reporting query costs O(NlogNMop+1)O(\frac{N\sqrt{\log N}}{\sqrt M}\sqrt{op+1}) expected time where opop is the size of the output, and each witness query costs O(NlogNM+logN)O(\frac{N\sqrt{\log N}}{\sqrt M} + \log N) expected time.Comment: Accepted to WADS 201
    corecore