4,025 research outputs found

    Perceptions Measurement of Professional Certifications to Augment Buffalo State College Baccalaureate Technology Programs, as a Representative American Postsecondary Educational Institution

    Get PDF
    The purpose of this study was to assess, measure, and analyze whether voluntary, nationally-recognized professional certification credentials were important to augment technology programs at Buffalo State College (BSC), as a representative postsecondary baccalaureate degree-granting institution offering technology curricula. Six BSC undergraduate technology programs were evaluated within the scope of this study: 1.) Computer Information Systems; 2.) Electrical Engineering, Electronics; 3.) Electrical Engineering, Smart Grid; 4.) Industrial Technology; 5.) Mechanical Engineering; and 6.) Technology Education. This study considered the following three aspects of the problem: a.) postsecondary technology program enrollment and graduation trends; b.) the value/awareness of professional certifications to employers and students; and c.) professional certification relevancy and postsecondary curricula integration. The study was conducted through surveys and interviews with four technology-related purposive sample groups: 1.) BSC program alumni; 2.) BSC and non-BSC technology program faculty; 3.) hiring managers/industry leaders; and 4.) non-BSC alumni and certification holders. In addition, this study included an analysis of relevant professional certification organizations and student enrollment data from the six technology programs within scope. Research methods included both quantitative and qualitative analytical techniques. This study concluded undergraduate technology students benefitted from a greater awareness of relevant professional certifications and their perceived value. This study also found the academic community may be well served to acknowledge the increasing trend of professional certification integration into postsecondary technology programs

    State-space solutions to the dynamic magnetoencephalography inverse problem using high performance computing

    Get PDF
    Determining the magnitude and location of neural sources within the brain that are responsible for generating magnetoencephalography (MEG) signals measured on the surface of the head is a challenging problem in functional neuroimaging. The number of potential sources within the brain exceeds by an order of magnitude the number of recording sites. As a consequence, the estimates for the magnitude and location of the neural sources will be ill-conditioned because of the underdetermined nature of the problem. One well-known technique designed to address this imbalance is the minimum norm estimator (MNE). This approach imposes an L2L^2 regularization constraint that serves to stabilize and condition the source parameter estimates. However, these classes of regularizer are static in time and do not consider the temporal constraints inherent to the biophysics of the MEG experiment. In this paper we propose a dynamic state-space model that accounts for both spatial and temporal correlations within and across candidate intracortical sources. In our model, the observation model is derived from the steady-state solution to Maxwell's equations while the latent model representing neural dynamics is given by a random walk process.Comment: Published in at http://dx.doi.org/10.1214/11-AOAS483 the Annals of Applied Statistics (http://www.imstat.org/aoas/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Trust and Delegation

    Get PDF
    Due to imperfect transparency and costly auditing, trust is an essential component of financial intermediation. In this paper we study a comprehensive sample of due diligence reports from a major hedge fund due diligence firm. A routine feature of due diligence is an assessment of integrity. We find that misrepresentation about past legal and regulatory problems is frequent (21%), as is incorrect or unverifiable representations about other topics (28%). Misrepresentation, the failure to use a major auditing firm and the use of internal pricing are significantly related to legal and regulatory problems, indices of operational risk. Due diligence (DD) reports are costly and are only performed when a fund is seriously considered for investment. It is important to control for this conditioning which would otherwise bias cross-sectional analysis. We find that DD reports are typically issued on high return funds three months after the historical performance has peaked. DD reports are also issued at the point of highest cash flow into the fund. This pattern is consistent with return chasing behavior by institutional hedge fund investors

    Universally Sloppy Parameter Sensitivities in Systems Biology

    Get PDF
    Quantitative computational models play an increasingly important role in modern biology. Such models typically involve many free parameters, and assigning their values is often a substantial obstacle to model development. Directly measuring \emph{in vivo} biochemical parameters is difficult, and collectively fitting them to other data often yields large parameter uncertainties. Nevertheless, in earlier work we showed in a growth-factor-signaling model that collective fitting could yield well-constrained predictions, even when it left individual parameters very poorly constrained. We also showed that the model had a `sloppy' spectrum of parameter sensitivities, with eigenvalues roughly evenly distributed over many decades. Here we use a collection of models from the literature to test whether such sloppy spectra are common in systems biology. Strikingly, we find that every model we examine has a sloppy spectrum of sensitivities. We also test several consequences of this sloppiness for building predictive models. In particular, sloppiness suggests that collective fits to even large amounts of ideal time-series data will often leave many parameters poorly constrained. Tests over our model collection are consistent with this suggestion. This difficulty with collective fits may seem to argue for direct parameter measurements, but sloppiness also implies that such measurements must be formidably precise and complete to usefully constrain many model predictions. We confirm this implication in our signaling model. Our results suggest that sloppy sensitivity spectra are universal in systems biology models. The prevalence of sloppiness highlights the power of collective fits and suggests that modelers should focus on predictions rather than on parameters.Comment: Submitted to PLoS Computational Biology. Supplementary Information available in "Other Formats" bundle. Discussion slightly revised to add historical contex

    Performance of GEDI space-borne LiDAR for quantifying structural variation in the temperate forests of South-Eastern Australia

    Get PDF
    Monitoring forest structural properties is critical for a range of applications because structure is key to understanding and quantifying forest biophysical functioning, including stand dynamics, evapotranspiration, habitat, and recovery from disturbances. Monitoring of forest structural properties at desirable frequencies and cost globally is enabled by space-borne LiDAR missions such as the global ecosystem dynamics investigation (GEDI) mission. This study assessed the accuracy of GEDI estimates for canopy height, total plant area index (PAI), and vertical profile of plant area volume density (PAVD) and elevation over a gradient of canopy height and terrain slope, compared to estimates derived from airborne laser scanning (ALS) across two forest age-classes in the Central Highlands region of south-eastern Australia. ALS was used as a reference dataset for validation of GEDI (Version 2) dataset. Canopy height and total PAI analyses were carried out at the landscape level to understand the influence of beam-type, height of the canopy, and terrain slope. An assessment of GEDI’s terrain elevation accuracy was also carried out at the landscape level. The PAVD profile evaluation was carried out using footprints grouped into two forest age-classes, based on the areas of mountain ash (Eucalyptus regnans) forest burnt in the Central Highlands during the 1939 and 2009 wildfires. The results indicate that although GEDI is found to significantly under-estimate the total PAI and slightly over-estimate the canopy height, the GEDI estimates of canopy height and the vertical PAVD profile (above 25 m) show a good level of accuracy. Both beam-types had comparable accuracies, with increasing slope having a slightly detrimental effect on accuracy. The elevation accuracy of GEDI found the RMSE to be 10.58 m and bias to be 1.28 m, with an R2 of 1.00. The results showed GEDI is suitable for canopy densities and height in complex forests of south-eastern Australia

    The sloppy model universality class and the Vandermonde matrix

    Full text link
    In a variety of contexts, physicists study complex, nonlinear models with many unknown or tunable parameters to explain experimental data. We explain why such systems so often are sloppy; the system behavior depends only on a few `stiff' combinations of the parameters and is unchanged as other `sloppy' parameter combinations vary by orders of magnitude. We contrast examples of sloppy models (from systems biology, variational quantum Monte Carlo, and common data fitting) with systems which are not sloppy (multidimensional linear regression, random matrix ensembles). We observe that the eigenvalue spectra for the sensitivity of sloppy models have a striking, characteristic form, with a density of logarithms of eigenvalues which is roughly constant over a large range. We suggest that the common features of sloppy models indicate that they may belong to a common universality class. In particular, we motivate focusing on a Vandermonde ensemble of multiparameter nonlinear models and show in one limit that they exhibit the universal features of sloppy models.Comment: New content adde

    Trust and Delegation

    Get PDF
    Due to imperfect transparency and costly auditing, trust is an essential component of financial intermediation. In this paper we study a comprehensive sample of due diligence reports from a major hedge fund due diligence firm. A routine feature of due diligence is an assessment of integrity. We find that misrepresentation about past legal and regulatory problems is frequent (21%), as is incorrect or unverifiable representations about other topics (28%). Misrepresentation, the failure to use a major auditing firm and the use of internal pricing are significantly related to legal and regulatory problems, indices of operational risk. Due diligence (DD) reports are costly and are only performed when a fund is seriously considered for investment. It is important to control for this conditioning which would otherwise bias cross-sectional analysis. We find that DD reports are typically issued on high return funds three months after the historical performance has peaked. DD reports are also issued at the point of highest cash flow into the fund. This pattern is consistent with return chasing behavior by institutional hedge fund investors
    corecore