1,570 research outputs found

    Application of Statistical Methods for Improving Models of Intramuscular Percentage Fat Prediction in Live Beef Animals From Real-Time Ultrasound Images

    Get PDF
    Real-time ultrasound images from the Longissimus dorsi muscle across 11th to 13th ribs of 720 live bulls and steers were acquired over the period of four years. The actual intramuscular percentage of fat (IFAT) was determined using an n-hexane extraction with mean of 4.98%, standard deviation of 2.12%, and range from 1.10% to 14.68%. Image-processing techniques were used to calculate parameters to quantify the image texture patterns. The parameters which showed good correlations with the actual IFAT were used to develop a statistical linear regression model. The accuracy of prediction was very good for the actual IFAT less than or equal to eight (low IFAT group), with root mean square error (RMSE) around 1.0%. However, the model was much less accurate for prediction of IFAT values more than eight (high IFAT group), with RMSE more than 1.5%. One reason for this could be the limited ability of the ultrasound technique to resolve differences in high-IFAT muscles in terms of image texture patterns. Also, this group contained fewer than 10% of the images collected, which may be an inadequate sample. Overall accuracy of prediction was improved by developing different regression models for the low-IFAT and high-IFAT groups. Statistical pattern recognition and classification techniques were applied to “pre-classify” the images into low- or high-IFAT groups before being subjected to regression prediction models. The techniques applied included cluster analysis, discriminant analysis, and classification and regression tree (CART). The classification tree provided the best results with overall classification accuracy around 90% for low- and high-IFAT groups of images. In conclusion, overall accuracy of predicting the IFAT from ultrasound image parameters and regression models can be improved by first isolating the high- IFAT group from low-IFAT group using statistical classification methods

    Impact of the selenisation temperature on the structural and optical properties of CZTSe absorbers

    Get PDF
    We present structural and optical spectroscopy studies of thin films of Cu2ZnSnSe4 (CZTSe) with strong copper deficiency deposited on Mo/Glass substrates and selenised at 450, 500 or 550 °C. Solar cells fabricated from these films demonstrated efficiencies up to 7.4% for selenisation at 500 °C. Structural analysis based on X-ray diffraction and Raman spectroscopy revealed the presence of SnSe2 in the film selenised at 450 °C but not detected in the films selenised at higher temperatures. A progressive decrease of the Sn and Se content was observed as the selenisation temperature increased. Photoluminescence excitation was used to determine the bandgaps at 4.2 K. Detailed measurements of the temperature and excitation intensity dependencies of the photoluminescence spectra allow the recombination mechanisms of the observed emission bands to be identified as band-to-impurity and band-to-band transitions, and their evolution with selenisation temperature changes to be analysed. The strongest band-to-band transition is recorded in the PL spectra of the film selenised at 500 °C and can be observed from 6 K to room temperature. The compositional and structural changes in the films and their influence on the optoelectronic properties of CZTSe and solar cells are discussed

    All electron and pseudopotential study of the spin polarization of the V (001) surface: LDA versus GGA

    Full text link
    The spin-polarization at the V(001) surface has been studied by using different local (LSDA) and semilocal (GGA) approximations to the exchange-correlation potential of DFT within two ab initio methods: the all-electron TB-LMTO-ASA and the pseudopotential LCAO code SIESTA (Spanish Initiative for Electronic Simulations with Thousands of Atoms). A comparative analysis is performed first for the bulk and then for a N-layer V(001) film (7 < N < 15). The LSDA approximation leads to a non magnetic V(001) surface with both theoretical models in agreement (disagreement) with magneto-optical Kerr (electron-capture spectroscopy) experiments. The GGA within the pseudopotential method needs thicker slabs than the LSDA to yield zero moment at the central layer, giving a high surface magnetization (1.70 Bohr magnetons), in contrast with the non magnetic solution obtained by means of the all-electron code.Comment: 12 pages, 1 figure. Latex gzipped tar fil

    Fully relativistic calculation of magnetic properties of Fe, Co and Ni adclusters on Ag(100)

    Full text link
    We present first principles calculations of the magnetic moments and magnetic anisotropy energies of small Fe, Co and Ni clusters on top of a Ag(100) surface as well as the exchange-coupling energy between two single adatoms of Fe or Co on Ag(100). The calculations are performed fully relativistically using the embedding technique within the Korringa-Kohn-Rostoker method. The magnetic anisotropy and the exchange-coupling energies are calculated by means of the force theorem. In the case of adatoms and dimers of iron and cobalt we obtain enhanced spin moments and, especially, unusually large orbital moments, while for nickel our calculations predict a complete absence of magnetism. For larger clusters, the magnitudes of the local moments of the atoms in the center of the cluster are very close to those calculated for the corresponding monolayers. Similar to the orbital moments, the contributions of the individual atoms to the magnetic anisotropy energy strongly depend on the position, hence, on the local environment of a particular atom within a given cluster. We find strong ferromagnetic coupling between two neighboring Fe or Co atoms and a rapid, oscillatory decay of the exchange-coupling energy with increasing distance between these two adatoms.Comment: 8 pages, ReVTeX + 4 figures (Encapsulated Postscript), submitted to PR

    Supersymmetric black holes in 2D dilaton supergravity: baldness and extremality

    Full text link
    We present a systematic discussion of supersymmetric solutions of 2D dilaton supergravity. In particular those solutions which retain at least half of the supersymmetries are ground states with respect to the bosonic Casimir function (essentially the ADM mass). Nevertheless, by tuning the prepotential appropriately, black hole solutions may emerge with an arbitrary number of Killing horizons. The absence of dilatino and gravitino hair is proven. Moreover, the impossibility of supersymmetric dS ground states and of nonextremal black holes is confirmed, even in the presence of a dilaton. In these derivations the knowledge of the general analytic solution of 2D dilaton supergravity plays an important role. The latter result is addressed in the more general context of gPSMs which have no supergravity interpretation. Finally it is demonstrated that the inclusion of non-minimally coupled matter, a step which is already nontrivial by itself, does not change these features in an essential way.Comment: 30 pages, LaTeX, v2: mayor revision (rearranged title, shortened abstract, revised introduction, inserted section from appendix to main text, added subsection with new material on non-SUGRA gPSMs, added clarifying remarks at some places, updated references); v3: corrected minor misprints, added note with a new referenc

    Applying a business intelligence system in a big data context: production companies

    Get PDF
    Industry 4.0 promotes automation through computer systems of the manufacturing industry and its objective is the Smart Factory. Its development is considered a key factor in the strategic positioning not only of companies, but of regions, countries and continents in the short, medium and long term. Thus, it is no surprise that governments such as the United States and the European Commission are already taking this into consideration in the development of their industrial policies. This article presents a case of the implementation of a BI system in an industrial food environment with Big Data characteristics in which information from various sources is combined to provide information that improves the decision-making of the controls

    Selecting the most suitable classification algorithm for supporting assistive technology adoption for people with dementia: A multicriteria framework

    Get PDF
    The number of people with dementia (PwD) is increasing dramatically. PwD exhibit impairments of reasoning, memory, and thought that require some form of self‐management intervention to support the completion of everyday activities while maintaining a level of independence. To address this need, efforts have been directed to the development of assistive technology solutions, which may provide an opportunity to alleviate the burden faced by the PwD and their carers. Nevertheless, uptake of such solutions has been limited. It is therefore necessary to use classifiers to discriminate between adopters and nonadopters of these technologies in order to avoid cost overruns and potential negative effects on quality of life. As multiple classification algorithms have been developed, choosing the most suitable classifier has become a critical step in technology adoption. To select the most appropriate classifier, a set of criteria from various domains need to be taken into account by decision makers. In addition, it is crucial to define the most appropriate multicriteria decision‐making approach for the modelling of technology adoption. Considering the above‐mentioned aspects, this paper presents the integration of a five‐phase methodology based on the Fuzzy Analytic Hierarchy Process and the Technique for Order of Preference by Similarity to Ideal Solution to determine the most suitable classifier for supporting assistive technology adoption studies. Fuzzy Analytic Hierarchy Process is used to determine the relative weights of criteria and subcriteria under uncertainty and Technique for Order of Preference by Similarity to Ideal Solution is applied to rank the classifier alternatives. A case study considering a mobile‐based self‐management and reminding solution for PwD is described to validate the proposed approach. The results revealed that the best classifier was k‐nearest‐neighbour with a closeness coefficient of 0.804, and the most important criterion when selecting classifiers is scalability. The paper also discusses the strengths and weaknesses of each algorithm that should be addressed in future research

    Generalized Second Law of Thermodynamics on the Event Horizon for Interacting Dark Energy

    Full text link
    Here we are trying to find the conditions for the validity of the generalized second law of thermodynamics (GSLT) assuming the first law of thermodynamics on the event horizon in both cases when the FRW universe is filled with interacting two fluid system- one in the form of cold dark matter and the other is either holographic dark energy or new age graphic dark energy. Using the recent observational data we have found that GSLT holds both in quintessence era as well as in phantom era for new age graphic model while for holographic dark energy GSLT is valid only in phantom era.Comment: 8 pages, 2 figure

    K-Space at TRECVID 2008

    Get PDF
    In this paper we describe K-Space’s participation in TRECVid 2008 in the interactive search task. For 2008 the K-Space group performed one of the largest interactive video information retrieval experiments conducted in a laboratory setting. We had three institutions participating in a multi-site multi-system experiment. In total 36 users participated, 12 each from Dublin City University (DCU, Ireland), University of Glasgow (GU, Scotland) and Centrum Wiskunde and Informatica (CWI, the Netherlands). Three user interfaces were developed, two from DCU which were also used in 2007 as well as an interface from GU. All interfaces leveraged the same search service. Using a latin squares arrangement, each user conducted 12 topics, leading in total to 6 runs per site, 18 in total. We officially submitted for evaluation 3 of these runs to NIST with an additional expert run using a 4th system. Our submitted runs performed around the median. In this paper we will present an overview of the search system utilized, the experimental setup and a preliminary analysis of our results
    corecore