340 research outputs found

    Too Early or Too Late — Master Professional in Serbia

    Get PDF
    The harmonization of modes of study and assessment within the European educational space has long since become an integral part of policy and practice of a large number of countries, including those which have started negotiations for joining the European Union, among which is Serbia. The most important transformation in this sense has been sustained by the system of higher education, with its main levers of faculties and higher vocational schools. Faculties, which cannot exist independently, but only under the auspices of universities, were given the right to organize and conduct three circles of studies. On the other hand, higher vocational schools, which can exist independently, were given the right to organize only two circles of studies. Thus, they were not given the right to organize and conduct doctoral studies. As for the second circle of higher education, faculties were given the right to organize and conduct master and specialized studies, whereas the vocational schools were given the right only to the specialized studies. Nonetheless, ten years after the introduction of the «Bologna mechanism», the decision has been made to grant higher vocational schools the right to conduct master degree studies, however specific, as they would need to be closely related to practice. It turned out, nevertheless, that all the conditions were not yet existent and ripe for such a move, as a consequence of which the public opinion has become divided and the whole process ceased. In this paper, we explain the basic controversies, as well as the pros and cons of such a decision. We also analyze the sensitive and contentious issues surrounding the introduction of the master of vocational studies and provide suggestions as to what still needs to be closely defined and adjusted in that regard

    Esophagogastric Ulcer in Pigs on Commercial Farms

    Get PDF
    Intensive swine production in modern conditions of breeding and use of technological processes has influenced the occurrence of wide range of breeding and technopathy diseases. One of the most abundant breeding diseases declared in literature as an independent disease is esophagogastric ulcer characterized by erosions and ulcers in mostly esophagogastric part and rarely in mucous part of stomach. Esophagogastric ulcer is a disease of multifactorial etiology caused by genetic predisposition, diet, and presence of certain pathogenic microorganisms (for example Helicobacter pylori). The goal of our research was to examine the frequency of esophagogastric ulcer in pigs on commercial farms. One commercial farm has its own slaughterhouse for producing meat of fattened pigs. In our experiment we used 103 pigs in fattening with body weight between 100 and 107 kg and aged from 6 to 7 months. In the process line of slaughterhouse we established thickening of esophageal surface, hyperkeratosis, nonstructural yellow surface in 37 of total of 103 animals, while erosion of esophageal part of stomach, surface damage which does not include damage of muscular layer of mucous membrane, was present in 29 of 103 animals. Ulcers of esophageal part of stomach which affect total thickness of mucosal membrane were present in 4 of 103 examined animals

    Noise-aided gradient descent bit-flipping decoders approaching maximum likelihood decoding

    No full text
    International audienceIn the recent literature, the study of iterative LDPC decoders implemented on faulty-hardware has led to the counter-intuitive conclusion that noisy decoders could perform better than their noiseless version. This peculiar behavior has been observed in the finite codeword length regime, where the noise perturbating the decoder dynamics help to escape the attraction of fixed points such as trapping sets. In this paper, we will study two recently introduced LDPC decoders derived from noisy versions of the gradient descent bit-flipping decoder (GDBF). Although the GDBF is known to be a simple decoder with limited error correction capability compared to more powerful soft-decision decoders, it has been shown that the introduction of a random perturbation in the decoder could greatly improve the performance results, approaching and even surpassing belief propagation or min-sum based decoders. For both decoders, we evaluate the probability of escaping from a Trapping set, and relate this probability to the parameters of the injected noise distribution, using a Markovian model of the decoder transitions in the state space of errors localized on isolated trapping sets. In a second part of the paper, we present a modified scheduling of our algorithms for the binary symmetric channel, which allows to approach maximum likelihood decoding (MLD) at the cost of a very large number of iterations

    Effect of Consecutive Cut and Vegetation Stage on Cncps Protein Fractions in Alfalfa (Medicago Sativa L.)

    Get PDF
    Crude protein (CP) of forages can be separated into fractions of differentiated abilities to provide available amino acids in the lower gut of ruminants. This knowledge is critical to develop feeding systems and to predict animal responses. The objective of this research was to asses whether CP concentrations and the relative proportion of CP fractions by CNCPS in alfalfa (Medicago sativa L.) cv K-28 were affected by different cuts and vegetation stages. Fraction B2, which represents true protein of intermediate ruminal degradation rate, was the largest single fraction in all cuts except in the third cut. Soluble fraction A was less than 400 g kg-1 CP in all cuts except in the third cut, while the unavailable fraction C ranged from 56 g kg-1 CP in the first cut to 134.8 g kg-1 CP in the fourth cut. The remaining fraction B3 (true protein of very low degradation rate) only represented less than 60 g kg-1 of total CP. Results showed that undegraded dietary protein represented a small proportion of total CP in alfalfa from the first to the fourth cut

    Comparative Structural and Optical Properties of Different Ceria Nanoparticles

    Get PDF
    Herein a comparative study of five nanocrystalline cerium oxides (CeO2-delta) synthesised by different methods and calcined at 500 degrees C is reported. XRPD analysis showed that stoichiometry parameter delta, crystallite size/strain and lattice constant were only slightly affected by the method utilized. All ceria nanoparticles are nearly spherical in shape with faceted morphology, free of defects and with a relatively uniform size distribution. The average microstrain was found to be approximately 10 times higher than that of bulk counterpart. The absorption edge of nanocrystalline materials was shifted towards a higher wavelengths (red shift) in comparison with bulk counterpart, and band gap values were in the range 2.7-3.24 eV (3.33 eV for bulk counterpart)

    Constant-Overhead Fault-Tolerant Quantum Computation with Reconfigurable Atom Arrays

    Full text link
    Quantum low-density parity-check (qLDPC) codes can achieve high encoding rates and good code distance scaling, providing a promising route to low-overhead fault-tolerant quantum computing. However, the long-range connectivity required to implement such codes makes their physical realization challenging. Here, we propose a hardware-efficient scheme to perform fault-tolerant quantum computation with high-rate qLDPC codes on reconfigurable atom arrays, directly compatible with recently demonstrated experimental capabilities. Our approach utilizes the product structure inherent in many qLDPC codes to implement the non-local syndrome extraction circuit via atom rearrangement, resulting in effectively constant overhead in practically relevant regimes. We prove the fault tolerance of these protocols, perform circuit-level simulations of memory and logical operations with these codes, and find that our qLDPC-based architecture starts to outperform the surface code with as few as several hundred physical qubits at a realistic physical error rate of 10310^{-3}. We further find that less than 3000 physical qubits are sufficient to obtain over an order of magnitude qubit savings compared to the surface code, and quantum algorithms involving thousands of logical qubits can be performed using less than 10510^5 physical qubits. Our work paves the way for explorations of low-overhead quantum computing with qLDPC codes at a practical scale, based on current experimental technologies

    Characterization of New Structure for Silicon Carbide X-Ray Detector by Method Monte Carlo

    Get PDF
    This work presents a characterization of radiation absorption properties of silicon carbide (SiC) as semiconductor for the realization of detectors for X-rays. SiC detectors can potentially reach superior performance with respect to all the other semiconductors presently employed in hazardous environments in nuclear and space science and technology. Physics and numerical modeling of photons transport through SiC detector is incorporated in non-destructive Monte Carlo method for determining the energy deposited and dose distribution. The Monte Carlo code has been adopted for numerical simulations for different detector conditions and configurations. The X-ray characterization of new SiC structures originates the improving of design of these detector systems.12th Annual YUCOMAT Conference, Sep 06-10, 2010, Herceg Novi, Montenegr

    Urgent need to clarify the definition of chronic critical limb ischemia - a position paper from the European Society for Vascular Medicine

    Get PDF
    Chronic critical lower limb ischemia (CLI) has been defined as ischemia that endangers the leg. An attempt was made to give a precise definition of CLI, based on clinical and hemodynamic data (Second European Consensus). CLI may be easily defined from a clinical point of view as rest pain of the distal foot or gangrene or ulceration. It is probably useful to add leg ulcers of other origin which do not heal because of severe ischemia, and to consider the impact of frailty on adverse outcome. From a hemodynamic viewpoint there is no consensus and most of the existing classifications are not based upon evidence. We should thus propose a definition and then validate it in a prospective cohort in order to define the patients at major risk of amputation, and also to define the categories of patients whose prognosis is improved by revascularisation. From today\u27s available data, it seems clear that the patients with a systolic toe pressure (STP) below 30 mmHg must be revascularised whenever possible. However other patients with clinically suspected CLI and STP above 30 mmHg must be evaluated and treated in specialised vascular units and revascularisation has to be discussed on a case by case basis, taking into account other data such as the WiFi classification for ulcers.In conclusion, many useful but at times contradictory definitions of CLI have been suggested. Only a few have taken into account evidence, and none have been validated prospectively. This paper aims to address this and to give notice that a CLI registry within Europe will be set up to prospectively validate, or not, the previous and suggested definitions of CLI

    GLACE: the global land–atmosphere coupling experiment. Part I: overview

    Get PDF
    Permission to place copies of these works on this server has been provided by the American Meteorological Society (AMS). The AMS does not guarantee that the copies provided here are accurate copies of the published work. © Copyright 2006 American Meteorological Society (AMS). Permission to use figures, tables, and brief excerpts from this work in scientific and educational works is hereby granted provided that the source is acknowledged. Any use of material in this work that is determined to be “fair use” under Section 107 of the U.S. Copyright Act or that satisfies the conditions specified in Section 108 of the U.S. Copyright Act (17 USC §108, as revised by P.L. 94-553) does not require the AMS’s permission. Republication, systematic reproduction, posting in electronic form on servers, or other uses of this material, except as exempted by the above statement, requires written permission or a license from the AMS. Additional details are provided in the AMS Copyright Policy, available on the AMS Web site located at (http://www.ametsoc.org/AMS) or from the AMS at 617-227-2425 or [email protected] Global Land–Atmosphere Coupling Experiment (GLACE) is a model intercomparison study focusing on a typically neglected yet critical element of numerical weather and climate modeling: land–atmosphere coupling strength, or the degree to which anomalies in land surface state (e.g., soil moisture) can affect rainfall generation and other atmospheric processes. The 12 AGCM groups participating in GLACE performed a series of simple numerical experiments that allow the objective quantification of this element for boreal summer. The derived coupling strengths vary widely. Some similarity, however, is found in the spatial patterns generated by the models, with enough similarity to pinpoint multimodel “hot spots” of land–atmosphere coupling. For boreal summer, such hot spots for precipitation and temperature are found over large regions of Africa, central North America, and India; a hot spot for temperature is also found over eastern China. The design of the GLACE simulations are described in full detail so that any interested modeling group can repeat them easily and thereby place their model’s coupling strength within the broad range of those documented here
    corecore