752 research outputs found

    Liability Insurance of Businessman with regard to Compensation

    Get PDF
    Import 22/07/2015Cílem práce je zhodnotit význam pojištění odpovědnosti podnikatele v souvislosti se změnami občanského zákoníku a se změnami určení výše a způsobu náhrady škody. Tohoto cíle bylo dosaženo pomocí výkladu změn, které přináší nový občanský zákoník do pojištění odpovědnosti a do oblasti náhrady újmy. V práci byl použitý také modelový příklad, který sloužil ke srovnání pojistných podmínek a návrhu pojistných smluv vybraných pojistitelů. Díky analýze modelového příkladu byl zhodnocen význam pojištění odpovědnosti pro podnikatelské subjekty. Hlavním přínosem práce je představení novinek obsažených v novém občanském zákoníku, které se dotýkají pojištění odpovědnosti podnikatelů. Výsledky práce pak prokazují ekonomickou výhodnost pojištění odpovědnosti pro střední podnik.The aim of the thesis is to evaluate the importance of Liability Insurance of Businessman in context of changes of Civil Code and changes of amount determination and form of compensation. This aim was achieved by changes explanation, which is brought by new Civil Code. Thesis includes model example being used to compare Insurance Terms and proposal of insurance contracts of the selected insurance companies. Thank to analysis of this example a significance of Liability Insurance of Businessman importance has been evaluated. Main benefit of this thesis is presentation of news in new Civil Code being connected with the Liability Insurance of Businessman. Results of the thesis prove economic advantage of Liability Insurance for middle size enterprises.119 - Katedra právavýborn

    File-based data flow in the CMS Filter Farm

    Get PDF
    During the LHC Long Shutdown 1, the CMS Data Acquisition system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and prepare the ground for future upgrades of the detector front-ends. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. This approach provides additional decoupling between the HLT algorithms and the input and output data flow. All the metadata needed for bookkeeping of the data flow and the HLT process lifetimes are also generated in the form of small "documents" using the JSON encoding, by either services in the flow of the HLT execution (for rates etc.) or watchdog processes. These "files" can remain memory-resident or be written to disk if they are to be used in another part of the system (e.g. for aggregation of output data). We discuss how this redesign improves the robustness and flexibility of the CMS DAQ and the performance of the system currently being commissioned for the LHC Run 2.National Science Foundation (U.S.)United States. Department of Energ

    Online data handling and storage at the CMS experiment

    Get PDF
    During the LHC Long Shutdown 1, the CMS Data Acquisition (DAQ) system underwent a partial redesign to replace obsolete network equipment, use more homogeneous switching technologies, and support new detector back-end electronics. The software and hardware infrastructure to provide input, execute the High Level Trigger (HLT) algorithms and deal with output data transport and storage has also been redesigned to be completely file- based. All the metadata needed for bookkeeping are stored in files as well, in the form of small documents using the JSON encoding. The Storage and Transfer System (STS) is responsible for aggregating these files produced by the HLT, storing them temporarily and transferring them to the T0 facility at CERN for subsequent offline processing. The STS merger service aggregates the output files from the HLT from ~62 sources produced with an aggregate rate of ~2GB/s. An estimated bandwidth of 7GB/s in concurrent read/write mode is needed. Furthermore, the STS has to be able to store several days of continuous running, so an estimated of 250TB of total usable disk space is required. In this article we present the various technological and implementation choices of the three components of the STS: the distributed file system, the merger service and the transfer system.United States. Department of EnergyNational Science Foundation (U.S.

    10 Gbps TCP/IP streams from the FPGA for High Energy Physics

    Get PDF
    The DAQ system of the CMS experiment at CERN collects data from more than 600 custom detector Front-End Drivers (FEDs). During 2013 and 2014 the CMS DAQ system will undergo a major upgrade to address the obsolescence of current hardware and the requirements posed by the upgrade of the LHC accelerator and various detector components. For a loss-less data collection from the FEDs a new FPGA based card implementing the TCP/IP protocol suite over 10Gbps Ethernet has been developed. To limit the TCP hardware implementation complexity the DAQ group developed a simplified and unidirectional but RFC 793 compliant version of the TCP protocol. This allows to use a PC with the standard Linux TCP/IP stack as a receiver. We present the challenges and protocol modifications made to TCP in order to simplify its FPGA implementation. We also describe the interaction between the simplified TCP and Linux TCP/IP stack including the performance measurements.United States. Dept. of EnergyNational Science Foundation (U.S.)Marie Curie International Fellowshi

    The new CMS DAQ system for LHC operation after 2014 (DAQ2)

    Get PDF
    The Data Acquisition system of the Compact Muon Solenoid experiment at CERN assembles events at a rate of 100 kHz, transporting event data at an aggregate throughput of 100 GByte/s. We are presenting the design of the 2nd generation DAQ system, including studies of the event builder based on advanced networking technologies such as 10 and 40 Gbit/s Ethernet and 56 Gbit/s FDR Infiniband and exploitation of multicore CPU architectures. By the time the LHC restarts after the 2013/14 shutdown, the current compute nodes, networking, and storage infrastructure will have reached the end of their lifetime. In order to handle higher LHC luminosities and event pileup, a number of sub-detectors will be upgraded, increase the number of readout channels and replace the off-detector readout electronics with a μTCA implementation. The second generation DAQ system, foreseen for 2014, will need to accommodate the readout of both existing and new off-detector electronics and provide an increased throughput capacity. Advances in storage technology could make it feasible to write the output of the event builder to (RAM or SSD) disks and implement the HLT processing entirely file based.United States. Dept. of EnergyNational Science Foundation (U.S.)Marie Curie International Fellowshi

    Ekologické zemědělství, učebnice pro školy i praxi. I. díl. Základy EZ, agroenvironmentální aspekty a pěstování rostlin

    Get PDF
    První učebnice ekologického zemědělství v České republice. Obsahuje základy filosofie a praxe v ekologickém zemědělství, zaměřené na rostlinnou produkci a v samostatné kapitole informuje o biodynamickém zemědělství

    Search for physics beyond the standard model in events with two leptons, jets, and missing transverse momentum in pp collisions at √s = 8 TeV

    Get PDF
    A search is presented for physics beyond the standard model in final states with two opposite-sign same-flavor leptons, jets, and missing transverse momentum. The data sample corresponds to an integrated luminosity of 19.4 fb[superscript −1] of proton-proton collisions at √s = 8 TeV collected with the CMS detector at the CERN LHC in 2012. The analysis focuses on searches for a kinematic edge in the invariant mass distribution of the oppositesign same-flavor lepton pair and for final states with an on-shell Z boson. The observations are consistent with expectations from standard model processes and are interpreted in terms of upper limits on the production of supersymmetric particles.United States. Dept. of EnergyNational Science Foundation (U.S.)Alfred P. Sloan Foundatio

    Measurement of the W boson helicity in events with a single reconstructed top quark in pp collisions at √s = 8 TeV

    Get PDF
    A measurement of the W boson helicity is presented, where the W boson originates from the decay of a top quark produced in pp collisions. The event selection, optimized for reconstructing a single top quark in the final state, requires exactly one isolated lepton (muon or electron) and exactly two jets, one of which is likely to originate from the hadronization of a bottom quark. The analysis is performed using data recorded at a center-of-mass energy of 8 TeV with the CMS detector at the CERN LHC in 2012. The data sample corresponds to an integrated luminosity of 19.7 fb[superscript −1]. The measured helicity fractions are F [subscript L] = 0.298 ± 0.028 (stat) ± 0.032(syst), F [subscript 0] = 0.720 ± 0.039 (stat) ± 0.037(syst), and F [subscript R] = −0.018 ± 0.019 (stat) ± 0.011(syst). These results are used to set limits on the real part of the tWb anomalous couplings, g [subscript L] and g [subscript R]

    Search for a standard model Higgs boson produced in association with a top-quark pair and decaying to bottom quarks using a matrix element method

    Get PDF
    A search for a standard model Higgs boson produced in association with a top-quark pair and decaying to bottom quarks is presented. Events with hadronic jets and one or two oppositely charged leptons are selected from a data sample corresponding to an integrated luminosity of 19.5 fb[superscript -1] collected by the CMS experiment at the LHC in pp collisions at a centre-of-mass energy of 8 TeV. In order to separate the signal from the larger t[bar over t] + jets background, this analysis uses a matrix element method that assigns a probability density value to each reconstructed event under signal or background hypotheses. The ratio between the two values is used in a maximum likelihood fit to extract the signal yield. The results are presented in terms of the measured signal strength modifier, μ, relative to the standard model prediction for a Higgs boson mass of 125 GeV. The observed (expected) exclusion limit at a 95 % confidence level is μ < 4.2 (3.3), corresponding to a best fit value [^ over μ] = 1.2[+1.6 over −1.5].United States. Dept. of EnergyNational Science Foundation (U.S.)Alfred P. Sloan Foundatio

    Search for the production of dark matter in association with top-quark pairs in the single-lepton final state in proton-proton collisions at √s = 8 TeV

    Get PDF
    A search is presented for particle dark matter produced in association with a pair of top quarks in pp collisions at a centre-of-mass energy of √s = 8 TeV. The data were collected with the CMS detector at the LHC and correspond to an integrated luminosity of 19.7 fb[superscript −1]. This search requires the presence of one lepton, multiple jets, and large missing transverse energy. No excess of events is found above the SM expectation, and upper limits are derived on the production cross section. Interpreting the findings in the context of a scalar contact interaction between fermionic dark matter particles and top quarks, lower limits on the interaction scale are set. These limits are also interpreted in terms of the dark matter-nucleon scattering cross sections for the spin-independent scalar operator and they complement direct searches for dark matter particles in the low mass region.United States. Dept. of EnergyNational Science Foundation (U.S.)Alfred P. Sloan Foundatio
    corecore