375 research outputs found

    Learning, Social Intelligence and the Turing Test - why an "out-of-the-box" Turing Machine will not pass the Turing Test

    Get PDF
    The Turing Test (TT) checks for human intelligence, rather than any putative general intelligence. It involves repeated interaction requiring learning in the form of adaption to the human conversation partner. It is a macro-level post-hoc test in contrast to the definition of a Turing Machine (TM), which is a prior micro-level definition. This raises the question of whether learning is just another computational process, i.e. can be implemented as a TM. Here we argue that learning or adaption is fundamentally different from computation, though it does involve processes that can be seen as computations. To illustrate this difference we compare (a) designing a TM and (b) learning a TM, defining them for the purpose of the argument. We show that there is a well-defined sequence of problems which are not effectively designable but are learnable, in the form of the bounded halting problem. Some characteristics of human intelligence are reviewed including it's: interactive nature, learning abilities, imitative tendencies, linguistic ability and context-dependency. A story that explains some of these is the Social Intelligence Hypothesis. If this is broadly correct, this points to the necessity of a considerable period of acculturation (social learning in context) if an artificial intelligence is to pass the TT. Whilst it is always possible to 'compile' the results of learning into a TM, this would not be a designed TM and would not be able to continually adapt (pass future TTs). We conclude three things, namely that: a purely "designed" TM will never pass the TT; that there is no such thing as a general intelligence since it necessary involves learning; and that learning/adaption and computation should be clearly distinguished.Comment: 10 pages, invited talk at Turing Centenary Conference CiE 2012, special session on "The Turing Test and Thinking Machines

    Wavefunction statistics in open chaotic billiards

    Full text link
    We study the statistical properties of wavefunctions in a chaotic billiard that is opened up to the outside world. Upon increasing the openings, the billiard wavefunctions cross over from real to complex. Each wavefunction is characterized by a phase rigidity, which is itself a fluctuating quantity. We calculate the probability distribution of the phase rigidity and discuss how phase rigidity fluctuations cause long-range correlations of intensity and current density. We also find that phase rigidities for wavefunctions with different incoming wave boundary conditions are statistically correlated.Comment: 4 pages, RevTeX; 1 figur

    Propagation of short lightpulses in microring resonators: ballistic transport versus interference in the frequency domain

    Get PDF
    The propagation of short lightpulses in waveguiding structures with optical feedback, in our case optical microresonators, has been studied theoretically and experimentally. It appears that, dependent on the measurement set-up, ballistic transport or interference in the time domain of fs and ps laser pulses can be observed. The experiments are analyzed in terms of characteristic time scales of the source, the waveguide device and the detector arrangement and are related to Heisenberg's uncertainty principle. Based on this analysis a criterion is given for the upper bitrate for error free data transmission through optical microresonators

    Disease and pharmacologic risk factors for first and subsequent episodes of equine laminitis: a cohort study of free-text electronic medical records

    Get PDF
    Electronic medical records from first opinion equine veterinary practice may represent a unique resource for epidemiologic research. The appropriateness of this resource for risk factor analyses was explored as part of an investigation into clinical and pharmacologic risk factors for laminitis. Amalgamated medical records from seven UK practices were subjected to text mining to identify laminitis episodes, systemic or intra-synovial corticosteroid prescription, diseases known to affect laminitis risk and clinical signs or syndromes likely to lead to corticosteroid use. Cox proportional hazard models and Prentice, Williams, Peterson models for repeated events were used to estimate associations with time to first, or subsequent laminitis episodes, respectively. Over seventy percent of horses that were diagnosed with laminitis suf- fered at least one recurrence. Risk factors for first and subsequent laminitis episodes were found to vary. Corticosteroid use (prednisolone only) was only significantly associated with subsequent, and not ini- tial laminitis episodes. Electronic medical record use for such analyses is plausible and offers important advantages over more traditional data sources. It does, however, pose challenges and limitations that must be taken into account, and requires a conceptual change to disease diagnosis which should be considered carefully

    Search for direct production of charginos and neutralinos in events with three leptons and missing transverse momentum in √s = 7 TeV pp collisions with the ATLAS detector

    Get PDF
    A search for the direct production of charginos and neutralinos in final states with three electrons or muons and missing transverse momentum is presented. The analysis is based on 4.7 fb−1 of proton–proton collision data delivered by the Large Hadron Collider and recorded with the ATLAS detector. Observations are consistent with Standard Model expectations in three signal regions that are either depleted or enriched in Z-boson decays. Upper limits at 95% confidence level are set in R-parity conserving phenomenological minimal supersymmetric models and in simplified models, significantly extending previous results

    Jet size dependence of single jet suppression in lead-lead collisions at sqrt(s(NN)) = 2.76 TeV with the ATLAS detector at the LHC

    Get PDF
    Measurements of inclusive jet suppression in heavy ion collisions at the LHC provide direct sensitivity to the physics of jet quenching. In a sample of lead-lead collisions at sqrt(s) = 2.76 TeV corresponding to an integrated luminosity of approximately 7 inverse microbarns, ATLAS has measured jets with a calorimeter over the pseudorapidity interval |eta| < 2.1 and over the transverse momentum range 38 < pT < 210 GeV. Jets were reconstructed using the anti-kt algorithm with values for the distance parameter that determines the nominal jet radius of R = 0.2, 0.3, 0.4 and 0.5. The centrality dependence of the jet yield is characterized by the jet "central-to-peripheral ratio," Rcp. Jet production is found to be suppressed by approximately a factor of two in the 10% most central collisions relative to peripheral collisions. Rcp varies smoothly with centrality as characterized by the number of participating nucleons. The observed suppression is only weakly dependent on jet radius and transverse momentum. These results provide the first direct measurement of inclusive jet suppression in heavy ion collisions and complement previous measurements of dijet transverse energy imbalance at the LHC.Comment: 15 pages plus author list (30 pages total), 8 figures, 2 tables, submitted to Physics Letters B. All figures including auxiliary figures are available at http://atlas.web.cern.ch/Atlas/GROUPS/PHYSICS/PAPERS/HION-2011-02

    Measurement of the polarisation of W bosons produced with large transverse momentum in pp collisions at sqrt(s) = 7 TeV with the ATLAS experiment

    Get PDF
    This paper describes an analysis of the angular distribution of W->enu and W->munu decays, using data from pp collisions at sqrt(s) = 7 TeV recorded with the ATLAS detector at the LHC in 2010, corresponding to an integrated luminosity of about 35 pb^-1. Using the decay lepton transverse momentum and the missing transverse energy, the W decay angular distribution projected onto the transverse plane is obtained and analysed in terms of helicity fractions f0, fL and fR over two ranges of W transverse momentum (ptw): 35 < ptw < 50 GeV and ptw > 50 GeV. Good agreement is found with theoretical predictions. For ptw > 50 GeV, the values of f0 and fL-fR, averaged over charge and lepton flavour, are measured to be : f0 = 0.127 +/- 0.030 +/- 0.108 and fL-fR = 0.252 +/- 0.017 +/- 0.030, where the first uncertainties are statistical, and the second include all systematic effects.Comment: 19 pages plus author list (34 pages total), 9 figures, 11 tables, revised author list, matches European Journal of Physics C versio
    • 

    corecore