89 research outputs found

    Manipulating the alpha level cannot cure significance testing

    Get PDF
    We argue that making accept/reject decisions on scientific hypotheses, including a recent call for changing the canonical alpha level from p = 0.05 to p = 0.005, is deleterious for the finding of new discoveries and the progress of science. Given that blanket and variable alpha levels both are problematic, it is sensible to dispense with significance testing altogether. There are alternatives that address study design and sample size much more directly than significance testing does; but none of the statistical tools should be taken as the new magic method giving clear-cut mechanical answers. Inference should not be based on single studies at all, but on cumulative evidence from multiple independent studies. When evaluating the strength of the evidence, we should consider, for example, auxiliary assumptions, the strength of the experimental design, and implications for applications. To boil all this down to a binary decision based on a p-value threshold of 0.05, 0.01, 0.005, or anything else, is not acceptable

    Towards Comprehensive Foundations of Computational Intelligence

    Full text link
    Abstract. Although computational intelligence (CI) covers a vast variety of different methods it still lacks an integrative theory. Several proposals for CI foundations are discussed: computing and cognition as compression, meta-learning as search in the space of data models, (dis)similarity based methods providing a framework for such meta-learning, and a more general approach based on chains of transformations. Many useful transformations that extract information from features are discussed. Heterogeneous adaptive systems are presented as particular example of transformation-based systems, and the goal of learning is redefined to facilitate creation of simpler data models. The need to understand data structures leads to techniques for logical and prototype-based rule extraction, and to generation of multiple alternative models, while the need to increase predictive power of adaptive models leads to committees of competent models. Learning from partial observations is a natural extension towards reasoning based on perceptions, and an approach to intuitive solving of such problems is presented. Throughout the paper neurocognitive inspirations are frequently used and are especially important in modeling of the higher cognitive functions. Promising directions such as liquid and laminar computing are identified and many open problems presented.

    Effects of Waveform Model on Sensitivity Values of Transducers Used in Mechanical Dynamic Measurements

    No full text
    Dynamic calibration of the pressure transducers and accelerometers are carried out by applying dynamic mechanical inputs to them. Determination of those transducers' sensitivities, defined as the ratio of electrical output of the transducer to the mechanical input, is an important task for calibration laboratories. Data obtained during calibration are processed in order to have the peak values of the input and output signals which are sampled by data acquisition boards. Different approximations are made such as fitting the data in the range of 90% of the maximum value for parabola or half-sine waveform. It is clear that waveform model used and also the resolution and the sampling rate of the data acquisition boards have effects on the accuracy of the sensitivity of the transducer. For the investigation, the electrical output signal of the transducer corresponding to the mechanical input is recorded and simulated with different resolutions and sampling rates. Those data are processed for the waveforms of half-sine, parabola, Gaussian distribution. The effect of the waveform model of the input quantities on the dynamic sensitivity is discussed in this paper

    Success of goniotomy and trabeculotomy as an initial procedure in the surgical treatment of congenital glaucoma

    No full text
    Purpose: To evaluate the success and safety of goniotomy and trabeculotomy procedures performed on the patients with congenital glaucoma. Methods: The records of twelve eyes of six consecutive patients with primary congenital glaucoma undergoing surgery were reviewed. Goniotomy was performed on the patients with a clear cornea (n=4), while trabeculotomy was performed on the cases with a hazy cornea (n=8). Data collected during patient follow-up were analyzed and the efficacy of both procedures were evaluated retrospectively. Results: The study group consisted of 3 boys and 3 girls with an average age of 1.83 ± 1.12 (range, 1-4) years. The patients were examined before and after the operation under general anaesthesia. Sex, age, photophobia and tearing complaints, intraocular pressure, corneal haze, corneal diameter, and the type of the operation performed were recorded. Mean intraocular pressure was 31.08 ± 6.06 mm Hg preoperatively and found to decrease to 18.41 ± 4.05 mm Hg postoperatively. Goniotomy was performed on four eyes and the remaining eight eyes were undergone trabeculotomy operation. During a mean follow up of 7.50 ± 6.86 (range, 6-26) months, surgical success was found 75.0% in goniotomy group and 87.5% in trabeculotomy group. Trabeculectomy combined with mitomycin C was performed on the failed two cases. Conclusion: Both goniotomy and trabeculotomy are successful surgical procedures in the management of congenital glaucoma
    corecore