1,789 research outputs found

    Human Uniqueness, Cognition by Description, and Procedural Memory

    Get PDF
    Evidence will be reviewed suggesting a fairly direct link between the human ability to think about entities which one has never perceived — here called “cognition by description” — and procedural memory. Cognition by description is a uniquely hominid trait which makes religion, science, and history possible. It is hypothesized that cognition by description (in the manner of Bertrand Russell’s “knowledge by description”) requires variable binding, which in turn utilizes quantifier raising. Quantifier raising plausibly depends upon the computational core of language, specifically the element of it which Noam Chomsky calls “internal Merge”. Internal Merge produces hierarchical structures by means of a memory of derivational steps, a process plausibly involving procedural memory. The hypothesis is testable, predicting that procedural memory deficits will be accompanied by impairments in cognition by description. We also discuss neural mechanisms plausibly underlying procedural memory and also, by our hypothesis, cognition by description

    Design-for-delay-testability techniques for high-speed digital circuits

    Get PDF
    The importance of delay faults is enhanced by the ever increasing clock rates and decreasing geometry sizes of nowadays' circuits. This thesis focuses on the development of Design-for-Delay-Testability (DfDT) techniques for high-speed circuits and embedded cores. The rising costs of IC testing and in particular the costs of Automatic Test Equipment are major concerns for the semiconductor industry. To reverse the trend of rising testing costs, DfDT is\ud getting more and more important

    Delay Test Quality Evaluation Using Bounded Gate Delays

    Full text link
    Abstract: Conventionally, path delay tests are derived in a delay-independent manner, which causes most faults to be robustly untestable. Many non-robust tests are invalidated by hazards caused primarily due to non-zero delays of off-path circuit elements. Thus, non-robust tests are of limited value when process variations change gate delays. We propose a bounded gate delay model for test quality evaluation and give a novel simulation algorithm that is less pessimistic than previous approaches. The key idea is that certain time-correlations among the multiple transitions at the inputs of a gate cannot cause hazard at its output. We maintain “ambiguity lists ” for gates. These are propagated with events, similar to fault lists in a traditional concurrent fault simulation. They are used to suppress erroneous unknown states. Experimental results for ISCAS benchmarks with gate delay variation of ±14 % show a miscorrelation of critical path delay as much as 20%.

    Modelling and Design of Inverter Threshold Quantization based Current Comparator using Artificial Neural Networks

    Get PDF
    Performance of a MOS based circuit is highly influenced by the transistor dimensions chosen for that circuit. Thus, proper dimensioning of the transistors plays a key role in determining its overall performance.  While choosing the dimension is critical, it is equally difficult, primarily due to complex mathematical formulations that come into play when moving into the submicron level. The drain current is the most affected parameter which in turn affects all other parameters. Thus, there is a constant quest to come up with techniques and procedure to simplify the dimensioning process while still keeping the parameters under check. This study presents one such novel technique to estimate the transistor dimensions for a current comparator structure, using the artificial neural networks approach. The approach uses Multilayer perceptrons as the artificial neural network architectures. The technique involves a two step process. In the first step, training and test data are obtained by doing SPICE simulations of modelled circuit using 0.18Όm TSMC CMOS technology parameters. In the second step, this training and test data is applied to the developed neural network architecture using MATLAB R2007b

    Scientific Evidence and Forensic Science Since Daubert: Maine Decides to Sit out on the Dance

    Get PDF
    In 1993, the Supreme Court of the United States stated that with the federal adoption of statutory rules of evidence in 1975, the common law rule for determining admissibility of scientific testimony was superseded, and that thenceforth admissibility of scientific testimony was to be determined solely by Federal Rule of Evidence 702 (Rule 702). The Frye standard had been adopted in one form or another by most of the federal circuits and by many of the state courts during the 70 years preceding Daubert. Referred to as the “general acceptance” standard, the Frye standard--although adopted in a variety of forms--had the core requirement that proffered scientific testimony be based on something enjoying “general acceptance” among some set of scientists. It was an effort to ensure that expert testimony had some measure of reliability. The Daubert Court, in agreement with Petitioners and with the authors of six of the twenty-two amicus briefs that had been filed, held that the strictness of the Frye “general acceptance” requirement was not in keeping with the goal of the Federal Rules of Evidence to liberalize admission criteria. If the Court had stopped there, Daubert would be tantamount to the Maine scientific evidence rule, as set out in State v. Williams. But it did not stop there. Instead, engaging in what some might characterize as an exegesis, the Court asserted that since inaccurate expert testimony could not “determine a fact in issue,” it was necessary for the trial judge to exclude expert testimony not based on the scientific method. The Court thereby brought in through the back door the same reliability concern that had led to the widespread adoption of Frye in the first place. This author asserts that, contrary to popular legal and lay belief, the significance of Daubert lies not in its discarding of Frye and its emphasis on Rule 702, but rather in its exhorting of trial judges to exercise their “gatekeeper” role with respect to scientific evidence, something that many had been fairly lax about previously. It is precisely because trial judges have taken this gatekeeper role more seriously than in the past that a revolution is occurring in scientific evidence and forensic science. Adding to the pressure for reexamination and change has been the plethora of DNA-based wrongful-conviction discoveries of the past decade. Men convicted of the most heinous crimes, and often sentenced to death, have subsequently been found indisputably innocent of those crimes. Just as an autopsy provides a post-mortem check of a physician\u27s cause-of-death finding and/or an earlier diagnosis of disease, the post-conviction DNA analysis can provide a check on the correctness of a verdict or plea. Of course, there is less symmetry in the legal selection process than there is in the medical. Although autopsies are generally sought whenever there is uncertainty in the diagnosis or cause of death, post-conviction DNA reviews are sought only to prove that the guilty verdict was mistaken, that it represented a “false positive.” No one seeks such reviews to support a verdict of innocence. Furthermore, it is unlikely that a prosecutor would ever seek such a review to support a verdict (or plea) of guilty. Once the wrongful-conviction findings began to surface, there was great interest in investigating what had gone wrong at the underlying trials. It was realized that, in addition to answering the pressing specific question, the results of such an investigation might have significance for criminal trials in general, regardless of the crime charged, and for civil trials. Presumably, errors that were occurring in trials that could be checked with DNA analysis were also occurring in trials for which DNA analysis was not available. The most common threads running through the trials that led to wrongful convictions are a paucity of evidence and the failure of the defense to put on a forensic expert. In many of the cases, there was no physical evidence at all and the prosecution\u27s case rested entirely on eye witness testimony, sometimes from a single witness. The forensic science community was most dismayed by those cases where the wrongful verdict was based on specious forensic testimony. In most instances, the testimony involved exaggerating, either through implication or direct lying, the significance of those tests that had been done. A typical example would involve the claim that hairs can be “individualized” by microscopic examination, leading to the conclusion that specimens of the defendant\u27s hair had been found at the crime scene. Although the falsity of such statements has long been recognized in professional scientific literature, it seems not to be recognized by the majority of the public. This means that, in the absence of effective opposition, a jury will probably accept the false testimony at face value and as persuasive evidence. Even if the witness only makes a literally true statement that the hair specimen found at the scene “is consistent with” being the defendant\u27s, a jury and judge not familiar with this type of evidence, and not alert to the “is consistent with” subterfuge, can be influenced to the severe detriment of the defendant. A knowledgeable defense expert can help cure such testimony or even prevent it from being offered in the first place

    A NASA family of minicomputer systems, Appendix A

    Get PDF
    This investigation was undertaken to establish sufficient specifications, or standards, for minicomputer hardware and software to provide NASA with realizable economics in quantity purchases, interchangeability of minicomputers, software, storage and peripherals, and a uniformly high quality. The standards will define minicomputer system component types, each specialized to its intended NASA application, in as many levels of capacity as required

    Proving that a Tree Language is not First-Order Definable

    Get PDF
    We explore from an algebraic viewpoint the properties of the tree languages definable with a first-order formula involving the ancestor predicate, using the description of these languages as those recognized by iterated block products of forest algebras defined from finite counter monoids. Proofs of nondefinability are infinite sequences of sets of forests, one for each level of the hierarchy of quantification levels that defines the corresponding variety of languages. The forests at a given level are built recursively by inserting forests from previous level at the ports of a suitable set of multicontexts. We show that a recursive proof exists for the syntactic algebra of every non-definable language. We also investigate certain types of uniform recursive proofs. For this purpose, we define from a forest algebra an algebra of mappings and an extended algebra, which we also use to redefine the notion of aperiodicity in a way that generalizes the existing ones
    • 

    corecore