25,189 research outputs found

    Economics and the new economy: the invisible hand meets creative destruction

    Get PDF
    In the 18th century, Adam Smith offered his theory of the invisible hand and the view that perfect competition is the main spur to economic efficiency. The theory of the invisible hand, as it has evolved in modern economic thought, treats creative activity as being outside the scope of economic theory. In the 20th century, Joseph Schumpeter offered an alternative perspective: creativity is an economic activity. He argued that a capitalist market system rewards change by allowing those who create new products and processes to capture some of the benefits of their creations in the form of short-term monopoly profits, a situation that promotes what Schumpeter called "creative destruction." What should the fundamental paradigm of economics be: creative destruction or the invisible hand? In this article, Leonard Nakamura offers some possible answers to this question.[Adobe Acrobat (.pdf)Economic development ; Productivity ; Wages

    Semantic Heterogeneity Issues on the Web

    Full text link
    The Semantic Web is an extension of the traditional Web in which meaning of information is well defined, thus allowing a better interaction between people and computers. To accomplish its goals, mechanisms are required to make explicit the semantics of Web resources, to be automatically processed by software agents (this semantics being described by means of online ontologies). Nevertheless, issues arise caused by the semantic heterogeneity that naturally happens on the Web, namely redundancy and ambiguity. For tackling these issues, we present an approach to discover and represent, in a non-redundant way, the intended meaning of words in Web applications, while taking into account the (often unstructured) context in which they appear. To that end, we have developed novel ontology matching, clustering, and disambiguation techniques. Our work is intended to help bridge the gap between syntax and semantics for the Semantic Web construction

    Shikimate hydroxycinnamoyl transferase (HCT) activity assays in Populus nigra

    Get PDF
    Lignin is a complex phenolic polymer deposited in secondarily-thickened plant cell walls. The polymer is mainly derived from the three primary monolignols: p-coumaryl, coniferyl and sinapyl alcohol which give rise to p-hydroxyphenyl, guaiacyl and syringyl units (H, G and S units, respectively) when coupled into the polymer. The building blocks differ in their degree of methoxylation and their biosynthetic pathway is catalyzed by more than 10 enzymes. HCT plays a crucial role by channeling the phenylpropanoids towards the production of coniferyl and sinapyl alcohols. Interestingly, HCT has been reported to be implicated in the pathway both upstream and downstream of the 3-hydroxylation of the aromatic ring of p-coumaroyl shikimate (Figure 1) (Hoffmann et al., 2003; Hoffmann et al., 2004; Vanholme et al., 2013b). These features highlight the importance of developing an assay to reliably measure HCT activity in planta. Here, we describe a UPLC-MS-based method for the analysis of HCT activity in xylem total protein extracts of Populus nigra, which can be adapted to other woody and herbaceous plant species. The protocol was initially described in Vanholme et al. (2013a)

    The "Artificial Mathematician" Objection: Exploring the (Im)possibility of Automating Mathematical Understanding

    Get PDF
    Reuben Hersh confided to us that, about forty years ago, the late Paul Cohen predicted to him that at some unspecified point in the future, mathematicians would be replaced by computers. Rather than focus on computers replacing mathematicians, however, our aim is to consider the (im)possibility of human mathematicians being joined by “artificial mathematicians” in the proving practice—not just as a method of inquiry but as a fellow inquirer

    Automated reduction of submillimetre single-dish heterodyne data from the James Clerk Maxwell Telescope using ORAC-DR

    Get PDF
    With the advent of modern multi-detector heterodyne instruments that can result in observations generating thousands of spectra per minute it is no longer feasible to reduce these data as individual spectra. We describe the automated data reduction procedure used to generate baselined data cubes from heterodyne data obtained at the James Clerk Maxwell Telescope. The system can automatically detect baseline regions in spectra and automatically determine regridding parameters, all without input from a user. Additionally it can detect and remove spectra suffering from transient interference effects or anomalous baselines. The pipeline is written as a set of recipes using the ORAC-DR pipeline environment with the algorithmic code using Starlink software packages and infrastructure. The algorithms presented here can be applied to other heterodyne array instruments and have been applied to data from historical JCMT heterodyne instrumentation.Comment: 18 pages, 13 figures, submitted to Monthly Notices of the Royal Astronomical Societ

    Edaq530: a transparent, open-end and open-source measurement solution in natural science education

    Get PDF
    We present Edaq530, a low-cost, compact and easy-to-use digital measurement solution consisting of a thumb-sized USB-to-sensor interface and a measurement software. The solution is fully open-source, our aim being to provide a viable alternative to professional solutions. Our main focus in designing Edaq530 has been versatility and transparency. In this paper, we shall introduce the capabilities of Edaq530, complement it by showing a few sample experiments, and discuss the feedback we have received in the course of a teacher training workshop in which the participants received personal copies of Edaq530 and later made reports on how they could utilise Edaq530 in their teaching
    • …
    corecore