9,459 research outputs found

    Enzymatic functionalization of carbon-hydrogen bonds

    Get PDF
    The development of new catalytic methods to functionalize carbon–hydrogen (C–H) bonds continues to progress at a rapid pace due to the significant economic and environmental benefits of these transformations over traditional synthetic methods. In nature, enzymes catalyze regio- and stereoselective C–H bond functionalization using transformations ranging from hydroxylation to hydroalkylation under ambient reaction conditions. The efficiency of these enzymes relative to analogous chemical processes has led to their increased use as biocatalysts in preparative and industrial applications. Furthermore, unlike small molecule catalysts, enzymes can be systematically optimized via directed evolution for a particular application and can be expressed in vivo to augment the biosynthetic capability of living organisms. While a variety of technical challenges must still be overcome for practical application of many enzymes for C–H bond functionalization, continued research on natural enzymes and on novel artificial metalloenzymes will lead to improved synthetic processes for efficient synthesis of complex molecules. In this critical review, we discuss the most prevalent mechanistic strategies used by enzymes to functionalize non-acidic C–H bonds, the application and evolution of these enzymes for chemical synthesis, and a number of potential biosynthetic capabilities uniquely enabled by these powerful catalysts (110 references)

    Management and architecture click: The FAD(E)E Framework.

    Get PDF
    Enterprises are living things. They constantly need to be (re-)architected in order to achieve the necessary agility, alignment and integration. This paper gives a high-level overview of how companies can go about doing 'enterprise architecture' in the context of both the classic (isolated) enterprise and the Extended Enterprise. By discussing the goals that are pursued in an enterprise architecture effort we reveal some basic requirements that can be put on the process of architecting the enterprise. The relationship between managing and architecting the enterprise is discussed and clarified in the FAD(E)E, the Framework for the Architectural Development of the (Extended) Enterprise.Management; Architecture; Framework;

    Information Technology, Workplace Organization and the Demand for Skilled Labor: Firm-Level Evidence

    Get PDF
    Recently, the relative demand for skilled labor has increased dramatically. We investigate one of the causes, skill-biased technical change. Advances in information technology (IT) are among the most powerful forces bearing on the economy. Employers who use IT often make complementary innovations in their organizations and in the services they offer. Our hypothesis is that these co-inventions by IT users change the mix of skills that employers demand. Specifically, we test the hypothesis that it is a cluster of complementary changes involving IT, workplace organization and services that is the key skill-biased technical change. We examine new firm-level data linking several indicators of IT use, workplace organization, and the demand for skilled labor. In both a short-run factor demand framework and a production function framework, we find evidence for complementarity. IT use is complementary to a new workplace organization which includes broader job responsibilities for line workers, more decentralized decision-making, and more self-managing teams. In turn, both IT and that new organization are complements with worker skill, measured in a variety of ways. Further, the managers in our survey believe that IT increases skill requirements and autonomy among workers in their firms. Taken together, the results highlight the roles of both IT and IT-enabled organizational change as important components of the skill-biased technical change.

    Automatic Detection of Seizures with Applications

    Get PDF
    There are an estimated two million people with epilepsy in the United States. Many of these people do not respond to anti-epileptic drug therapy. Two devices can be developed to assist in the treatment of epilepsy. The first is a microcomputer-based system designed to process massive amounts of electroencephalogram (EEG) data collected during long-term monitoring of patients for the purpose of diagnosing seizures, assessing the effectiveness of medical therapy, or selecting patients for epilepsy surgery. Such a device would select and display important EEG events. Currently many such events are missed. A second device could be implanted and would detect seizures and initiate therapy. Both of these devices require a reliable seizure detection algorithm. A new algorithm is described. It is believed to represent an improvement over existing seizure detection algorithms because better signal features were selected and better standardization methods were used

    Perspective review of what is needed for molecular-specific fluorescence-guided surgery

    Get PDF
    Molecular image-guided surgery has the potential for translating the tools of molecular pathology to real-time guidance in surgery. As a whole, there are incredibly positive indicators of growth, including the first United States Food and Drug Administration clearance of an enzyme-biosynthetic-activated probe for surgery guidance, and a growing number of companies producing agents and imaging systems. The strengths and opportunities must be continued but are hampered by important weaknesses and threats within the field. A key issue to solve is the inability of macroscopic imaging tools to resolve microscopic biological disease heterogeneity and the limitations in microscopic systems matching surgery workflow. A related issue is that parsing out true molecular-specific uptake from simple-enhanced permeability and retention is hard and requires extensive pathologic analysis or multiple in vivo tests, comparing fluorescence accumulation with standard histopathology and immunohistochemistry. A related concern in the field is the over-reliance on a finite number of chosen preclinical models, leading to early clinical translation when the probe might not be optimized for high intertumor variation or intratumor heterogeneity. The ultimate potential may require multiple probes, as are used in molecular pathology, and a combination with ultrahigh-resolution imaging and image recognition systems, which capture the data at a finer granularity than is possible by the surgeon. Alternatively, one might choose a more generalized approach by developing the tracer based on generic hallmarks of cancer to create a more "one-size-fits-all" concept, similar to metabolic aberrations as exploited in fluorodeoxyglucose-positron emission tomography (FDG-PET) (i.e., Warburg effect) or tumor acidity. Finally, methods to approach the problem of production cost minimization and regulatory approvals in a manner consistent with the potential revenue of the field will be important. In this area, some solid steps have been demonstrated in the use of fluorescent labeling commercial antibodies and separately in microdosing studies with small molecules. (C) The Authors

    Green communication via Type-I ARQ: Finite block-length analysis

    Get PDF
    This paper studies the effect of optimal power allocation on the performance of communication systems utilizing automatic repeat request (ARQ). Considering Type-I ARQ, the problem is cast as the minimization of the outage probability subject to an average power constraint. The analysis is based on some recent results on the achievable rates of finite-length codes and we investigate the effect of codewords length on the performance of ARQ-based systems. We show that the performance of ARQ protocols is (almost) insensitive to the length of the codewords, for codewords of length 50\ge 50 channel uses. Also, optimal power allocation improves the power efficiency of the ARQ-based systems substantially. For instance, consider a Rayleigh fading channel, codewords of rate 1 nats-per-channel-use and outage probability 103.10^{-3}. Then, with a maximum of 2 and 3 transmissions, the implementation of power-adaptive ARQ reduces the average power, compared to the open-loop communication setup, by 17 and 23 dB, respectively, a result which is (almost) independent of the codewords length. Also, optimal power allocation increases the diversity gain of the ARQ protocols considerably.Comment: Accepted for publication in GLOBECOM 201
    corecore