188 research outputs found
Disruption of cholinergic neurotransmission, within a cognitive challenge paradigm, is indicative of Aβ-related cognitive impairment in preclinical Alzheimer’s disease after a 27-month delay interval
Background
Abnormal beta-amyloid (Aβ) is associated with deleterious changes in central cholinergic tone in the very early stages of Alzheimer’s disease (AD), which may be unmasked by a cholinergic antagonist (J Prev Alzheimers Dis 1:1–4, 2017). Previously, we established the scopolamine challenge test (SCT) as a “cognitive stress test” screening measure to identify individuals at risk for AD (Alzheimer’s & Dementia 10(2):262–7, 2014) (Neurobiol. Aging 36(10):2709-15, 2015). Here we aim to demonstrate the potential of the SCT as an indicator of cognitive change and neocortical amyloid aggregation after a 27-month follow-up interval. Methods
Older adults (N = 63, aged 55–75 years) with self-reported memory difficulties and first-degree family history of AD completed the SCT and PET amyloid imaging at baseline and were then seen for cognitive testing at 9, 18, and 27 months post-baseline. Repeat PET amyloid imaging was completed at the time of the 27-month exam. Results
Significant differences in both cognitive performance and in Aβ neocortical burden were observed between participants who either failed vs. passed the SCT at baseline, after a 27-month follow-up period. Conclusions
Cognitive response to the SCT (Alzheimer’s & Dementia 10(2):262–7, 2014) at baseline is related to cognitive change and PET amyloid imaging results, over the course of 27 months, in preclinical AD. The SCT may be a clinically useful screening tool to identify individuals who are more likely to both have positive evidence of amyloidosis on PET imaging and to show measurable cognitive decline over several years
Demographic and clinical characteristics associated with glomerular filtration rates in living kidney donors
Due to the shortage of organs, living donor acceptance criteria are becoming less stringent. An accurate determination of the glomerular filtration rate (GFR) is critical in the evaluation of living kidney donors and a value exceeding 80 ml/min per 1.73 m2 is usually considered suitable. To improve strategies for kidney donor screening, an understanding of factors that affect GFR is needed. Here we studied the relationships between donor GFR measured by 125I-iothalamate clearances (mGFR) and age, gender, race, and decade of care in living kidney donors evaluated at the Cleveland Clinic from 1972 to 2005. We report the normal reference ranges for 1057 prospective donors (56% female, 11% African American). Females had slightly higher mGFR than males after adjustment for body surface area, but there were no differences due to race. The lower limit of normal for donors (5th percentile) was less than 80 ml/min per 1.73 m2 for females over age 45 and for males over age 40. We found a significant doubling in the rate of GFR decline in donors over age 45 as compared to younger donors. The age of the donors and body mass index increased over time, but their mGFR, adjusted for body surface area, significantly declined by 1.49±0.61 ml/min per 1.73 m2 per decade of testing. Our study shows that age and gender are important factors determining normal GFR in living kidney donors
Commercialization of the Internet: The Interaction of Public Policy and Private Actions,” in
seminar participants for comments. I am particularly grateful to Zvi Griliches who encouraged this research when it was at a formative stage. All remaining errors are mine alone. Abstract Why did commercialization of the Internet go so well? This paper examines events in the Internet access market as a window on this broad question. The study emphasizes four themes. First, commercializing Internet access did not give rise to many of the anticipated technical and operational challenges. Entrepreneurs quickly learned that the Internet access business was commercially feasible. Second, Internet access was malleable as a technology and as an economic unit. Third, privatization fostered attempts to adapt the technology in new uses, new locations, new market settings, new applications and in conjunction with other lines of business. These went beyond what anyone would have forecast by examining the uses for the technology prior to 1992. Fourth, and not trivially, the NSF was lucky in one specific sense. The Internet access industry commercialized at a propitious moment, at the same time as the growth of an enormous new technological opportunity, the World Wide Web. As it turned out, the Web thrived under market-oriented, decentralized and independent decision making. The paper draws lessons for policies governing the commercialization of other government managed technologies and for the Internet access market moving forward. 1 Motivation The "commercialization of the Internet" is shorthand for three nearly simultaneous events: the removal of restrictions by the National Science Foundation (NSF) over use of the Internet for commercial purposes, the browser wars initiated by the founding of Netscape, and the rapid entry of tens of thousands of firms into commercial ventures using technologies which employ the suite of TCP/IP standards. These events culminated years of work at NSF to transfer the Internet into commercial hands from its exclusive use for research activity in government funded laboratories and universities. Sufficient time has passed to begin to evaluate how the market performed after commercialization. Such an evaluation is worth doing. Actual events have surpassed the forecasts of the most optimistic managers at NSF. Was this due to mere good fortune or something systematic whose lessons illuminate the market today? Other government managed technologies usually face vexing technical and commercial challenges that prevent the technology from diffusing quickly, if at all. Can we draw lessons from this episode for the commercialization of other government managed technologies? In that spirit, this paper examines the Internet access market and one set of actors, Internet Service Provides (ISPs). ISPs provide Internet access for most of the households and business users in the country (NTIA, 1999), usually for a fee or, more recently, in exchange for advertising. Depending on the user facilities, whether it is a business or personal residence, access can involve dial-up to a local number or 1-800 number at different speeds, or direct access to the user's server employing one of several high-speed access technologies. The largest ISP in the United States today is America-On-Line, to which approximately half the households in the US subscribe. There also are many national ISPs with recognizable names, such as AT&T Worldnet, MCI WorldCom/UUNet, Mindspring/Earthlink, and PSINet, as well as thousands of smaller regional ISPs. The Internet access market is a good case to examine. Facilities for similar activity existed prior to commercialization, but there was reason to expect a problematic migration into commercial use. This activity 2 appeared to possess idiosyncratic technical features and uneconomic operational procedures which made it unsuitable in other settings. The Internet's exclusive use by academics and researchers fostered cautious predictions that unanticipated problems would abound and commercial demand might not materialize. In sharp contrast to cautious expectations, however, the ISP market displayed three extraordinary features. For one, this market grew rapidly, attracting thousands of entrants and many users, quickly achieving mass-market status. Second, firms offering this service became nearly geographically pervasive, a diffusion pattern rarely found in new infrastructure markets. And third, firms did not settle on a standard menu of services to offer, indicative of new commercial opportunities and also a lack of consensus about the optimal business model for this opportunity. Aside from defying expectations, all three traits --rapid growth, geographic pervasiveness and the absence of settlement --do not inherently go together in most markets. The presence of restructuring should have interfered with rapid growth and geographic expansion. So explaining this market experience is also interesting in its own right. What happened to make commercialization go so well? This paper's examination reveals four themes. First, commercialization did not give rise to many of the anticipated technical and operational challenges. Entrepreneurs quickly learned that the Internet access business was commercially feasible. This happened for a variety of economic reasons. ISPs began offering commercial service after making only incremental changes to familiar operating procedures borrowed from the academic setting. It was technically easy to collect revenue at what used to be the gateway functions of academic modem pools. Moreover, the academic model of Internet access migrated into commercial operation without any additional new equipment suppliers. Second, Internet access was malleable as a technology and as an economic unit. This is because the foundation for Internet inter-connectivity, TCP/IP, is not a single invention, diffusing across time and space without changing form. Instead, it is embedded in equipment which uses a suite of communication technologies, protocols and standards for networking between computers. This technology obtains economic value in combination with complementary invention, investment and equipment. While commercialization did 3 give rise to restructuring of Internet access to suit commercial users, the restructuring did not stand in the way of diffusion, nor interfere with the initial growth of demand. Third, privatizing Internet access fostered customizing Internet access technology to a wide variety of locations, circumstances and users. As it turned out, the predominant business model was feasible at small scale and, thus, at low levels of demand. This meant that the technology was commercially viable at low densities of population, whether or not it was part of a national branded service or a local geographically Fourth, and not trivially, the NSF was lucky in a particular sense of the word. It enabled the commercialization of the Internet access industry at a propitious moment, at the same time as the growth of an enormous new technological opportunity, the World Wide Web. This invention motivated further experimentation to take advantage of the new opportunity, which, as it turned out, thrived under marketoriented and decentralized decision making. The paper first develop these themes. Then it describes recent experience. It ends by discussing how these themes continue to resonate today. Challenges during technology transfer: an overview Conventional approaches to technological development led most observers in 1992 to be cautious about the commercialization of the Internet. To understand how this prediction went awry, it is important to understand its foundations. For example, military users frequently require electronic components to meet specifications that suit 6 the component to battle conditions. Extensive technical progress is needed to tailor a product design to meet these requirements. Yet, and this is difficult to anticipate prior to commercialization, an additional amount of invention is often needed to bring such a product design and to bring its manufacturing to a price/point with features that meet more cost-conscious or less technically stringent commercial requirements. Commercial challenges arise when commercial markets require substantial adaptation of operation and business processes in order to put technologies into use. In other words, government users or users in a research environment often tolerate operational processes that do not translate profitably to commercial environments. After a technology transfers out of government sponsorship, it may not be clear how to balance costs and revenues for technologies that had developed under settings with substantial subsidies underwriting losses, and research goals justifying expenditures. Hence, many government managed technologies require considerable experimentation with business models before they begin to grow, if they grow at all. For example, the supersonic transport actually met its engineering targets, but still failed to satisfy basic operational economics in most settings. Being technically sleek was insufficient to attract enough interest to generate the revenue which covered operating costs on any but a small set of routes. No amount of operational innovations and marketing campaigns were able to overcome these commercial problems. New technologies are also vulnerable to structural challenges that impede pathways to commercialization. Commercial and structural challenges are not necessarily distinct, though the latter are typically more complex. Structural challenges are those which require change to the bundle of services offered, change to the boundary of the firms offering or using the new technology, or dramatic change to the operational structure of the service organization. These challenges arise because technologies developed under government auspices may presume implementation at a particular scale or with a set of technical standards, but require a different set of organizational arrangements to support commercial applications. For example, while many organizations provided the technical advances necessary for scientific 7 computing in academic settings during the 1950s, very few of these same firms migrated into supporting large customer bases among business users. As it turned out, the required changes were too dramatic for many companies to make. The structure of the support and sales organization were very different, and so too were the product designs. Of course, the few who successfully made the transition to commercial users, such as IBM, did quite well, but doing so required overcoming considerable obstacles. In summary, conventional analysis forecasts that migrating Internet access into commercial use would engender technical, commercial and structural challenges. Why did the migration proceed so different than expected? The absence of challenge in the Internet Access industry An ISP is a commercial firm who provides access, maintains it for a fee and develop related applications as users require. While sometimes this is all they do, with business users they often do much more. Sometimes ISPs do simple things such as filtering. Sometimes it involves managing and designing email accounts, data-bases and web pages. Some ISPs label this activity consulting and charge for it separately; others do not consider it distinct from the normal operation of the Internet access services. On the surface the record of achievement for ISPs is quite remarkable. Most recent surveys show that no more than 10 percent of US households get their Internet access from university-sponsored Internet access providers, the predominant provider of such access prior to commercialization. Today almost all users go to a commercial providers By the end of the century the ISP market had obtained a remarkable structure. One firm, America On-line, provided access to close to half the households in the US market, while several score of other ISPs provided access to millions of households and businesses on a nationwide basis. Thousands of ISPs also 8 provided access for limited geographic areas, such as one city or region. Such small ISPs accounted for roughly a quarter of household use and another fraction of business use. Technical challenges did not get in the way The Internet access market did suffer from some technical challenges, but not enough to prevent rapid diffusion. Commercialization induced considerable technical innovation in complementary inventive activities. Much of this innovative activity became associated with developing new applications for existing users and new users. It is often forgotten that when the electronic commerce first developed based on TCP/IP standards, it was relatively mature in some applications, such as e-mail and file transfers, which were the most popular applications (these programs continue to be the most popular today, NTIA [1999]). To be sure, TCP/IP based programs were weak in others areas, such as commercial data base and software applications for business use, but those uses did not necessarily have to come immediately. The invention of the World Wide Web in the early 1990s further stretched the possibilities for potential applications and highlighted these weaknesses. More important for the initial diffusion, little technical invention was required for commercial vendors to put this technology into initial mainstream use. Academic modem pools and computing centers tended to use technologies similar to their civilian counterparts --such as bulletin board operators --while buying most equipment from commercial suppliers. Moving this activity into the mainstream commercial sector did not necessitate building a whole new Internet equipment industry; it was already there, supplying goods and services to the universities and to home PC users. Similarly, much of the software continued to be usefuli.e., Unix systems, the gate-keeping software, and the basic communication protocols. Indeed, every version of Unix software had been TPC/IP compatible for many years due to Department of Defense requirements. A simple commercial operation only needed to add a billing component to the gate-keeping software to turn an academic modem pool into a rudimentary commercial operation. 9 Technical information about these operations was easy to obtain if one had sufficient technical background; a BA in basic electrical engineering or computer science was far more than adequate. Many IS
Prior exercise and antioxidant supplementation: effect on oxidative stress and muscle injury
<p>Abstract</p> <p>Background</p> <p>Both acute bouts of prior exercise (preconditioning) and antioxidant nutrients have been used in an attempt to attenuate muscle injury or oxidative stress in response to resistance exercise. However, most studies have focused on untrained participants rather than on athletes. The purpose of this work was to determine the independent and combined effects of antioxidant supplementation (vitamin C + mixed tocopherols/tocotrienols) and prior eccentric exercise in attenuating markers of skeletal muscle injury and oxidative stress in resistance trained men.</p> <p>Methods</p> <p>Thirty-six men were randomly assigned to: no prior exercise + placebo; no prior exercise + antioxidant; prior exercise + placebo; prior exercise + antioxidant. Markers of muscle/cell injury (muscle performance, muscle soreness, C-reactive protein, and creatine kinase activity), as well as oxidative stress (blood protein carbonyls and peroxides), were measured before and through 48 hours of exercise recovery.</p> <p>Results</p> <p>No group by time interactions were noted for any variable (P > 0.05). Time main effects were noted for creatine kinase activity, muscle soreness, maximal isometric force and peak velocity (P < 0.0001). Protein carbonyls and peroxides were relatively unaffected by exercise.</p> <p>Conclusion</p> <p>There appears to be no independent or combined effect of a prior bout of eccentric exercise or antioxidant supplementation as used here on markers of muscle injury in resistance trained men. Moreover, eccentric exercise as used in the present study results in minimal blood oxidative stress in resistance trained men. Hence, antioxidant supplementation for the purpose of minimizing blood oxidative stress in relation to eccentric exercise appears unnecessary in this population.</p
The role of equilibrium and disequilibrium in modeling regional growth and decline: a critical reassessment
While unable to copy/paste the abstract, the paper argues that regional differentials in wages and rents are overwhelmingly of an equilibrium nature, with disequilibrium forces having little systematic influenc
- …