847 research outputs found
Supporting Innovation in Targeted Treatments: Licenses of Right to NIH-Funded Research Tools
Support for new drug development has taken some interesting turns in current patent law jurisprudence. Beginning with the severe curtailment of scope of the common law experimental use doctrine in Madey v. Duke University, and culminating with the recent Supreme Court decision in Merck KGaA v. Integra Lifesciences I, Ltd., broadening the scope of the statutory research exemption, the freedom to conduct experimental research using another\u27s patented inventions becomes dependent in part on the purpose of the research. That the patent at issue in Merck was characterized by the Federal Circuit as being directed to a research tool raised the question of the extent of protection that should be afforded to such inventions. In other words, as new drug development necessarily involves some degree of exploratory research, research tools are often employed to facilitate the search for new drugs. Consequently, patents on such inventions provide an interesting anomaly to the innovation incentive argument for patent rights. The question of whether patents on research tools retard rather than enhance innovation have been discussed at length without any clear conclusion. Interestingly, the patent code, which in its present form was promulgated in large part through heavy lobbying by the pharmaceutical companies, now serves as a sword of Damocles hanging over these companies\u27 research activities using patented research tools. The environment for drug development by large pharmaceutical companies is changing. Previously chemistry dictated new development by defining new structures or active isomers of chemical compounds. The focus has now shifted to biology as targeted treatments are taking the forefront in drug development. The shift can be credited in large part to the success of the Human Genome project, which served to redefine medical research by fusing biological systems with advances in information technology. Pharmacogenomics, as it is termed, describes the science behind targeted pharmaceuticals, which serves as a novel business model for the pharmaceutical industry[...] Targeted treatments could be the next generation of blockbuster pharmaceuticals the industry has been waiting for, in that it will shift the focus from traditional blockbuster one-for-all medications to more selective products. However, much of the success of these drugs depends on the discovery and validation of new targets using existing and developing research tools. One noteworthy example of the importance of research tools in developing targeted treatments is the discovery of gene slicing by RNA interference (RNAi), which offers a promising possibility for treating AIDS and other diseases. The FDA also jumped on the bandwagon criticizing the disparity between advances in the understanding of diseases and the dearth of new pharmaceutical products. As a result, the clinical diagnostics business will play an important role in developing targeted treatments. Moreover, legislation facilitating overt and explicit contact between scientific discovery and product development has created changes that has led many commentators, both critics and advocates, to agree that there is something significant occurring. The scientific lab is now directly linked with commercial outlets through a systematic infrastructure of the venture capital market, the insurgence of biotechnology start up companies, and the convergence of university and industry. Due to the complexity of pharmacogenomics, access to many proprietary research tools is necessary to conduct research in this field. Thus the significance of the availability of research tools becomes apparent. However, broad patents on upstream products arguably adversely affect innovation in this area. This article will address the limitations on the use of research tools and propose a means to ensure its continued accessibility to promote innovation in pharmacogenomics. Part I will address the development of biotechnology in the United States and the legislation that has helped spur innovation in the field. Part II will discuss the arguments put forth in support of open science and the legal issues surrounding access to patented tools for research purposes. Part III will discuss the current literature on research tools, including the economics and legal policies relating to patents on research tools and alternatives to patenting. Finally, Part IV will propose a policy incorporating licenses of right to ensure continued accessibility of patented research tools developed using NIH funding
Supporting Innovation in Targeted Treatments: Licenses of Right to NIH-Funded Research Tools
Support for new drug development has taken some interesting turns in current patent law jurisprudence. Beginning with the severe curtailment of scope of the common law experimental use doctrine in Madey v. Duke University, and culminating with the recent Supreme Court decision in Merck KGaA v. Integra Lifesciences I, Ltd., broadening the scope of the statutory research exemption, the freedom to conduct experimental research using another\u27s patented inventions becomes dependent in part on the purpose of the research. That the patent at issue in Merck was characterized by the Federal Circuit as being directed to a research tool raised the question of the extent of protection that should be afforded to such inventions. In other words, as new drug development necessarily involves some degree of exploratory research, research tools are often employed to facilitate the search for new drugs. Consequently, patents on such inventions provide an interesting anomaly to the innovation incentive argument for patent rights. The question of whether patents on research tools retard rather than enhance innovation have been discussed at length without any clear conclusion. Interestingly, the patent code, which in its present form was promulgated in large part through heavy lobbying by the pharmaceutical companies, now serves as a sword of Damocles hanging over these companies\u27 research activities using patented research tools. The environment for drug development by large pharmaceutical companies is changing. Previously chemistry dictated new development by defining new structures or active isomers of chemical compounds. The focus has now shifted to biology as targeted treatments are taking the forefront in drug development. The shift can be credited in large part to the success of the Human Genome project, which served to redefine medical research by fusing biological systems with advances in information technology. Pharmacogenomics, as it is termed, describes the science behind targeted pharmaceuticals, which serves as a novel business model for the pharmaceutical industry[...] Targeted treatments could be the next generation of blockbuster pharmaceuticals the industry has been waiting for, in that it will shift the focus from traditional blockbuster one-for-all medications to more selective products. However, much of the success of these drugs depends on the discovery and validation of new targets using existing and developing research tools. One noteworthy example of the importance of research tools in developing targeted treatments is the discovery of gene slicing by RNA interference (RNAi), which offers a promising possibility for treating AIDS and other diseases. The FDA also jumped on the bandwagon criticizing the disparity between advances in the understanding of diseases and the dearth of new pharmaceutical products. As a result, the clinical diagnostics business will play an important role in developing targeted treatments. Moreover, legislation facilitating overt and explicit contact between scientific discovery and product development has created changes that has led many commentators, both critics and advocates, to agree that there is something significant occurring. The scientific lab is now directly linked with commercial outlets through a systematic infrastructure of the venture capital market, the insurgence of biotechnology start up companies, and the convergence of university and industry. Due to the complexity of pharmacogenomics, access to many proprietary research tools is necessary to conduct research in this field. Thus the significance of the availability of research tools becomes apparent. However, broad patents on upstream products arguably adversely affect innovation in this area. This article will address the limitations on the use of research tools and propose a means to ensure its continued accessibility to promote innovation in pharmacogenomics. Part I will address the development of biotechnology in the United States and the legislation that has helped spur innovation in the field. Part II will discuss the arguments put forth in support of open science and the legal issues surrounding access to patented tools for research purposes. Part III will discuss the current literature on research tools, including the economics and legal policies relating to patents on research tools and alternatives to patenting. Finally, Part IV will propose a policy incorporating licenses of right to ensure continued accessibility of patented research tools developed using NIH funding
Effects of lengthscales and attractions on the collapse of hydrophobic polymers in water
We present results from extensive molecular dynamics simulations of collapse
transitions of hydrophobic polymers in explicit water focused on understanding
effects of lengthscale of the hydrophobic surface and of attractive
interactions on folding. Hydrophobic polymers display parabolic, protein-like,
temperature-dependent free energy of unfolding. Folded states of small
attractive polymers are marginally stable at 300 K, and can be unfolded by
heating or cooling. Increasing the lengthscale or decreasing the polymer-water
attractions stabilizes folded states significantly, the former dominated by the
hydration contribution. That hydration contribution can be described by the
surface tension model, , where the surface
tension, , is lengthscale dependent and decreases monotonically with
temperature. The resulting variation of the hydration entropy with polymer
lengthscale is consistent with theoretical predictions of Huang and Chandler
(Proc. Natl. Acad. Sci.,97, 8324-8327, 2000) that explain the blurring of
entropy convergence observed in protein folding thermodynamics. Analysis of
water structure shows that the polymer-water hydrophobic interface is soft and
weakly dewetted, and is characterized by enhanced interfacial density
fluctuations. Formation of this interface, which induces polymer folding, is
strongly opposed by enthalpy and favored by entropy, similar to the
vapor-liquid interface.Comment: 24 pages, 5 figure
Concurrency Lock Issues in Relational Cloud Computing
The widespread popularity of Cloud computing as a preferred platform for the deployment of web applications has resulted in an enormous number of applications moving to the cloud, and the huge success of cloud service providers. Due to the increasing number of web applications being hosted in the cloud, and the growing scale of data which these applications store, process, and serve – scalable data management systems form a critical part of cloud infrastructures. There are issues related to the database security while database is on cloud. The major challenging issues are multi-tenancy, scalability and the privacy. This paper focuses on the problems faced in the data security of Relational Cloud. The problems faced by various types of tenants and the type of access into the database makes a rework on the security of data, by analyzing proper locking strategies on the records accessed from the database. Data security in cloud computing addresses the type of access mode by the users (for analytical or transaction purpose) and the frequency of data access from the physical location (in shared or no-shared disk mode). Accordingly, the various data locking strategies are studied and appropriate locking mechanism will be implemented for real-time applications as in e-commerce. Keywords: Relational Cloud, Multi-tenant, two-phase locking, concurrency control, data management
Immunocytochemical localisation of follicle stimulating hormone (FSH) in normal, benign and malignant human prostates.
Immunocytochemical localisation of follicle stimulating hormone (FSH) was carried out in normal, benign and malignant human prostates by indirect immunoperoxidase technique. Positive staining was observed in the epithelial cells of all the three categories, while the stromal cells showed a weakly positive reaction in a few specimens. The brown reaction product was dispersed in the cytoplasm of the epithelial cells. These observations demonstrate the presence of immunoreactive FSH-like peptide in human prostate. The significance of FSH in the aetiopathology of prostatic disorders is discussed
High-resolution continuum source graphite furnace molecular absorption spectrometry for the monitoring of Sr isotopes via SrF formation: a case study
High-resolution continuum source graphite furnace molecular absorption spectrometry (HR CS GFMAS) can provide isotopic information under certain conditions, thus broadening its field of application. However, to date, only elements with two major stable isotopes have been monitored via this technique. In this work, the possibilities of HR CS GFMAS to determine isotope ratios of elements with more than two stable isotopes are evaluated for the first time. For this purpose, Sr was chosen as the analyte and SrF as the target species, so four different signals corresponding to four stable Sr isotopes (88Sr, 87Sr, 86Sr and 84Sr) should be distinguished. Nevertheless, due to the number of strontium isotopes, the shape of the peaks, and the resolution that the instrument exhibits in the spectral window, isotopic signals overlap, thus leading to potentially biased results. To circumvent this issue, a deconvolution protocol, consisting of measuring and correcting for the contribution of each isotope on the signals of the rest, was developed. These contributions were calculated as the signal ratio between the absorbance of the monoisotopic profile at the wavelengths where the maxima of other isotopes are expected and at its own maximum. Therefore, the interference can be simply subtracted from the net signal registered for the interfered isotope. The performance of this method was demonstrated for both naturally abundant and isotope-enriched Sr standards, paving the way for future applications in this field. Analysis of a real sample (tap water) spiked with a 84Sr solution is also demonstrated
Conventional and advanced echocardiographic assessment of systolic function in dogs sedated with dexmedetomidine or acepromazine
Version of Record: https://www.sciencedirect.com/science/article/pii/S003452882100285X
Manuscript_6ee2061ebee7a19cedd0d075979490e1Dexmedetomidine and acepromazine, sedatives commonly used in dogs have opposite vascular effects, resulting in afterload increase and decrease, respectively. This could variably affect systolic myocardial function. Previous echocardiographic studies assessing the cardiovascular effects of these drugs used conventional echocardiography, while advanced techniques such as speckle tracking echocardiography (STE) and tissue Doppler imaging (TDI), which are known to provide a more accurate assessment of systolic function, have been rarely used for this aim. Moreover, in the few studies using advanced techniques, the drugs where combined with opioids. Therefore, the main objective of this prospective study was to assess systolic myocardial function by conventional and advanced echocardiography (STE and TDI), in dogs sedated exclusively with dexmedetomidine or acepromazine not combined with other drugs. Twenty healthy dogs were randomly divided into two groups, Group A (acepromazine, 20 μg/kg IM), and Group D (dexmedetomidine, 5 μg/kg IM), cardiovascular parameters were assessed before sedation (T0), and thirty minutes afterwards (T1). Systolic arterial pressure and heart rate decreased in both groups at T1 as compared to T0. Only one conventional echocardiographic raw variable (left ventricular internal dimension in systole) and three out of five advanced echocardiographic variables (radial TDI systolic velocities at the epicardial region of the left ventricular free wall, longitudinal TDI systolic velocities of the septal mitral valve annulus and the STE-derived left ventricular global strain), were affected in Group D. A systolic impairment was observed in Group D and better estimated by advanced echocardiography. In Group A, only the end diastolic voume index (conventional echocardiography) was decreased. Both protocols seem to induce echocardiographic changes more likely secondary to their vascular action.Veterinari
A novel approach for adapting the standard addition method to single particle-ICP-MS for the accurate determination of NP size and number concentration in complex matrices; 35414390
This paper presents a novel approach, based on the standard addition method, for overcoming the matrix effects that often hamper the accurate characterization of nanoparticles (NPs) in complex samples via single particle inductively coupled plasma mass spectrometry (SP-ICP-MS). In this approach, calibration of the particle size is performed by two different methods: (i) by spiking a suspension of NPs standards of known size containing the analyte, or (ii) by spiking the sample with ionic standards; either way, the measured sensitivity is used in combination with the transport efficiency (TE) for sizing the NPs. Moreover, such transport efficiency can be readily obtained from the data obtained via both calibration methods mentioned above, so that the particle number concentration can also be determined. The addition of both ionic and NP standards can be performed on-line, by using a T-piece with two inlet lines of different dimensions. The smaller of the two is used for the standards, thus ensuring a constant and minimal sample dilution. As a result of the spiking of the samples, mixed histograms including the signal of the sample and that of the standards are obtained. However, the use of signal deconvolution approaches permits to extract the information, even in cases of signal populations overlapping. For proofing the concept, characterization of a 50 nm AuNPs suspension prepared in three different media (i.e., deionized water, 5% ethanol, and 2.5% tetramethyl ammonium hydroxide-TMAH) was carried out. Accurate results were obtained in all cases, in spite of the matrix effects detected in some media. Overall, the approach proposed offers flexibility, so it can be adapted to different situations, but it might be specially indicated for samples for which the matrix is not fully known and/or dilution is not possible/recommended. © 2022 The Author
Magnetic anisotropy, first-order-like metamagnetic transitions and large negative magnetoresistance in the single crystal of GdPdSi
Electrical resistivity (), magnetoresistance (MR), magnetization,
thermopower and Hall effect measurements on the single crystal
GdPdSi, crystallizing in an AlB-derived hexagonal structure are
reported. The well-defined minimum in at a temperature above N\'eel
temperature (T= 21 K) and large negative MR below 3T, reported
earlier for the polycrystals, are reproducible even in single crystals. Such
features are generally uncharacteristic of Gd alloys. In addition, we also
found interesting features in other data, e.g., two-step first-order-like
metamagnetic transitions for the magnetic field along [0001] direction. The
alloy exhibits anisotropy in all these properties, though Gd is a S-state ion.Comment: RevTeX, 5 pages, 6 encapsulated postscript figures; scheduled to be
published in Phy. Rev. B (01 November 1999, B1
- …