2,173 research outputs found

    Novi steroidni derivati sintetizirani iz 3betha-hidroksiandrosten-17-ona

    Get PDF
    In this study, we synthesized some new substituted steroidal derivatives using 3betha-hydroxyandrosten-17-one (dehydroepiandrosterone) as starting material. The synthesized steroidal derivatives 1-11 were evaluated for their androgenic-anabolic activities compared to testosterone as positive control. Details of the synthesis, spectroscopic data and toxicity (LD50) of synthesized compounds are reported.U radu je opisana sinteza novih steroidnih derivata 1-11 koristeći 3betha-hidroksiandrosten-17-on (dehidroepiandrosteron) kao početnu supstanciju. Androgeno-anaboličko djelovanje tih spojeva uspoređivano je s djelovanjem testosterona kao pozitivnom kontrolom. Navode se detaljni sintetski postupci, spektroskopska karakterizacija i podaci o toksičnosti (LD50)

    Flat-Cored Dark Matter in Cuspy Clusters of Galaxies

    Full text link
    Sand, Treu, & Ellis (2002) have measured the central density profile of cluster MS2137-23 with gravitational lensing and velocity dispersion and removed the stellar contribution with a reasonable M/L. The resulting dark matter distribution within r<50 kpc was fitted by a density cusp of r^{-beta} with beta=0.35. This stands in an apparent contradiction to the CDM prediction of beta~1, and the disagreement worsens if adiabatic compression of the dark matter by the infalling baryons is considered. Following El-Zant, Shlosman & Hoffman (2001), we argue that dynamical friction acting on galaxies moving within the dark matter background counters the effect of adiabatic compression by transfering the orbital energy of galaxies to the dark matter, thus heating up and softening the central density cusp. Using N-body simulations of massive solid clumps moving in clusters we show that indeed the inner dark matter distribution flattens (with beta approx 0.35 for a cluster like MS2137-23) when the galaxies spiral inward. We find as a robust result that while the dark matter distribution becomes core-like, the overall mass distribution preserves its cuspy nature, in agreement with X-ray and lensing observations of clusters.Comment: 7 pages, 3 figures, to be published in Astrophysical Journal Letter

    TB70: Physical and Chemical Changes Associated with the Development of the Lowbush Blueberry Fruit Vaccinium angustifolium Ait.

    Get PDF
    The objective of this investigation was to determine the growth characteristics, changes in the soluble solids, pH, and titratable acidity for the purpose of defining and describing stages in the growth of the blueberry fruit.https://digitalcommons.library.umaine.edu/aes_techbulletin/1121/thumbnail.jp

    B780: A Cost Analysis of Pruning Procedures in Lowbush Blueberry Production

    Get PDF
    Burning fields with fuel oil is currently the most practical method of pruning blueberries but is costly and destructive to the organic material on the surface of the soil. Fuel oil is a nonrenewable resource that is rapidly increasing in cost and, in the future, may become less readily available for this use. The need to develop alternative means of pruning lowbush bleuberries is evident. This bulletin compares the economics of six pruning procedures on operations of three sizes. The budgets are based on certain assumptions and costs which will change over time. The results will allow blueberry growers to compare procedures to determine which one is most economically feasible for their particular operation and its resources.https://digitalcommons.library.umaine.edu/aes_bulletin/1066/thumbnail.jp

    Thermal Equilibrium as an Initial State for Quantum Computation by NMR

    Full text link
    We present a method of using a nuclear magnetic resonance computer to solve the Deutsch-Jozsa problem in which: (1) the number of molecules in the NMR sample is irrelevant to the number of qubits available to an NMR quantum computer, and (2) the initial state is chosen to be the state of thermal equilibrium, thereby avoiding the preparation of pseudopure states and the resulting exponential loss of signal as the number of qubits increases. The algorithm is described along with its experimental implementation using four active qubits. As expected, measured spectra demonstrate a clear distinction between constant and balanced functions.Comment: including 4 figure

    Geometry of Discrete Quantum Computing

    Full text link
    Conventional quantum computing entails a geometry based on the description of an n-qubit state using 2^{n} infinite precision complex numbers denoting a vector in a Hilbert space. Such numbers are in general uncomputable using any real-world resources, and, if we have the idea of physical law as some kind of computational algorithm of the universe, we would be compelled to alter our descriptions of physics to be consistent with computable numbers. Our purpose here is to examine the geometric implications of using finite fields Fp and finite complexified fields Fp^2 (based on primes p congruent to 3 mod{4}) as the basis for computations in a theory of discrete quantum computing, which would therefore become a computable theory. Because the states of a discrete n-qubit system are in principle enumerable, we are able to determine the proportions of entangled and unentangled states. In particular, we extend the Hopf fibration that defines the irreducible state space of conventional continuous n-qubit theories (which is the complex projective space CP{2^{n}-1}) to an analogous discrete geometry in which the Hopf circle for any n is found to be a discrete set of p+1 points. The tally of unit-length n-qubit states is given, and reduced via the generalized Hopf fibration to DCP{2^{n}-1}, the discrete analog of the complex projective space, which has p^{2^{n}-1} (p-1)\prod_{k=1}^{n-1} (p^{2^{k}}+1) irreducible states. Using a measure of entanglement, the purity, we explore the entanglement features of discrete quantum states and find that the n-qubit states based on the complexified field Fp^2 have p^{n} (p-1)^{n} unentangled states (the product of the tally for a single qubit) with purity 1, and they have p^{n+1}(p-1)(p+1)^{n-1} maximally entangled states with purity zero.Comment: 24 page

    Capacity Market for Distribution System Operator – with Reliability Transactions – Considering Critical Loads and Microgrids

    Get PDF
    Conventional distribution system (DS) asset planning methods consider energy only from transmission systems (TS) and not from distributed energy resources (DER), leading to expensive plans. Newer transactive energy DS (TEDS) asset planning models, built on capacity market mechanisms, consider energy from both TS and DERs, leading to lower-cost plans and maximizing social welfare. However, in both methods the cost of higher reliability requirements for some users are socialized across all users, leading to lower social welfare. In this paper, a novel transactive energy capacity market (TECM) model is proposed for DS asset planning. It builds on TEDS incremental capacity auction models by provisioning for critical loads to bid and receive superior reliability as a service. The TECM model considers these reliability transactions, in addition, to selling energy transactions from TS and DERs, buying energy transactions from loads, and asset upgrade transactions from the network operator. The TECM model allows for islanded microgrids and network reconfiguration to maximize social welfare. The TECM model is assessed on several case studies, demonstrating that it achieves higher social welfare and a lower plan cost

    A framework for automatic semantic video annotation

    Get PDF
    The rapidly increasing quantity of publicly available videos has driven research into developing automatic tools for indexing, rating, searching and retrieval. Textual semantic representations, such as tagging, labelling and annotation, are often important factors in the process of indexing any video, because of their user-friendly way of representing the semantics appropriate for search and retrieval. Ideally, this annotation should be inspired by the human cognitive way of perceiving and of describing videos. The difference between the low-level visual contents and the corresponding human perception is referred to as the ‘semantic gap’. Tackling this gap is even harder in the case of unconstrained videos, mainly due to the lack of any previous information about the analyzed video on the one hand, and the huge amount of generic knowledge required on the other. This paper introduces a framework for the Automatic Semantic Annotation of unconstrained videos. The proposed framework utilizes two non-domain-specific layers: low-level visual similarity matching, and an annotation analysis that employs commonsense knowledgebases. Commonsense ontology is created by incorporating multiple-structured semantic relationships. Experiments and black-box tests are carried out on standard video databases for action recognition and video information retrieval. White-box tests examine the performance of the individual intermediate layers of the framework, and the evaluation of the results and the statistical analysis show that integrating visual similarity matching with commonsense semantic relationships provides an effective approach to automated video annotation
    • 

    corecore