348 research outputs found

    Managing the Muses: Musical Performance and Modernity in the Public Schools of Late-Nineteenth Century Toronto

    Get PDF
    Abstract:This paper employs the lens of sensory historical analysis to examine public school music in the making of a modern middle class in late-Victorian Toronto. Its aim is to show how this subject both shaped and was shaped by the culture of modernity which increasingly pervaded large urban centres such as Toronto during the course of the nineteenth century. The paper goes beyond pedagogic and bureaucratic justification, to present the evolution of school music within a much broader acoustic framework, that is, to weave it into the increasingly-urban soundtrack of Toronto, to gain some appreciation of how it would have been heard and understood at the time. Its aim is to offer historians of education an understanding of what actually occurred in the classrooms of Toronto during the period by listening to these experiences and the acoustic environment in which they would have been understood.Résumé:Cet article analyse le rôle joué par les cours de musique dans les écoles publiques de Toronto dans le processus de formation de la classe moyenne à l'époque victorienne. L'auteur propose une analyse historique sensorielle afin de démontrer comment cette matière scolaire a influencé et était influencée par la culture de modernité dont s'imprégnaient graduellement les grands centres urbains au cours du dix-neuvième siècle. Au-delà des justifications pédagogiques et bureaucratiques, il présente l’évolution des cours de musique dans un cadre élargi en lien avec la musique diffusée dans cette ville aux sonorités de plus en plus urbaines. Il veut ainsi favoriser une meilleure compréhension de ce qu'était le contexte sonore et sa réceptivité chez les auditeurs à cette époque. Ce texte trace pour les historiens de l'éducation un portrait de l'évolution de l'enseignement de la musique dans les classes torontoises dans le contexte particulier de la nouveauté musicale

    The Entrepreneur\u27s Choice: Venture Capital Debt Financing with Adverse Selection

    Get PDF
    This paper studies the consequences of using a debt contract to raise venture capital for an entrepreneurial project in an adverse selection setting with different quality venture capitalists. The paper considers not only the likelihood of success of a one-time project being dependent on the quality of the venture capitalist, but also the problem of a reduced ownership value of future rents from the venture if the venture capitalist takes it over as the result of default of the entrepreneur. Expressions for the face value of debt required for pooling and separating equilibria are also derived. The existence of a separating equilibrium with bad quality venture capitalists is used to show how less reputable venture capitalists can survive in the marketplace. Finally, the paper uses a numerical example to demonstrate why the entrepreneurs of more profitable entrepreneurial firms may prefer to do business with bad quality venture capitalists

    The Staging of Venture Equity Capital and Venture Capitalist Bargaining Power

    Get PDF
    In this paper we look at the effects of bargaining power on the types of entrepreneurial projects chosen by venture capitalists and show that a wealth-constrained venture capitalist prefers to provide equity financing to a two-stage rather than to a similar single-stage project. While the venture capitalist does not have bargaining power over the entrepreneur of a single-stage project and is thus unable to extract any surplus, the venture capitalist does have this advantage in a two-stage project and, provided the project is good, can demand a portion of the surplus as a pre-condition for providing follow-on capital. This suggests that venture capitalists should stage their capital investments in order to improve their bargaining power, allowing them to earn greater profits from successful entrepreneurial projects

    Some aspects of the construction and implementation of error-correcting linear codes

    Get PDF
    From Conclusion: The study of error-correcting codes is now approximately 25 years old. The first known publication on the subject was in 1949 by M. Golay, who later did much research into the subject of perfect codes. It has been recently established that all the perfect codes are known. R.W. Hamming presented his perfect single-error correcting codes in 1950, in ~n article in the Bell System Technical Journal. These codes turned out to be a special case of the powerful Bose-Chaudhuri codes which were discovered around 1960. Various work has been done on the theory of minimal redundancy of codes for a given error-correcting performance, by Plotkin, Gilbert, Varshamov and others, between 1950 and 1960. The binary BCH codes were found to be so close to the theoretical bounds that, to date, no better codes have been discovered. Although the BCH codes are extremely efficient in terms of ratio of information to check digits, they are not easily, decoded with a minimal amount of apparatus. Petersen in 1961 described an algorithm for d e coding BCH codes, but this was cumbersome compared with the majority-logic methods of Massey and others. Thus the search began for codes which are easily decoded with comparatively simple apparatus. The finite geometry codes which were described by Rudolph in a 1964 thesis were examples of codes which are easily decoded 58 by a small number of steps of majority logic. The simplicial codes of Saltzer are even better in this respect, since they can be decoded by a single step of majority logic, but are rather inefficient . The applications of coding theory have changed over the years, as well. The first computers were huge circuits of relays, which were unreliable and prone to errors. Error correcting codes were required to minimise the possibility of incorrect results. As vacuum tubes and later transistorised circuits made computers more reliable, the need for sophisticated and powerful codes in the computer world diminished. Other used presented themselves however, for example the control systems of unmanned space craft. Because of the difficulty of sending and receiving messages in this case, · very powerful codes were required. Other uses were found in transmission lines and telephone exchanges. The codes considered in this dissertation have, for the most part, been block codes for use on the binary symmetric channel. There are, however, several other applications, such as codes for use on an erasure channel, where bits are corrupted so as to be unrecognizable, rather than changed. There are also codes for burst-error correction, where chennel noise is not randomly distributed, but occurs in "bursts" a few bits long. Certain cyclic codes are of application in these cases. The theory of error correcting codes has risen from virtual non-existence in 1950 to a major and sophisticated part of communication theory. Judging from the articles in journals, it promises to be the subject of a great deal of research for some years to come

    Entrepreneurial Stock Brokering and Switching Costs

    Get PDF
    Stock brokers are entrepreneurs who incur switching costs when the change brokerage houses. We use Helsinki Stock Exchange data to investigate these costs by examining whether investors are loyal to their brokers when brokers move. We find that investors who have extant relationships with the new house are more likely to attract the investors from the old houses, and savvy (knowledgeable) investors are more likely to stay with their broker

    All Transients, All the Time: Real-Time Radio Transient Detection with Interferometric Closure Quantities

    Full text link
    We demonstrate a new technique for detecting radio transients based on interferometric closure quantities. The technique uses the bispectrum, the product of visibilities around a closed-loop of baselines of an interferometer. The bispectrum is calibration independent, resistant to interference, and computationally efficient, so it can be built into correlators for real-time transient detection. Our technique could find celestial transients anywhere in the field of view and localize them to arcsecond precision. At the Karl G. Jansky Very Large Array (VLA), such a system would have a high survey speed and a 5-sigma sensitivity of 38 mJy on 10 ms timescales with 1 GHz of bandwidth. The ability to localize dispersed millisecond pulses to arcsecond precision in large volumes of interferometer data has several unique science applications. Localizing individual pulses from Galactic pulsars will help find X-ray counterparts that define their physical properties, while finding host galaxies of extragalactic transients will measure the electron density of the intergalactic medium with a single dispersed pulse. Exoplanets and active stars have distinct millisecond variability that can be used to identify them and probe their magnetospheres. We use millisecond time scale visibilities from the Allen Telescope Array (ATA) and VLA to show that the bispectrum can detect dispersed pulses and reject local interference. The computational and data efficiency of the bispectrum will help find transients on a range of time scales with next-generation radio interferometers.Comment: Accepted to ApJ. 8 pages, 5 figures, 2 tables. Revised to include discussion of non-Gaussian statistics of techniqu

    Volatility Clustering and the Bid-ask Spread: Exchange Rate Behaviour in Early Renaissance Florence

    Get PDF
    Abstract This paper investigates the nature and behavior of the domestic (local) currency market that existed in Florence (Italy) during the late 14th and early 15th centuries (a.k.a. the Early Renaissance). We find that the extant volatility and microstructure models developed for modern asset markets are able to describe the statistical volatility properties observed for the denaro-florin exchange rate. Volatility is clustered and is related to the bid-ask spread. This supports the notion that, although there are huge social, industrial and technological differences between capitalism then and now, individuals trading financial assets in an organized venue behave in a similar manner

    Acceptability of HIV Testing Sites Among Rural and Urban African Americans Who Use Cocaine

    Get PDF
    African Americans (AAs) who use cocaine in the Southern region of the U.S. have a relatively high risk of HIV and need for HIV testing. Among this group, those residing in rural areas may have less favorable opinions about common HIV testing sites, which could inhibit HIV testing. We examined rural/urban variations in their acceptability of multiple HIV testing sites (private physician clinic, local health department, community health center, community HIV fair, hospital emergency department, blood plasma donation center, drug abuse treatment facility, and mobile van or community outreach worker). Results from partial proportional odds and logistic regression analyses indicate that rural AA who use cocaine have lower odds of viewing local health departments (OR = 0.09, 95 % CI = 0.03–0.21), physician offices (OR = 0.19, 95 % CI = 0.09–0.42), and drug use treatment centers (OR = 0.49; 95 % CI = 0.30–0.80) as acceptable relative to their urban counterparts. The findings have implications for further targeting HIV testing toward AAs who use of cocaine, particularly those residing in the rural South

    CSO and CARMA Observations of L1157. II. Chemical Complexity in the Shocked Outflow

    Get PDF
    L1157, a molecular dark cloud with an embedded Class 0 protostar possessing a bipolar outflow, is an excellent source for studying shock chemistry, including grain-surface chemistry prior to shocks, and post-shock, gas-phase processing. The L1157-B1 and B2 positions experienced shocks at an estimated ~2000 and 4000 years ago, respectively. Prior to these shock events, temperatures were too low for most complex organic molecules to undergo thermal desorption. Thus, the shocks should have liberated these molecules from the ice grain-surfaces en masse, evidenced by prior observations of SiO and multiple grain mantle species commonly associated with shocks. Grain species, such as OCS, CH3OH, and HNCO, all peak at different positions relative to species that are preferably formed in higher velocity shocks or repeatedly-shocked material, such as SiO and HCN. Here, we present high spatial resolution (~3") maps of CH3OH, HNCO, HCN, and HCO+ in the southern portion of the outflow containing B1 and B2, as observed with CARMA. The HNCO maps are the first interferometric observations of this species in L1157. The maps show distinct differences in the chemistry within the various shocked regions in L1157B. This is further supported through constraints of the molecular abundances using the non-LTE code RADEX (Van der Tak et al. 2007). We find the east/west chemical differentiation in C2 may be explained by the contrast of the shock's interaction with either cold, pristine material or warm, previously-shocked gas, as seen in enhanced HCN abundances. In addition, the enhancement of the HNCO abundance toward the the older shock, B2, suggests the importance of high-temperature O-chemistry in shocked regions.Comment: Accepted for publication in the Astrophysical Journa

    Differential Acute Impacts of Sleeve Gastrectomy, Roux-en-Y Gastric Bypass Surgery and Matched Caloric Restriction Diet on Insulin Secretion, Insulin Effectiveness and Non-Esterified Fatty Acid Levels Among Patients with Type 2 Diabetes.

    Full text link
    peer reviewedBACKGROUND: Bariatric surgery is an increasingly common option for control of type 2 diabetes (T2D) and obesity. Mechanisms underlying rapid improvement of T2D after different types of bariatric surgery are not clear. Caloric deprivation and altered levels of non-esterified fatty acid (NEFA) have been proposed. This study examines how sleeve gastrectomy (SG), Roux-en-Y gastric bypass (GBP) or matched hypocaloric diet (DT) achieves improvements in T2D by characterising components of the glucose metabolism and NEFA levels before and 3 days after each intervention. METHODS: Plasma samples at five time points during oral glucose tolerance test (OGTT) from subjects with T2D undergoing GBP (N = 11) or SG (N = 12) were analysed for C-peptide, insulin and glucose before surgery and 3-day post-intervention or after DT (N = 5). Fasting palmitic, linoleic, oleic and stearic acid were measured. C-peptide measurements were used to model insulin secretion rate (ISR) using deconvolution. RESULTS: Subjects who underwent GBP surgery experienced the greatest improvement in glycaemia (median reduction in blood glucose (BG) from basal by 29 % [IQR -57, -18]) and the greatest reduction in all NEFA measured. SG achieved improvement in glycaemia with lower ISR and reduction in all but palmitoleic acid. DT subjects achieved improvement in glycaemia with an increase in ISR, 105 % [IQR, 20, 220] and stearic acid. CONCLUSIONS: GBP, SG and DT each improve glucose metabolism through different effects on pancreatic beta cell function, insulin sensitivity and free fatty acids
    • …
    corecore