1,094 research outputs found

    The Impact Of Cultural Differences On The Effectiveness Of Advertisements On The Internet: A Comparison Among The United States, China, And Germany

    Get PDF
    Hall and Hall (1990) classify German culture as a low-context culture, American culture as a medium/low-context culture, and Chinese culture as a high-context culture. A low-context culture is one where the words contain most of the information needed and there is little need to rely on the context of the events/message to help interpreting the meaning of the message/events. In contrast, a high-context culture is one where the context of the message is as important as or even more important than the words. This paper focuses on selected cultural differences among Germany, United States, and China, and the impact of these differences on the various aspects of consumer behavior. Specifically, it is focusing on the impact of cultural context on the effectiveness of the different styles of advertising.

    The IPAC Image Subtraction and Discovery Pipeline for the intermediate Palomar Transient Factory

    Get PDF
    We describe the near real-time transient-source discovery engine for the intermediate Palomar Transient Factory (iPTF), currently in operations at the Infrared Processing and Analysis Center (IPAC), Caltech. We coin this system the IPAC/iPTF Discovery Engine (or IDE). We review the algorithms used for PSF-matching, image subtraction, detection, photometry, and machine-learned (ML) vetting of extracted transient candidates. We also review the performance of our ML classifier. For a limiting signal-to-noise ratio of 4 in relatively unconfused regions, "bogus" candidates from processing artifacts and imperfect image subtractions outnumber real transients by ~ 10:1. This can be considerably higher for image data with inaccurate astrometric and/or PSF-matching solutions. Despite this occasionally high contamination rate, the ML classifier is able to identify real transients with an efficiency (or completeness) of ~ 97% for a maximum tolerable false-positive rate of 1% when classifying raw candidates. All subtraction-image metrics, source features, ML probability-based real-bogus scores, contextual metadata from other surveys, and possible associations with known Solar System objects are stored in a relational database for retrieval by the various science working groups. We review our efforts in mitigating false-positives and our experience in optimizing the overall system in response to the multitude of science projects underway with iPTF.Comment: 66 pages, 21 figures, 7 tables, accepted by PAS

    Processing Images from the Zwicky Transient Facility

    Get PDF
    The Zwicky Transient Facility is a new robotic-observing program, in which a newly engineered 600-MP digital camera with a pioneeringly large field of view, 47~square degrees, will be installed into the 48-inch Samuel Oschin Telescope at the Palomar Observatory. The camera will generate ∼1\sim 1~petabyte of raw image data over three years of operations. In parallel related work, new hardware and software systems are being developed to process these data in real time and build a long-term archive for the processed products. The first public release of archived products is planned for early 2019, which will include processed images and astronomical-source catalogs of the northern sky in the gg and rr bands. Source catalogs based on two different methods will be generated for the archive: aperture photometry and point-spread-function fitting.Comment: 6 pages, 4 figures, submitted to RTSRE Proceedings (www.rtsre.org

    The Impossibility of a Perfectly Competitive Labor Market

    Get PDF
    Using the institutional theory of transaction cost, I demonstrate that the assumptions of the competitive labor market model are internally contradictory and lead to the conclusion that on purely theoretical grounds a perfectly competitive labor market is a logical impossibility. By extension, the familiar diagram of wage determination by supply and demand is also a logical impossibility and the neoclassical labor demand curve is not a well-defined construct. The reason is that the perfectly competitive market model presumes zero transaction cost and with zero transaction cost all labor is hired as independent contractors, implying multi-person firms, the employment relationship, and labor market disappear. With positive transaction cost, on the other hand, employment contracts are incomplete and the labor supply curve to the firm is upward sloping, again causing the labor demand curve to be ill-defined. As a result, theory suggests that wage rates are always and everywhere an amalgam of an administered and bargained price. Working Paper 06-0

    Chronic Hepatitis B Finite Treatment: similar and different concerns with new drug classes

    Get PDF
    Chronic hepatitis B, a major cause of liver disease and cancer, affects over 250 million people worldwide. Currently there is no cure, only suppressive therapies. Efforts to develop finite curative HBV therapies are underway, consisting of combinations of multiple novel agents +/- nucleos(t)ide reverse transcriptase inhibitors. The HBV Forum convened a webinar in July 2021, and subsequent working group discussions to address how and when to stop finite therapy for demonstration of sustained off-treatment efficacy and safety responses. Participants included leading experts in academia, clinical practice, pharmaceutical companies, patient representatives and regulatory agencies. This Viewpoint outlines areas of consensus within our multi-stakeholder group for stopping finite therapies in chronic Hepatitis B investigational studies, including trial design, patient selection, outcomes, biomarkers, pre-defined stopping criteria, pre-defined retreatment criteria, duration of investigational therapies, and follow up after stopping therapy. Future research of unmet needs are discussed

    An International Quiet Ocean Experiment

    Get PDF
    Author Posting. © Oceanography Society, 2011. This article is posted here by permission of Oceanography Society for personal use, not for redistribution. The definitive version was published in Oceanography 24, no. 2 (2011): 174–181, doi:10.5670/oceanog.2011.37.The effect of noise on marine life is one of the big unknowns of current marine science. Considerable evidence exists that the human contribution to ocean noise has increased during the past few decades: human noise has become the dominant component of marine noise in some regions, and noise is directly correlated with the increasing industrialization of the ocean. Sound is an important factor in the lives of many marine organisms, and theory and increasing observations suggest that human noise could be approaching levels at which negative effects on marine life may be occurring. Certain species already show symptoms of the effects of sound. Although some of these effects are acute and rare, chronic sublethal effects may be more prevalent, but are difficult to measure. We need to identify the thresholds of such effects for different species and be in a position to predict how increasing anthropogenic sound will add to the effects. To achieve such predictive capabilities, the Scientific Committee on Oceanic Research (SCOR) and the Partnership for Observation of the Global Oceans (POGO) are developing an International Quiet Ocean Experiment (IQOE), with the objective of coordinating the international research community to both quantify the ocean soundscape and examine the functional relationship between sound and the viability of key marine organisms. SCOR and POGO will convene an open science meeting to gather community input on the important research, observations, and modeling activities that should be included in IQOE

    Catching Element Formation In The Act

    Full text link
    Gamma-ray astronomy explores the most energetic photons in nature to address some of the most pressing puzzles in contemporary astrophysics. It encompasses a wide range of objects and phenomena: stars, supernovae, novae, neutron stars, stellar-mass black holes, nucleosynthesis, the interstellar medium, cosmic rays and relativistic-particle acceleration, and the evolution of galaxies. MeV gamma-rays provide a unique probe of nuclear processes in astronomy, directly measuring radioactive decay, nuclear de-excitation, and positron annihilation. The substantial information carried by gamma-ray photons allows us to see deeper into these objects, the bulk of the power is often emitted at gamma-ray energies, and radioactivity provides a natural physical clock that adds unique information. New science will be driven by time-domain population studies at gamma-ray energies. This science is enabled by next-generation gamma-ray instruments with one to two orders of magnitude better sensitivity, larger sky coverage, and faster cadence than all previous gamma-ray instruments. This transformative capability permits: (a) the accurate identification of the gamma-ray emitting objects and correlations with observations taken at other wavelengths and with other messengers; (b) construction of new gamma-ray maps of the Milky Way and other nearby galaxies where extended regions are distinguished from point sources; and (c) considerable serendipitous science of scarce events -- nearby neutron star mergers, for example. Advances in technology push the performance of new gamma-ray instruments to address a wide set of astrophysical questions.Comment: 14 pages including 3 figure

    Factors affecting the microwave coking of coals and the implications on microwave cavity design

    Get PDF
    The work carried out in this paper assessed how processing conditions and feedstock affect the quality of the coke produced during microwave coke making. The aim was to gather information that would support the development of an optimised microwave coke making oven. Experiments were carried out in a non-optimised 2450 MHz cylindrical cavity. The effect of treatment time (15–120 min), power input (750 W–4.5 kW) and overall power input (1700–27,200 kWh/t) on a range of coals (semi-bituminous–anthracite) was investigated. Intrinsic reactivity, random reflectance, strength index and dielectric properties of the produced cokes were compared with those of two commercial cokes to assess the degree of coking produced in the microwave system. Overall energy input and coal rank were found to be the major factors determining the degree of coking following microwave treatment. The dependency on coal rank was attributed to the larger amount of volatiles that had to be removed from the lower ranked coals, and the increasing dielectric loss of the organic component of the coal with rank due to increased structural ordering. Longer treatment times at lower powers or shorter treatment times at higher powers are expected to produce the same degree of coking. It was concluded that microwave coke making represents a potential step-change in the coking industry by reducing treatment times by an order of magnitude, introducing flexibility and potentially decreasing the sensitivity to quality requirement in the feedstock. The main challenges to development are the energy requirements (which will need to be significantly reduced in an optimised process) and penetration depth (which will require an innovative reactor design to maximise the advantage of using microwaves). Understanding and quantifying the rapidly changing dielectric properties of the coal and coke materials is vital in addressing both of these challenges
    • …
    corecore