24,235 research outputs found

    Music and dance as a coalition signaling system

    Get PDF
    Evidence suggests that humans have neurological specializations for music processing, but a compelling adaptationist account of music and dance is lacking. The sexual selection hypothesis cannot easily account for the widespread performance of music and dance in groups (especially synchronized performances), and the social bonding hypothesis has severe theoretical difficulties. Humans are unique among the primates in their ability to form cooperative alliances between groups in the absence of consanguineal ties. We propose that this unique form of social organization is predicated on music and dance. Music and dance may have evolved as a coalition signaling system that could, among other things, credibly communicate coalition quality, thus permitting meaningful cooperative relationships between groups. This capability may have evolved from coordinated territorial defense signals that are common in many social species, including chimpanzees. We present a study in which manipulation of music synchrony significantly altered subjects’ perceptions of music quality, and in which subjects’ perceptions of music quality were correlated with their perceptions of coalition quality, supporting our hypothesis. Our hypothesis also has implications for the evolution of psychological mechanisms underlying cultural production in other domains such as food preparation, clothing and body decoration, storytelling and ritual, and tools and other artifacts

    Imaging and burst location with the EXIST high-energy telescope

    Full text link
    The primary instrument of the proposed EXIST mission is a coded mask high energy telescope (the HET), that must have a wide field of view and extremely good sensitivity. It will be crucial to minimize systematic errors so that even for very long total integration times the imaging performance is close to the statistical photon limit. There is also a requirement to be able to reconstruct images on-board in near real time in order to detect and localize gamma-ray bursts. This must be done while the spacecraft is scanning the sky. The scanning provides all-sky coverage and is key to reducing systematic errors. The on-board computational problem is made even more challenging for EXIST by the very large number of detector pixels. Numerous alternative designs for the HET have been evaluated. The baseline concept adopted depends on a unique coded mask with two spatial scales. Monte Carlo simulations and analytic analysis techniques have been used to demonstrate the capabilities of the design and of the proposed two-step burst localization procedure

    Product Launch in a Declining Environment: The Blu-ray Disc – Opportunities and Struggle

    Get PDF
    Increasingly ICT-based virtual products are challenging physical products and markets. Obsolescence has become a real effect for an augmented number of established industries due to the facilitation of access, consumption, and permanent, immediate availability, which dematerialised products provide. Once again, Schumpeter’s Wind of Creative Destruction intensifies organisations’ permanent struggle for survival (1950). This paper presents long-term research in the optical disc industry, which has presented the optical disc format of Blu-ray as its latest innovation. It is an example of how an established industry launches a new product for finding new opportunities, but fights desperately against market resistance. The degree of innovation, the Blu-ray represents, may not be sufficient in the overarching battle of the physical place versus the virtual space (Kotler et al. 2002, Lam. 2004, Lamont et al. 1993, Scardigli et al. 1988). As the US market research institute In-Stat highlights, the optical disc market has declined for the 10th year in sequence (Kaufhold. 2010, IFPI. 2010). Sufficient evidence is available that the replication industry of optical discs may be confronted with an endgame scenario. The market climate may already be too hostile to support this industry’s desire for a renewal of consumers’ acceptance of the physical product, here the Blu-ray disc, and to create new market opportunities in the struggle against the industry’s potential obsolescence (Harrigan et al. 1983). Despite strong efforts of promotion and powerful market approaches, the Blu-ray disc could not find inroads to markets yet making this format sustainably successful. Evidence is that after a short period of time, Blu-ray discs’ available manufacturing capacities outperform consumers’ demand by >30%, consumer and replication prices fall sharply and many of the Home Entertainment’s content providers have little or no use for this format being a commodity and based on mass production (dvd-intelligence. 2010a, Kaufhold. 2010, Killer-Korff. 2010). Therefore, as research among the replication industry indicates, it presently seems more as though the Blu-ray format may not fulfil this industry’s needs and, with reference to Abernathy et al.’s research, may not lead to the renewal of industrial dynamics in a declining marketplace (1983, 1984). Further explanation for reasons can be found in the theories of innovation based on Utterback’s, Christensen’s and Christensen et al.’s studies of disruptive and discontinuous innovation (1996, 2003, 2003, 2004). Following the paper presented at the Sixteenth Annual South Dakota International Business Conference, this paper presents research about the Blu-ray format’s market problems. The introduction of the Transilience Organisation Innovation Map provided a conceptual approach for the initial explanation of the underlying reasons (Oestreicher. 2009). Research among European replication firms since concludes for Blu-ray that innovation in technology alone is not sufficient, when innovation’s second stream of market linkages is involved (Abernathy et al. 1983, 1984). The paper presents explanations, why the Blu-ray disc may not be sufficiently strong to support the replication industry in overcoming the odds impacting their strategic opportunities in a declining and eventually disruptive environment (Lamont et al. 1993, Yoo. 1992). The research methods applied are grounded theory and case study (Goulding. 2002, Charmaz. 2009, Eisenhardt. 1989, Davies. 2006). The overall intention of this long-term research is to contribute to a theory, which may also be relevant for other industries, like the publishing industry, whose struggle against dematerialisation of content is presently starting (Picard. 2003). Key Words: Radical vs. marginal innovation, Ideal Final Result, endgame strategies, theories of innovation, Blu-ra

    The Effect of External Safeguards on Human-Information System Trust in an Information Warfare Environment

    Get PDF
    This research looks at how human trust in an information system is influenced by external safeguards in an Information Warfare (1W) domain. Information systems are relied upon in command and control environments to provide fast and reliable information to the decision-makers. The degree of reliance placed in these systems by the decision-makers suggests a significant level of trust. Understanding this trust relationship and what effects it is the focus of this study. A model is proposed that predicts behavior associated with human trust in information systems. It is hypothesized that a decision-maker\u27s belief in the effectiveness of external safeguards will positively influence a decision-maker\u27s trusting behavior. Likewise, the presence of an Information Warfare attack will have a negative effect on a decision-maker\u27s trusting behavior. Two experiments were conducted in which the perceived effectiveness of external safeguards and the information provided by an information system were manipulated in order to test the hypotheses presented in this study. The findings from both experiments suggest that a person\u27s trust in computers in specific situations are useful in predicting trusting behavior, external safeguards have a negative effect on trusting behavior, and that Information Warfare attacks have no effect on trusting behavior

    The Impact of Ground-Based Glaciogenic Seeding on a Shallow Stratiform Cloud over the Sierra Madre in Wyoming: A Multi-Sensor Study of the 3 March 2012 Case

    Get PDF
    A case study is presented of the impact of ground-based glaciogenic seeding on a shallow, lightly precipitating orographic storm with abundant supercooled cloud droplets, but few ice particles. The storm was observed on 3 March 2012 as part of the AgI (silver iodide) Seeding Cloud Impact Investigating (ASCII) experiment in Wyoming. The cloud base temperature was about -9°C, and cloud tops were at about -16°C. The high concentration of small droplets and low ice particle concentration lead to natural snow growth, mainly by vapor diffusion. The question addressed here is whether the injection of ice nucleating particles (AgI) enhanced snow growth and snowfall. The treated (seeded) period is compared with the preceding untreated (noseeded) period, and natural trends (observed in an adjacent control region) are removed. The main target site, located on a mountain pass at an elevation above cloud base, was impacted by AgI seeding, according to a trace chemistry analysis of freshly fallen snow. Data from three radar systems were used: the Wyoming Cloud Radar, two Ka-band profiling Micro-Rain Radars, and a X-band scanning Doppler-on-Wheels (DOW) radar. Composite data from these radar systems and from gauges in the target area indicate an increase in low-level reflectivity and precipitation rate during seeding. This finding generally agrees with other published ASCII case studies. The increase in reflectivity during seeding in the target area appears to be due mainly to an increase in particle size (aggregation), not number concentration, as suggested by DOW differential reflectivity and by disdrometer and Cloud Particle Imager measurements on the ground

    Defensive Cyber Battle Damage Assessment Through Attack Methodology Modeling

    Get PDF
    Due to the growing sophisticated capabilities of advanced persistent cyber threats, it is necessary to understand and accurately assess cyber attack damage to digital assets. This thesis proposes a Defensive Cyber Battle Damage Assessment (DCBDA) process which utilizes the comprehensive understanding of all possible cyber attack methodologies captured in a Cyber Attack Methodology Exhaustive List (CAMEL). This research proposes CAMEL to provide detailed knowledge of cyber attack actions, methods, capabilities, forensic evidence and evidence collection methods. This product is modeled as an attack tree called the Cyber Attack Methodology Attack Tree (CAMAT). The proposed DCBDA process uses CAMAT to analyze potential attack scenarios used by an attacker. These scenarios are utilized to identify the associated digital forensic methods in CAMEL to correctly collect and analyze the damage from a cyber attack. The results from the experimentation of the proposed DCBDA process show the process can be successfully applied to cyber attack scenarios to correctly assess the extent, method and damage caused by a cyber attack

    Women in Aerospace in India: Aerowoman

    Get PDF
    The main objective of the seminar was to bring together the Indian women in the field of aerospace and identify their scientific and technical contributions. In this context, the seminar was perhaps the first of its kind held in India. It was also aptly conducted in the x2018;Womenx2019;s Empowerment Year x2013; 2001x2019; declared by the Govt. of India.13

    A new 3-D modelling method to extract subtransect dimensions from underwater videos

    Get PDF
    Underwater video transects have become a common tool for quantitative analysis of the seafloor. However a major difficulty remains in the accurate determination of the area surveyed as underwater navigation can be unreliable and image scaling does not always compensate for distortions due to perspective and topography. Depending on the camera set-up and available instruments, different methods of surface measurement are applied, which make it difficult to compare data obtained by different vehicles. 3-D modelling of the seafloor based on 2-D video data and a reference scale can be used to compute subtransect dimensions. Focussing on the length of the subtransect, the data obtained from 3-D models created with the software PhotoModeler Scanner are compared with those determined from underwater acoustic positioning (ultra short baseline, USBL) and bottom tracking (Doppler velocity log, DVL). 3-D model building and scaling was successfully conducted on all three tested set-ups and the distortion of the reference scales due to substrate roughness was identified as the main source of imprecision. Acoustic positioning was generally inaccurate and bottom tracking unreliable on rough terrain. Subtransect lengths assessed with PhotoModeler were on average 20 % longer than those derived from acoustic positioning due to the higher spatial resolution and the inclusion of slope. On a high relief wall bottom tracking and 3-D modelling yielded similar results. At present, 3-D modelling is the most powerful, albeit the most time-consuming, method for accurate determination of video subtransect dimensions
    corecore