412 research outputs found

    The Architecture of Complexity Revisited: Design Primitives for Ultra-Large-Scale Systems

    Get PDF
    As software-intensive systems continue to grow in scale and complexity the techniques that we have used to design and analyze them in the past no longer suffice. In this paper we look at examples of existing ultra-large-scale systems—systems of enormous size and complexity. We examine instances of such systems that have arisen spontaneously in nature and those that have been human-constructed. We distill from these example systems the design primitives that underlie them. We capture these design primitives as a set of tactics— fundamental architectural building-blocks—and argue that to efficiently build and analyze such systems in the future we should strongly consider employing such building-blocks

    Demystifying Big Data Adoption: Beyond IT Fashion and Relative Advantage

    Get PDF
    There is a paradox in big data adoption: a peak of hype and simultaneously an unexpectedly low deployment rate. The present multiple case study research develops a Big Data Adoption (Big2) model that helps to explain this paradox and sheds light on the “whether”, “why”, and “how” questions regarding big data adoption. The Big2 model extends beyond the existing Relative Advantage and IT Fashion theories to include organizational, environmental, social variables as well as new psychological factors that are unique to big data adoption. Our analysis reveals that the outcome of big data adoption is indeterministic, which defies the implicit assumption of most simplistic “rational-calculus” models of innovation adoption: Relative Advantage is a necessary but not sufficient condition for big data adoption. Most importantly, our study uncovered a “Deployment Gap” and a “Limbo Stage” where companies continuously experiment for a long time and do not proceed to deployment despite the intent to adopt big data. As a result there are four big data adoption categories: Not adopting, Experimented but Not Adopting, Not Yet Deployed, Deployed. Our Big2 model contributes to provide a Paradigm Shift and Complexity Tolerance perspective to understand the “why” in each of the 4 adoption categories. This study further identifies 9 complexity tolerance strategies to help narrow the Deployment Gap but also shows that big data is not for everyone

    Can Cybersecurity Be Proactive? A Big Data Approach and Challenges

    Get PDF
    The cybersecurity community typically reacts to attacks after they occur. Being reactive is costly and can be fatal where attacks threaten lives, important data, or mission success. But can cybersecurity be done proactively? Our research capitalizes on the Germination Period—the time lag between hacker communities discussing software flaw types and flaws actually being exploited—where proactive measures can be taken. We argue for a novel proactive approach, utilizing big data, for (I) identifying potential attacks before they come to fruition; and based on this identification, (II) developing preventive counter-measures. The big data approach resulted in our vision of the Proactive Cybersecurity System (PCS), a layered, modular service platform that applies big data collection and processing tools to a wide variety of unstructured data sources to predict vulnerabilities and develop countermeasures. Our exploratory study is the first to show the promise of this novel proactive approach and illuminates challenges that need to be addressed

    Big Data Value Engineering for Business Model Innovation

    Get PDF
    Big data value engineering for business model innovation requires a drastically different approach as compared with methods for engineering value under existing business models. Taking a Design Science approach, we conducted an exploratory study to formulate the requirements for a method to aid in engineering value via innovation. We then developed a method, called Eco-ARCH (Eco-ARCHitecture) for value discovery. This method is tightly integrated with the BDD (Big Data Design) method for value realization, to form a big data value engineering methodology for addressing these requirements. The Eco-ARCH approach is most suitable for the big data context where system boundaries are fluid, requirements are ill-defined, many stakeholders are unknown, design goals are not provided, no central architecture pre-exists, system behavior is non-deterministic and continuously evolving, and co-creation with consumers and prosumers is essential to achieving innovation goals. The method was empirically validated in collaboration with an IT service company in the Electric Power industry

    Needle-Free Electroacupuncture for Postoperative Pain Management

    Get PDF
    This study examined the effects of needle-free electroacupuncture, at ST36 on postoperative pain following hysterectomy. Based on a double-blind, sham and different intervention controlled clinical experimental design, 47 women were randomly allocated to four different groups. Except for those in the control group (Group 1, n = 13), a course of treatment was given of either sham (Group 2, n = 12), high-frequency stimulation (Group 3, n = 12), or low-frequency stimulation (Group 4, n = 10). All groups were assessed during the postoperative period for 24 hours. The Visual Analogue Scale was used to determine the amount of perceived pain felt by each subject. Differences were found between the means postoperatively at three, four, eight, 16 and 24 hours. Post hoc comparison tests indicated that Group 4 was significantly different from Groups 1, 2, and 3 at 24 hours. A one-way ANOVA analysis for total patient-controlled analgesia demand and doses indicated significant differences between the groups F(3, 42) = 3.59, P < .05. Post hoc analysis confirmed the differences between Groups 1 (M = 84.54) and 4 (M = 41.60). Treatment outcomes of this therapy showed a positive effect for the management of postoperative pain

    Antarctic Mapping Mission Planning Aids

    Get PDF
    On November 4, 1995, the Canadian RADARSAT was carried aloft by a NASA rocket launched from Vandenburg Air Force Base. Radarsat is equipped with a C-band Synthetic Aperture Radar (SAR) capable of acquiring high resolution (25 m) images of Earth's surface day or night and under all weather conditions. Along with the attributes familiar to researchers working with SAR data from the European Space Agency's Earth Remote Sensing Satellite and the Japanese Earth Resources Satellite, RADARSAT will have enhanced flexibility to collect data using a variety of swath widths, incidence angles and resolutions. Most importantly, for scientists interested in Antarctica, the agreement for a U.S. launch of RADARSAT includes a provision for rotating in orbit the normally right-looking SAR to a left-looking mode. This 'Antarctic Mode' will provide for the first time a nearly instantaneous, high resolution view of the entirety of Antarctica on each of two proposed mappings separated by 2 years. This is an unprecedented opportunity to finish mapping one of the few remaining uncharted regions of the Earth. The completed maps will also provide two important benchmarks for gauging changes of Antarctica's ice cover. The preparation of a digital mosaic of Antarctica is being conducted under a NASA Pathfinder Project awarded to the Byrd Polar Research Center of The Ohio State University. The primary goal of this proposal is to compile digital SAR mosaics of the entire Antarctic continent using a combination of standard and extended beams during the "Antarctic Mode" of the Radarsat Mission. Agreements with the Canadian Space Agency call for the first Antarctic Mapping Manuever to occur in September, 1997. A mission plan to coordinate that complex acquisition and downlinking of Antarctic data has been developed by NASA's Jet Propulsion Laboratory. The Alaska SAR Facility (ASF) will be used as the primary data collection site supported by collections at the Canadian Gatineau and Prince Albert Ground Stations. ASF will process data into images which will be sent to OSU for compositing into map products using state-of-the-art equipment to be designed by Vexcel Corporation of Boulder Colorado. Imaging geometry will be constrained over the Antarctic using active radar transponders constructed by the Environmental Research Institute of Michigan and by corner reflectors deployed by the British Antarctic Survey. Additional ground control is being supplied by the National Imagery and Mapping Agency. Final products will be distributed through the ASF and the National Snow and Ice Data Center which are both NASA Data Archive Centers. The mosaics and ancillary information will be prepared on CDROM and will be made available to the science community through NASA DAACs. Science opportunities envisioned for the program are summarized on the accompanying table. These include studying the dynamics and variability of the Antarctic Ice Sheet including studies of regions like the Wordie Ice Shelf and the Larsen Ice Shelf which have recently experienced unexplained and nearly catastropic retreat. Geologic applications include large scale mapping of faults, volcanic features, and mountain building processes (particularly the Transantarctic Mountains). Finally, there is simply the unprecedented opportunity to use these digital maps in studies of many previously unexplored areas of the Southern Continent.NASACanadian Space Agenc

    The Deacetylase HDAC6 Mediates Endogenous Neuritic Tau Pathology

    Get PDF
    The initiating events that promote tau mislocalization and pathology in Alzheimer's disease (AD) are not well defined, partly because of the lack of endogenous models that recapitulate tau dysfunction. We exposed wild-type neurons to a neuroinflammatory trigger and examined the effect on endogenous tau. We found that tau re-localized and accumulated within pathological neuritic foci, or beads, comprised of mostly hypo-phosphorylated, acetylated, and oligomeric tau. These structures were detected in aged wild-type mice and were enhanced in response to neuroinflammation in vivo, highlighting a previously undescribed endogenous age-related tau pathology. Strikingly, deletion or inhibition of the cytoplasmic shuttling factor HDAC6 suppressed neuritic tau bead formation in neurons and mice. Using mass spectrometry-based profiling, we identified a single neuroinflammatory factor, the metalloproteinase MMP-9, as a mediator of neuritic tau beading. Thus, our study uncovers a link between neuroinflammation and neuritic tau beading as a potential early-stage pathogenic mechanism in AD

    Electric Field Conjugation with the Project 1640 coronagraph

    Full text link
    The Project 1640 instrument on the 200-inch Hale telescope at Palomar Observatory is a coronagraphic instrument with an integral field spectrograph at the back end, designed to find young, self-luminous planets around nearby stars. To reach the necessary contrast for this, the PALM-3000 adaptive optics system corrects for fast atmospheric speckles, while CAL, a phase-shifting interferometer in a Mach-Zehnder configuration, measures the quasistatic components of the complex electric field in the pupil plane following the coronagraphic stop. Two additional sensors measure and control low-order modes. These field measurements may then be combined with a system model and data taken separately using a white-light source internal to the AO system to correct for both phase and amplitude aberrations. Here, we discuss and demonstrate the procedure to maintain a half-plane dark hole in the image plane while the spectrograph is taking data, including initial on-sky performance.Comment: 9 pages, 7 figures, in Proceedings of SPIE, 8864-19 (2013
    • 

    corecore