1,309 research outputs found

    Snake-Oil Security Claims the Systematic Misrepresentation of Product Security in the E-Commerce Arena

    Get PDF
    The modern commercial systems and software industry in the United States have grown up in a snake-oil salesman\u27s paradise. The largest sector of this industry by far is composed of standard commercial systems that are marketed to provide specified functionality (e.g. Internet web server, firewall, router, etc.) Such products are generally provided with a blanket disclaimer stating that the purchaser must evaluate the suitability of the product for use, and that the user assumes all liability for product behavior. In general, users cannot evaluate and cannot be expected to evaluate the security claims of a product. The ability to analyze security claims is important because a consumer may place unwarranted trust in the security abilities of a web server (or other computer device) to perform its stated purpose, thereby putting his own organization at risk, as well as third parties (consumers, business partners, etc.) All but the largest and most capable organizations lack the resources or expertise to evaluate the security claims of a product. More importantly, no reasonable and knowledgeable person would expect them to be able to do so. The normal legal presumptions of approximate equality of bargaining power and comparable sophistication in evaluating benefits and risks are grievously unjust in the context of software security. In these transactions, it is far wiser to view the general purchaser, even if that purchaser is a sizable corporation, as an ignorant consumer. Hence, often purchasers accept what appear to be either implied merchantability claims of the vendor or claims of salespersons\u27 made outside of the context of a written document. These claims frequently have little, if any, basis in fact. These standard commercial systems form the bulk of the critical infrastructure of existing Internet functionality and e-commerce systems. Often, these systems are not trustworthy, yet the use of these systems by misinformed purchasers created massive vulnerability for both purchasers and third parties (including a substantial fraction of both U.S. and international citizens). The frequent disclosure of individual credit card information from supposedly secure commercial systems illustrates an aspect of this vulnerability and raises serious questions concerning the merchantability of these systems. While it is impossible to avoid all risks, they can be reduced to a very small fraction of their current level. Vendors have willfully taken approaches and used processes that do not allow assurance of appropriate security properties, while simultaneously and recklessly misrepresenting the security properties of their products to their customers

    One place doesn't fit all: improving the effectiveness of sustainability standards by accounting for place

    Get PDF
    Includes bibliographical references (pages 8-10).The growing interest in incentivizing sustainable agricultural practices is supported by a large network of voluntary production standards, which aim to offer farmers and ranchers increased value for their product in support of reduced environmental impact. To be effective with producers and consumers alike, these standards must be both credible and broadly recognizable, and thus are typically highly generalizable. However, the environmental impact of agriculture is strongly place-based and varies considerably due to complex biophysical, socio-cultural, and management-based factors, even within a given sector in a particular region. We suggest that this contradiction between the placeless generality of standards and the placed-ness of agriculture renders many sustainability standards ineffective. In this policy and practice review, we examine this contradiction through the lens of beef production, with a focus on an ongoing regional food purchasing effort in Denver, Colorado, USA. We review the idea of place in the context of agricultural sustainability, drawing on life cycle analysis and diverse literature to find that recognition of place-specific circumstances is essential to understanding environmental impact and improving outcomes. We then examine the case of the Good Food Purchasing Program (GFPP), a broad set of food-purchasing standards currently being implemented for institutional purchasing in Denver. The GFPP was created through a lengthy stakeholder-inclusive process for use in Los Angeles, California, USA, and has since been applied to many cities across the country. The difference between Los Angeles' process and that of applying the result of Los Angeles' process to Denver is instructive, and emblematic of the flaws of generalizable sustainability standards themselves. We then describe the essential elements of a place-based approach to agricultural sustainability standards, pointing toward a democratic, process-based, and outcome-oriented strategy that results in standards that enable rather than hinder the creativity of both producers and consumers. Though prescription is anathema to our approach, we close by offering a starting point for the development of standards for beef production in Colorado that respect the work of people in place

    Proximate and fatty acid composition of 40 southeastern U.S. finfish species

    Get PDF
    This report describes the proximate compositions (protein, moisture, fat, and ash) and major fatty acid profiles for raw and cooked samples of 40 southeastern finfish species. All samples (fillets) were cooked by a standard procedure in laminated plastic bags to an internal temperature of 70'C (lS8'F). Both summarized compositional data, with means and ranges for each species, and individual sample data including harvest dates and average lengths and weights are presented. When compared with raw samples, cooked samples exhibited an increase in protein content with an accompanying decrease in moisture content. Fat content either remained approximately the same or increased due to moisture loss during cooking. Our results are discussed in reference to compositional data previously published by others on some of the same species. Although additional data are needed to adequately describe the seasonal and geographic variations in the chemical compositions of many of these fish species, the results presented here should be useful to nutritionists, seafood marketers, and consumers.(PDF file contains 28 pages.

    Recovery of surface reflectance spectra and evaluation of the optical depth of aerosols in the near-IR using a Monte-Carlo approach: Application to the OMEGA observations of high latitude regions of Mars

    Full text link
    We present a model of radiative transfer through atmospheric particles based on Monte Carlo methods. This model can be used to analyze and remove the contribution of aerosols in remote sensing observations. We have developed a method to quantify the contribution of atmospheric dust in near-IR spectra of the Martian surface obtained by the OMEGA imaging spectrometer on board Mars Express. Using observations in the nadir pointing mode with significant differences in solar incidence angles, we can infer the optical depth of atmospheric dust, and we can retrieve the surface reflectance spectra free of aerosol contribution. Martian airborne dust properties are discussed and constrained from previous studies and OMEGA data. We have tested our method on a region at 90{\deg}E and 77{\deg}N extensively covered by OMEGA, where significant variations of the albedo of ice patches in the visible have been reported. The consistency between reflectance spectra of ice-covered and ice-free regions recovered at different incidence angles validates our approach. The optical depth of aerosols varies by a factor 3 in this region during the summer of Martian year 27. The observed brightening of ice patches does not result from frost deposition but from a decrease in the dust contamination of surface ice and (to a lower extent) from a decrease in the optical thickness of atmospheric dust. Our Monte Carlo-based model can be applied to recover the spectral reflectance characteristics of the surface from OMEGA spectral imaging data when the optical thickness of aerosols can be evaluated. It could prove useful for processing image cubes from the Compact Reconnaissance Imaging Spectrometer for Mars (CRISM) on board the Mars Reconnaissance Orbiter (MRO)

    A rapid in vivo screen for pancreatic ductal adenocarcinoma therapeutics

    Get PDF
    Pancreatic ductal adenocarcinoma (PDA) is the fourth leading cause of cancer-related deaths in the United States, and is projected to be second by 2025. It has the worst survival rate among all major cancers. Two pressing needs for extending life expectancy of affected individuals are the development of new approaches to identify improved therapeutics, addressed herein, and the identification of early markers. PDA advances through a complex series of intercellular and physiological interactions that drive cancer progression in response to organ stress, organ failure, malnutrition, and infiltrating immune and stromal cells. Candidate drugs identified in organ culture or cell-based screens must be validated in preclinical models such as KIC (p48Cre;LSL-KrasG12D;Cdkn2af/f) mice, a genetically engineered model of PDA in which large aggressive tumors develop by 4 weeks of age. We report a rapid, systematic and robust in vivo screen for effective drug combinations to treat Kras-dependent PDA. Kras mutations occur early in tumor progression in over 90% of human PDA cases. Protein kinase and G-protein coupled receptor (GPCR) signaling activates Kras. Regulators of G-protein signaling (RGS) proteins are coincidence detectors that can be induced by multiple inputs to feedback-regulate GPCR signaling. We crossed Rgs16::GFP bacterial artificial chromosome (BAC) transgenic mice withKIC mice and show that the Rgs16::GFP transgene is a KrasG12D-dependent marker of all stages of PDA, and increases proportionally to tumor burden in KIC mice. RNA sequencing (RNA-Seq) analysis of cultured primary PDA cells reveals characteristics of embryonic progenitors of pancreatic ducts and endocrine cells, and extraordinarily high expression of the receptor tyrosine kinase Axl, an emerging cancer drug target. In proof-of-principle drug screens, we find that weanling KIC mice with PDA treated for 2 weeks with gemcitabine (with or without Abraxane) plus inhibitors of Axl signaling (warfarin and BGB324) have fewer tumor initiation sites and reduced tumor size compared with the standard-of-care treatment. Rgs16::GFP is therefore an in vivo reporter of PDA progression and sensitivity to new chemotherapeutic drug regimens such as Axl-targeted agents. This screening strategy can potentially be applied to identify improved therapeutics for other cancers

    Implementing Provider‐based Sampling for the National Children's Study: Opportunities and Challenges

    Full text link
    Background:  The National Children's Study (NCS) was established as a national probability sample of births to prospectively study children's health starting from in utero to age 21. The primary sampling unit was 105 study locations (typically a county). The secondary sampling unit was the geographic unit (segment), but this was subsequently perceived to be an inefficient strategy. Methods and Results:  This paper proposes that second‐stage sampling using prenatal care providers is an efficient and cost‐effective method for deriving a national probability sample of births in the US. It offers a rationale for provider‐based sampling and discusses a number of strategies for assembling a sampling frame of providers. Also presented are special challenges to provider‐based sampling pregnancies, including optimising key sample parameters, retaining geographic diversity, determining the types of providers to include in the sample frame, recruiting women who do not receive prenatal care, and using community engagement to enrol women. There will also be substantial operational challenges to sampling provider groups. Conclusion:  We argue that probability sampling is mandatory to capture the full variation in exposure and outcomes expected in a national cohort study, to provide valid and generalisable risk estimates, and to accurately estimate policy (such as screening) benefits from associations reported in the NCS.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/94504/1/ppe12005.pd

    Templates for Convex Cone Problems with Applications to Sparse Signal Recovery

    Full text link
    This paper develops a general framework for solving a variety of convex cone problems that frequently arise in signal processing, machine learning, statistics, and other fields. The approach works as follows: first, determine a conic formulation of the problem; second, determine its dual; third, apply smoothing; and fourth, solve using an optimal first-order method. A merit of this approach is its flexibility: for example, all compressed sensing problems can be solved via this approach. These include models with objective functionals such as the total-variation norm, ||Wx||_1 where W is arbitrary, or a combination thereof. In addition, the paper also introduces a number of technical contributions such as a novel continuation scheme, a novel approach for controlling the step size, and some new results showing that the smooth and unsmoothed problems are sometimes formally equivalent. Combined with our framework, these lead to novel, stable and computationally efficient algorithms. For instance, our general implementation is competitive with state-of-the-art methods for solving intensively studied problems such as the LASSO. Further, numerical experiments show that one can solve the Dantzig selector problem, for which no efficient large-scale solvers exist, in a few hundred iterations. Finally, the paper is accompanied with a software release. This software is not a single, monolithic solver; rather, it is a suite of programs and routines designed to serve as building blocks for constructing complete algorithms.Comment: The TFOCS software is available at http://tfocs.stanford.edu This version has updated reference

    The National Childrens Study: Recruitment Outcomes Using the Provider-Based Recruitment Approach

    Get PDF
    In 2009, the National Children’s Study (NCS) Vanguard Study tested the feasibility of household-based recruitment and participant enrollment using a birth-rate probability sample. In 2010, the NCS Program Office launched 3 additional recruitment approaches. We tested whether provider-based recruitment could improve recruitment outcomes compared with household-based recruitment

    Qualitative Analysis of String Cosmologies

    Get PDF
    A qualitative analysis is presented for spatially flat, isotropic and homogeneous cosmologies derived from the string effective action when the combined effects of a dilaton, modulus, two-form potential and central charge deficit are included. The latter has significant effects on the qualitative dynamics. The analysis is also directly applicable to the anisotropic Bianchi type I cosmology.Comment: 13 pages, 4 postscript figures, accepted to Physical Review
    • …
    corecore