934 research outputs found

    Technology requirements for post-1985 communications satellites

    Get PDF
    The technical and functional requirements for commercial communication satellites are discussed. The need for providing quality service at an acceptable cost is emphasized. Specialized services are postulated in a needs model which forecasts future demands. This needs model is based upon 322 separately identified needs for long distance communication. It is shown that the 1985 demand for satellite communication service for a domestic region such as the United States, and surrounding sea and air lanes, may require on the order of 100,000 MHz of bandwith. This level of demand can be met by means of the presently allocated bandwidths and developing some key technologies. Suggested improvements include: (1) improving antennas so that high speed switching will be possible; (2) development of solid state transponders for 12 GHz and possibly higher frequencies; (3) development of switched or steered beam antennas with 10 db or higher gain for aircraft; and (4) continued development of improved video channel compression techniques and hardware

    Technology requirements for communication satellites in the 1980's

    Get PDF
    The key technology requirements are defined for meeting the forecasted demands for communication satellite services in the 1985 to 1995 time frame. Evaluation is made of needs for services and technical and functional requirements for providing services. The future growth capabilities of the terrestrial telephone network, cable television, and satellite networks are forecasted. The impact of spacecraft technology and booster performance and costs upon communication satellite costs are analyzed. Systems analysis techniques are used to determine functional requirements and the sensitivities of technology improvements for reducing the costs of meeting requirements. Recommended development plans and funding levels are presented, as well as the possible cost saving for communications satellites in the post 1985 era

    Breaking Cosmological Degeneracies in Galaxy Cluster Surveys with a Physical Model of Cluster Structure

    Get PDF
    Forthcoming large galaxy cluster surveys will yield tight constraints on cosmological models. It has been shown that in an idealized survey, containing > 10,000 clusters, statistical errors on dark energy and other cosmological parameters will be at the percent level. It has also been shown that through "self-calibration", parameters describing the mass-observable relation and cosmology can be simultaneously determined, though at a loss in accuracy by about an order of magnitude. Here we examine the utility of an alternative approach of self-calibration, in which a parametrized ab-initio physical model is used to compute cluster structure and the resulting mass-observable relations. As an example, we use a modified-entropy ("pre-heating") model of the intracluster medium, with the history and magnitude of entropy injection as unknown input parameters. Using a Fisher matrix approach, we evaluate the expected simultaneous statistical errors on cosmological and cluster model parameters. We study two types of surveys, in which a comparable number of clusters are identified either through their X-ray emission or through their integrated Sunyaev-Zel'dovich (SZ) effect. We find that compared to a phenomenological parametrization of the mass-observable relation, using our physical model yields significantly tighter constraints in both surveys, and offers substantially improved synergy when the two surveys are combined. These results suggest that parametrized physical models of cluster structure will be useful when extracting cosmological constraints from SZ and X-ray cluster surveys. (abridged)Comment: 22 pages, 8 figures, accepted to Ap

    Cosmological Simulations of the Preheating Scenario for Galaxy Cluster Formation: Comparison to Analytic Models and Observations

    Full text link
    We perform a set of non--radiative cosmological simulations of a preheated intracluster medium in which the entropy of the gas was uniformly boosted at high redshift. The results of these simulations are used first to test the current analytic techniques of preheating via entropy input in the smooth accretion limit. When the unmodified profile is taken directly from simulations, we find that this model is in excellent agreement with the results of our simulations. This suggests that preheated efficiently smoothes the accreted gas, and therefore a shift in the unmodified profile is a good approximation even with a realistic accretion history. When we examine the simulation results in detail, we do not find strong evidence for entropy amplification, at least for the high-redshift preheating model adopted here. In the second section of the paper, we compare the results of the preheating simulations to recent observations. We show -- in agreement with previous work -- that for a reasonable amount of preheating, a satisfactory match can be found to the mass-temperature and luminosity-temperature relations. However -- as noted by previous authors -- we find that the entropy profiles of the simulated groups are much too flat compared to observations. In particular, while rich clusters converge on the adiabatic self--similar scaling at large radius, no single value of the entropy input during preheating can simultaneously reproduce both the core and outer entropy levels. As a result, we confirm that the simple preheating scenario for galaxy cluster formation, in which entropy is injected universally at high redshift, is inconsistent with observations.Comment: 11 pages, 13 figures, accepted for publication in Ap
    corecore