1,867 research outputs found

    Neural network identification of keystream generators

    Get PDF
    Applications such as stream ciphers and spread spectra require the generation of binary keystreams to implement, and the simulation of such keystreams to break. Most cryptanalytic attacks are of the known generator type, that is, they assume knowledge of the method used to generate the keystream. We show that a neural network can be used to identify the generator, and in some cases to simulate the keystream.http://archive.org/details/neuralnetworkide00leadApproved for public release; distribution is unlimited

    Center of mass and relative motion in time dependent density functional theory

    Full text link
    It is shown that the exchange-correlation part of the action functional Axc[ρ(r⃗,t)]A_{xc}[\rho (\vec r,t)] in time-dependent density functional theory , where ρ(r⃗,t)\rho (\vec r,t) is the time-dependent density, is invariant under the transformation to an accelerated frame of reference ρ(r⃗,t)→ρâ€Č(r⃗,t)=ρ(r⃗+x⃗(t),t)\rho (\vec r,t) \to \rho ' (\vec r,t) = \rho (\vec r + \vec x (t),t), where x⃗(t)\vec x (t) is an arbitrary function of time. This invariance implies that the exchange-correlation potential in the Kohn-Sham equation transforms in the following manner: Vxc[ρâ€Č;r⃗,t]=Vxc[ρ;r⃗+x⃗(t),t]V_{xc}[\rho '; \vec r, t] = V_{xc}[\rho; \vec r + \vec x (t),t]. Some of the approximate formulas that have been proposed for VxcV_{xc} satisfy this exact transformation property, others do not. Those which transform in the correct manner automatically satisfy the ``harmonic potential theorem", i.e. the separation of the center of mass motion for a system of interacting particles in the presence of a harmonic external potential. A general method to generate functionals which possess the correct symmetry is proposed

    Experimental philosophy leading to a small scale digital data base of the conterminous United States for designing experiments with remotely sensed data

    Get PDF
    Research using satellite remotely sensed data, even within any single scientific discipline, often lacked a unifying principle or strategy with which to plan or integrate studies conducted over an area so large that exhaustive examination is infeasible, e.g., the U.S.A. However, such a series of studies would seem to be at the heart of what makes satellite remote sensing unique, that is the ability to select for study from among remotely sensed data sets distributed widely over the U.S., over time, where the resources do not exist to examine all of them. Using this philosophical underpinning and the concept of a unifying principle, an operational procedure for developing a sampling strategy and formal testable hypotheses was constructed. The procedure is applicable across disciplines, when the investigator restates the research question in symbolic form, i.e., quantifies it. The procedure is set within the statistical framework of general linear models. The dependent variable is any arbitrary function of remotely sensed data and the independent variables are values or levels of factors which represent regional climatic conditions and/or properties of the Earth's surface. These factors are operationally defined as maps from the U.S. National Atlas (U.S.G.S., 1970). Eighty-five maps from the National Atlas, representing climatic and surface attributes, were automated by point counting at an effective resolution of one observation every 17.6 km (11 miles) yielding 22,505 observations per map. The maps were registered to one another in a two step procedure producing a coarse, then fine scale registration. After registration, the maps were iteratively checked for errors using manual and automated procedures. The error free maps were annotated with identification and legend information and then stored as card images, one map to a file. A sampling design will be accomplished through a regionalization analysis of the National Atlas data base (presently being conducted). From this analysis a map of homogeneous regions of the U.S.A. will be created and samples (LANDSAT scenes) assigned by region

    Long-term results of NOPHO ALL-92 and ALL-2000 studies of childhood acute lymphoblastic leukemia

    Get PDF
    To access publisher full text version of this article. Please click on the hyperlink in Additional Links fieldAnalysis of 2668 children with acute lymphoblastic leukemia (ALL) treated in two successive Nordic clinical trials (Nordic Society of Paediatric Haematology and Oncology (NOPHO) ALL-92 and ALL-2000) showed that 75% of all patients are cured by first-line therapy, and 83% are long-term survivors. Improvements in systemic and intrathecal chemotherapy have reduced the use of central nervous system (CNS) irradiation to <10% of the patients and provided a 5-year risk of isolated CNS relapse of 2.6%. Improved risk stratification and chemotherapy have eliminated the previous independent prognostic significance of gender, CNS leukemia and translocation t(1;19)(q23;p13), whereas the post-induction level of minimal residual disease (MRD) has emerged as a new risk grouping feature. Infant leukemia, high leukocyte count, T-lineage immunophenotype, translocation t(4;11)(q21;q23) and hypodiploidy persist to be associated with lower cure rates. To reduce the overall toxicity of the treatment, including the risk of therapy-related second malignant neoplasms, the current NOPHO ALL-2008 protocol does not include CNS irradiation in first remission, the dose of 6-mercaptopurine is reduced for patients with low thiopurine methyltransferase activity, and the protocol restricts the use of hematopoietic stem cell transplantation in first remission to patients without morphological remission after induction therapy or with high levels of MRD after 3 months of therapy

    Making electromagnetic wavelets

    Full text link
    Electromagnetic wavelets are constructed using scalar wavelets as superpotentials, together with an appropriate polarization. It is shown that oblate spheroidal antennas, which are ideal for their production and reception, can be made by deforming and merging two branch cuts. This determines a unique field on the interior of the spheroid which gives the boundary conditions for the surface charge-current density necessary to radiate the wavelets. These sources are computed, including the impulse response of the antenna.Comment: 29 pages, 4 figures; minor corrections and addition

    Quantum Computation with Quantum Dots and Terahertz Cavity Quantum Electrodynamics

    Get PDF
    A quantum computer is proposed in which information is stored in the two lowest electronic states of doped quantum dots (QDs). Many QDs are located in a microcavity. A pair of gates controls the energy levels in each QD. A Controlled Not (CNOT) operation involving any pair of QDs can be effected by a sequence of gate-voltage pulses which tune the QD energy levels into resonance with frequencies of the cavity or a laser. The duration of a CNOT operation is estimated to be much shorter than the time for an electron to decohere by emitting an acoustic phonon.Comment: Revtex 6 pages, 3 postscript figures, minor typos correcte

    Complex-Distance Potential Theory and Hyperbolic Equations

    Full text link
    An extension of potential theory in R^n is obtained by continuing the Euclidean distance function holomorphically to C^n. The resulting Newtonian potential is generated by an extended source distribution D(z) in C^n whose restriction to R^n is the delta function. This provides a natural model for extended particles in physics. In C^n, interpreted as complex spacetime, D(z) acts as a propagator generating solutions of the wave equation from their initial values. This gives a new connection between elliptic and hyperbolic equations that does not assume analyticity of the Cauchy data. Generalized to Clifford analysis, it induces a similar connection between solutions of elliptic and hyperbolic Dirac equations. There is a natural application to the time-dependent, inhomogeneous Dirac and Maxwell equations, and the `electromagnetic wavelets' introduced previously are an example.Comment: 25 pages, submited to Proceedings of 5th Intern. Conf. on Clifford Algebras, Ixtapa, June 24 - July 4, 199

    Fish Spawning Aggregations: Where Well-Placed Management Actions Can Yield Big Benefits for Fisheries and Conservation

    Get PDF
    Marine ecosystem management has traditionally been divided between fisheries management and biodiversity conservation approaches, and the merging of these disparate agendas has proven difficult. Here, we offer a pathway that can unite fishers, scientists, resource managers and conservationists towards a single vision for some areas of the ocean where small investments in management can offer disproportionately large benefits to fisheries and biodiversity conservation. Specifically, we provide a series of evidenced-based arguments that support an urgent need to recognize fish spawning aggregations (FSAs) as a focal point for fisheries management and conservation on a global scale, with a particular emphasis placed on the protection of multispecies FSA sites. We illustrate that these sites serve as productivity hotspots - small areas of the ocean that are dictated by the interactions between physical forces and geomorphology, attract multiple species to reproduce in large numbers and support food web dynamics, ecosystem health and robust fisheries. FSAs are comparable in vulnerability, importance and magnificence to breeding aggregations of seabirds, sea turtles and whales yet they receive insufficient attention and are declining worldwide. Numerous case-studies confirm that protected aggregations do recover to benefit fisheries through increases in fish biomass, catch rates and larval recruitment at fished sites. The small size and spatio-temporal predictability of FSAs allow monitoring, assessment and enforcement to be scaled down while benefits of protection scale up to entire populations. Fishers intuitively understand the linkages between protecting FSAs and healthy fisheries and thus tend to support their protection
    • 

    corecore