473 research outputs found

    Book Review

    Get PDF

    Something to Remember, Something to Celebrate: Women at Columbia Law School In

    Get PDF
    In this issue the Columbia Law Review joins in the celebration the 75th anniversary of the admission of women to the Columbia Law School. I am grateful to the editors of the Review for inviting me to contribute, and for the open-endedness of the invitation (or, in other words, what follows is my fault, not theirs). This has been an opportunity for me to do some research, some recalling and some reflection (and to tell a few stories). My research is incomplete, one might say sketchy, but I trust reliable as far as it goes. My recollections may well not match those of others who were on the scene at Columbia in the times of which I write, but that is in the nature of recollection. My reflections in some instances have surprised me; we shall have to see what you make of them. (And I hope you enjoy the stories.

    The Decline and Fall of the Stock Certificate in America

    Get PDF

    Identifying wave packet fractional revivals by means of information entropy

    Full text link
    Wave packet fractional revivals is a relevant feature in the long time scale evolution of a wide range of physical systems, including atoms, molecules and nonlinear systems. We show that the sum of information entropies in both position and momentum conjugate spaces is an indicator of fractional revivals by analyzing three different model systems: (i)(i) the infinite square well, (ii)(ii) a particle bouncing vertically against a wall in a gravitational field, and (iii)(iii) the vibrational dynamics of hydrogen iodide molecules. This description in terms of information entropies complements the usual one in terms of the autocorrelation function

    How Varroa Parasitism affects the immunological and nutritional status of the honey bee, Apis mellifera

    Get PDF
    We investigated the effect of the parasitic mite Varroa destructor on the immunological and nutritional condition of honey bees, Apis mellifera, from the perspective of the individual bee and the colony. Pupae, newly-emerged adults and foraging adults were sampled from honey bee colonies at one site in S. Texas, USA. Varroa infested bees displayed elevated titer of Deformed Wing Virus (DWV), suggestive of depressed capacity to limit viral replication. Expression of genes coding three anti-microbial peptides (defensin1, abaecin, hymenoptaecin) was either not significantly different between Varroa-infested and uninfested bees or was significantly elevated in Varroa-infested bees, varying with sampling date and bee developmental age. The effect of Varroa on nutritional indices of the bees was complex, with protein, triglyceride, glycogen and sugar levels strongly influenced by life-stage of the bee and individual colony. Protein content was depressed and free amino acid content elevated in Varroa-infested pupae, suggesting that protein synthesis, and consequently growth, may be limited in these insects. No simple relationship between the values of nutritional and immune-related indices was observed, and colony-scale effects were indicated by the reduced weight of pupae in colonies with high Varroa abundance, irrespective of whether the individual pupa bore Varroa

    Vectorized Rebinning Algorithm for Fast Data Down-Sampling

    Get PDF
    A vectorized rebinning (down-sampling) algorithm, applicable to N-dimensional data sets, has been developed that offers a significant reduction in computer run time when compared to conventional rebinning algorithms. For clarity, a two-dimensional version of the algorithm is discussed to illustrate some specific details of the algorithm content, and using the language of image processing, 2D data will be referred to as "images," and each value in an image as a "pixel." The new approach is fully vectorized, i.e., the down-sampling procedure is done as a single step over all image rows, and then as a single step over all image columns. Data rebinning (or down-sampling) is a procedure that uses a discretely sampled N-dimensional data set to create a representation of the same data, but with fewer discrete samples. Such data down-sampling is fundamental to digital signal processing, e.g., for data compression applications

    JWST Wavefront Control Toolbox

    Get PDF
    A Matlab-based toolbox has been developed for the wavefront control and optimization of segmented optical surfaces to correct for possible misalignments of James Webb Space Telescope (JWST) using influence functions. The toolbox employs both iterative and non-iterative methods to converge to an optimal solution by minimizing the cost function. The toolbox could be used in either of constrained and unconstrained optimizations. The control process involves 1 to 7 degrees-of-freedom perturbations per segment of primary mirror in addition to the 5 degrees of freedom of secondary mirror. The toolbox consists of a series of Matlab/Simulink functions and modules, developed based on a "wrapper" approach, that handles the interface and data flow between existing commercial optical modeling software packages such as Zemax and Code V. The limitations of the algorithm are dictated by the constraints of the moving parts in the mirrors

    Recursive Branching Simulated Annealing Algorithm

    Get PDF
    This innovation is a variation of a simulated-annealing optimization algorithm that uses a recursive-branching structure to parallelize the search of a parameter space for the globally optimal solution to an objective. The algorithm has been demonstrated to be more effective at searching a parameter space than traditional simulated-annealing methods for a particular problem of interest, and it can readily be applied to a wide variety of optimization problems, including those with a parameter space having both discrete-value parameters (combinatorial) and continuous-variable parameters. It can take the place of a conventional simulated- annealing, Monte-Carlo, or random- walk algorithm. In a conventional simulated-annealing (SA) algorithm, a starting configuration is randomly selected within the parameter space. The algorithm randomly selects another configuration from the parameter space and evaluates the objective function for that configuration. If the objective function value is better than the previous value, the new configuration is adopted as the new point of interest in the parameter space. If the objective function value is worse than the previous value, the new configuration may be adopted, with a probability determined by a temperature parameter, used in analogy to annealing in metals. As the optimization continues, the region of the parameter space from which new configurations can be selected shrinks, and in conjunction with lowering the annealing temperature (and thus lowering the probability for adopting configurations in parameter space with worse objective functions), the algorithm can converge on the globally optimal configuration. The Recursive Branching Simulated Annealing (RBSA) algorithm shares some features with the SA algorithm, notably including the basic principles that a starting configuration is randomly selected from within the parameter space, the algorithm tests other configurations with the goal of finding the globally optimal solution, and the region from which new configurations can be selected shrinks as the search continues. The key difference between these algorithms is that in the SA algorithm, a single path, or trajectory, is taken in parameter space, from the starting point to the globally optimal solution, while in the RBSA algorithm, many trajectories are taken; by exploring multiple regions of the parameter space simultaneously, the algorithm has been shown to converge on the globally optimal solution about an order of magnitude faster than when using conventional algorithms. Novel features of the RBSA algorithm include: 1. More efficient searching of the parameter space due to the branching structure, in which multiple random configurations are generated and multiple promising regions of the parameter space are explored; 2. The implementation of a trust region for each parameter in the parameter space, which provides a natural way of enforcing upper- and lower-bound constraints on the parameters; and 3. The optional use of a constrained gradient- search optimization, performed on the continuous variables around each branch s configuration in parameter space to improve search efficiency by allowing for fast fine-tuning of the continuous variables within the trust region at that configuration point
    corecore