5 research outputs found
Automating planetary mission operations
In this paper, we describe the elements of a semi-autonomous system designed to provide instrument health, pointing, and data in a cost effective fashion
Factors influencing adverse skin responses in rats receiving repeated subcutaneous injections and potential impact on neurobehavior.
Repeated subcutaneous (s.c.) injection is a common route of administration in chronic studies of neuroactive compounds. However, in a pilot study we noted a significant incidence of skin abnormalities in adult male Long-Evans rats receiving daily s.c. injections of peanut oil (1.0 ml/kg) in the subscapular region for 21 d. Histopathological analyses of the lesions were consistent with a foreign body reaction. Subsequent studies were conducted to determine factors that influenced the incidence or severity of skin abnormalities, and whether these adverse skin reactions influenced a specific neurobehavioral outcome. Rats injected daily for 21 d with food grade peanut oil had an earlier onset and greater incidence of skin abnormalities relative to rats receiving an equal volume (1.0 ml/kg/d) of reagent grade peanut oil or triglyceride of coconut oil. Skin abnormalities in animals injected daily with peanut oil were increased in animals housed on corncob versus paper bedding. Comparison of animals obtained from different barrier facilities exposed to the same injection paradigm (reagent grade peanut oil, 1.0 ml/kg/d s.c.) revealed significant differences in the severity of skin abnormalities. However, animals from different barrier facilities did not perform differently in a Pavlovian fear conditioning task. Collectively, these data suggest that environmental factors influence the incidence and severity of skin abnormalities following repeated s.c. injections, but that these adverse skin responses do not significantly influence performance in at least one test of learning and memory
Visualizing the Operations of the Phoenix Mars Lander
With the successful landing of the Phoenix Mars Lander comes the task of visualizing the spacecraft, its operations and surrounding environment. The JPL Solar System Visualization team has brought together a wide range of talents and software to provide a suit of visualizations that shed light on the operations of this visitor to another world. The core set of tools range from web-based production tracking (Image Products Release Website), to custom 3D transformation software, through to studio quality 2D and 3D video production. We will demonstrate several of the key technologies that bring together these visualizations. Putting the scientific results of Phoenix in context requires managing the classic powers-of-10 problem. Everything from the location of polar dust storms down to the Atomic Force Microscope must be brought together in a context that communicates to both the scientific and public audiences. We used Lightwave to blend 2D and 3D visualizations into a continuous series of zooms using both simulations and actual data. Beyond the high-powered industrial strength solutions, we have strived to bring as much power down to the average computer user\u27s standard view of the computer: the web browser. Zooming and Interactive Mosaics (ZIM) tool is a JavaScript web tool for displaying high-resolution panoramas in a spacecraft-centric view. This tool allows the user to pan and zoom through the mosaic, indentifying feature and target names, all the while maintaining a contextual frame-of-reference. Google Earth presents the possibility of taking hyperlinked web browser interaction into the 3D geo-browser modality. Until Google releases a Mars mode to Google Earth, we are forced to wrap the Earth in a Mars texture. However, this can still provide a suitable background for exploring interactive visualizations. These models range over both regional and local scales, with the lander positioned on Mars and the local environment mapped into pseudo- Street View modes. Many visualizations succeed by altering the interaction metaphor. Therefore, we have attempted to completely overload the Google Earth interface from a traditional planetary globe into a mosaic viewer by mapping the Phoenix Mosaics onto the sphere and using geographic latitude and longitude coordinates as the camera pointing coordinates of a Phoenix mosaic. This presentation focuses on the data management and visualization aspects of the mission. For scientific results, please see the special section U13 The Phoenix Mission