517,109 research outputs found
Flexible control techniques for a lunar base
The fundamental elements found in every terrestrial control system can be employed in all lunar applications. These elements include sensors which measure physical properties, controllers which acquire sensor data and calculate a control response, and actuators which apply the control output to the process. The unique characteristics of the lunar environment will certainly require the development of new control system technology. However, weightlessness, harsh atmospheric conditions, temperature extremes, and radiation hazards will most significantly impact the design of sensors and actuators. The controller and associated control algorithms, which are the most complex element of any control system, can be derived in their entirety from existing technology. Lunar process control applications -- ranging from small-scale research projects to full-scale processing plants -- will benefit greatly from the controller advances being developed today. In particular, new software technology aimed at commercial process monitoring and control applications will almost completely eliminate the need for custom programs and the lengthy development and testing cycle they require. The applicability of existing industrial software to lunar applications has other significant advantages in addition to cost and quality. This software is designed to run on standard hardware platforms and takes advantage of existing LAN and telecommunications technology. Further, in order to exploit the existing commercial market, the software is being designed to be implemented by users of all skill levels -- typically users who are familiar with their process, but not necessarily with software or control theory. This means that specialized technical support personnel will not need to be on-hand, and the associated costs are eliminated. Finally, the latest industrial software designed for the commercial market is extremely flexible, in order to fit the requirements of many types of processing applications with little or no customization. This means that lunar process control projects will not be delayed by unforeseen problems or last minute process modifications. The software will include all of the tools needed to adapt to virtually any changes. In contrast to other space programs which required the development of tremendous amounts of custom software, lunar-based processing facilities will benefit from the use of existing software technology which is being proven in commercial applications on Earth
Opaque Service Virtualisation: A Practical Tool for Emulating Endpoint Systems
Large enterprise software systems make many complex interactions with other
services in their environment. Developing and testing for production-like
conditions is therefore a very challenging task. Current approaches include
emulation of dependent services using either explicit modelling or
record-and-replay approaches. Models require deep knowledge of the target
services while record-and-replay is limited in accuracy. Both face
developmental and scaling issues. We present a new technique that improves the
accuracy of record-and-replay approaches, without requiring prior knowledge of
the service protocols. The approach uses Multiple Sequence Alignment to derive
message prototypes from recorded system interactions and a scheme to match
incoming request messages against prototypes to generate response messages. We
use a modified Needleman-Wunsch algorithm for distance calculation during
message matching. Our approach has shown greater than 99% accuracy for four
evaluated enterprise system messaging protocols. The approach has been
successfully integrated into the CA Service Virtualization commercial product
to complement its existing techniques.Comment: In Proceedings of the 38th International Conference on Software
Engineering Companion (pp. 202-211). arXiv admin note: text overlap with
arXiv:1510.0142
Software Graphics Processing Unit (sGPU) for Deep Space Applications
A graphics processing capability will be required for deep space missions and must include a range of applications, from safety-critical vehicle health status to telemedicine for crew health. However, preliminary radiation testing of commercial graphics processing cards suggest they cannot operate in the deep space radiation environment. Investigation into an Software Graphics Processing Unit (sGPU)comprised of commercial-equivalent radiation hardened/tolerant single board computers, field programmable gate arrays, and safety-critical display software shows promising results. Preliminary performance of approximately 30 frames per second (FPS) has been achieved. Use of multi-core processors may provide a significant increase in performance
Design of Simulator for Alarm Modules
For easy, comprehensive testing and evaluation of error conditions in a real time environment, now-a-days use of a simulator on software based platform for generating these conditions as well as testing before it is actually implemented on a commercial scale ,that would be manifolds of the current scenario .The inputs are assimilated , conditioned , in an testing environment are processed according to a set of parameters specified and designed according to the designer or the user of application. The objective of a simulator is to create a uncomplicated migration from a hardware sourced system to software based simulator where concurrent processes and evaluations are done forehand to comprehend, ameliorate the simulator ,accordingly these implementations to be done in the hardware where the core would reside perform the key task managements. At, the testing unit, the system is ignorant whether it is an actual system performing the operations and implementing the tasks or a simulator performing mimicking the hardware and conducting the actual tasks on a software. It could be comprehended as if a mobile is simulated on desktop and the end user is being asked to use this mobile on computer with all the similar features that the user would have an access in an actual mobile. So, here in the system simulator will assist the testing as well as checking the efficiency of the system and study of the complex hardware system
Software platform virtualization in chemistry research and university teaching
<p>Abstract</p> <p>Background</p> <p>Modern chemistry laboratories operate with a wide range of software applications under different operating systems, such as Windows, LINUX or Mac OS X. Instead of installing software on different computers it is possible to install those applications on a single computer using Virtual Machine software. Software platform virtualization allows a single guest operating system to execute multiple other operating systems on the same computer. We apply and discuss the use of virtual machines in chemistry research and teaching laboratories.</p> <p>Results</p> <p>Virtual machines are commonly used for cheminformatics software development and testing. Benchmarking multiple chemistry software packages we have confirmed that the computational speed penalty for using virtual machines is low and around 5% to 10%. Software virtualization in a teaching environment allows faster deployment and easy use of commercial and open source software in hands-on computer teaching labs.</p> <p>Conclusion</p> <p>Software virtualization in chemistry, mass spectrometry and cheminformatics is needed for software testing and development of software for different operating systems. In order to obtain maximum performance the virtualization software should be multi-core enabled and allow the use of multiprocessor configurations in the virtual machine environment. Server consolidation, by running multiple tasks and operating systems on a single physical machine, can lead to lower maintenance and hardware costs especially in small research labs. The use of virtual machines can prevent software virus infections and security breaches when used as a sandbox system for internet access and software testing. Complex software setups can be created with virtual machines and are easily deployed later to multiple computers for hands-on teaching classes. We discuss the popularity of bioinformatics compared to cheminformatics as well as the missing cheminformatics education at universities worldwide.</p
Test case prioritization approaches in regression testing: A systematic literature review
Context Software quality can be assured by going through software testing process. However, software testing phase is an expensive process as it consumes a longer time. By scheduling test cases execution order through a prioritization approach, software testing efficiency can be improved especially during regression testing. Objective It is a notable step to be taken in constructing important software testing environment so that a system's commercial value can increase. The main idea of this review is to examine and classify the current test case prioritization approaches based on the articulated research questions. Method Set of search keywords with appropriate repositories were utilized to extract most important studies that fulfill all the criteria defined and classified under journal, conference paper, symposiums and workshops categories. 69 primary studies were nominated from the review strategy. Results There were 40 journal articles, 21 conference papers, three workshop articles, and five symposium articles collected from the primary studies. As for the result, it can be said that TCP approaches are still broadly open for improvements. Each approach in TCP has specified potential values, advantages, and limitation. Additionally, we found that variations in the starting point of TCP process among the approaches provide a different timeline and benefit to project manager to choose which approaches suite with the project schedule and available resources. Conclusion Test case prioritization has already been considerably discussed in the software testing domain. However, it is commonly learned that there are quite a number of existing prioritization techniques that can still be improved especially in data used and execution process for each approach
Model-based Testing: Next Generation Functional Software Testing
The idea of model-based testing is to use an explicit abstract model of a SUT and its environment to automatically derive tests for the SUT: the behavior of the model of the SUT is interpreted as the intended behavior of the SUT. The technology of automated model-based test case generation has matured to the point where large-scale deployments of this technology are becoming commonplace. The prerequisites for success, such as qualification of the test team, integrated tool chain availability and methods, are now identified, and a wide range of commercial and open-source tools are available.
Although MBT will not solve all testing problems, it is an important and useful technique, which brings significant progress over the state of the practice for functional software testing effectiveness, and can increase productivity and improve functional coverage.
In this talk, we\u27ll adress the current trend of deploying MBT in the industry, particularly in the TCoE - Test Center of Excellence - managed by the big System Integrators, as a vector for software testing "industrialization"
Development of a COTS-Based Propulsion System Controller for NASA’s Lunar Flashlight CubeSat Mission
The Lunar Flashlight mission is designed to send a 6U CubeSat into lunar orbit with the aim of finding water-ice deposits on the lunar south pole. The Glenn Lightsey Research Group (GLRG) within Georgia Tech’s Space Systems Design Laboratory (SSDL) is developing a low-cost propulsion system controller for this satellite using commercial-off-the-shelf (COTS) parts, with an emphasis on overcoming the harsh environment of lunar orbit through careful architecture and testing. This paper provides in-depth coverage of the Lunar Flashlight Propulsion System (LFPS) controller development and testing processes, showing how an embedded system based on COTS parts can be designed for the intense environment of space. From the high-level requirements architecture to the selection of specific hardware components and software design choices, followed by rigorous environmental testing of the design, radiation and other environmental hardening can be achieved with high confidence
Recommended from our members
Investigation into an improved modular rule-based testing framework for business rules
Rule testing in scheduling applications is a complex and potentially costly business problem. This thesis reports the outcome of research undertaken to develop a system to describe and test scheduling rules against a set of scheduling data. The overall intention of the research was to reduce commercial scheduling costs by minimizing human domain expert interaction within the scheduling process.
This thesis reports the outcome of research initiated following a consultancy project to develop a system to test driver schedules against the legal driving rules in force in the UK and the EU. One of the greatest challenges faced was interpreting the driving rules and translating them into the chosen programming language. This part of the project took considerable effort to complete the programming, testing and debugging processes. A potential problem then arises if the Department of Transport or the European Union alter or change the driving rules. Considerable software development is likely to be required to support the new rule set.
The approach considered takes into account the need for a modular software component that can be used in not just transport scheduling systems which look at legal driving rules but may also be integrated into other systems that have the need to test temporal rules. The integration of the rule testing component into existing systems is key to making the proposed solution reusable.
The research outcome proposes an alternative approach to rule definition, similar to that of RuleML, but with the addition of rule metadata to provide the ability of describing rules of a temporal nature. The rules can be serialised and deserialised between XML (eXtensible Markup Language) and objects within an object oriented environment (in this case .NET with C#), to provide a means of transmission of the rules over a communication infrastructure. The rule objects can then be compiled into an executable software library, allowing the rules to be tested more rapidly than traditional interpreted rules. Additional support functionality is also defined to provide a means of effectively integrating the rule testing engine into existing applications.
Following the construction of a rule testing engine that has been designed to meet the given requirements, a series of tests were undertaken to determine the effectiveness of the proposed approach. This lead to the implementation of improvements in the caching of constructed work plans to further improve performance. Tests were also carried out into the application of the proposed solution within alternative scheduling domains and to analyse the difference in computational performance and memory usage across system architectures, software frameworks and operating systems, with the support of Mono.
Future work that is expected to follow on from this thesis will likely reside in investigations into the development of graphical design tools for the creation of the rules, improvements in the work plan construction algorithm, parallelisation of elements of the process to take better advantage of multi-core processors and off-loading of the rule testing process onto dedicated or generic computational processors
- …