606 research outputs found
Improved Spectrum Mobility using Virtual Reservation in Collaborative Cognitive Radio Networks
Cognitive radio technology would enable a set of secondary users (SU) to
opportunistically use the spectrum licensed to a primary user (PU). On the
appearance of this PU on a specific frequency band, any SU occupying this band
should free it for PUs. Typically, SUs may collaborate to reduce the impact of
cognitive users on the primary network and to improve the performance of the
SUs. In this paper, we propose and analyze the performance of virtual
reservation in collaborative cognitive networks. Virtual reservation is a novel
link maintenance strategy that aims to maximize the throughput of the cognitive
network through full spectrum utilization. Our performance evaluation shows
significant improvements not only in the SUs blocking and forced termination
probabilities but also in the throughput of cognitive users.Comment: 7 pages, 10 figures, IEEE ISCC 201
Software-Engineering Process Simulation (SEPS) model
The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments
The dynamics of software development project management: An integrative systems dynamic perspective
Rather than continuing to focus on software development projects per se, the system dynamics modeling approach outlined is extended to investigate a broader set of issues pertaining to the software development organization. Rather than trace the life cycle(s) of one or more software projects, the focus is on the operations of a software development department as a continuous stream of software products are developed, placed into operation, and maintained. A number of research questions are ""ripe'' for investigating including: (1) the efficacy of different organizational structures in different software development environments, (2) personnel turnover, (3) impact of management approaches such as management by objectives, and (4) the organizational/environmental determinants of productivity
Modelling of cardiac hemodynamics: A case study
AbstractObservations made on patients under cardiac catheterization are used to develop validated models for the heart cavities and the main blood vessels, treating them as compartments of the cardiac system. The algorithm utilizes realistic nonlinear formulations and the least squares techniques for optimal parameter estimation. A comprehensive investigation made for modelling of the hemodynamics of one of the compartments is reported as a case study. The modelling procedure is broad-based in character, and may be used advantageously as an aid in the diagnosis of heart diseases
Using Dynamic Modeling and Simulation to Improve the COTS Software Process
In the last several years, software industry has undergone a
significant transition to the use of existing component products in building
systems. Nowadays, more and more solutions are built by integrating
Commercial-Off-The-Shelf (COTS) products rather than building from scratch.
This new approach for software development has specific features that add new
factors that need to be taken into account to successfully face software
development. In this paper, we present the first results of developing a dynamic
simulation model to model and simulate the COTS-based software development
process with the aim of helping to understand the specific features of this kind
of software development, and design and evaluate software process
improvements. An example of how to use these dynamic simulation models to
study how the system integration starting point affects the main project
variables is shown.CICYT TIC2001-1143-C03-0
Histological, immunohistochemical and ultrastructural study of secondary compressed spinal cord injury in a rat model
Introduction. Spinal cord injury (SCI) is a life-disrupting condition in which the first few days are the most critical. Secondary conditions remain the main causes of death for people with SCI. The response of different cell types to SCI and their role at different times in the progression of secondary degeneration are not well understood. The aim of this study was to study the histopathological changes of compressed spinal cord injury (CSCI) in a rat model.
Material and methods. Forty adult male Sprague-Dawley rats were divided into four groups. In group I, the rats were left without any surgical intervention (control). In group II, the rats were subjected to laminectomy without spinal cord compression (sham-operated). In group III, the rats were sacrificed one day after CSCI. In group IV, the rats were sacrificed seven days after CSCI. The light microscopy was employed to study the morphology using H&E, osmic acid staining and immunohistochemistry to detect glial fibrillary acidic protein (GFAP). The electron microscopy was applied for ultrastructure study.
Results. Histopathological examination of the posterior funiculus of the white matter revealed minute hemorrhages and localized necrotic areas on day 1, which transformed to areas of cavitation and fibrinoid necrosis surrounded by a demarcating rim of numerous astrocytes by day 7. The mean percentage of area of GFAP expression increased significantly by day 7. Osmic acid staining revealed swollen nerve fibers after one day, while numerous fibers had been lost by day 7. An ultrastructure study revealed swollen redundant thinned myelin and myelin splitting, as well as degeneration of axoplasm on day 1. On day 7, layers of the myelin sheath were folded and wrinkled with partial or complete demyelination areas. The myelin lamellae were disorganized and loose. The G-ratio was significantly greater on day 1 than day 7 after CSCI.
Conclusions. In the rat model of CSCI details of the progressive spinal cord injury can be analyzed by morphological methods and may be helpful in the identification of the onset and type of clinical intervention
A Survey on IP Watermarking Techniques
Intellectual property (IP) block reuse is essential for facilitating the design process of system-on-a-chip. Sharing IP designs poses significant high security risks. Recently, digital watermarking emerged as a candidate solution for copyright protection of IP blocks. In this paper, we survey and classify different techniques used for watermarking IP designs. To this end, we defined several evaluation criteria, which can also be used as a benchmark for new IP watermarking developments. Furthermore, we established a comprehensive set of requirements for future IP watermarking techniques
Verifying a synthesized implementation of IEEE-754 floating-point exponential function using HOL
Deep datapath and algorithm complexity have made the verification of floating-point units a very hard task. Most simulation and reachability analysis verification tools fail to verify a circuit with a deep datapath like most industrial floating-point units. Theorem proving, however, offers a better solution to handle such verification. In this paper, we have hierarchically formalized and verified a hardware implementation of the IEEE-754 table-driven floating-point exponential function algorithm using the higher-order logic (HOL) theorem prover. The high ability of abstraction in the HOL verification system allows its use for the verification task over the whole design path of the circuit, starting from gate-level implementation of the circuit up to a high-level mathematical specification
Verifying a Synthesized Implementation of IEEE-754 Floating-Point Exponential Function using HOL
Deep datapath and algorithm complexity have made the verification of floating-point units a very hard task. Most simulation and reachability analysis verification tools fail to verify a circuit with a deep datapath like most industrial floating-point units. Theorem proving, however, offers a better solution to handle such verification. In this paper, we have hierarchically formalized and verified a hardware implementation of the IEEE-754 table-driven floating-point exponential function algorithm using the higher-order logic (HOL) theorem prover. The high ability of abstraction in the HOL verification system allows its use for the verification task over the whole design path of the circuit, starting from gate-level implementation of the circuit up to a high-level mathematical specification
Effort estimation of FLOSS projects: A study of the Linux kernel
This is the post-print version of the Article. The official published version can be accessed from the link below - Copyright @ 2011 SpringerEmpirical research on Free/Libre/Open Source Software (FLOSS) has shown that developers tend to cluster around two main roles: “core” contributors differ from “peripheral” developers in terms of a larger number of responsibilities and a higher productivity pattern. A further, cross-cutting characterization of developers could be achieved by associating developers with “time slots”, and different patterns of activity and effort could be associated to such slots. Such analysis, if replicated, could be used not only to compare different FLOSS communities, and to evaluate their stability and maturity, but also to determine within projects, how the effort is distributed in a given period, and to estimate future needs with respect to key points in the software life-cycle (e.g., major releases). This study analyses the activity patterns within the Linux kernel project, at first focusing on the overall distribution of effort and activity within weeks and days; then, dividing each day into three 8-hour time slots, and focusing on effort and activity around major releases. Such analyses have the objective of evaluating effort, productivity and types of activity globally and around major releases. They enable a comparison of these releases and patterns of effort and activities with traditional software products and processes, and in turn, the identification of company-driven projects (i.e., working mainly during office hours) among FLOSS endeavors. The results of this research show that, overall, the effort within the Linux kernel community is constant (albeit at different levels) throughout the week, signalling the need of updated estimation models, different from those used in traditional 9am–5pm, Monday to Friday commercial companies. It also becomes evident that the activity before a release is vastly different from after a release, and that the changes show an increase in code complexity in specific time slots (notably in the late night hours), which will later require additional maintenance efforts
- …