110 research outputs found

    Regolith behavior under asteroid-level gravity conditions: low-velocity impact experiments

    Full text link
    The dusty regolith covering the surfaces of asteroids and planetary satellites differs in size, shape, and composition from terrestrial soil particles and is subject to very different environmental conditions. Experimental studies of the response of planetary regolith in the relevant environmental conditions are thus necessary to facilitate future Solar System exploration activities. We combined the results and provided new data analysis elements for a series of impact experiments into simulated planetary regolith in low-gravity conditions using two experimental setups: the Physics of Regolith Impacts in Microgravity Experiment (PRIME) and the COLLisions Into Dust Experiment (COLLIDE). Results of these experimental campaigns found that there is a significant change in the regolith behavior with the gravity environment. In a 10-2g environment (Lunar g levels), only embedding of the impactor was observed and ejecta production was produced for most impacts at > 20 cm/s. Once at microgravity levels (<10-4g), the lowest impact energies also produced impactor rebound. In these microgravity conditions, ejecta started to be produced for impacts at > 10 cm/s. The measured ejecta speeds were lower than the ones measured at reduced-gravity levels, but the ejected masses were higher. The mean ejecta velocity shows a power-law dependence on the impact energy with an index of ~0.7. When projectile rebound occurred, we observed that its coefficients of restitution on the bed of regolith simulant decrease by a factor of 10 with increasing impact speeds from ~5 cm/s up to 100 cm/s. We could also observe an increased cohesion between the JSC-1 grains compared to the quartz sand targets

    Deployment of Heterogeneous Swarm Robotic Agents Using a Task-Oriented Utility-Based Algorithm

    Get PDF
    In a swarm robotic system, the desired collective behavior emerges from local decisions made by robots, themselves, according to their environment. Swarm robotics is an emerging area that has attracted many researchers over the last few years. It has been proven that a single robot with multiple capabilities cannot complete an intended job within the same time frame as that of multiple robotic agents. A swarm of robots, each one with its own capabilities, are more flexible, robust, and cost-effective than an individual robot. As a result of a comprehensive investigation of the current state of swarm robotic research, this dissertation demonstrates how current swarm deployment systems lack the ability to coordinate heterogeneous robotic agents. Moreover, this dissertation's objective shall define the starting point of potential algorithms that lead to the development of a new software environment interface. This interface will assign a set of collaborative tasks to the swarm system without being concerned about the underlying hardware of the heterogeneous robotic agents. The ultimate goal of this research is to develop a task-oriented software application that facilitates the rapid deployment of multiple robotic agents. The task solutions are created at run-time, and executed by the agents in a centralized or decentralized fashion. Tasks are fractioned into smaller sub-tasks which are, then, assigned to the optimal number of robots using a novel Robot Utility Based Task Assignment (RUTA) algorithm. The system deploys these robots using it's application program interfaces (API's) and uploads programs that are integrated with a small routine code. The embedded routine allows robots to configure solutions when the decentralized approach is adopted. In addition, the proposed application also offers customization of robotic platforms by simply defining the available sensing and actuation devices. Another objective of the system is to improve code and component reusability to reduce efforts in deploying tasks to swarm robotic agents. Usage of the proposed framework prevents the need to redesign or rewrite programs should any changes take place in the robot's platform

    UB Robot Swarm System: Design and Implementation

    Get PDF
    In this poster we described the hardware architecture of an inexpensive, heterogeneous, mobile robot swarm, designed and developed at RISC lab, University of Bridgeport. Each UB robot swarm is equipped with sensors, actuators, control and communication units, power supply, and interconnection mechanism. Robot swarms have become a new research paradigm in the last ten years offering novel approaches, such as self-reconfigurabity, self-assembly, self-replication and self-learning. Developing a multi-agent robot system with heterogeneity and larger behavioral repertoire is a great challenge. This robot swarm is capable of performing user defined tasks such as wall painting, mapping, human rescue operations, task allocation, obstacle avoidance, and object transportation

    UBSwarm: Design of a Software Environment to Deploy Multiple Decentralized Robots

    Get PDF
    This article presents a high-level configuration and task assignment software package that distributes algorithms on a swarm of robots. The software allows the robots to operate in a swarm fashion. When the swarm robotic system adopts a decentralized approach, the desired collective behaviors emerge from local decisions made by the robots themselves according to their environment. Using its GUI, the proposed system expects the operator to select between several available robot agents and assign the swarm of robots a particular task from a set of available tasks

    Hardware Architecture Review of Swarm Robotics System: Self-Reconfigurability, Self-Reassembly, and Self-Replication

    Get PDF
    Swarm robotics is one of the most fascinating and new research areas of recent decades, and one of the grand challenges of robotics is the design of swarm robots that are self-sufficient. This can be crucial for robots exposed to environments that are unstructured or not easily accessible for a human operator, such as the inside of a blood vessel, a collapsed building, the deep sea, or the surface of another planet. In this paper, we present a comprehensive study on hardware architecture and several other important aspects of modular swarm robots, such as self-reconfigurability, self-replication, and self-assembly. The key factors in designing and building a group of swarm robots are cost and miniaturization with robustness, flexibility, and scalability. In robotics intelligence, self-assembly and self-reconfigurability are among the most important characteristics as they can add additional capabilities and functionality to swarm robots. Simulation and model design for swarm robotics is highly complex and expensive, especially when attempting to model the behavior of large swarm robot groups.http://dx.doi.org/10.5402/2013/84960

    Soluble TWEAK and Cardiovascular Morbidity and Mortality in Chronic Kidney Disease Patients

    Get PDF
    Introduction: Cardiovascular disease (CVD) is a major cause of morbidity and mortality in chronic kidney patients (CKD). The aim of this study was to demonstrate the role of soluble tumor necrosis factor (TNF) weak inducer of apoptosis (sTWEAK) as a marker of cardiovascular morbidity and mortality in CKD patients.Methods: The study included 75 CKD patients classified according to eGFR into three groups; group-1 included 15 patients with stage-1 CKD, group-2 included 30 patients with stage-2 and stage-3 CKD, and group-3 included 30 patients with stage-4 and stage-5 CKD. The three groups were compared to 20 matched controls. Interleukin-6 (IL-6) and sTWEAK were measured using ELISA and chemiluminescent techniques respectively. Carotid intima-media thickness (IMT) was also measured.Results: We found that IL-6 showed significant difference between patient groups and controls, being highest in stage 4 and 5 CKD patients and lowest in controls. Soluble TWEAK showed significant difference between patient groups and controls, being lowest in stage 4 and 5 CKD patients and highest in controls. Soluble TWEAK level showed significant negative correlation with IL-6 (r = -0.68; P&lt;0.01) and carotid IMT (r = -0.95; P&lt;0.01). After two years follow up, nine out of 75 CKD patients developed ischemic heart disease (IHD). Two patients developed cerebrovascular stroke and another patient developed peripheral arterial disease. These patients had significantly lower levels of sTWEAK at baseline compared to other patients (160.5&plusmn; 60.2 versus 274.8&plusmn;90 pg/mL; P &lt; 0.05).Conclusion: Soluble TWEAK can be a novel biomarker of atherosclerosis and endothelial dysfunction as well as cardiovascular outcome in CKD patients

    Multi-modal palm-print and hand-vein biometric recognition at sensor level fusion

    Get PDF
    When it is important to authenticate a person based on his or her biometric qualities, most systems use a single modality (e.g. fingerprint or palm print) for further analysis at higher levels. Rather than using higher levels, this research recommends using two biometric features at the sensor level. The Log-Gabor filter is used to extract features and, as a result, recognize the pattern, because the data acquired from images is sampled at various spacing. Using the two fused modalities, the suggested system attained greater accuracy. Principal component analysis (PCA) was performed to reduce the dimensionality of the data. To get the optimum performance between the two classifiers, fusion was performed at the sensor level utilizing different classifiers, including K-nearest neighbors (K-NN) and support vector machines (SVMs). The technology collects palm prints and veins from sensors and combines them into consolidated images that take up less disk space. The amount of memory needed to store such photos has been lowered. The amount of memory is determined by the number of modalities fused

    Cross asset resource allocation framework for pavement and bridges in Iowa

    Get PDF
    With all the challenges facing U.S departments of transportation (US DOTs), especially the scarcity of resources and the deteriorating infrastructure systems, DOTs started to divert from separate assets decision making strategies to a more comprehensive resource allocation approach. This also resulted from the fact that the optimal allocation for each asset type separately is not the optimal allocation for all assets in the network. Specifically speaking about Iowa, about one quarter of Iowa’s primary roadways fail to meet a sufficiency rating considered minimally acceptable, furthermore the rural Interstate system in Iowa was ranked 38th in the nation in 2010. The case in bridges is not better, where one of every five bridges in Iowa are rated as structurally deficient. By that, Iowa has the third worst state record in the nation. As a result of that, this research will focus on proposing a new simple and applicable cross asset resource allocation framework for pavements and bridges in Iowa, utilizing Pavement Management Information System (PMIS) and National Bridge Inventory (NBI) data. The objective function of this framework is to maximize the network monetary value by changing the proportions of total budget allocated to each asset type, while the resulting budgets are allocated in a need-based approach across importance groups and in a worst-first basis within each importance group. The final output of this research is a MATLAB simple tool that allocates five years of funding across interstate, U.S, and state pavements and bridges. This tool also provides a list of pavement mileage and bridge deck area that need to be treated by each maintenance action at each budget level. It also compares the impact of different pavement and bridge valuation definitions on the solution that maximizes the network monetary value. The results show that the proposed framework is not sensitive to the valuation approach. It also shows that at low budget levels, most of the budget is allocated to pavements. This condition is reversed at moderate budget levels, and equal allocation is achieved at very high total budget level, i.e. 1 billion dollars

    Perspectives of Social Barriers to Accessing Health Care Insurance among the Homeless Population

    Get PDF
    Access to healthcare services and healthcare insurance has been considered problematic for many populations in the United States. Despite many efforts to solve these issues, there is a significant gap in research related to the perceptions of the population experiencing homelessness associated with the social barriers they face with access to health insurance and health care. The behavioral-ecological framework best suited this study. In this qualitative descriptive study the perceptions of a sample of 10 individuals experiencing homelessness, who accepted to sit for 60-minute interviews, were investigated. Purposeful sampling was used to identify these individuals, whose age 18-60 years old, who met the U.S. Department of Housing and Urban Development’s definition of homeless and tried to access health care insurance within the last 12 months. Data were collected using semistructured interviews in a face-to-face setting. Using Braun and Clarke’s six steps for thematic analysis, the transcripts from the interviews were coded and analyzed, extracting six themes that assisted in answering the posed research questions. These findings included that the population experiencing homelessness (a) having an inability to prioritize their health-related decisions, (b)facing an inability to interact successfully with healthcare providers, (c) having difficulties in health care follow-up and scheduling of appointments, (d)are unable to acquire accessible assistance and resources, (e) carrying negative attitudes and behaviors about their health, and (f) resources fail to assist by reaching out to them. The potential impact for positive social change involved increasing the availability of healthcare information and resources through public organizations for easy access

    Smart job searching system based on information retrieval techniques and similarity of fuzzy parameterized sets

    Get PDF
    Job searching for the proper vacancy among several choices is one of the most important decision-making problems. The necessity of dealing with uncertainty in such real-world problems has been a long-term research challenge which has originated from different methodologies and theories. The main contribution of this work is to match the applicant curriculum vitae (CV) with the best available job opportunities based on certain criteria. The proposed job searching system (JSS) implements a series of approaches which can be broken down into segmentation, tokenization, part of speech, gazetteer, and fuzzy inference to extract and arrange the required data from the job announcements and CV. Moreover, this study designs a fuzzy parameterized structure to store such data as well as a measuring tool to calculate the degree of similarity between the job requirements and the applicant’s CV. In addition, this system analyses the computed similarity scores in order to get the optimal job opportunities for the job seeker in descending order. The performance evaluation of the proposed system shows high recall and precision percentages for the matching process. The results also confirm the viability of the JSS approach in handling the fuzziness that is associated with the problem of job searching
    • …
    corecore