348 research outputs found

    Estimation of the rain signal in the presence of large surface clutter

    Get PDF
    The principal limitation for the use of a spaceborne imaging SAR as a rain radar is the surface-clutter problem. Signals may be estimated in the presence of noise by averaging large numbers of independent samples. This method was applied to obtain an estimate of the rain echo by averaging a set of N(sub c) samples of the clutter in a separate measurement and subtracting the clutter estimate from the combined estimate. The number of samples required for successful estimation (within 10-20%) for off-vertical angles of incidence appears to be prohibitively large. However, by appropriately degrading the resolution in both range and azimuth, the required number of samples can be obtained. For vertical incidence, the number of samples required for successful estimation is reasonable. In estimating the clutter it was assumed that the surface echo is the same outside the rain volume as it is within the rain volume. This may be true for the forest echo, but for convective storms over the ocean the surface echo outside the rain volume is very different from that within. It is suggested that the experiment be performed with vertical incidence over forest to overcome this limitation

    Precipitation measurement using SIR-C: A feasibility study. Investigation at nadir

    Get PDF
    The most significant limitation of the imaging SAR in rain measurements is the ground return coupled to the rain cell. Here we report a study of the possibility of using the X-SAR and the C-band channel of SIR-C for rain measurement. Earlier signal-to-clutter calculations rule out the use of X-SAR at steeper off-vertical angles of incidence (i.e., 20 less than theta less than 50). Only rain rates greater than 30 mm/hr at angles of incidence greater than 60 degrees showed good signal-to-clutter ratio (SCR). This study involved calculations at vertical incidence. There is adequate signal-to-noise ratio (SNR) at vertical incidence, but the presence of high-range side-lobe levels leads to small SCR for measurement over oceans at both X and C bands. For larger rain thickness (greater than two km), the SCR gets better and smaller rain rates (greater than 10 mm/hr) can be measured. However, rain measurements over forests seem to be feasible at nadir even for smaller rain thickness (less than two km). We conclude that X band may be usable over the forest at vertical incidence to measure rain rates greater than five mm/hr even for shallow rain thickness and over ocean for large rain thickness

    Precipitation measurement using SIR-C: A feasibility study

    Get PDF
    A precipitation detection and measurement experiment is planned for the SIR-C/X-SAR mission. This study was conducted to determine under what conditions an off-nadir experiment is feasible. The signal-to-clutter ratio, the signal-to-noise ratio, and the minimum detectable rain rate were investigated. Available models, used in previous studies, were used for the surface clutter and the rain echo. The study also considers the attenuation of the returns at X band. It was concluded that an off-nadir rain-measurement experiment is feasible only for rain rates greater than 10 mm/hr for look angles greater than 60 deg. For the range of look angles 5 less than theta(sub 1) less than 50, the rain rate required is very high for adequate signal-to-clutter ratio, and hence the feasibility of the experiment

    A survey, review, and future trends of skin lesion segmentation and classification

    Get PDF
    The Computer-aided Diagnosis or Detection (CAD) approach for skin lesion analysis is an emerging field of research that has the potential to alleviate the burden and cost of skin cancer screening. Researchers have recently indicated increasing interest in developing such CAD systems, with the intention of providing a user-friendly tool to dermatologists to reduce the challenges encountered or associated with manual inspection. This article aims to provide a comprehensive literature survey and review of a total of 594 publications (356 for skin lesion segmentation and 238 for skin lesion classification) published between 2011 and 2022. These articles are analyzed and summarized in a number of different ways to contribute vital information regarding the methods for the development of CAD systems. These ways include: relevant and essential definitions and theories, input data (dataset utilization, preprocessing, augmentations, and fixing imbalance problems), method configuration (techniques, architectures, module frameworks, and losses), training tactics (hyperparameter settings), and evaluation criteria. We intend to investigate a variety of performance-enhancing approaches, including ensemble and post-processing. We also discuss these dimensions to reveal their current trends based on utilization frequencies. In addition, we highlight the primary difficulties associated with evaluating skin lesion segmentation and classification systems using minimal datasets, as well as the potential solutions to these difficulties. Findings, recommendations, and trends are disclosed to inform future research on developing an automated and robust CAD system for skin lesion analysis

    Capstone Case Study Guide MAPFRE

    Get PDF
    Mapfre is a Top-Notch insurer and a competitive and fast-evolving insurance company. Clark team will help Mapfre to organize to secure systems availability and resilience to support the business process. Assist and recommend the IT team for further analysis and identify data, trends, and patterns and come up with to improve the service

    Implementation of MHMIP and Comparing the Performance With MIP and DHMIP in Mobile Networks

    Get PDF
    Managing the mobility efficiently in wireless networks causes critical issue, in order to support mobile users. To support global mobility in IP networks The Mobile Internet Protocol (MIP) has been proposed. The Hierarchical MIP (HMIP) and Dynamic HMIP (DHMIP) strategies are also proposed for providing high signaling delay. Our proposal approach “Multicast HMIP strategy” limits the registration processes in the GFAs. For high-mobility MTs, MHMIP provides lowest mobility signaling delay compared to the HMIP and DHMIP approaches. However, it is resource consuming strategy unless for frequent MT mobility. Hence, we propose an analytic model to evaluate the mean signaling delay and the mean bandwidth per call according to the type of MT mobility. In our analysis, the MHMIP gives the best performance among the DHMIP and MIP strategies in almost all the studied cases. The main contribution of this paper is to implement the MHMIP and provide the analytic model that allows the comparison of MIP, DHMIP and MHMIP mobility management approaches

    Fisheye Consistency: Keeping Data in Synch in a Georeplicated World

    Get PDF
    Over the last thirty years, numerous consistency conditions for replicated data have been proposed and implemented. Popular examples of such conditions include linearizability (or atomicity), sequential consistency, causal consistency, and eventual consistency. These consistency conditions are usually defined independently from the computing entities (nodes) that manipulate the replicated data; i.e., they do not take into account how computing entities might be linked to one another, or geographically distributed. To address this lack, as a first contribution, this paper introduces the notion of proximity graph between computing nodes. If two nodes are connected in this graph, their operations must satisfy a strong consistency condition, while the operations invoked by other nodes are allowed to satisfy a weaker condition. The second contribution is the use of such a graph to provide a generic approach to the hybridization of data consistency conditions into the same system. We illustrate this approach on sequential consistency and causal consistency, and present a model in which all data operations are causally consistent, while operations by neighboring processes in the proximity graph are sequentially consistent. The third contribution of the paper is the design and the proof of a distributed algorithm based on this proximity graph, which combines sequential consistency and causal consistency (the resulting condition is called fisheye consistency). In doing so the paper not only extends the domain of consistency conditions, but provides a generic provably correct solution of direct relevance to modern georeplicated systems
    • 

    corecore