2,188 research outputs found

    Optimal Embedding of Functions for In-Network Computation: Complexity Analysis and Algorithms

    Full text link
    We consider optimal distributed computation of a given function of distributed data. The input (data) nodes and the sink node that receives the function form a connected network that is described by an undirected weighted network graph. The algorithm to compute the given function is described by a weighted directed acyclic graph and is called the computation graph. An embedding defines the computation communication sequence that obtains the function at the sink. Two kinds of optimal embeddings are sought, the embedding that---(1)~minimizes delay in obtaining function at sink, and (2)~minimizes cost of one instance of computation of function. This abstraction is motivated by three applications---in-network computation over sensor networks, operator placement in distributed databases, and module placement in distributed computing. We first show that obtaining minimum-delay and minimum-cost embeddings are both NP-complete problems and that cost minimization is actually MAX SNP-hard. Next, we consider specific forms of the computation graph for which polynomial time solutions are possible. When the computation graph is a tree, a polynomial time algorithm to obtain the minimum delay embedding is described. Next, for the case when the function is described by a layered graph we describe an algorithm that obtains the minimum cost embedding in polynomial time. This algorithm can also be used to obtain an approximation for delay minimization. We then consider bounded treewidth computation graphs and give an algorithm to obtain the minimum cost embedding in polynomial time

    Study of a imaging indexing technique in JPEG Compressed domain

    Get PDF
    In our computers all stored images are in JPEG compressed format even when we download an image from the internet that is also in JPEG compressed format, so it is very essential that we should have content based image indexing its retrieval conducted directly in the compressed domain. In this paper we used a partial decoding algorithm for all the JPEG compressed images to index the images directly in the JPEG compressed domain. We also compare the performance of the approaches in DCT domain and the original images in the pixel domain. This technology will prove preciously in those applications where fast image key generation is required. Image and audio techniques are very important in the multimedia applications. In this paper, we comprise an analytical review of the compressed domain indexing techniques, in which we used transform domain techniques such as Fourier transform, karhunen-loeve transform, Cosine transform, subbands and spatial domain techniques, which are using vector quantization and fractrals. So after comparing other research papers we come on the conclusion that when we have to compress the original image then we should convert the image by using the 8X8 pixels of image blocks and after that convert into DCT form and so on. So after doing research on the same concept we can divide image pixels blocks into 4X4X4 blocks of pixels. So by doing the same we can compress the original image by using the steps further

    Improving Mix-CLAHE with ACO for Clearer Oceanic Images

    Full text link
    Oceanic pictures have poor visibility attributable to various factors; weather disturbance, particles in water, lightweight frames and water movement which results in degraded and low contrast pictures of underwater. Visibility restoration refers to varied ways in which aim to decline and remove the degradation that have occurred whereas the digital image has been obtained. The probabilistic Ant Colony Optimization (ACO) approach is presented to solve the problem of designing an optimal route for hard combinatorial problems. It\u27s found that almost all of the prevailing researchers have neglected several problems i.e. no technique is correct for various reasonably circumstances. the prevailing strategies have neglected the utilization of hymenopter colony optimization to cut back the noise and uneven illuminate downside. The main objective of this paper is to judge the performance of ANT colony optimization primarily based haze removal over the obtainable MIX-CLAHE (Contrast Limited adaptive histogram Equalization) technique. The experiment has clearly showed the effectiveness of the projected technique over the obtainable strategies

    Robotic Resistance Treadmill Training Improves Locomotor Function in Children With Cerebral Palsy: A Randomized Controlled Pilot Study

    Get PDF
    Objective To determine whether applying controlled resistance forces to the legs during the swing phase of gait may improve the efficacy of treadmill training as compared with applying controlled assistance forces in children with cerebral palsy (CP). Design Randomized controlled study. Setting Research unit of a rehabilitation hospital. Participants Children with spastic CP (N=23; mean age, 10.6y; range, 6–14y; Gross Motor Function Classification System levels, I–IV). Interventions Participants were randomly assigned to receive controlled assistance (n=11) or resistance (n=12) loads applied to the legs at the ankle. Participants underwent robotic treadmill training 3 times a week for 6 weeks (18 sessions). A controlled swing assistance/resistance load was applied to both legs starting from the toe-off to mid-swing phase of gait during training. Main Outcome Measures Outcome measures consisted of overground walking speed, 6-minute walk distance, and Gross Motor Function Measure scores and were assessed pre and post 6 weeks of training and 8 weeks after the end of training. Results After 6 weeks of treadmill training in participants from the resistance training group, fast walking speed and 6-minute walk distance significantly improved (18% and 30% increases, respectively), and 6-minute walk distance was still significantly greater than that at baseline (35% increase) 8 weeks after the end of training. In contrast, overground gait speed and 6-minute walk distance had no significant changes after robotic assistance training. Conclusions The results of the present study indicated that robotic resistance treadmill training is more effective than assistance training in improving locomotor function in children with CP

    Exploring protein structural dissimilarity to facilitate structure classification

    Get PDF
    Background: Classification of newly resolved protein structures is important in understanding their architectural, evolutionary and functional relatedness to known protein structures. Among various efforts to improve the database of Structural Classification of Proteins (SCOP), automation has received particular attention. Herein, we predict the deepest SCOP structural level that an unclassified protein shares with classified proteins with an equal number of secondary structure elements (SSEs). Results: We compute a coefficient of dissimilarity (omega) between proteins, based on structural and sequence-based descriptors characterising the respective constituent SSEs. For a set of 1,661 pairs of proteins with sequence identity up to 35%, the performance of omega in predicting shared Class, Fold and Super-family levels is comparable to that of DaliLite Z score and shows a greater than four-fold increase in the true positive rate (TPR) for proteins sharing the Family level. On a larger set of 600 domains representing 200 families, the performance of Z score improves in predicting a shared Family, but still only achieves about half of the TPR of omega. The TPR for structures sharing a Superfamily is lower than in the first dataset, but omega performs slightly better than Z score. Overall, the sensitivity of omega in predicting common Fold level is higher than that of the DaliLite Z score. Conclusion: Classification to a deeper level in the hierarchy is specific and difficult. So the efficiency of omega may be attractive to the curators and the end-users of SCOP. We suggest omega may be a better measure for structure classification than the DaliLite Z score, with the caveat that currently we are restricted to comparing structures with equal number of SSEs

    Attribute Based Secure Data Retrieval System for Decentralized Disruption Tolerant Military Networks

    Get PDF
    There are partitions in military environments such as a battlefield or a hostile region.They are likely to suffer from intermittent network connectivity.They having frequent partitions. Disruption-tolerant network DTN technologies are is a true and easy solutions.DTN is a Disruption-tolerant network.It allow devices which are wireless and carried by peoples in a military to interact with each other.These devices access the confidential information or command reliably by exploiting external storage nodes. In these networking environments DTN is very successful technology. When there is no wired connection between a source and a destination device, the information from the source node may need to wait in the intermediate nodes for a large amount of time until the connection would be correctly established.one of the challenching approach is a ABE.that is attribute-based encryption which fulfills the requirements for secure data retrieval in DTNs. The another concept is Cipher text Policy ABE (CP-ABE).it gives a appropriate way of encryption of data. the encryption includes the attribute set that the decryption needs to possess in order to decrypt the cipher text.hence, Many users can be allowed to decrypt different parts of data according to the security policy

    Management of Coagulopathy in Patients with Decompensated Liver Cirrhosis

    Get PDF
    Patients with decompensated liver cirrhosis have significantly impaired synthetic function. Many proteins involved in the coagulation process are synthesized in the liver. Routinely performed tests of the coagulation are abnormal in patients with decompensated liver cirrhosis. This has led to the widespread belief that decompensated liver cirrhosis is prototype of acquired hemorrhagic coagulopathy. If prothrombin time is prolonged more than 3 seconds over control, invasive procedures like liver biopsy, splenoportogram, percutaneous cholangiography, or surgery were associated with increased risk of bleeding, and coagulopathy should be corrected with infusion of fresh frozen plasma. These practices were without any scientific evidence and were associated with significant hazards of fresh frozen plasma transfusion. Now, it is realized that coagulation is a complex process involving the interaction of procoagulation and anticoagulation factors and the fibrinolytic system. As there is reduction in both anti and procoagulant factors, global tests of coagulation are normal in patients with acute and chronic liver disease indicating that coagulopathy in liver disease is more of a myth than a reality. In the last few years, surgical techniques have substantially improved, and complex procedures like liver transplantation can be done without the use of blood or blood products. Patients with liver cirrhosis may also be at increased risk of thrombosis. In this paper, we will discuss coagulopathy, increased risk of thrombosis, and their management in decompensated liver cirrhosis

    Sponsored data with ISP competition

    Full text link
    We analyze the effect of sponsored data platforms when Internet service providers (ISPs) compete for subscribers and content providers (CPs) compete for a share of the bandwidth usage by the customers. Our analytical model is of a full information, leader-follower game. ISPs lead and set prices for sponsorship. CPs then make the binary decision of sponsoring or not sponsoring their content on the ISPs. Lastly, based on both of these, users make a two-part decision of choosing the ISP to which they subscribe, and the amount of data to consume from each of the CPs through the chosen ISP. User consumption is determined by a utility maximization framework, the sponsorship decision is determined by a non-cooperative game between the CPs, and the ISPs set their prices to maximize their profit in response to the prices set by the competing ISP. We analyze the pricing dynamics of the prices set by the ISPs, the sponsorship decisions that the CPs make and the market structure therein, and the surpluses of the ISPs, CPs, and users. This is the first analysis of the effect sponsored data platforms in the presence of ISP competition. We show that inter-ISP competition does not inhibit ISPs from extracting a significant fraction of the CP surplus. Moreover, the ISPs often have an incentive to significantly skew the CP marketplace in favor of the most profitable CP
    corecore