1,110 research outputs found
A procedure for testing the quality of LANDSAT atmospheric correction algorithms
There are two basic methods for testing the quality of an algorithm to minimize atmospheric effects on LANDSAT imagery: (1) test the results a posteriori, using ground truth or control points; (2) use a method based on image data plus estimation of additional ground and/or atmospheric parameters. A procedure based on the second method is described. In order to select the parameters, initially the image contrast is examined for a series of parameter combinations. The contrast improves for better corrections. In addition the correlation coefficient between two subimages, taken at different times, of the same scene is used for parameter's selection. The regions to be correlated should not have changed considerably in time. A few examples using this proposed procedure are presented
Noise correction on LANDSAT images using a spline-like algorithm
Many applications using LANDSAT images face a dilemma: the user needs a certain scene (for example, a flooded region), but that particular image may present interference or noise in form of horizontal stripes. During automatic analysis, this interference or noise may cause false readings of the region of interest. In order to minimize this interference or noise, many solutions are used, for instane, that of using the average (simple or weighted) values of the neighboring vertical points. In the case of high interference (more than one adjacent line lost) the method of averages may not suit the desired purpose. The solution proposed is to use a spline-like algorithm (weighted splines). This type of interpolation is simple to be computer implemented, fast, uses only four points in each interval, and eliminates the necessity of solving a linear equation system. In the normal mode of operation, the first and second derivatives of the solution function are continuous and determined by data points, as in cubic splines. It is possible, however, to impose the values of the first derivatives, in order to account for shapr boundaries, without increasing the computational effort. Some examples using the proposed method are also shown
Classification-driven search for effective sm partitioning in multitasking GPUs
Graphics processing units (GPUs) feature an increasing number of streaming multiprocessors (SMs) with each successive generation. At the same time, GPUs are increasingly widely adopted in cloud services and data centers to accelerate general-purpose workloads. Running multiple applications on a GPU in such environments requires effective multitasking support. Spatial multitasking in which independent applications co-execute on different sets of SMs is a promising solution to share GPU resources. Unfortunately, how to effectively partition SMs is an open problem.
In this paper, we observe that compared to widely-used even partitioning, dynamic SM partitioning based on the characteristics of the co-executing applications can significantly improve performance and power efficiency. Unfortunately, finding an effective SM partition is challenging because the number of possible combinations increases exponentially with the number of SMs and co-executing applications. Through offline analysis, we find that first classifying workloads, and then searching an effective SM partition based on the workload characteristics can significantly reduce the search space, making dynamic SM partitioning tractable.
Based on these insights, we propose Classification-Driven search (CD-search) for low-overhead dynamic SM partitioning in multitasking GPUs. CD-search first classifies workloads using a novel off-SM bandwidth model, after which it enters the performance mode or power mode depending on the workload's characteristics. Both modes follow a specific search strategy to quickly determine the optimum SM partition. Our evaluation shows that CD-search improves system throughput by 10.4% on average (and up to 62.9%) over even partitioning for workloads that are classified for the performance mode. For workloads classified for the power mode, CD-search reduces power consumption by 25% on average (and up to 41.2%). CD-search incurs limited runtime overhead
Finfish constituents of trawl low value by-catch off Mangalore
Trawling remains a controversial method of fishing due to the poor selectivity of trawl net and catching of hugequantity and diversity of non target fishes. By-catch is recognised as an unavoidable portion of fish catch but quantity variesaccording to the type of gear operated. FAO has considered the resource damage due to discarding of by-catch as serious issue.Mangalore fisheries harbour is one among the major fishing harbours of Karnataka state with its significant contribution tothe Trawl landings. The study was conducted to assess the recourse damage due to indiscriminate fishing and landing of juveniles of commercially important species in low value by-catch due to smaller size and improper handling. The quantity of lowvalue by-catch was estimated from the landings of single-day and multi-day trawler landings for the period of August 2012 toMay 2013. The quantity of low value by-catch (trash fish) generated by trawlers of Mangalore was estimated as 32,426 t (19%of the total trawl catch). A total 121 species of finfishes belonging to 55 families have been identified constituting the low valueby-catch. Order Perciformes contributed 61.16 per cent (74 species) to the total number of species, followed by Clupeiformesand other groups, whereas the family Carangidae contributed 11.57 per cent (14 species) to the total number of species, followedby Engraulidae and other families. Presence of wide diversity of finfishes in low value by-catch showed an alarming signal of fisheries resource damage
Detecting Baryon Acoustic Oscillations with third generation gravitational wave observatories
We explore the possibility of detecting Baryon Acoustic Oscillations (BAO) solely from gravitational wave observations of binary neutron star mergers with third generation (3G) gravitational wave (GW) detectors like Cosmic Explorer and the Einstein Telescope. These measurements would provide a new independent probe of cosmology. The detection of the BAO peak with current generation GW detectors (solely from GW observations) is not possible because i) unlike galaxies, the GW mergers are poorly localized and ii) there are not enough merger events to probe the BAO length scale. With the 3G GW detector network, it is possible to observe binary neutron star mergers per year localized well within one square degree in the sky for redshift . We show that 3G observatories will enable precision measurements of the BAO feature in the large-scale two-point correlation function; the effect of BAO can be independently detected at different reshifts, with a log-evidence ratio of 23, 17, or 3 favouring a model with a BAO peak at redshift of 0.2, 0.25, or 0.3, respectively, using a redshift bin corresponding to a shell of thickness Mpc
SAMPLE-A RAW DATA FOR FUTURE INFORMATION OF LIFE
Gene amplification requires a biological sample which is collected prior to the demands. The sample for such analysis plays a vital role as they serve as a resource for the core material-Deoxyribonucleic acid. A sample for gene amplification or any analysis would be collected prior to the demands and type of analysis. Human blood has been an essential resource of DNA from the commencement of DNA extraction in the 19th century. After then several protocols developed according to the requirement of both samples of different forms and their retrieval as various forms and methods. The sample has various characteristics and prerequisites when collected for gene analysis. Some important characteristics of sample collection methods are significant, which are not widely taken. This work analyses various general sample collection protocols and highlights some of the major characteristics and prerequisites for the sample. This work initiates and delivers to attain the core of genomics by bringing out the basic raw materials importance and consequences on amplification which is highly sensitive
Rate constants for proteins binding to substrates with multiple binding sites using a generalized forward flux sampling expression
- …
