1,525 research outputs found

    Statistical Design of Experiment A Tool for Mineral Engineers

    Get PDF
    The classical methods of one-factor-at-a-time exper-iment (keeping at other remaining factors constant) requires a large number of trial involving time, energy, and money, yet the effect of interaction between various factors are not brought out clearly. In an effort to find optium conditions with less number of experiments and to secure amount of quantitative information about the sys-tem, statistical or factorial design of experiments has been devised and developed

    Integrated force method versus displacement method for finite element analysis

    Get PDF
    A novel formulation termed the integrated force method (IFM) has been developed in recent years for analyzing structures. In this method all the internal forces are taken as independent variables, and the system equilibrium equations (EE's) are integrated with the global compatibility conditions (CC's) to form the governing set of equations. In IFM the CC's are obtained from the strain formulation of St. Venant, and no choices of redundant load systems have to be made, in constrast to the standard force method (SFM). This property of IFM allows the generation of the governing equation to be automated straightforwardly, as it is in the popular stiffness method (SM). In this report IFM and SM are compared relative to the structure of their respective equations, their conditioning, required solution methods, overall computational requirements, and convergence properties as these factors influence the accuracy of the results. Overall, this new version of the force method produces more accurate results than the stiffness method for comparable computational cost

    DRSP : Dimension Reduction For Similarity Matching And Pruning Of Time Series Data Streams

    Get PDF
    Similarity matching and join of time series data streams has gained a lot of relevance in today's world that has large streaming data. This process finds wide scale application in the areas of location tracking, sensor networks, object positioning and monitoring to name a few. However, as the size of the data stream increases, the cost involved to retain all the data in order to aid the process of similarity matching also increases. We develop a novel framework to addresses the following objectives. Firstly, Dimension reduction is performed in the preprocessing stage, where large stream data is segmented and reduced into a compact representation such that it retains all the crucial information by a technique called Multi-level Segment Means (MSM). This reduces the space complexity associated with the storage of large time-series data streams. Secondly, it incorporates effective Similarity Matching technique to analyze if the new data objects are symmetric to the existing data stream. And finally, the Pruning Technique that filters out the pseudo data object pairs and join only the relevant pairs. The computational cost for MSM is O(l*ni) and the cost for pruning is O(DRF*wsize*d), where DRF is the Dimension Reduction Factor. We have performed exhaustive experimental trials to show that the proposed framework is both efficient and competent in comparison with earlier works.Comment: 20 pages,8 figures, 6 Table

    A deep, high resolution survey of the low frequency radio sky

    Full text link
    We report on the first wide-field, very long baseline interferometry (VLBI) survey at 90 cm. The survey area consists of two overlapping 28 deg^2 fields centred on the quasar J0226+3421 and the gravitational lens B0218+357. A total of 618 sources were targeted in these fields, based on identifications from Westerbork Northern Sky Survey (WENSS) data. Of these sources, 272 had flux densities that, if unresolved, would fall above the sensitivity limit of the VLBI observations. A total of 27 sources were detected as far as 2 arcdegrees from the phase centre. The results of the survey suggest that at least 10% of moderately faint (S~100 mJy) sources found at 90 cm contain compact components smaller than ~0.1 to 0.3 arcsec and stronger than 10% of their total flux densities. A ~90 mJy source was detected in the VLBI data that was not seen in the WENSS and NRAO VLA Sky Survey (NVSS) data and may be a transient or highly variable source that has been serendipitously detected. This survey is the first systematic (and non-biased), deep, high-resolution survey of the low-frequency radio sky. It is also the widest field of view VLBI survey with a single pointing to date, exceeding the total survey area of previous higher frequency surveys by two orders of magnitude. These initial results suggest that new low frequency telescopes, such as LOFAR, should detect many compact radio sources and that plans to extend these arrays to baselines of several thousand kilometres are warranted.Comment: Accepted by The Astrophysical Journal. 39 pages, 4 figure

    Handling and analysis of ices in cryostats and glove boxes in view of cometary samples

    Get PDF
    Comet nucleus sample return mission and other return missions from planets and satellites need equipment for handling and analysis of icy samples at low temperatures under vacuum or protective gas. Two methods are reported which were developed for analysis of small icy samples and which are modified for larger samples in cometary matter simulation experiments (KOSI). A conventional optical cryostat system was modified to allow for transport of samples at 5 K, ion beam irradiation, and measurement in an off-line optical spectrophotometer. The new system consists of a removable window plug containing nozzles for condensation of water and volatiles onto a cold finger. This plug can be removed in a vacuum system, changed against another plug (e.g., with other windows (IR, VIS, VUV) or other nozzles). While open, the samples can be treated under vacuum with cooling by manipulators (cut, removal, sample taking, irradiation with light, photons, or ions). After bringing the plug back, the samples can be moved to another site of analysis. For handling the 30 cm diameter mineral-ice samples from the KOSI experiments an 80x80x80 cm glove box made out of plexiglass was used. The samples were kept in a liquid nitrogen bath, which was filled from the outside. A stream a dry N2 and evaporating gas from the bath purified the glove box from impurity gases and, in particular, H2O, which otherwise would condense onto the samples

    Histopathology of gill, liver, muscle and brain of Cyprinus carpio communis L. exposed to sublethal concentration of lead and cadmium

    Get PDF
    Histological studies in organs like gill, liver, muscle and brain of Cyprinus carpio communis were made to assess tissue damage due to sublethal concentration of heavy metals lead and cadmium after 28 days of exposure. In lead treated gill, disintegration and fusion of primary lamellae, extensive vacuolization with disruption of epithelial lining was observed, whereas on sublethal exposure to cadmium, hyperplasia of branchial arch, vacuolization and congestion of blood vessels were well marked. Metal accumulation was clearly visible in treated liver with degeneration and severe necrosis. Both lead and cadmium treated fish showed marked thickening and separation of muscle bundles with severe intramuscular oedema more pronounced in sublethal treatment of cadmium. Neuronal cell degeneration, swelling of pyramidal cells, vacuolization and dystrophic changes were characteristic features observed in treated brain.Key words: Lead, cadmium, histopathology, Cyprinus carpio communis
    • …
    corecore