3,192 research outputs found

    Y2K Interruption: Can the Doomsday Scenario Be Averted?

    Get PDF
    The management philosophy until recent years has been to replace the workers with computers, which are available 24 hours a day, need no benefits, no insurance and never complain. But as the year 2000 approached, along with it came the fear of the millennium bug, generally known as Y2K, and the computers threatened to strike!!!! Y2K, though an abbreviation of year 2000, generally refers to the computer glitches which are associated with the year 2000. Computer companies, in order to save memory and money, adopted a voluntary standard in the beginning of the computer era that all computers automatically convert any year designated by two numbers such as 99 into 1999 by adding the digits 19. This saved enormous amount of memory, and thus money, because large databases containing birth dates or other dates only needed to contain the last two digits such as 65 or 86. But it also created a built in flaw that could make the computers inoperable from January 2000. The problem is that most of these old computers are programmed to convert 00 (for the year 2000) into 1900 and not 2000. The trouble could therefore, arise when the systems had to deal with dates outside the 1900s. In 2000, for example a programme that calculates the age of a person born in 1965 will subtract 65 from 00 and get -65. The problem is most acute in mainframe systems, but that does not mean PCs, UNIX and other computing environments are trouble free. Any computer system that relies on date calculations must be tested because the Y2K or the millennium bug arises because of a potential for “date discontinuity” which occurs when the time expressed by a system, or any of its components, does not move in consonance with real time. Though attention has been focused on the potential problems linked with change from 1999 to 2000, date discontinuity may occur at other times in and around this period.

    Using semantic indexing to improve searching performance in web archives

    Get PDF
    The sheer volume of electronic documents being published on the Web can be overwhelming for users if the searching aspect is not properly addressed. This problem is particularly acute inside archives and repositories containing large collections of web resources or, more precisely, web pages and other web objects. Using the existing search capabilities in web archives, results can be compromised because of the size of data, content heterogeneity and changes in scientific terminologies and meanings. During the course of this research, we will explore whether semantic web technologies, particularly ontology-based annotation and retrieval, could improve precision in search results in multi-disciplinary web archives

    Velocity correlations in dense granular flows observed with internal imaging

    Full text link
    We show that the velocity correlations in uniform dense granular flows inside a silo are similar to the hydrodynamic response of an elastic hard-sphere liquid. The measurements are made using a fluorescent refractive index matched interstitial fluid in a regime where the flow is dominated by grains in enduring contact and fluctuations scale with the distance traveled, independent of flow rate. The velocity autocorrelation function of the grains in the bulk shows a negative correlation at short time and slow oscillatory decay to zero similar to simple liquids. Weak spatial velocity correlations are observed over several grain diameters. The mean square displacements show an inflection point indicative of caging dynamics. The observed correlations are qualitatively different at the boundaries.Comment: 11 pages, 4 figure

    Rotor Eddy-Current loss in permanent-magnet brushless AC machines

    Get PDF
    This paper analyzes rotor eddy-current loss in permanent-magnet brushless ac machines. It is shown that analytical or finite-element techniques published in literature for predicting rotor eddy-current loss using space harmonic based approaches may not yield correct results in each magnet segment when one magnet-pole is circumferentially segmented into more than two pieces. It is also shown that the eddy-current loss in each equally segmented piece may differ by a large margin, which implies that the temperature distribution in the magnets will be uneven and the risk of demagnetization has to be carefully assessed. The theoretical derivation is validated by time-stepped transient finite-element analysis

    The “broken escalator” phenomenon: Vestibular dizziness interferes with locomotor adaptation

    Get PDF
    BACKGROUND: Although vestibular lesions degrade postural control we do not know the relative contributions of the magnitude of the vestibular loss and subjective vestibular symptoms to locomotor adaptation. OBJECTIVE: To study how dizzy symptoms interfere with adaptive locomotor learning. METHODS: We examined patients with contrasting peripheral vestibular deficits, vestibular neuritis in the chronic stable phase (n = 20) and strongly symptomatic unilateral Meniere’s disease (n = 15), compared to age-matched healthy controls (n = 15). We measured locomotor adaptive learning using the “broken escalator” aftereffect, simulated on a motorised moving sled. RESULTS: Patients with Meniere’s disease had an enhanced “broken escalator” postural aftereffect. More generally, the size of the locomotor aftereffect was related to how symptomatic patients were across both groups. Contrastingly, the degree of peripheral vestibular loss was not correlated with symptom load or locomotor aftereffect size. During the MOVING trials, both patient groups had larger levels of instability (trunk sway) and reduced adaptation than normal controls. CONCLUSION: Dizziness symptoms influence locomotor adaptation and its subsequent expression through motor aftereffects. Given that the unsteadiness experienced during the “broken escalator” paradigm is internally driven, the enhanced aftereffect found represents a new type of self-generated postural challenge for vestibular/unsteady patients

    Modeling Probability of Path Loss for DSDV, OLSR and DYMO above 802.11 and 802.11p

    Full text link
    This paper presents path loss model along with framework for probability distribution function for VANETs. Furthermore, we simulate three routing protocols Destination Sequenced Distance Vector (DSDV), Optimized Link State Routing (OLSR) and Dynamic MANET On-demand (DYMO) in NS-2 to evaluate and compare their performance using two Mac-layer Protocols 802.11 and 802.11p. A novel approach of this work is modifications in existing parameters to achieve high efficiency. After extensive simulations, we observe that DSDV out performs with 802.11p while DYMO gives best performance with 802.11.Comment: IEEE 8th International Conference on Broadband and Wireless Computing, Communication and Applications (BWCCA'13), Compiegne, Franc

    Parametric and Reliability Estimation of the Kumaraswamy Generalized Distribution Based on Record Values

    Get PDF
    A general family of distributions, namely Kumaraswamy generalized family of (Kw-G) distribution, is considered for estimation of the unknown parameters and reliability function based on record data from Kw-G distribution. The maximum likelihood estimators (MLEs) are derived for unknown parameters and reliability function, along with its confidence intervals. A Bayesian study is carried out under symmetric and asymmetric loss functions in order to find the Bayes estimators for unknown parameters and reliability function. Future record values are predicted using Bayesian approach and non Bayesian approach, based on numerical examples and a monte carlo simulation

    Automated Reconstruction of 3D Open Surfaces from Sparse Point Clouds

    Full text link
    Real-world 3D data may contain intricate details defined by salient surface gaps. Automated reconstruction of these open surfaces (e.g., non-watertight meshes) is a challenging problem for environment synthesis in mixed reality applications. Current learning-based implicit techniques can achieve high fidelity on closed-surface reconstruction. However, their dependence on the distinction between the inside and outside of a surface makes them incapable of reconstructing open surfaces. Recently, a new class of implicit functions have shown promise in reconstructing open surfaces by regressing an unsigned distance field. Yet, these methods rely on a discretized representation of the raw data, which loses important surface details and can lead to outliers in the reconstruction. We propose IPVNet, a learning-based implicit model that predicts the unsigned distance between a surface and a query point in 3D space by leveraging both raw point cloud data and its discretized voxel counterpart. Experiments on synthetic and real-world public datasets demonstrates that IPVNet outperforms the state of the art while producing far fewer outliers in the reconstruction.Comment: To be presented at the 2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR) Workshop on Photorealistic Image and Environment Synthesis for Mixed Reality (PIES-MR
    • …
    corecore