1,447 research outputs found

    Forecasts Marine Weather on Java Sea Using Hybrid Methods: TS-ANFIS

    Get PDF
    Indonesia is an archipelago. Consequently, themajorities are working around the sea such as a fisherman.While the number of activities at sea are increasing more accident occurred are rising. This research presents marine weather prediction system using Hybrid Methods TS-ANFIS(Adaptive Neuro Fuzzy Inference System – Time Series) in orderto anticipate bad weather and reduce risk. This method use bothocean current and wave height at Java Sea particularly on Gresikin order to forecast ocean current velocity and wave height. Inputvariables used in this paper are data at (t), an hour before (t-1),and two hours before (t-2) and obtained next hour, next 6 hours,next 12 hours, and next day prediction as output. The resultsindicate that ocean current speed attain 16.97327 cm/s; 13.22302cm/s; 10.21107 cm/s; 14.09871 cm/s with mean error is about0.12993; 1.5758; 1.3182; 0.82613 while wave height reach 0.45554m; 0.48286 m; 0.46395 m; 0.54571 m with mean error is about0.0012247; 0.018619; 0.046584; 0.060206. Therefore, it was safe tosailing on 1st January 2016

    Data Curation Format Profile: netCDF

    Get PDF
    A primer for the netCDF (*.nc) file format for the purpose of data curation. It covers various tools for viewing netCDF files and questions to consider when curating a netCDF dataset or file.https://deepblue.lib.umich.edu/bitstream/2027.42/145724/1/Public-Data-CurationFormat Profile-netCDF.pdfhttps://deepblue.lib.umich.edu/bitstream/2027.42/145724/5/Public-Data-CurationFormat Profile-netCDFv2.pdfDescription of Public-Data-CurationFormat Profile-netCDF.pdf : Original versionDescription of Public-Data-CurationFormat Profile-netCDFv2.pdf : Version 2.0 with Integrated Data Viewer information from Sophie Ho

    Metocean Big Data Processing Using Hadoop

    Get PDF
    This report will discuss about MapReduce and how it handles big data. In this report, Metocean (Meteorology and Oceanography) Data will be used as it consist of large data. As the number and type of data acquisition devices grows annually, the sheer size and rate of data being collected is rapidly expanding. These big data sets can contain gigabytes or terabytes of data, and can grow on the order of megabytes or gigabytes per day. While the collection of this information presents opportunities for insight, it also presents many challenges. Most algorithms are not designed to process big data sets in a reasonable amount of time or with a reasonable amount of memory. MapReduce allows us to meet many of these challenges to gain important insights from large data sets. The objective of this project is to use MapReduce to handle big data. MapReduce is a programming technique for analysing data sets that do not fit in memory. The problem statement chapter in this project will discuss on how MapReduce comes as an advantage to deal with large data. The literature review part will explain the definition of NoSQL and RDBMS, Hadoop Mapreduce and big data, things to do when selecting database, NoSQL database deployments, scenarios for using Hadoop and Hadoop real world example. The methodology part will explain the waterfall method used in this project development. The result and discussion will explain in details the result and discussion from my project. The last chapter in this project report is conclusion and recommendatio

    On the role of pre and post-processing in environmental data mining

    Get PDF
    The quality of discovered knowledge is highly depending on data quality. Unfortunately real data use to contain noise, uncertainty, errors, redundancies or even irrelevant information. The more complex is the reality to be analyzed, the higher the risk of getting low quality data. Knowledge Discovery from Databases (KDD) offers a global framework to prepare data in the right form to perform correct analyses. On the other hand, the quality of decisions taken upon KDD results, depend not only on the quality of the results themselves, but on the capacity of the system to communicate those results in an understandable form. Environmental systems are particularly complex and environmental users particularly require clarity in their results. In this paper some details about how this can be achieved are provided. The role of the pre and post processing in the whole process of Knowledge Discovery in environmental systems is discussed

    Automatic differentiation in machine learning: a survey

    Get PDF
    Derivatives, mostly in the form of gradients and Hessians, are ubiquitous in machine learning. Automatic differentiation (AD), also called algorithmic differentiation or simply "autodiff", is a family of techniques similar to but more general than backpropagation for efficiently and accurately evaluating derivatives of numeric functions expressed as computer programs. AD is a small but established field with applications in areas including computational fluid dynamics, atmospheric sciences, and engineering design optimization. Until very recently, the fields of machine learning and AD have largely been unaware of each other and, in some cases, have independently discovered each other's results. Despite its relevance, general-purpose AD has been missing from the machine learning toolbox, a situation slowly changing with its ongoing adoption under the names "dynamic computational graphs" and "differentiable programming". We survey the intersection of AD and machine learning, cover applications where AD has direct relevance, and address the main implementation techniques. By precisely defining the main differentiation techniques and their interrelationships, we aim to bring clarity to the usage of the terms "autodiff", "automatic differentiation", and "symbolic differentiation" as these are encountered more and more in machine learning settings.Comment: 43 pages, 5 figure

    Dermatological diagnosis by mobile application

    Get PDF
    Health care mobile application delivers the right information at the right time and place to benefit patient’s clinicians and managers to make correct and accurate decisions in health care fields, safer care and less waste, errors, delays and duplicated errors.Lots of people have knowledge a skin illness at some point of their life, For the reason that skin is the body's major organ and it is quite exposed, significantly increasing its hazard of starting to be diseased or ruined.This paper aims to detect skin disease by mobile app using android platform providing valid trustworthy and useful dermatological information on over 4 skin diseases such as acne, psoriasis content for each skin condition, skin rush and Melanoma. It will include name, image, description, symptoms, treatment and prevention with support multi languages English and Bahasa and Mandarin. the application  has the ability to take and send video as well as normal and magnified photos to your dermatologist as an email attachment with comments on safe secure network, this app also has a built in protected privacy features to access to your photo and video dermatologists. The mobile application help in diagnose and treat their patients without an office visit teledermatology is recognized by all major insurance companies doctor.

    Análisis de la evolución de los réditos de los bonos soberanos, utilizando las técnicas de ondículas

    Full text link
    The term "wavelets" covers a set of resources from the mathematical analysis that has proven their efficiency in system identification on areas such as hydrology, geology, glaciology, climatology and energy resources optimization. The methodology undergone on systems engineering could be extrapolated to everything conceptualized as "complex system" whatever its nature be. The wavelet techniques provide the description of non-stationary components and the evolution of macroeconomic variables in the frequency domain. The identification of predominant frequential scales and transient effects in time series highlights the multiresolutional analysis that would be more difficult to treat with traditional methods of econometrics. A review of the literature will show the potential problems that can be solved with these techniques, including prediction of benefits calculated on the evolution of the risk premium of a country, the extraction of symmetric macroeconomic shocks in country clusters, or detection of transient effects on the mutual influence of sovereign bonds between pairs of countries, among others. The dissertation will culminate in specific applications that show the power of wavelet techniques in identifying possible determinants and correlation of the evolution of sovereign bond yields in the euro area countries.El término ‘‘ondículas’’ cubre un conjunto de recursos del análisis matemático que ha demostrado su eficacia en la identificación de sistemas en áreas tales como la hidrología, geología, glaciología, climatología y optimización de los recursos energéticos. La metodología utilizada en la ingeniería de sistemas podría extrapolarse a todo lo conceptualizado como‘‘sistema complejo’’, fuere cual fuere su naturaleza. Las técnicas de ‘‘ondículas’’ aportan la descripción de los componentes no estacionarios y la evolución de las variables macroeconómicas en el campo de la frecuencia. La identificación de las escalas frecuenciales predominantes y los efectos transitorios en las series temporales enfatiza el análisis multi -resolución, que podría ser más difícil de tratar con el uso de métodos econométricos tradicionales. Una revisión dela literatura reflejará los problemas potenciales que pueden solucionarse utilizando estas técnicas, tales como la predicción de los beneficios calculados sobre la evolución de la prima de riesgo en un país, la extracción de los shocks macroeconómicos simétricos en los clusters del país, o la detección de los efectos transitorios sobre la influencia mutua de los bonos soberanos entre los pares de países, entre otros. La disertación desembocará en aplicaciones específicas que reflejarán el poder de las técnicas de ondículas en la identificación de posibles determinantes, y la correlación de la evolución de los réditos de los bonos soberanos en los países de la zona Euro

    Automatic processing, quality assurance and serving of real-time weather data

    Get PDF
    Recent advances in technology have produced a significant increase in the availability of free sensor data over the Internet. With affordable weather monitoring stations now available to individual meteorology enthusiasts a reservoir of real time data such as temperature, rainfall and wind speed can now be obtained for most of the United States and Europe. Despite the abundance of available data, obtaining useable information about the weather in your local neighbourhood requires complex processing that poses several challenges. This paper discusses a collection of technologies and applications that harvest, refine and process this data, culminating in information that has been tailored toward the user. In this case we are particularly interested in allowing a user to make direct queries about the weather at any location, even when this is not directly instrumented, using interpolation methods. We also consider how the uncertainty that the interpolation introduces can then be communicated to the user of the system, using UncertML, a developing standard for uncertainty representation

    Application for photogrammetry of organisms

    Get PDF
    Single-camera photogrammetry is a well-established procedure to retrieve quantitative information from objects using photography. In biological sciences, photogrammetry is often applied to aid in morphometry studies, focusing on the comparative study of shapes and organisms. Two types of photogrammetry are used in morphometric studies: 2D photogrammetry, where distance and angle measurements are used to quantitatively describe attributes of an object, and 3D photogrammetry, where data on landmark coordinates are used to reconstruct an object true shape. Although there are excellent software tools for 3D photogrammetry available, software specifically designed to aid in the somewhat simpler 2D photogrammetry are lacking. Therefore, most studies applying 2D photogrammetry, still rely on manual acquisition of measurements from pictures, that must then be scaled to an appropriate measuring system. This is often a laborious multistep process, on most cases utilizing diverse software to complete different tasks. In addition to being time-consuming, it is also error-prone since measurement recording is often made manually. The present work aimed at tackling those issues by implementing a new cross-platform software able to integrate and streamline the photogrammetry workflow usually applied in 2D photogrammetry studies. Results from a preliminary study show a decrease of 45% in processing time when using the software developed in the scope of this work in comparison with a competing methodology. Existing limitations and future work towards improved versions of the software are discussed.Fotogrametria em câmera única é um procedimento bem estabelecido para recolher dados quantitativos de objectos através de fotografias. Em biologia, fotogrametria é frequentemente aplicada no contexto de estudos morfométricos, focando-se no estudo comparativo de formas e organismos. Nos estudos morfométricos são utilizados dois tipos de aplicação fotogramétrica: fotogrametria 2D, onde são utilizadas medidas de distância e ângulo para quantitativamente descrever atributos de um objecto, e fotogrametria 3D, onde são utilizadas coordenadas de referência de forma a reconstruir a verdadeira forma de um objeto. Apesar da existência de uma elevada variedade de software no contexto de fotogrametria 3D, a variedade de software concebida especificamente para a a aplicação de fotogrametria 2D é ainda muito reduzida. Consequentemente, é comum observar estudos onde fotogrametria 2D é utilizada através da aquisição manual de medidas a partir de imagens, que posteriormente necessitam de ser escaladas para um sistema apropriado de medida. Este processo de várias etapas é frequentemente moroso e requer a aplicação de diferentes programas de software. Além de ser moroso, é também susceptível a erros, dada a natureza manual na aquisição de dados. O presente trabalho visou abordar os problemas descritos através da implementação de um novo software multiplataforma capaz de integrar e agilizar o processo de fotogrametria presentes em estudos que requerem fotogrametria 2D. Resultados preliminares demonstram um decréscimo de 45% em tempo de processamento na utilização do software desenvolvido no âmbito deste trabalho quando comparado a uma metodologia concorrente. Limitações existentes e trabalho futuro são discutidos
    corecore