2,637 research outputs found
Unsupervised Domain Adaptation with Similarity Learning
The objective of unsupervised domain adaptation is to leverage features from
a labeled source domain and learn a classifier for an unlabeled target domain,
with a similar but different data distribution. Most deep learning approaches
to domain adaptation consist of two steps: (i) learn features that preserve a
low risk on labeled samples (source domain) and (ii) make the features from
both domains to be as indistinguishable as possible, so that a classifier
trained on the source can also be applied on the target domain. In general, the
classifiers in step (i) consist of fully-connected layers applied directly on
the indistinguishable features learned in (ii). In this paper, we propose a
different way to do the classification, using similarity learning. The proposed
method learns a pairwise similarity function in which classification can be
performed by computing similarity between prototype representations of each
category. The domain-invariant features and the categorical prototype
representations are learned jointly and in an end-to-end fashion. At inference
time, images from the target domain are compared to the prototypes and the
label associated with the one that best matches the image is outputed. The
approach is simple, scalable and effective. We show that our model achieves
state-of-the-art performance in different unsupervised domain adaptation
scenarios
From Image-level to Pixel-level Labeling with Convolutional Networks
We are interested in inferring object segmentation by leveraging only object
class information, and by considering only minimal priors on the object
segmentation task. This problem could be viewed as a kind of weakly supervised
segmentation task, and naturally fits the Multiple Instance Learning (MIL)
framework: every training image is known to have (or not) at least one pixel
corresponding to the image class label, and the segmentation task can be
rewritten as inferring the pixels belonging to the class of the object (given
one image, and its object class). We propose a Convolutional Neural
Network-based model, which is constrained during training to put more weight on
pixels which are important for classifying the image. We show that at test
time, the model has learned to discriminate the right pixels well enough, such
that it performs very well on an existing segmentation benchmark, by adding
only few smoothing priors. Our system is trained using a subset of the Imagenet
dataset and the segmentation experiments are performed on the challenging
Pascal VOC dataset (with no fine-tuning of the model on Pascal VOC). Our model
beats the state of the art results in weakly supervised object segmentation
task by a large margin. We also compare the performance of our model with state
of the art fully-supervised segmentation approaches.Comment: CVPR201
Recurrent Convolutional Neural Networks for Scene Parsing
Scene parsing is a technique that consist on giving a label to all pixels in
an image according to the class they belong to. To ensure a good visual
coherence and a high class accuracy, it is essential for a scene parser to
capture image long range dependencies. In a feed-forward architecture, this can
be simply achieved by considering a sufficiently large input context patch,
around each pixel to be labeled. We propose an approach consisting of a
recurrent convolutional neural network which allows us to consider a large
input context, while limiting the capacity of the model. Contrary to most
standard approaches, our method does not rely on any segmentation methods, nor
any task-specific features. The system is trained in an end-to-end manner over
raw pixels, and models complex spatial dependencies with low inference cost. As
the context size increases with the built-in recurrence, the system identifies
and corrects its own errors. Our approach yields state-of-the-art performance
on both the Stanford Background Dataset and the SIFT Flow Dataset, while
remaining very fast at test time
Phrase-based Image Captioning
Generating a novel textual description of an image is an interesting problem
that connects computer vision and natural language processing. In this paper,
we present a simple model that is able to generate descriptive sentences given
a sample image. This model has a strong focus on the syntax of the
descriptions. We train a purely bilinear model that learns a metric between an
image representation (generated from a previously trained Convolutional Neural
Network) and phrases that are used to described them. The system is then able
to infer phrases from a given image sample. Based on caption syntax statistics,
we propose a simple language model that can produce relevant descriptions for a
given test image using the phrases inferred. Our approach, which is
considerably simpler than state-of-the-art models, achieves comparable results
in two popular datasets for the task: Flickr30k and the recently proposed
Microsoft COCO
Ligação diatômica: Uma abordagem clássica e quântica
The present article has the intention of showing the difference between classical and quantum analysis for a diatomic molecule, on the aspects of vibrational transitions, showing the results and predictions of each method, describing the solutions and comparing them with the real behavior. Making clear the limitations and validities of each method.O presente artigo tem a intenção de evidenciar as diferenças entre a análise clássica e quântica de uma molécula diatômica, nos aspectos das transições vibracionais, mostrando os resultados e predições de cada método, descrevendo as soluções e comparando elas com o comportamento real. Deixando claro as limitações e validades de cada método
Multi-Agents System Approach to Industry 4.0: Enabling Collaboration Considering a Blockchain
Dissertação de Mestrado em Engenharia InformáticaThe evolution of existing technologies and the creation of new ones paved the way for a new revolution
in the industrial sector. With the introduction of the existing and new technologies in the manufacturing
environment, the industry is moving towards the fourth industrial revolution, called Industry 4.0. The
fourth industrial revolution introduces many new components like 3D printing, Internet of things, artificial
intelligence, and augmented reality. The automation of the traditional manufacturing processes and the
use of smart technology are transforming industries in a more interconnected environment, where there
is more transparent information and decentralised decisions.
The arrival of Industry 4.0 introduces industries to a new environment, where their manufacturing processes
are more evolved, more agile, and with more efficiency. The principles of Industry 4.0 rely on
the interconnection of machines, devices, sensors, and people to communicate and connect. The transparency
of information guaranties that decision makers are provided with clear and correct information
to make informed decisions and the decentralisation of decisions will create the ability for machines and
systems to make decisions on their own and to perform tasks autonomously.
Industry 4.0 is making manufacturing processes more agile and efficient, but due to the fast pace of
trends and the shift from the traditional mass production philosophy towards the mass customisation,
following the Industry 4.0 guidelines might not be enough. The mass customisation paradigm was created
from the desire that customers have in owning custom made products and services, tailor made
to their needs. The idea to perform small tweaks in a product to face the needs of a consumer group,
keeping the production costs like the ones from the mass production, without losing efficiency in the
production. This paradigm poses great challenges to the industries, since they must be able to always
have the capability to answer the demands that may arise from the preparation and production of personalised
products and services. In the meantime, organisations will try to increasingly mark its position
in the market, with competition getting less relevant and with different organisations worrying less with
their performance on an individual level and worrying more about their role in a supply chain. The need
for an improved collaboration with Industry 4.0 is the motivation for the model proposed in this work.
This model, that perceives a set of organisations as entities in a network that want to interact with each
other, is divided into two parts, the knowledge representation and the reasoning and interactions. The first part relies on the Blockchain technology to securely store and manage all the organisation transactions
and data, guaranteeing the decentralisation of information and the transparency of the transactions.
Each organisation has a public and private profile were the data is stored to allow each organisation to
evaluate the others and to allow each organisation to be evaluated by the remainder of the organisations
present in the network. Furthermore, this part of the model works as a ledger of the transactions made
between the organisations, since that every time two organisations negotiate or interact in any way, the
interaction is getting recorded. The ledger is public, meaning that every organisation in the network
can view the data stored. Nevertheless, an organisation will have the possibility, in some situations, to
keep transactions private to the organisations involved. Despite the idea behind the model is to promote
transparency and collaboration, in some selected occasions organisations might want to keep transactions
private from the other participants to have some form of competitive advantage. The knowledge
representation part also wants to provide security and trust to the organisation that their data will be safe
and tamper proof.
The second part, reasoning and interactions, uses a Multi-Agent System and has the objective to help
improve decision-making. Imagining that one organisation needs a service that can be provided by two
other organisations, also present in the network, this part of the model is going to work towards helping
the organisations choose what is the best choice, given the scenario and data available. This part of the
model is also responsible to represent every organisation present in the network and when organisations
negotiate or interact, this component is also going to handle the transaction and communicate the data
to the first part of the model.A constante evolução de tecnologias atuais e a criação de novas tecnologias criou as condições necessárias para a existência de uma nova revolução industrial. Com a evolução de dispositivos móveis e com a chegada de novas tecnologias e ferramentas que começaram a ser introduzidas em ambiente industrial,
como a impressão 3D, internet das coisas, inteligência artificial, realidade aumentada, entre outros, a industria conseguiu começar a explorar novas tecnologias e automatizar os seus processos de fabrico tradicionais, movendo as industrias para a quarta revolução industrial, conhecida por Industria 4.0.
A adoção dos princípios da Indústria 4.0 levam as indústrias a evoluir os seus processos e a ter uma maior e melhor capacidade de produção, uma vez que as mesmas se vão tornar mais ágeis e introduzir melhorias nos seus ambientes de produção. Uma dessas melhorias na questão da interoperabilidade, com máquinas, sensores, dispositivos e pessoas a comunicarem entre si. A transparência da informação vai levar a uma melhor interpretação dos dados para efetuar decisões informadas, com os sistemas a recolher cada vez mais dados e informação dos diferentes pontos do processo de manufatura.
(...
Modelação numérica da interação maré astronómica-maré meteorológica e a sua influência nos padrões de inundação estuarina
Mestrado em Ciências do Mar e da AtmosferaSeveral problems related with coastal flooding depend on tidal properties and
on sea level rise induced by meteorological effects (storm surge). The water
levels during storms are consequently determined by tidal and storm surge
levels as well as by their non-linear interaction. Herein, shallow coastal systems
are particularly important, since the local bathymetry is a crucial mechanism
controlling the interaction between tide and storm surge. Thus, the main
concern of this dissertation is to give new insights about the effect of tide-surge
interaction on the inundation patterns of a flood prone shallow estuarine system
located in the northwest Portuguese coast, the Ria de Aveiro. The methodology
adopted comprised two fundamental steps: 1) statistical analysis of sea level
data collected at the lagoon inlet between 1979 and 2013, in order to infer
about the frequency distributions of positive residual events and their relation
with tidal phase; 2) implementation, calibration, validation and exploitation of a
numerical application based on Delft3D hydrodynamic model, in order to
describe the local estuarine circulation (including the marginal areas potentially
flooded during adverse conditions), and explore inundation conditions and tidesurge
interactions under the scenarios arising from the previous task.
The statistical analysis revealed that the frequency distributions do not depend
on the threshold used, as well as on the minimum number of hours to an event
be assigned. The results highlighted a tendency for storm surge events to
occur most frequently on the falling tide, with a principal mode of occurrence 3
hours after high water. Besides that, the highest storm surge peaks have
occurred at the beginning of rising tide and at the end of falling tide. The
numerical model results evidence that the model is able to reproduce tidal and
storm surge propagation more accurately than previous numerical applications
developed for Ria de Aveiro. Once calibrated the model, a set of numerical
experiments (hypothetical scenarios) were designed to assess the effect of tide
on storm surge propagation along Ria de Aveiro and its influence in local
inundation patterns. Thus, it was found that storm surge characteristics are
modulated by the state of the tide, where the effect of nonlinear terms is to
reduce the storm tide peak. However, at the upper reaches of the main
channels the storm tide peak is controlled by the storm surge, since it is higher
than tide. Concerning to the lagoon flooded area, the different scenarios
showed that the most threatened areas are the upper reaches of São Jacinto
channel and the vicinity of Laranjo bay, resulting this from the increase of storm
tide peak in these regions. The adjacent margins of Baixo Vouga Lagunar and
the margins of Mira/Ílhavo channel are also threatened when the tidal range is
increased. This results from an increase on nonlinear tide-surge interaction and
also on the maximal residual levels.Diversos problemas relacionados com a inundação de zonas costeiras
dependem das características da maré e da sobre-elevação do nível do mar
devido a factores meteorológicos (storm surge). Durante episódios de
tempestade, a altura da água é determinada pelos níveis de maré e da storm
surge, assim como pela sua interacção não-linear. Esta, revela-se mais
importante em sistemas costeiros relativamente pouco profundos devido ao
efeito da batimetria. Deste modo, o principal objectivo desta dissertação é
explicar a interacção entre a maré e a storm surge, e a sua influência nos
padrões de inundação da Ria de Aveiro. Localizado a Noroeste da costa
portuguesa, este sistema lagunar, caracterizado por regiões pouco profundas,
é vulnerável a inundações. A metodologia adoptada baseou-se em dois passos
fundamentais: 1) análise estatística da elevação da superfície livre registada
no marégrafo localizado à entrada da laguna entre 1979 e 2013, de modo a
perceber as distribuições de frequência de eventos de storm surge positivos e
a sua relação com a fase da maré; 2) implementação, calibração, validação e
exploração de uma aplicação numérica baseada no modelo hidrodinâmico
Delft3D, para descrever a circulação estuarina local (incluindo as áreas
marginais potencialmente inundadas durante condições meteorológicas
adversas), e explorar condições de inundação e interacções entre a maré e a
storm surge através de cenários que resultam do passo anterior.
A análise estatística revelou que as distribuições de frequência não dependem
do threshold utilizado, assim como do número mínimo de horas para que um
evento seja considerado. Os resultados evidenciaram uma tendência para os
eventos de storm surge ocorrerem mais frequentemente na vazante, com um
modo principal de ocorrência 3 horas depois da preia-mar. Além disso, a
magnitude dos eventos foi maior no início da enchente e no final da vazante.
Os resultados do modelo numérico evidenciaram que o modelo é capaz de
reproduzir a propagação da maré e da storm surge com uma maior precisão
do que as implementações anteriores desenvolvidas para a Ria de Aveiro.
Uma vez calibrado o modelo, realizou-se um conjunto de cenários hipotéticos
para avaliar o efeito da maré na propagação da storm surge e o seu efeito nos
padrões de inundação locais. Deste modo, verificou-se que os níveis da storm
surge são influenciados pelo estado da maré, onde o efeito dos termos não
lineares foi reduzir o pico da tempestade de maré. Contudo, nas extremidades
dos canais principais o pico da tempestade de maré é controlado pela storm
surge uma vez que esta é maior do que a maré. As regiões mais afectadas por
inundações correspondem às extremidades do canal de São Jacinto, arredores
da bacia do Laranjo e margens adjacentes do Baixo Vouga Lagunar, sendo o
aumento do pico da tempestade da maré, associado a uma amplificação do
residual e dos termos não lineares nestas regiões a principal explicação
The private equity dilemma of holding a distressed firm : a valuation analysis
In 2010, ABC Capital, a private equity firm specialized in investing in distressed
companies, acquired PTO, a Portuguese producer of cork floating floors and walls. Two years
after the initial investment, despite the upturn registered in the cork and in the construction
industries, PTO is not giving signs of recovery.
The aim of this thesis is to help ABC Capital solving the following dilemma: should they
keep investing in the company and perform a turnaround versus disinvesting by liquidating the
company.
In order to do so, this thesis combines a valuation approach with an extensive analysis
of the company, its competitors and the industry. Moreover, the relevant literature was
reviewed and applied, supporting all the analyses conducted.
We conclude that ABC Capital should maintain PTO in its portfolio and perform a
turnaround strategy, since this choice delivers more value to the shareholders of the firm.
More precisely, we recommend that PTO management should follow a Mix Strategy, where
the company simultaneously should increase sales and reduces its costs. Finally, we also
recommend that PTO, after stabilizing and becoming again profitable, should pursue a more
aggressive sales strategy, as sales are the major cause of value generation
Nonoperative treatment of slipped capital femoral epiphysis: a scientific study
Abstract Background Treatment of the Slipped Capital Femoral Epiphysis remains a cause of concern due to the fact that the true knowledge of the etiopathogeny is unknown, as well as one of its major complications: chondrolysis. The conservative treatment remains controversial; it has been overlooked in the studies and subjected to intense criticism. The purpose of this study is to investigate the results of treatment on the hip of patients displaying slipped capital femoral epiphysis, using the plaster cast immobilization method and its link to chondrolysis. Methods The research was performed based on the study of the following variables: symptomatology, and the degree of slipping. A hip spica cast and bilateral short/long leg casts in abduction, internal rotation with anti-rotational bars were used for immobilizing the patient's hip for twelve weeks. Statistical analysis was accomplished by Wilcoxon's marked position test and by the Fisher accuracy test at a 5% level. Results A satisfactory result was obtained in the acute group, 70.5%; 94%; in the chronic group (chronic + acute on chronic). Regarding the degree of the slipping, a satisfactory result was obtained in 90.5% of hips tested with a mild slip; in 76% with moderate slip and 73% in the severe slip. The statistical result revealed that a significant improvement was found for flexion (p = 0.0001), abduction (p = 0.0001), internal rotation (p = 0.0001) and external rotation (p = 0.02). Chondrolysis was present in 11.3% of the hips tested. One case of pseudoarthrosis with aseptic capital necrosis was presented. There was no significant variation between age and chondrolysis (p = 1.00).Significant variation between gender/non-white patients versus chondrolysis (p = 0.031) and (p = 0.037), respectively was verified. No causal association between plaster cast and chondrolysis was observed (p = 0.60). In regard to the symptomatology group and the slip degree versus chondrolysis, the p value was not statistically significant in both analyses, p = 0.61 and p = 0.085 respectively. Conclusions After analyzing the nonoperative treatment of slipped capital femoral epiphysis and chondrolysis, we conclude that employment of the treatment revealed that the method was functional, efficient, valid, and reproducible; it also can be used as an alternative therapeutic procedure regarding to this specific disease.</p
- …