4,679 research outputs found
Accessibility and Development in Peripheral Regions. The Case for Beira Interior
Beira Interior is a Portuguese region located at the centre of Portugal, close to the Spanish border, and traditionally seen as a strongly peripheral region. In the last years the decrease in population and weaknesses of the industrial park have been justified on basis of shortness and/or lack of quality in transport infrastructure. In order to evaluate whether there is in fact a case in favour of infrastructure shortage we have developed a methodology that would allow us to identify the accessibility gains in the recent past, the ones foreseeable in the usually adopted planning periods and the ones possible in a asymptotic scenario of strong generalised accessibility, enabling this way to make explicit identification of the gains already achieved and the ones still possible. The evolution of values of the studied region was compared with the corresponding values in the region Litoral Centro – the region that was also used as benchmark in a previous consultation process to industrial key informants operating in Beira Interior. This thematic is extremely important for the region and for the country since the conclusions obtained will enable a better supported discussion on additional investment in transport infrastructure.
The changing energy paradigm, challenges, and new developments
Editorial of the Special Issue of the International Journal of Energy Researc
“Recent Advances in Energy Research”
The worldwide scarcity of fossil fuels regarding primary energy demand together with growing environmental concerns have raised new challenges to the world economy, and led to changes in the energy paradigm. Industry, services, researchers, and the Academy are challenged to envisage new solutions through setting up new conversion processes, designing new power systems, and investigating and developing new energy sources and vectors
A Perspective on the Revival of Structural (In) Stability With Novel Opportunities for Function: From Buckliphobia to Buckliphilia
Buckling of slender structures is traditionally regarded as a first route toward failure. Here, we provide an alternative perspective on a burgeoning movement where mechanical instabilities are exploited to devise new classes of functional mechanisms that make use of the geometrically nonlinear behavior of their postbuckling regimes. Selected examples are highlighted across length-scales to illustrate some of the exciting opportunities that lie ahead.National Science Foundation (U.S.) (CMMI-1129894)National Science Foundation (U.S.) (Faculty Early Career Development (CAREER) Program Award CMMI-1351449
Overpaints with cultural significance. How to define authenticity? The case of Afonso de Albuquerque’s portrait
Removal or not removal of repaints and overpaints is always complex and the justification for
it isn’t always objective. This type of operation can sometimes result in a worn out surface
with several losses of painting which normally are supposed to be restored and integrated. Justified by the search or by the reposition of authenticity, both these operations (restoration and integration) will change the values formerly attributed to the object. When one is dealing
with works of art which are generally recognized as being cultural significant, different values should be interpreted and discussed between stakeholders and specialists from different areas of expertise in a multidisciplinary platform before carrying out any intervention. This isn’t
always easy to achieve and the conservator usually has the difficult task of transforming subjective concepts into an objective solution. In this paper we present a case-study, the panel portrait of Afonso de Albuquerque, currently exhibited in the Museu Nacional de Arte Antiga
(Lisboa), which the current investigation process as proven to be the portrait of another governor, but repainted to resemble the viceroy of Portuguese India during a restoration process in the 1960s regarded has “technically difficult”. Several episodes throughout the history of
the Viceroy and Governors portrait Gallery and this specific panel gave the repaint historic, documental and iconographic values. Investigation is still ongoing and new facts may alter the definition of values attributed to this portrait and its authentic state in a process conservators
should be ever more concerned with during restoration interventions, namely chromatic integration
Blockchain-Enabled DPKI Framework
Public Key Infrastructures (PKIs), which rely on digital signature technology and establishment
of trust and security association parameters between entities, allow entities
to interoperate with authentication proofs, using standardized digital certificates (with
X.509v3 as the current reference). Despite PKI technology being used by many applications
for their security foundations (e.g. WEB/HTTPS/TLS, Cloud-Enabled Services,
LANs/WLANs Security, VPNs, IP-Security), there are several concerns regarding their
inherent design assumptions based on a centralized trust model.
To avoid some problems and drawbacks that emerged from the centralization assumptions,
a Decentralized Public Key Infrastructure (DPKI), is an alternative approach. The
main idea for DPKIs is the ability to establish trust relations between all parties, in a
web-of-trust model, avoiding centralized authorities and related root-of-trust certificates.
As a possible solution for DPKI frameworks, the Blockchain technology, as an enabler
solution, can help overcome some of the identified PKI problems and security drawbacks.
Blockchain-enabled DPKIs can be designed to address a fully decentralized ledger for
managed certificates, providing data-replication with strong consistency guarantees, and
fairly distributed trust management properties founded on a P2P trust model. In this
approach, typical PKI functions are supported cooperatively, with validity agreement
based on consistency criteria, for issuing, verification and revocation of X509v3 certificates.
It is also possible to address mechanisms to provide rapid reaction of principals in
the verification of traceable, shared and immutable history logs of state-changes related
to the life-cycle of certificates, with certificate validation rules established consistently by
programmable Smart Contracts executed by peers.
In this dissertation we designed, implemented and evaluated a Blockchain-Enabled
Decentralized Public Key Infrastructure (DPKI) framework, providing an implementation
prototype solution that can be used and to support experimental research. The
proposal is based on a framework instantiating a permissioned collaborative consortium
model, using the service planes supported in an extended Blockchain platform leveraged
by the Hyperledger Fabric (HLF) solution. In our proposed DPKI framework model,
X509v3 certificates are issued and managed following security invariants, processing
rules, managing trust assumptions and establishing consistency metrics, defined and executed in a decentralized way by the Blockchain nodes, using Smart Contracts. Certificates
are issued cooperatively and can be issued with group-oriented threshold-based
Byzantine fault-tolerant (BFT) signatures, as group-oriented authentication proofs. The
Smart Contracts dictate how Blockchain peers participate consistently in issuing, signing,
attestation, validation and revocation processes. Any peer can validate certificates
obtaining their consistent states consolidated in closed blocks in a Meckle tree structure
maintained in the Blockchain. State-transition operations are managed with serializability
guarantees, provided by Byzantine Fault Tolerant (BFT) consensus primitives
Prediction of high-performance concrete compressive strength through a comparison of machine learning techniques
Dissertation presented as the partial requirement for obtaining a Master's degree in Data Science and Advanced Analytics, specialization in Data ScienceHigh-performance concrete (HPC) is a highly complex composite material whose characteristics are extremely difficult to model. One of those characteristics is the concrete compressive strength, a nonlinear function of the same ingredients that compose HPC: cement, fly ash, blast furnace slag, water, superplasticizer, age, and coarse and fine aggregates. Research has shown time and time again that concrete strength is not determined just by the water-to-cement ratio, which was for years the go to metric. In addition, traditional methods that attempt to model HPC, such as regression analysis, do not provide sufficient prediction power due to nonlinear proprieties of the mixture. Therefore, this study attempts to optimize the prediction and modeling of the compressive strength of HPC by analyzing seven different machine learning (ML) algorithms: three regularization algorithms (Lasso, Ridge and Elastic Net), three ensemble algorithms (Random Forest, Gradient Boost and AdaBoost), and Artificial Neural Networks. All techniques were built and tested with a dataset composed of data from 17 different concrete strength test laboratories, under the same experimental conditions, which enabled a fair comparison amongst them and between different previous studies in the field. Feature importance analysis and outlier analysis were also performed, and all models were subject to a
Wilcoxon Signed-Ranks Test to ensure statistically significant results. The final results show that the
more complex ML algorithms provided greater accuracy than the regularization techniques, with Gradient Boost being the superior model amongst them, providing more accurate predictions than the sate-of-the-art. Better results were achieved using all variables and without removing outlier observations
American options under stochastic volatility via a transformation procedure
Tese de mestrado em Matemática Financeira, apresentada à Universidade de Lisboa, através da Faculdade de Ciências, 2017Nesta tese explora-se o pricing the opções Americanas através de um Transformation Procedure, tendo por base o modelo de volatilidade estocástica de Heston. Dado que a informação empírica mostra que o preço das ações contém variações na sua volatilidade, principalmente devido ao denominado efeito de alavancagem, esta tese incorpora um processo estocástico para volatilidade além de para o processo do ativo subjacente, estando estes correlacionados, sendo que nos modelos mais simples é típico a volatilidade ser determinística. Para resolver a equação de derivadas parciais associada ao modelo de Heston um método de diferenças finitas é utilizado complementado por condições de fronteira apropriadas para uma opção de venda. A utilização do método de diferenças finitas é instrumental para posteriormente através das suas partições no tempo conseguir que o preço seja solução para uma opção Americana, estando este sujeito a uma barreira-de-exercício, obtida através de um Transformation Procedure baseado na derivada da opção em relação ao seu preço, operando ao longo de várias iterações, contanto a tese com a prova de funcionalidade e uma ilustração deste Transformation Procedure. Esta tese também explora as condições de estabilidade numérica de acordo com a relação entre os parâmetros e as partições e também a precisão do método para diferentes partições das variáveis ao compará-lo com a solução de Heston para opções Europeias. Finalmente também é explorada a sensibilidade do preço das opções a diferentes variáveis, o efeito do preço quando introduzida a volatilidade estocástica face ao modelo determinístico e é explorada a eficiência face á precisão com a alteração de diferentes parâmetros.Empirical data shows that volatility of asset prices is not constant, although the basic derivative pricing settings do not take this into account, and so stochastic volatility models are more capable of providing reliable asset prices. Pricing assets under stochastic volatility in American option setting provides a bigger challenge when compared to European option setting. This thesis attempts to provide prices for American options under stochastic volatility by first constructing a optimal exercise boundary followed by an asset price through a transformation procedure. First, the baseline European pricing model is constructed and tested for accuracy and numerical stability. Then the procedure is described, its guarantees for
convergence are elaborated and the method is desiccated through an illustration. Lastly, the method is explored to give insights on how the option behaves when its parameters are changed, and its speed is tested in different computational settings
- …