18,487 research outputs found
Meso-scale FDM material layout design strategies under manufacturability constraints and fracture conditions
In the manufacturability-driven design (MDD) perspective, manufacturability of the product or system is the most important of the design requirements. In addition to being able to ensure that complex designs (e.g., topology optimization) are manufacturable with a given process or process family, MDD also helps mechanical designers to take advantage of unique process-material effects generated during manufacturing. One of the most recognizable examples of this comes from the scanning-type family of additive manufacturing (AM) processes; the most notable and familiar member of this family is the fused deposition modeling (FDM) or fused filament fabrication (FFF) process. This process works by selectively depositing uniform, approximately isotropic beads or elements of molten thermoplastic material (typically structural engineering plastics) in a series of pre-specified traces to build each layer of the part. There are many interesting 2-D and 3-D mechanical design problems that can be explored by designing the layout of these elements. The resulting structured, hierarchical material (which is both manufacturable and customized layer-by-layer within the limits of the process and material) can be defined as a manufacturing process-driven structured material (MPDSM). This dissertation explores several practical methods for designing these element layouts for 2-D and 3-D meso-scale mechanical problems, focusing ultimately on design-for-fracture. Three different fracture conditions are explored: (1) cases where a crack must be prevented or stopped, (2) cases where the crack must be encouraged or accelerated, and (3) cases where cracks must grow in a simple pre-determined pattern. Several new design tools, including a mapping method for the FDM manufacturability constraints, three major literature reviews, the collection, organization, and analysis of several large (qualitative and quantitative) multi-scale datasets on the fracture behavior of FDM-processed materials, some new experimental equipment, and the refinement of a fast and simple g-code generator based on commercially-available software, were developed and refined to support the design of MPDSMs under fracture conditions. The refined design method and rules were experimentally validated using a series of case studies (involving both design and physical testing of the designs) at the end of the dissertation. Finally, a simple design guide for practicing engineers who are not experts in advanced solid mechanics nor process-tailored materials was developed from the results of this project.U of I OnlyAuthor's request
Rank-based linkage I: triplet comparisons and oriented simplicial complexes
Rank-based linkage is a new tool for summarizing a collection of objects
according to their relationships. These objects are not mapped to vectors, and
``similarity'' between objects need be neither numerical nor symmetrical. All
an object needs to do is rank nearby objects by similarity to itself, using a
Comparator which is transitive, but need not be consistent with any metric on
the whole set. Call this a ranking system on . Rank-based linkage is applied
to the -nearest neighbor digraph derived from a ranking system. Computations
occur on a 2-dimensional abstract oriented simplicial complex whose faces are
among the points, edges, and triangles of the line graph of the undirected
-nearest neighbor graph on . In steps it builds an
edge-weighted linkage graph where
is called the in-sway between objects and . Take to be
the links whose in-sway is at least , and partition into components of
the graph , for varying . Rank-based linkage is a
functor from a category of out-ordered digraphs to a category of partitioned
sets, with the practical consequence that augmenting the set of objects in a
rank-respectful way gives a fresh clustering which does not ``rip apart`` the
previous one. The same holds for single linkage clustering in the metric space
context, but not for typical optimization-based methods. Open combinatorial
problems are presented in the last section.Comment: 37 pages, 12 figure
Adaptive Testing for Alphas in High-dimensional Factor Pricing Models
This paper proposes a new procedure to validate the multi-factor pricing
theory by testing the presence of alpha in linear factor pricing models with a
large number of assets. Because the market's inefficient pricing is likely to
occur to a small fraction of exceptional assets, we develop a testing procedure
that is particularly powerful against sparse signals. Based on the
high-dimensional Gaussian approximation theory, we propose a simulation-based
approach to approximate the limiting null distribution of the test. Our
numerical studies show that the new procedure can deliver a reasonable size and
achieve substantial power improvement compared to the existing tests under
sparse alternatives, and especially for weak signals
Heterogeneity in mode choice behavior: A spatial latent class approach based on accessibility measures
We propose a method to estimate mode choice models, where preference parameters are sensitive to the spatial context of the trip origin, challenging traditional assumptions of spatial homogeneity in the relationship between travel modes and the built environment. The framework, called Spatial Latent Classes (SLC), is based on the integrated choice and latent class approach, although instead of defining classes for the decision maker, it estimates the probability of a location belonging to a class, as a function of spatial attributes. For each Spatial Latent Class, a different mode choice model is specified, and the resulting behavioral model for each location is a weighted average of all class-specific models, which is estimated to maximize the likelihood of reproducing observed travel behavior. We test our models with data from Portland, Oregon, specifying spatial class membership models as a function of local and regional accessibility measures. Results show the SLC increases model fit when compared with traditional methods and, more importantly, allows segmenting urban space into meaningful zones, where predominant travel behavior patterns can be easily identified. We believe this is a very intuitive way to spatially analyze travel behavior trends, allowing policymakers to identify target areas of the city and the accessibility levels required to attain desired modal splits
Shellfish Stocks and Fisheries Review 2022: an assessment of selected stocks
This review presents information on the status of selected shellfish stocks in Ireland. In
addition, data on the fleet and landings of shellfish species (excluding Nephrops and mussels)
are presented. The intention of this annual review is to present stock assessment and
management advice for shellfisheries that may be subject to new management proposals or
where scientific advice is required in relation to assessing the environmental impact of
shellfish fisheries especially in areas designated under European Directives. The review
reflects the recent work of the Marine Institute (MI) in the biological assessment of shellfish
fisheries and their interaction with the environment.
The information and advice presented here for shellfish is complementary to that presented
in the MI Stock Book on demersal and pelagic fisheries. Separate treatment of shellfish is
warranted as their biology and distribution, the assessment methods that can be applied to
them and the system under which they are managed, all differ substantially to demersal and
pelagic stocks.
Shellfish stocks are not generally assessed by The International Council for the Exploration of
the Sea (ICES) and although they come under the competency of the Common Fisheries Policy
they are generally not regulated by EU TAC and in the main, other than crab and scallop, are
distributed inside the national 12 nm fisheries limit. Management of these fisheries is within
the competency of the Department of Agriculture, Food and Marine (DAFM).
A co-operative management framework introduced by the Governing Department and BIM in
2005 (Anon 2005), and under which a number of fishery management plans were developed,
was, in 2014, replaced by the National and Regional Inshore Fisheries Forums (NIFF, RIFFs).
These bodies are consultative forums, the members of which are representative of the inshore
fisheries sector and other stakeholder groups. The National forum (NIFF) provides a structure
with which each of the regional forums can interact with each other and with the Marine
Agencies, DAFM and the Minister.
Management of oyster fisheries is the responsibility of The Department of Environment,
Climate and Communications, implemented through Inland Fisheries Ireland (IFI). In many
cases, however, management responsibility for oysters is devolved through Fishery Orders or
Aquaculture licences to local co-operatives.
The main customers for this review are DAFM, RIFFs, NIFF and other Departments and
Authorities listed above.EMFAF; Government of Irelan
Accelerated Sparse Recovery via Gradient Descent with Nonlinear Conjugate Gradient Momentum
This paper applies an idea of adaptive momentum for the nonlinear conjugate
gradient to accelerate optimization problems in sparse recovery. Specifically,
we consider two types of minimization problems: a (single) differentiable
function and the sum of a non-smooth function and a differentiable function. In
the first case, we adopt a fixed step size to avoid the traditional line search
and establish the convergence analysis of the proposed algorithm for a
quadratic problem. This acceleration is further incorporated with an operator
splitting technique to deal with the non-smooth function in the second case. We
use the convex and the nonconvex functionals as two
case studies to demonstrate the efficiency of the proposed approaches over
traditional methods
Anuário científico da Escola Superior de Tecnologia da Saúde de Lisboa - 2021
É com grande prazer que apresentamos a mais recente edição (a 11.ª) do Anuário Científico da Escola Superior de Tecnologia da Saúde de Lisboa. Como instituição de ensino superior, temos o compromisso de promover e incentivar a pesquisa científica em todas as áreas do conhecimento que contemplam a nossa missão. Esta publicação tem como objetivo divulgar toda a produção científica desenvolvida pelos Professores, Investigadores, Estudantes e Pessoal não Docente da ESTeSL durante 2021. Este Anuário é, assim, o reflexo do trabalho árduo e dedicado da nossa comunidade, que se empenhou na produção de conteúdo científico de elevada qualidade e partilhada com a Sociedade na forma de livros, capítulos de livros, artigos publicados em revistas nacionais e internacionais, resumos de comunicações orais e pósteres, bem como resultado dos trabalhos de 1º e 2º ciclo. Com isto, o conteúdo desta publicação abrange uma ampla variedade de tópicos, desde temas mais fundamentais até estudos de aplicação prática em contextos específicos de Saúde, refletindo desta forma a pluralidade e diversidade de áreas que definem, e tornam única, a ESTeSL. Acreditamos que a investigação e pesquisa científica é um eixo fundamental para o desenvolvimento da sociedade e é por isso que incentivamos os nossos estudantes a envolverem-se em atividades de pesquisa e prática baseada na evidência desde o início dos seus estudos na ESTeSL. Esta publicação é um exemplo do sucesso desses esforços, sendo a maior de sempre, o que faz com que estejamos muito orgulhosos em partilhar os resultados e descobertas dos nossos investigadores com a comunidade científica e o público em geral. Esperamos que este Anuário inspire e motive outros estudantes, profissionais de saúde, professores e outros colaboradores a continuarem a explorar novas ideias e contribuir para o avanço da ciência e da tecnologia no corpo de conhecimento próprio das áreas que compõe a ESTeSL. Agradecemos a todos os envolvidos na produção deste anuário e desejamos uma leitura inspiradora e agradável.info:eu-repo/semantics/publishedVersio
Model Diagnostics meets Forecast Evaluation: Goodness-of-Fit, Calibration, and Related Topics
Principled forecast evaluation and model diagnostics are vital in fitting probabilistic models and forecasting outcomes of interest. A common principle is that fitted or predicted distributions ought to be calibrated, ideally in the sense that the outcome is indistinguishable from a random draw from the posited distribution. Much of this thesis is centered on calibration properties of various types of forecasts.
In the first part of the thesis, a simple algorithm for exact multinomial goodness-of-fit tests is proposed. The algorithm computes exact -values based on various test statistics, such as the log-likelihood ratio and Pearson\u27s chi-square. A thorough analysis shows improvement on extant methods. However, the runtime of the algorithm grows exponentially in the number of categories and hence its use is limited.
In the second part, a framework rooted in probability theory is developed, which gives rise to hierarchies of calibration, and applies to both predictive distributions and stand-alone point forecasts. Based on a general notion of conditional T-calibration, the thesis introduces population versions of T-reliability diagrams and revisits a score decomposition into measures of miscalibration, discrimination, and uncertainty. Stable and efficient estimators of T-reliability diagrams and score components arise via nonparametric isotonic regression and the pool-adjacent-violators algorithm. For in-sample model diagnostics, a universal coefficient of determination is introduced that nests and reinterprets the classical in least squares regression.
In the third part, probabilistic top lists are proposed as a novel type of prediction in classification, which bridges the gap between single-class predictions and predictive distributions. The probabilistic top list functional is elicited by strictly consistent evaluation metrics, based on symmetric proper scoring rules, which admit comparison of various types of predictions
Deep Transfer Learning Applications in Intrusion Detection Systems: A Comprehensive Review
Globally, the external Internet is increasingly being connected to the
contemporary industrial control system. As a result, there is an immediate need
to protect the network from several threats. The key infrastructure of
industrial activity may be protected from harm by using an intrusion detection
system (IDS), a preventive measure mechanism, to recognize new kinds of
dangerous threats and hostile activities. The most recent artificial
intelligence (AI) techniques used to create IDS in many kinds of industrial
control networks are examined in this study, with a particular emphasis on
IDS-based deep transfer learning (DTL). This latter can be seen as a type of
information fusion that merge, and/or adapt knowledge from multiple domains to
enhance the performance of the target task, particularly when the labeled data
in the target domain is scarce. Publications issued after 2015 were taken into
account. These selected publications were divided into three categories:
DTL-only and IDS-only are involved in the introduction and background, and
DTL-based IDS papers are involved in the core papers of this review.
Researchers will be able to have a better grasp of the current state of DTL
approaches used in IDS in many different types of networks by reading this
review paper. Other useful information, such as the datasets used, the sort of
DTL employed, the pre-trained network, IDS techniques, the evaluation metrics
including accuracy/F-score and false alarm rate (FAR), and the improvement
gained, were also covered. The algorithms, and methods used in several studies,
or illustrate deeply and clearly the principle in any DTL-based IDS subcategory
are presented to the reader
- …