218 research outputs found
Design, construction and commissioning of the Thermal Screen Control System for the CMS Tracker detector at CERN
The CERN (European Organization for Nuclear Research) laboratory is currently building the Large Hadron Collider (LHC). Four international collaborations have designed (and are now constructing) detectors able to exploit the physics potential of this collider. Among them is the Compact Muon Solenoid (CMS), a general purpose detector optimized for the search of Higgs boson and for physics beyond the Standard Model of fundamental interactions between elementary particles. This thesis presents, in particular, the design, construction, commissioning and test of the control system for a screen that provides a thermal separation between the Tracker and ECAL (Electromagnetic CALorimeter) detector of CMS (Compact Muon Solenoid experiment). Chapter 1 introduces the new challenges posed by these installations and deals, more in detail, with the Tracker detector of CMS. The size of current experiments for high energy physics is comparable to that of a small industrial plant: therefore, the techniques used for controls and regulations, although highly customized, must adopt Commercial Off The Shelf (COTS) hardare and software. The âワslow controlâ systems for the experiments at CERN make extensive use of PLCs (Programmable Logic Controllers) and SCADA (Supervisory Control and Data Acquisition) to provide safety levels (namely interlocks), regulations, remote control of high and low voltages distributions, as well as archiving and trending facilities. The system described in this thesis must follow the same philosophy and, at the same time, comply with international engineering standards. While the interlocks applications belong straightforwardly to the category of DES (Discrete Event System), and are therefore treated with a Finite State Machine approach, other controls are more strictly related to the regulation problem. Chapter 2 will focus on various aspects of modern process control and on the tools used to design the control system for the thermal screen: the principles upon which the controller is designed and tuned, and the model validated, including the Multiple Input-Multiple Output (MIMO) problematics are explained. The thermal screen itself, the constraints and the basis of its functioning are described in Chapter 3, where the thermodynamical design is discussed as well. For the LHC experiments, the aim of a control system is also to provide a well defined SIL (Safety Interlock Level) to keep the system in a safe condition; yet, in this case, it is necessary to regulate the temperature of the system within certain values and respect the constraints arising from the specific needs of the above mentioned subsystems. The most natural choice for a PLC-based controller is a PID (Proportional Integral Derivative) controller. This kind of controller is widely used in many industrial process, from batch production in the pharmaceutics or automotive field to chemical plants, distillation columns and, in general, wherever a reliable and robust control is needed. In order to design and tune PID controllers, many techniques are in use; the approach followed in this thesis is that of black-box modeling: the system is modeled in the time domain, a transfer function is inferred and a controller is designed. Then, a system identification procedure allows for a more thorough study and validation of the model, and for the controller tuning. Project of the thermal screen control including system modeling, controller design and MIMO implementation issues are entirely covered in Chapter 4. A systems engineering methodology has been followed all along to adequately manage and document every phase of the project, complying with time and budget constraints. A risk analysis has been performed, using Layer of Protection Analysis (LOPA) and Hazard and Operability Studies (HAZOP), to understand the level of protection assured by the thermal screen and its control components. Tests planned and then performed to validate the model and for quality assurance purposes are described in Chapter 5. A climatic chamber has been designed and built at CERN, where the real operating conditions of the thermal screen are simulated. Detailed test procedures have been defined, following IEEE standards, in order to completely check every single thermal screen panel. This installation allows for a comparison of different controller tuning approaches, including IAE minimization, Skogestad tuning rules, Internal Model Control (IMC), and a technique based upon the MatLab Optimization toolbox. This installation is also used for system identification purposes and for the acceptance tests of every thermal screen panel (allowing for both electrical and hydraulic checks). Also, tests have been performed on the West Hall CERN experimental area , where a full control system has been set up, for interlock high- and low- voltage lines. The interlock system operating procedures and behaviour have been validated during real operating conditions of the detector esposed to a particle beam. The satisfactory results of tests take the project to full completion, allowing the plan to reach the âワexitâ stage, when the thermal screen is ready to be installed in the Tracker and ready to be operational
Evolutionary computation for trading systems
2007/2008Evolutionary computations, also called evolutionary algorithms, consist of
several heuristics, which are able to solve optimization tasks by imitating
some aspects of natural evolution. They may use different levels of abstraction, but they are always working on populations of possible solutions for a
given task. The basic idea is that if only those individuals of a population
which meet a certain selection criteria reproduce, while the remaining individuals die, the population will converge to those individuals that best meet
the selection criteria. If imperfect reproduction is added the population can
begin to explore the search space and will move to individuals that have an
increased selection probability and that hand down this property to their
descendants. These population dynamics follow the basic rule of the Darwinian evolution theory, which can be described in short as the “survival of the fittest”.
Although evolutionary computations belong to a relative new research area,
from a computational perspective they have already showed some promising
features such as:
• evolutionary methods reveal a remarkable balance between efficiency
and efficacy;
• evolutionary computations are well suited for parameter optimisation;
• this type of algorithms allows a wide variety of extensions and constraints that cannot be provided in traditional methods;
• evolutionary methods are easily combined with other optimization
techniques and can also be extended to multi-objective optimization.
From an economic perspective, these methods appear to be particularly well
suited for a wide range of possible financial applications, in particular in this
thesis I study evolutionary algorithms
• for time series prediction;
• to generate trading rules;
• for portfolio selection.
It is commonly believed that asset prices are not random, but are permeated by complex interrelations that often translate in assets mispricing and
may give rise to potentially profitable opportunities. Classical financial approaches, such as dividend discount models or even capital asset pricing theories, are not able to capture these market complexities. Thus, in the
last decades, researchers have employed intensive econometric and statistical
modeling that examine the effects of a multitude of variables, such as price-
earnings ratios, dividend yields, interest rate spreads and changes in foreign
exchange rates, on a broad and variegated range of stocks at the same time.
However, these models often result in complex functional forms difficult to
manage or interpret and, in the worst case, are solely able to fit a given time
series but are useless to predict it. Parallelly to quantitative approaches,
other researchers have focused on the impact of investor psychology (in particular, herding and overreaction) and on the consequences of considering
informed signals from management and analysts, such as share repurchases
and analyst recommendations. These theories are guided by intuition and
experience, and thus are difficult to be translated into a mathematical environment.
Hence, the necessity to combine together these point of views in order to
develop models that examine simultaneously hundreds of variables, including qualitative informations, and that have user friendly representations, is
urged. To this end, the thesis focuses on the study of methodologies that
satisfy these requirements by integrating economic insights, derived from
academic and professional knowledge, and evolutionary computations.
The main task of this work is to provide efficient algorithms based on the
evolutionary paradigm of biological systems in order to compute optimal
trading strategies for various profit objectives under economic and statistical constraints. The motivations for constructing such optimal strategies
are:
i) the necessity to overcome data-snooping and supervisorship bias in
order to learn to predict good trading opportunities by using market
and/or technical indicators as features on which to base the forecasting;
ii) the feasibility of using these rules as benchmark for real trading
systems;
iii) the capability of ranking quantitatively various markets with respect
to their profitability according to a given criterion, thus making possible portfolio allocations.
More precisely, I present two algorithms that use artificial expert trading
systems to predict financial time series, and a procedure to generate integrated neutral strategies for active portfolio management.
The first algorithm is an automated procedure that simultaneously selects
variables and detect outliers in a dynamic linear model using information
criteria as objective functions and diagnostic tests as constraints for the
distributional properties of errors. The novelties are the automatic implementation of econometric conditions in the model selection step, making
possible a better exploration of the solution space on one hand, and the use
of evolutionary computations to efficiently generate a reduction procedure from a very large number of independent variables on the other hand.
In the second algorithm, the novelty is given by the definition of evolutionary
learning in financial terms and its use in a multi-objective genetic algorithm
in order to generate technical trading systems.
The last tool is based on a trading strategy on six assets, where future
movements of each variable are obtained by an evolutionary procedure that
integrates various types of financial variables. The contribution is given
by the introduction of a genetic algorithm to optimize trading signals parameters and the way in which different informations are represented and
collected.
In order to compare the contribution of this work to “classical” techniques
and theories, the thesis is divided into three parts. The first part, titled
Background, collects Chapters 2 and 3. Its purpose is to provide an introduction to search/optimization evolutionary techniques on one hand, and to
the theories that relate the predictability in financial markets with the concept of efficiency proposed over time by scholars on the other hand. More
precisely, Chapter 2 introduces the basic concepts and major areas of evolutionary computation. It presents a brief history of three major types of evolutionary algorithms, i.e. evolution strategies, evolutionary programming
and genetic algorithms, and points out similarities and differences among
them. Moreover it gives an overview of genetic algorithms and describes
classical and genetic multi-objective optimization techniques. Chapter 3
first presents an overview of the literature on the predictability of financial
time series. In particular, the extent to which the efficiency paradigm is
affected by the introduction of new theories, such as behavioral finance, is
described in order to justify the market forecasting methodologies developed
by practitioners and academics in the last decades. Then, a description of
the econometric and financial techniques that will be used in conjunction
with evolutionary algorithms in the successive chapters is provided. Special
attention is paid to economic implications, in order to highlight merits and
shortcomings from a practitioner perspective.
The second part of the thesis, titled Trading Systems, is devoted to the description of two procedures I have developed in order to generate artificial
trading strategies on the basis of evolutionary algorithms, and it groups
Chapters 4 and 5. In particular, chapter 4 presents a genetic algorithm for
variable selection by minimizing the error in a multiple regression model.
Measures of errors such as ME, RMSE, MAE, Theil’s inequality coefficient
and CDC are analyzed choosing models based on AIC, BIC, ICOMP and
similar criteria. Two components of penalty functions are taken in analysis-
level of significance and Durbin Watson statistics. Asymptotic properties of
functions are tested on several financial variables including stocks, bonds,
returns, composite prices indices from the US and the EU economies. Variables with outliers that distort the efficiency and consistency of estimators
are removed to solve masking and smearing problems that they may cause in
estimations. Two examples complete the chapter. In both cases, models are
designed to produce short-term forecasts for the excess returns of the MSCI
Europe Energy sector on the MSCI Europe index and a recursive estimation-
window is used to shed light on their predictability performances. In the first
application the data-set is obtained by a reduction procedure from a very
large number of leading macro indicators and financial variables stacked
at various lags, while in the second the complete set of 1-month lagged
variables is considered. Results show a promising capability to predict excess sector returns through the selection, using the proposed methodology,
of most valuable predictors. In Chapter 5 the paradigm of evolutionary
learning is defined and applied in the context of technical trading rules for
stock timing. A new genetic algorithm is developed by integrating statistical
learning methods and bootstrap to a multi-objective non dominated sorting
algorithm with variable string length, making possible to evaluate statistical
and economic criteria at the same time. Subsequently, the chapter discusses
a practical case, represented by a simple trading strategy where total funds
are invested in either the S&P 500 Composite Index or in 3-month Treasury
Bills. In this application, the most informative technical indicators are selected from a set of almost 5000 signals by the algorithm. Successively, these
signals are combined into a unique trading signal by a learning method. I
test the expert weighting solution obtained by the plurality voting committee, the Bayesian model averaging and Boosting procedures with data from
the the S&P 500 Composite Index, in three market phases, up-trend, down-
trend and sideways-movements, covering the period 2000–2006.
In the third part, titled Portfolio Selection, I explain how portfolio optimization models may be constructed on the basis of evolutionary algorithms and
on the signals produced by artificial trading systems. First, market neutral
strategies from an economic point of view are introduced, highlighting their
risks and benefits and focusing on their quantitative formulation. Then, a
description of the GA-Integrated Neutral tool, a MATLAB set of functions
based on genetic algorithms for active portfolio management, is given. The
algorithm specializes in the parameter optimization of trading signals for
an integrated market neutral strategy. The chapter concludes showing an
application of the tool as a support to decisions in the Absolute Return
Interest Rate Strategies sub-fund of Generali Investments.Gli “algoritmi evolutivi”, noti anche come “evolutionary computations”
comprendono varie tecniche di ottimizzazione per la risoluzione di problemi,
mediante alcuni aspetti suggeriti dall’evoluzione naturale. Tali metodologie
sono accomunate dal fatto che non considerano un’unica soluzione alla
volta, bens`ı trattano intere popolazioni di possibili soluzioni per un dato
problema. L’idea sottostante `e che, se un algoritmo fa evolvere solamente
gli individui di una data popolazione che soddisfano a un certo criterio di
selezione, e lascia morire i restanti, la popolazione converger`a agli individui
che meglio soddisfano il criterio di selezione. Con una selezione non ottimale,
cio`e una che ammette pure soluzioni sub-ottimali, la popolazione rappresenter`
a meglio l’intero spazio di ricerca e sar`a in grado di individuare in modo
pi`u consistente gli individui migliori da far evolvere. Queste dinamiche interne
alle popolazioni seguono i principi Darwiniani dell’evoluzione, che si
possono sinteticamente riassumere nella dicitura “la sopravvivenza del più
adatto”.
Sebbene gli algoritmi evolutivi siano un’area di ricerca relativamente nuova,
dal punto di vista computazionale hanno dimostrato alcune caratteristiche
interessanti fra cui le seguenti:
• permettono un notevole equilibrio tra efficienza ed efficacia;
• sono particolarmente indicati per la configurazione dei parametri in
problemi di ottimizzazione;
• consentono una flessibilit`a nella definizione matematica dei problemi
e dei vincoli che non si trova nei metodi tradizionali;
• possono facilmente essere integrati con altre tecniche di ottimizzazione
ed essere essere modificati per risolvere problemi multi-obiettivo.
Dal un punto di vista economico, l’applicazione di queste procedure pu`o
risultare utile specialmente in campo finanziario. In particolare, nella mia
tesi ho studiato degli algoritmi evolutivi per
• la previsione di serie storiche finanziarie;
• la costruzione di regole di trading;
• la selezione di portafogli.
Da un punto di vista pi`u ampio, lo scopo di questa ricerca `e dunque l’analisi
dell’evoluzione e della complessit`a dei mercati finanziari. In tal senso, dal
momento che i prezzi non seguono andamenti puramente casuali, ma sono
governati da un insieme molto articolato di eventi correlati, i modelli e le
teorie classiche, come i dividend discount model e le varie capital asset pricing
theories, non sono pi`u sufficienti per determinare potenziali opportunit`a di
profitto. A tal fine, negli ultimi decenni, alcuni ricercatori hanno sviluppato
una vasta gamma di modelli econometrici e statistici in grado di esaminare
contemporaneamente le relazioni e gli effetti di centinaia di variabili, come
ad esempio, price-earnings ratios, dividendi, differenziali fra tassi di interesse
e variazioni dei tassi di cambio, per una vasta gamma di assets. Comunque,
questo approccio, che fa largo impiego di strumenti di calcolo, spesso porta
a dei modelli troppo complicati per essere gestiti o interpretati, e, nel peggiore
dei casi, pur essendo ottimi per descrivere situazioni passate, risultano
inutili per fare previsioni. Parallelamente a questi approcci quantitativi, si
`e manifestato un grande interesse sulla psicologia degli investitori e sulle
conseguenze derivanti dalle opinioni di esperti e analisti nelle dinamiche del
mercato. Questi studi sono difficilmente traducibili in modelli matematici
e si basano principalmente sull’intuizione e sull’esperienza. Da qui la necessit`
a di combinare insieme questi due punti di vista, al fine di sviluppare
modelli che siano in grado da una parte di trattare contemporaneamente
un elevato numero di variabili in modo efficiente e, dall’altra, di incorporare
informazioni e opinioni qualitative. La tesi affronta queste tematiche integrando
le conoscenze economiche, sia accademiche che professionali, con gli
algoritmi evolutivi. Pi`u pecisamente, il principale obiettivo di questo lavoro
`e lo sviluppo di algoritmi efficienti basati sul paradigma dell’evoluzione dei
sistemi biologici al fine di determinare strategie di trading ottimali in termini
di profitto e di vincoli economici e statistici. Le ragioni che motivano
lo studio di tali strategie ottimali sono:
i) la necessit`a di risolvere i problemi di data-snooping e supervivorship
bias al fine di ottenere regole di investimento vantaggiose utilizzando
indicatori di mercato e/o tecnici per la previsione;
ii) la possibilitĂ di impiegare queste regole come benchmark per sistemi
di trading reali;
iii) la capacit`a di individuare gli asset pi`u vantaggiosi in termini di profitto,
o di altri criteri, rendendo possibile una migliore allocazione di
risorse nei portafogli.
In particolare, nella tesi descrivo due algoritmi che impiegano sistemi di trading
artificiali per predire serie storiche finanziarie e una procedura di calcolo
per strategie integrate neutral market per la gestione attiva di portafogli.
Il primo algoritmo `e una procedura automatica che seleziona le variabili
e simultaneamente determina gli outlier in un modello dinamico lineare
utilizzando criteri informazionali come funzioni obiettivo e test diagnostici
come vincoli per le caratteristiche delle distribuzioni degli errori. Le novit`a
del metodo sono da una parte l’implementazione automatica di condizioni
econometriche nella fase di selezione, consentendo una migliore analisi dello
EVOLUTIONARY COMPUTATIONS FOR TRADING SYSTEMS 3
spazio delle soluzioni, e dall’altra parte, l’introduzione di una procedura di
riduzione evolutiva capace di riconoscere in modo efficiente le variabili pi`u
informative.
Nel secondo algoritmo, le novità sono costituite dalla definizione dell’apprendimento
evolutivo in termini finanziari e dall’applicazione di un algoritmo
genetico multi-obiettivo per la costruzione di sistemi di trading basati
su indicatori tecnici.
L’ultimo metodo proposto si basa su una strategia di trading su sei assets,
in cui le dinamiche future di ciascuna variabile sono ottenute impiegando
una procedura evolutiva che integra diverse tipologie di variabili finanziarie.
Il contributo è dato dall’impiego di un algoritmo genetico per ottimizzare i
parametri negli indicatori tecnici e dal modo in cui le differenti informazioni
sono presentate e collegate.
La tesi `e organizzata in tre parti. La prima parte, intitolata Background,
comprende i Capitoli 2 e 3, ed è intesa a fornire un’introduzione alle tecniche
di ricerca/ottimizzazione su base evolutiva da una parte, e alle teorie
che si occupano di efficienza e prevedibilit`a dei mercati finanziari dall’altra.
PiĂą precisamente, il Capitolo 2 introduce i concetti base e i maggiori
campi di studio della computazione evolutiva. In tal senso, si dĂ una breve
presentazione storica di tre dei maggiori tipi di algoritmi evolutivi, ciò e le
strategie evolutive, la programmazione evolutiva e gli algoritmi genetici,
evidenziandone caratteri comuni e differenze. Il capitolo si chiude con una
panoramica sugli algoritmi genetici e sulle tecniche classiche e genetiche di
ottimizzazione multi-obiettivo. Il Capitolo 3 affronta nel dettaglio la problematica
della prevedibilit`a delle serie storiche finanziarie mettendo in luce,
in particolare, quanto il paradigma dell’efficienza sia influenzato dalle pi`u
recenti teorie finanziarie, come ad esempio la finanza comportamentale. Lo
scopo è quello di dare una giustificazione su basi teoriche per le metodologie
di previsione sviluppate nella tesi. Segue una descrizione dei metodi
econometrici e di analisi tecnica che nei capitoli successivi verrano impiegati
assieme agli algoritmi evolutivi. Una particolare attenzione è data alle implicazioni
economiche, al fine di evidenziare i loro meriti e i loro difetti da
un punto di vista pratico.
La seconda parte, intitolata Trading Systems, raggruppa i Capitoli 4 e 5 ed
è dedicata alla descrizione di due procedure che ho sviluppato per generare
sistemi di trading artificiali sulla base di algoritmi evolutivi. In particolare,
il Capitolo 4 presenta un algortimo genetico per la selezione di variabili attraverso
la minimizzazione dell’errore in un modello di regressione multipla.
Misure di errore, quali il ME, il RMSE, il MAE, il coefficiente di Theil e
il CDC sono analizzate a partire da modelli selezionati sulla scorta di criteri
informazionali, come ad esempio AIC, BIC, ICOMP. A livello di vincoli
diagnostici, ho considerato una funzione di penalitĂ a due componenti e la
statistica di Durbin Watson. Il programma impiega variabili finanziarie di
vario tipo, come rendimenti di titoli, bond e prezzi di indici composti ottenuti
dalle economie Statunitense ed Europea. Nel caso le serie storiche
4 MASSIMILIANO KAUCIC
considerate presentino outliers che distorcono l’efficienza e la consistenza
degli stimatori, l’algoritmo `e in grado di individuarle e rimuoverle dalla serie,
risolvendo il problema di masking and smearing. Il capitolo si conclude
con due applicazioni, in cui i modelli sono progettati per produrre previsioni
di breve periodo per l’extra rendimento del settore MSCI Europe Energy sull’indice
MSCI Europe e una procedura di tipo recursive estimation-window è
utilizzata per evidenziarne le performance previsionali. Nel primo esempio,
l’insieme dei dati `e ottenuto estraendo le variabili di interesse da un considerevole
numero di indicatori di tipo macro e da variabili finanziarie ritardate
rispetto alla variabile dipendente. Nel secondo esempio ho invece considerato
l’intero insieme di variabili ritardate di 1 mese. I risultati mostrano una
notevole capacità previsiva per l’extra rendimento, individuando gli indicatori
maggiormente informativi. Nel Capitolo 5, il concetto di apprendimento
evolutivo viene definito ed applicato alla costruzione di regole di trading su
indicatori tecnici per lo stock timing. In tal senso, ho sviluppato un algoritmo
che integra metodi di apprendimento statistico e di boostrap con un
particolare algoritmo multi-obiettivo. La procedura derivante è in grado di
valutare contemporaneamente criteri economici e statistici. Per descrivere
il suo funzionamento, ho considerato un semplice esempio di trading in cui
tutto il capitale è investito in un indice (che nel caso trattato è l’indice S&P
500 Composite) o in un titolo a basso rischio (nell’esempio, i Treasury Bills
a 3 mesi). Il segnale finale di trading `e il risultato della selezione degli indicatori
tecnici pi`u informativi a partire da un insieme di circa 5000 indicatori
e la loro conseguente integrazione mediante un metodo di apprendimento
(il plurality voting committee, il bayesian model averaging o il Boosting).
L’analisi è stata condotta sull’intervallo temporale dal 2000 al 2006, suddiviso
in tre sottoperiodi: il primo rappresenta l’indice in un
Working Notes from the 1992 AAAI Workshop on Automating Software Design. Theme: Domain Specific Software Design
The goal of this workshop is to identify different architectural approaches to building domain-specific software design systems and to explore issues unique to domain-specific (vs. general-purpose) software design. Some general issues that cut across the particular software design domain include: (1) knowledge representation, acquisition, and maintenance; (2) specialized software design techniques; and (3) user interaction and user interface
New Approaches in Automation and Robotics
The book New Approaches in Automation and Robotics offers in 22 chapters a collection of recent developments in automation, robotics as well as control theory. It is dedicated to researchers in science and industry, students, and practicing engineers, who wish to update and enhance their knowledge on modern methods and innovative applications. The authors and editor of this book wish to motivate people, especially under-graduate students, to get involved with the interesting field of robotics and mechatronics. We hope that the ideas and concepts presented in this book are useful for your own work and could contribute to problem solving in similar applications as well. It is clear, however, that the wide area of automation and robotics can only be highlighted at several spots but not completely covered by a single book
F-8C digital CCV flight control laws
A set of digital flight control laws were designed for the NASA F-8C digital fly-by-wire aircraft. The control laws emphasize Control Configured Vehicle (CCV) benefits. Specific pitch axis objectives were improved handling qualities, angle-of-attack limiting, gust alleviation, drag reduction in steady and maneuvering flight, and a capability to fly with reduced static stability. The lateral-directional design objectives were improved Dutch roll damping and turn coordination over a wide range in angle-of-attack. An overall program objective was to explore the use of modern control design methodilogy to achieve these specific CCV benefits. Tests for verifying system integrity, an experimental design for handling qualities evaluation, and recommended flight test investigations were specified
The 1991 3rd NASA Symposium on VLSI Design
Papers from the symposium are presented from the following sessions: (1) featured presentations 1; (2) very large scale integration (VLSI) circuit design; (3) VLSI architecture 1; (4) featured presentations 2; (5) neural networks; (6) VLSI architectures 2; (7) featured presentations 3; (8) verification 1; (9) analog design; (10) verification 2; (11) design innovations 1; (12) asynchronous design; and (13) design innovations 2
The 1990 Goddard Conference on Space Applications of Artificial Intelligence
The papers presented at the 1990 Goddard Conference on Space Applications of Artificial Intelligence are given. The purpose of this annual conference is to provide a forum in which current research and development directed at space applications of artificial intelligence can be presented and discussed. The proceedings fall into the following areas: Planning and Scheduling, Fault Monitoring/Diagnosis, Image Processing and Machine Vision, Robotics/Intelligent Control, Development Methodologies, Information Management, and Knowledge Acquisition
Machine learning techniques for classification problems related to therapies in diabetes patients
The growing use of technology and cyber tools has been embraced by the healthcare sector in many ways. An interesting and currently not completely exploited field of application is "patient engagement" . This thesis tackles the problem of classifying diabetes patients, with the use of machine learning, based on the therapy they are following in: patients that are following the correct therapy and patients that are not following the therapy, or for which the therapy is not correct.ope
Applications of MATLAB in Science and Engineering
The book consists of 24 chapters illustrating a wide range of areas where MATLAB tools are applied. These areas include mathematics, physics, chemistry and chemical engineering, mechanical engineering, biological (molecular biology) and medical sciences, communication and control systems, digital signal, image and video processing, system modeling and simulation. Many interesting problems have been included throughout the book, and its contents will be beneficial for students and professionals in wide areas of interest
COBE's search for structure in the Big Bang
The launch of Cosmic Background Explorer (COBE) and the definition of Earth Observing System (EOS) are two of the major events at NASA-Goddard. The three experiments contained in COBE (Differential Microwave Radiometer (DMR), Far Infrared Absolute Spectrophotometer (FIRAS), and Diffuse Infrared Background Experiment (DIRBE)) are very important in measuring the big bang. DMR measures the isotropy of the cosmic background (direction of the radiation). FIRAS looks at the spectrum over the whole sky, searching for deviations, and DIRBE operates in the infrared part of the spectrum gathering evidence of the earliest galaxy formation. By special techniques, the radiation coming from the solar system will be distinguished from that of extragalactic origin. Unique graphics will be used to represent the temperature of the emitting material. A cosmic event will be modeled of such importance that it will affect cosmological theory for generations to come. EOS will monitor changes in the Earth's geophysics during a whole solar color cycle
- …