5 research outputs found

    Simulation-based fitness landscape analysis and optimisation of complex problems

    Get PDF
    Widespread hard optimisation problems in economics and logistics are characterised by large dimensions, uncertainty and nonlinearity and require more powerful methods of stochastic optimisation that traditional ones. Simulation optimisation is a powerful tool for solving these problems. Moreover, fitness landscape analysis techniques provide an efficient approach to better selection of a suitable optimisation algorithm. The concept and techniques of fitness landscape analysis are described. A formalised scheme for simulation optimisation enhanced with fitness landscape analysis is given. Benchmark fitness landscape analysis is performed to find relations between efficiency of an optimisation algorithm and structural features of a fitness landscape. Case study in simulation optimisation of vehicle routing and scheduling is described. Various optimisation scenarios with application of the fitness landscape analysis are discussed and investigated

    Statistical parsing of noun phrase structure

    Get PDF
    Noun phrases (NPs) are a crucial part of natural language, exhibiting in many cases an extremely complex structure. However, NP structure is largely ignored by the statistical parsing field, as the most widely-used corpus is not annotated with it. This lack of gold-standard data has restricted all previous efforts to parse NPs, making it impossible to perform the supervised experiments that have achieved high performance in so many Natural Language Processing (NLP) tasks. We comprehensively solve this problem by manually annotating NP structure for the entire Wall Street Journal section of the Penn Treebank. The inter-annotator agreement scores that we attain refute the belief that the task is too difficult, and demonstrate that consistent NP annotation is possible. Our gold-standard NP data is now available and will be useful for all parsers. We present three statistical methods for parsing NP structure. Firstly, we apply the Collins (2003) model, and find that its recovery of NP structure is significantly worse than its overall performance. Through much experimentation, we determine that this is not a result of the special base-NP model used by the parser, but primarily caused by a lack of lexical information. Secondly, we construct a wide-coverage, large-scale NP Bracketing system, applying a supervised model to achieve excellent results. Our Penn Treebank data set, which is orders of magnitude larger than those used previously, makes this possible for the first time. We then implement and experiment with a wide variety of features in order to determine an optimal model. Having achieved this, we use the NP Bracketing system to reanalyse NPs outputted by the Collins (2003) parser. Our post-processor outperforms this state-of-the-art parser. For our third model, we convert the NP data to CCGbank (Hockenmaier and Steedman, 2007), a corpus that uses the Combinatory Categorial Grammar (CCG) formalism. We experiment with a CCG parser and again, implement features that improve performance. We also evaluate the CCG parser against the Briscoe and Carroll (2006) reannotation of DepBank (King et al., 2003), another corpus that annotates NP structure. This supplies further evidence that parser performance is increased by improving the representation of NP structure. Finally, the error analysis we carry out on the CCG data shows that again, a lack of lexicalisation causes difficulties for the parser. We find that NPs are particularly reliant on this lexical information, due to their exceptional productivity and the reduced explicitness present in modifier sequences. Our results show that NP parsing is a significantly harder task than parsing in general. This thesis comprehensively analyses the NP parsing task. Our contributions allow wide-coverage, large-scale NP parsers to be constructed for the first time, and motivate further NP parsing research for the future. The results of our work can provide significant benefits for many NLP tasks, as the crucial information contained in NP structure is now available for all downstream systems

    School bus selection, routing and scheduling.

    Get PDF
    The aim of this thesis is to develop formulations and exact algorithms for the school bus routing and scheduling problem and to develop an integrated software implementation using Xpress-MP/CPLEX and ArcGIS of ESRI, a geographical information system software package. In this thesis, bus flow, single commodity flow, two-commodity flow, multi-commodity flow, and time window formulations have been developed. They capture all of the important elements of the School Bus Routing and Scheduling Problem (SBRSP) including homogeneous or heterogeneous bus fleets, the identification of bus stops from a large set of potential bus stops, and the assignment of students to stops and stops to routes. They allow for the one stop-one bus and one stop-multi bus scenarios. Each formulation of the SBRSP has a linear programming relaxation and we present the relationships among them. We present a Branch-and-Cut exact algorithm which makes use of new linearization techniques, new valid inequalities, and the first valid equalities. We develop an integrated software package that is based on Geographical Information System (GIS) map-based interface, linking to an Xpress-MP/CPLEX solver. The interface between GIS and Xpress-MP is written in VBA and VC++.Dept. of Mathematics and Statistics. Paper copy at Leddy Library: Theses & Major Papers - Basement, West Bldg. / Call Number: Thesis2005 .K4. Source: Dissertation Abstracts International, Volume: 66-11, Section: B, page: 6250. Thesis (Ph.D.)--University of Windsor (Canada), 2005

    Depth Modulation in Radiotherapy

    Get PDF
    Intensity Modulated Radiotherapy (IMRT) has been a major field of research over the last thirty years and is today the standard in radiotherapy treatment of cancer. The introduction of IMRT into the clinical environment has greatly improved the ability of the treatment team to conform the radiation dose to the tumour volume. Alongside improvements in image guidance, IMRT has led to a reduction in side effects for patients and opened up the possibilities of dose escalation and hypofractionation. IMRT is however by no means perfect. IMRT and derivatives such as Volumated Arc Therapy (VMAT) are limited by the exit dose from the X-ray beams and deliver a significant amount of radiation dose to normal tissues. The much publicised alternative to IMRT is proton therapy. Proton therapy beams deposit dose over a narrow range resulting in minimal exit dose. The future of radiotherapy certainly involves a significant contribution from proton therapy but the availability to patients is likely to remain limited for a long time to come. The research in this thesis considers the possibility of further improving IMRT by modulating radiotherapy beams along their direction of travel as well as across their intensity, i.e. the so called ‘Depth Modulation’ of the thesis title. Although there are numerous possible ways to achieve depth modulation, this work proposes a combination of X-ray beams with electron beams of different energies with both modalities delivered with a conventional medical linear accelerator. The research in this thesis is concerned with developing a proof of principle for this method. It is to some extent a theoretical study, however at each step the possibility of practical implementation has been considered with the view that the method is only a viable proposition if it can be effectively implemented into clinical practice. The technique proposed in this work is to use electron beams delivered through X-ray MLC with a standard patient set up. To reduce scatter and photon contamination it is proposed to remove the scattering foils from the beamline and to employ optimisation of the electron and photon components to compensate for any remaining penumbra broadening. The research has shown that improvements to dosimetry through removal of the scattering foil would allow delivery without reducing the source to surface distance, making a single isocentre synergistic delivery for both the electron and photon components practical. Electron dose segments have been calculated using Monte Carlo radiation transport and a procedure to optimise dose for the combined photon and electron IMRT technique has been developed. Through development of the optimisation procedure the characteristics of the mixed modality technique have been examined. A number of findings are demonstrated such as the benefit of gaps between electron segments, the benefits of optimising for energy in three dimensions and the dependence of the cost function minimum on the electron to photon ratio. Through clinical examples it has been shown that for tumours close to the surface the mixed modality technique has the potential to reduce the dose to normal tissues, particular in the low dose wash. Calculations of relative malignant induction probability demonstrate that this reduction in dose has the potential to reduce the incidence of secondary cancer induction. Possible treatment sites for application of the technique include breast, head and neck, brain and sarcomas

    Proceedings of the 9th international conference on disability, virtual reality and associated technologies (ICDVRAT 2012)

    Get PDF
    The proceedings of the conferenc
    corecore