35 research outputs found

    ProblÚmes de tournées de véhicules et application industrielle pour la réduction de l'empreinte écologique

    Get PDF
    Dans cette thĂšse, nous nous sommes intĂ©ressĂ©s Ă  la rĂ©solution approchĂ©e de problĂšmes de tournĂ©es de vĂ©hicules. Nous avons exploitĂ© des travaux menĂ©s sur les graphes d'intervalles et des propriĂ©tĂ©s de dominance relatives aux tournĂ©es saturĂ©es pour traiter les problĂšmes de tournĂ©es sĂ©lectives plus efficacement. Des approches basĂ©es sur un algorithme d'optimisation par essaim particulaire et un algorithme mĂ©mĂ©tique ont Ă©tĂ© proposĂ©es. Les mĂ©taheuristiques dĂ©veloppĂ©es font appel Ă  un ensemble de techniques particuliĂšrement efficaces telles que le dĂ©coupage optimal, les opĂ©rateurs de croisement gĂ©nĂ©tiques ainsi que des mĂ©thodes de recherches locales. Nous nous sommes intĂ©ressĂ©s Ă©galement aux problĂšmes de tournĂ©es classiques avec fenĂȘtres de temps. DiffĂ©rents prĂ©traitements ont Ă©tĂ© introduits pour obtenir des bornes infĂ©rieures sur le nombre de vĂ©hicules. Ces prĂ©traitements s'inspirent de mĂ©thodes issues de modĂšles de graphes, de problĂšme d'ordonnancement et de problĂšmes de bin packing avec conflits. Nous avons montrĂ© Ă©galement l'utilitĂ© des mĂ©thodes dĂ©veloppĂ©es dans un contexte industriel Ă  travers la rĂ©alisation d'un portail de services mobilitĂ©.In this thesis, we focused on the development of heuristic approaches for solvingvehicle routing problems. We exploited researches conducted on interval graphsand dominance properties of saturated tours to deal more efficiently with selectivevehicle routing problems. An adaptation of a particle swarm optimization algorithmand a memetic algorithm is proposed. The metaheuristics that we developed arebased on effective techniques such as optimal split, genetic crossover operatorsand local searches. We are also interested in classical vehicle problems with timewindows. Various pre-processing methods are introduced to obtain lower boundson the number of vehicles. These methods are based on many approaches usinggraph models, scheduling problems and bin packing problems with conflicts. Wealso showed the effectiveness of the developed methods with an industrial applicationby implementing a portal of mobility services.COMPIEGNE-BU (601592101) / SudocSudocFranceF

    Conception d'un modÚle architectural collaboratif pour l'informatique omniprésente à la périphérie des réseaux mobiles

    Get PDF
    Le progrĂšs des technologies de communication pair-Ă -pair et sans fil a de plus en plus permis l’intĂ©gration de dispositifs portables et omniprĂ©sents dans des systĂšmes distribuĂ©s et des architectures informatiques de calcul dans le paradigme de l’internet des objets. De mĂȘme, ces dispositifs font l'objet d'un dĂ©veloppement technologique continu. Ainsi, ils ont toujours tendance Ă  se miniaturiser, gĂ©nĂ©ration aprĂšs gĂ©nĂ©ration durant lesquelles ils sont considĂ©rĂ©s comme des dispositifs de facto. Le fruit de ces progrĂšs est l'Ă©mergence de l'informatique mobile collaborative et omniprĂ©sente, notamment intĂ©grĂ©e dans les modĂšles architecturaux de l'Internet des Objets. L’avantage le plus important de cette Ă©volution de l'informatique est la facilitĂ© de connecter un grand nombre d'appareils omniprĂ©sents et portables lorsqu'ils sont en dĂ©placement avec diffĂ©rents rĂ©seaux disponibles. MalgrĂ© les progrĂšs continuels, les systĂšmes intelligents mobiles et omniprĂ©sents (rĂ©seaux, dispositifs, logiciels et technologies de connexion) souffrent encore de diverses limitations Ă  plusieurs niveaux tels que le maintien de la connectivitĂ©, la puissance de calcul, la capacitĂ© de stockage de donnĂ©es, le dĂ©bit de communications, la durĂ©e de vie des sources d’énergie, l'efficacitĂ© du traitement de grosses tĂąches en termes de partitionnement, d'ordonnancement et de rĂ©partition de charge. Le dĂ©veloppement technologique accĂ©lĂ©rĂ© des Ă©quipements et dispositifs de ces modĂšles mobiles s'accompagne toujours de leur utilisation intensive. Compte tenu de cette rĂ©alitĂ©, plus d'efforts sont nĂ©cessaires Ă  la fois dans la conception structurelle tant au matĂ©riel et logiciel que dans la maniĂšre dont il est gĂ©rĂ©. Il s'agit d'amĂ©liorer, d'une part, l'architecture de ces modĂšles et leurs technologies de communication et, d'autre part, les algorithmes d'ordonnancement et d'Ă©quilibrage de charges pour effectuer leurs travaux efficacement sur leurs dispositifs. Notre objectif est de rendre ces modĂšles omniprĂ©sents plus autonomes, intelligents et collaboratifs pour renforcer les capacitĂ©s de leurs dispositifs, leurs technologies de connectivitĂ© et les applications qui effectuent leurs tĂąches. Ainsi, nous avons Ă©tabli un modĂšle architectural autonome, omniprĂ©sent et collaboratif pour la pĂ©riphĂ©rie des rĂ©seaux. Ce modĂšle s'appuie sur diverses technologies de connexion modernes telles que le sans-fil, la radiocommunication pair-Ă -pair, et les technologies offertes par LoPy4 de Pycom telles que LoRa, BLE, Wi-Fi, Radio Wi-Fi et Bluetooth. L'intĂ©gration de ces technologies permet de maintenir la continuitĂ© de la communication dans les divers environnements, mĂȘme les plus sĂ©vĂšres. De plus, ce modĂšle conçoit et Ă©value un algorithme d'Ă©quilibrage de charge et d'ordonnancement permettant ainsi de renforcer et amĂ©liorer son efficacitĂ© et sa qualitĂ© de service (QoS) dans diffĂ©rents environnements. L’évaluation de ce modĂšle architectural montre des avantages tels que l’amĂ©lioration de la connectivitĂ© et l’efficacitĂ© d’exĂ©cution des tĂąches. Advances in peer-to-peer and wireless communication technologies have increasingly enabled the integration of mobile and pervasive devices into distributed systems and computing architectures in the Internet of Things paradigm. Likewise, these devices are subject to continuous technological development. Thus, they always tend to be miniaturized, generation after generation during which they are considered as de facto devices. The success of this progress is the emergence of collaborative mobiles and pervasive computing, particularly integrated into the architectural models of the Internet of Things. The most important benefit of this form of computing is the ease of connecting a large number of pervasive and portable devices when they are on the move with different networks available. Despite the continual advancements that support this field, mobile and pervasive intelligent systems (networks, devices, software and connection technologies) still suffer from various limitations at several levels such as maintaining connectivity, computing power, ability to data storage, communication speeds, the lifetime of power sources, the efficiency of processing large tasks in terms of partitioning, scheduling and load balancing. The accelerated technological development of the equipment and devices of these mobile models is always accompanied by their intensive use. Given this reality, it requires more efforts both in their structural design and management. This involves improving on the one hand, the architecture of these models and their communication technologies, and, on the other hand, the scheduling and load balancing algorithms for the work efficiency. The goal is to make these models more autonomous, intelligent, and collaborative by strengthening the different capabilities of their devices, their connectivity technologies and the applications that perform their tasks. Thus, we have established a collaborative autonomous and pervasive architectural model deployed at the periphery of networks. This model is based on various modern connection technologies such as wireless, peer-to-peer radio communication, and technologies offered by Pycom's LoPy4 such as LoRa, BLE, Wi-Fi, Radio Wi-Fi and Bluetooth. The integration of these technologies makes it possible to maintain the continuity of communication in the various environments, even the most severe ones. Within this model, we designed and evaluated a load balancing and scheduling algorithm to strengthen and improve its efficiency and quality of service (QoS) in different environments. The evaluation of this architectural model shows payoffs such as improvement of connectivity and efficiency of task executions

    Static and dynamic overproduction and selection of classifier ensembles with genetic algorithms

    Get PDF
    The overproduce-and-choose sttategy is a static classifier ensemble selection approach, which is divided into overproduction and selection phases. This thesis focuses on the selection phase, which is the challenge in overproduce-and-choose strategy. When this phase is implemented as an optimization process, the search criterion and the search algorithm are the two major topics involved. In this thesis, we concentrate in optimization processes conducted using genetic algorithms guided by both single- and multi-objective functions. We first focus on finding the best search criterion. Various search criteria are investigated, such as diversity, the error rate and ensemble size. Error rate and diversity measures are directly compared in the single-objective optimization approach. Diversity measures are combined with the error rate and with ensemble size, in pairs of objective functions, to guide the multi-optimization approach. Experimental results are presented and discussed. Thereafter, we show that besides focusing on the characteristics of the decision profiles of ensemble members, the control of overfitting at the selection phase of overproduce-and-choose strategy must also be taken into account. We show how overfitting can be detected at the selection phase and present three strategies to control overfitting. These strategies are tailored for the classifier ensemble selection problcm and compared. This comparison allows us to show that a global validation strategy should be applied to control overfitting in optimization processes involving a classifier ensembles selection task. Furthermore, this study has helped us establish that this global validation strategy can be used as a tool to measure the relationship between diversity and classification performance when diversity measures are employed as single-objective functions. Finally, the main contribution of this thesis is a proposed dynamic overproduce-and-choose strategy. While the static overproduce-and-choose selection strategy has traditionally focused on finding the most accurate subset of classifiers during the selection phase, and using it to predict the class of all the test samples, our dynamic overproduce-and- choose strategy allows the selection of the most confident subset of classifiers to label each test sample individually. Our method combines optimization and dynamic selection in a two-level selection phase. The optimization level is intended to generate a population of highly accurate classifier ensembles, while the dynamic selection level applies measures of confidence in order to select the ensemble with the highest degree of confidence in the current decision. Three different confidence measures are presented and compared. Our method outperforms classical static and dynamic selection strategies

    ÉquifinalitĂ©, incertitude et procĂ©dures multi-modĂšle en prĂ©vision hydrologique aux sites non-jaugĂ©s

    Get PDF
    L’objectif du prĂ©sent projet de recherche est d’analyser diverses mĂ©thodes de prĂ©vision hydrologique sur des bassins versants non-jaugĂ©s et d'en mieux comprendre les limitations afin d’y apporter des amĂ©liorations. Ce projet est divisĂ© en trois parties distinctes, chacune ayant contribuĂ© Ă  au moins un article scientifique parmi les sept contenus dans cette thĂšse. La premiĂšre partie consiste en la rĂ©duction de l’incertitude paramĂ©trique en amĂ©liorant le calage des modĂšles hydrologiques. Une comparaison et une analyse de 10 mĂ©thodes d’optimisation automatiques a permis d’identifier les algorithmes les mieux adaptĂ©s aux problĂšmes de calage associĂ©s Ă  ce projet. Cela a permis d’identifier des jeux de paramĂštres plus performants et plus robustes, ce qui s’avĂšre ĂȘtre une hypothĂšse forte dans certaines mĂ©thodes de rĂ©gionalisation. Dans la mĂȘme optique, des mĂ©thodes de rĂ©duction paramĂ©trique ont Ă©tĂ© testĂ©es et une approche de fixage de paramĂštres par analyse de sensibilitĂ© globale a Ă©tĂ© sĂ©lectionnĂ©e. Il a Ă©tĂ© dĂ©montrĂ© que le modĂšle hydrologique HSAMI pouvait, avec entre 8 et 11 paramĂštres fixĂ©s sur 23, conserver sa performance en validation ou en rĂ©gionalisation. Cependant, malgrĂ© la rĂ©duction de l’équifinalitĂ© et l’amĂ©lioration de l’identification paramĂ©trique, fixer des paramĂštres ne contribue pas Ă  augmenter la performance des mĂ©thodes de rĂ©gionalisation. Cette conclusion va Ă  l’encontre du concept de parcimonie gĂ©nĂ©ralement acceptĂ© dans la communautĂ© scientifique. La seconde partie du projet touche la modĂ©lisation multi-modĂšle, oĂč il a Ă©tĂ© montrĂ© que de pondĂ©rer les hydrogrammes simulĂ©s de plusieurs modĂšles amĂ©liore la performance de maniĂšre significative par rapport au meilleur modĂšle individuel. Une comparaison des approches classiques de pondĂ©ration multi-modĂšle ainsi que le dĂ©veloppement d’une nouvelle mĂ©thode a permis de sĂ©lectionner le meilleur outil pour les besoins du projet. Deux essais ont Ă©tĂ© effectuĂ©s en multi-modĂšle. PremiĂšrement, l’approche multi-modĂšle a Ă©tĂ© appliquĂ©e en rĂ©gionalisation avec trois modĂšles hydrologiques. Cependant, il a Ă©tĂ© dĂ©montrĂ© que la robustesse de deux des modĂšles Ă©tait insuffisante pour amĂ©liorer les performances du troisiĂšme. Enfin, une approche novatrice multi-modĂšle dans laquelle un modĂšle est lancĂ© avec plusieurs sources de donnĂ©es diffĂ©rentes a montrĂ© sa capacitĂ© Ă  rĂ©duire les erreurs liĂ©es aux donnĂ©es mĂ©tĂ©orologiques en simulation. La derniĂšre partie du projet est directement liĂ©e Ă  la rĂ©gionalisation. Il y a Ă©tĂ© dĂ©montrĂ© que l’équifinalitĂ© paramĂ©trique n’influence que trĂšs peu la qualitĂ© des prĂ©visions aux sites nonjaugĂ©s et que la performance du modĂšle est plus importante que l’identifiabilitĂ© des paramĂštres, du moins, pour le modĂšle hydrologique utilisĂ© ici. Une nouvelle approche hybride de rĂ©gionalisation a Ă©galement Ă©tĂ© proposĂ©e. Cette nouvelle approche a montrĂ© une meilleure performance que les approches classiques. Finalement, les approches de rĂ©gionalisation ont Ă©tĂ© mises Ă  l’épreuve dans un laboratoire numĂ©rique issu d’un modĂšle rĂ©gional de climat. Ceci a permis de les analyser dans un environnement exempt d’erreurs de mesure sur les donnĂ©es mĂ©tĂ©orologiques et physiographiques. La cohĂ©rence physique entre la mĂ©tĂ©orologie et l’hydrologie de ce monde virtuel Ă©tait Ă©galement respectĂ©e. Il a Ă©tĂ© dĂ©montrĂ© que les descripteurs physiques ne sont pas suffisants pour prĂ©voir la capacitĂ© d’une mĂ©thode de rĂ©gionalisation Ă  bien fonctionner, en plus de montrer que les incertitudes liĂ©es Ă  ces mesures sont moins importantes que ce qui rapportĂ© dans la littĂ©rature

    Ordonnancement de projets internationaux avec contraintes de matériel et de ressources

    Get PDF
    RÉSUMÉ L’évolution des processus d’affaires, rendue nĂ©cessaire par une globalisation des marchĂ©s toujours plus importante, a encouragĂ© les firmes internationales Ă  se tourner vers le fonctionnement par projet, qui s’est peu Ă  peu gĂ©nĂ©ralisĂ©. Mais la gestion de projets demeure une discipline complexe, impliquant de nombreux acteurs, sous-traitants et parties prenantes, et nĂ©cessitant le transport, l’approvisionnement et la livraison d’équipement et de matĂ©riel. Dans ce contexte, des dĂ©lais d’approvisionnement et des contraintes de capacitĂ© de stockage peuvent amener des retards dans l’avancement des projets et des dĂ©passements de budget. En effet, en dĂ©pit des tous les efforts accomplis en recherche pour dĂ©velopper des outils efficaces, la prise en compte des contraintes de livraison et d’approvisionnement lors de la phase de planification de projets demeure pour l’essentiel gĂ©rĂ©e manuellement par le gestionnaire de projets, selon son intuition et son expĂ©rience. Ainsi, les mĂ©thodes et algorithmes dĂ©veloppĂ©s ne sont utilisĂ©s en pratique que pour la gestion dite « traditionnelle » de projet, ne considĂ©rant qu’un unique projet de faible portĂ©e, sans contraintes logistiques. Reconnaissant cette nouvelle rĂ©alitĂ© et les besoins qui en dĂ©coulent, ce mĂ©moire envisage une approche de rĂ©solution plus intĂ©grĂ©e oĂč l’on considĂšre le problĂšme de planification de projets avec des contraintes liĂ©es au stockage et Ă  la livraison de matĂ©riel. Pour cela, un gĂ©nĂ©rateur alĂ©atoire de contraintes logistiques a Ă©tĂ© dĂ©veloppĂ©, permettant de dĂ©finir les donnĂ©es du problĂšme relatives aux contraintes logistiques, tout en jouant sur leur poids relatif. La rĂ©solution du problĂšme ainsi formulĂ© est effectuĂ©e par le biais d’un algorithme gĂ©nĂ©tique optimisĂ© afin de dĂ©terminer un Ă©chĂ©ancier de projet rĂ©alisable. L’algorithme, par des opĂ©rations de sĂ©lection, croisement et mutation sur une population de solutions admissibles, amĂ©liore globalement la qualitĂ© de celle-ci au fil des itĂ©rations et converge peu Ă  peu vers un optimum.---------- ABSTRACT With globalization of markets, new business models have emerged and international companies started using project management technics. However, the management of international projects remains complex as it involves several sub-contractors and requires the transportation of lots of construction equipment and materials. In that context, long material delivery times and storage capacity constraints may lead to project delays and budget overruns. Indeed, in spite of all research efforts accomplished to develop strong project management tools, project plans taking into account space and equipment availability for the execution of tasks are mostly manually developed on the basis of planner’s intuition and experience. Unfortunately, this task is nearly infeasible to perform in the case of large projects, due to the combinational nature of the resource allocation problem. Methods and algorithms are only used for project management in a “traditional” way, with a single small project and without any logistic constraint. The proposed model addresses this issue by formulating this problem as a scheduling problem with limited resources - resources can be either employees or storage areas - and by defining material delivery constraints. To that purpose, a random logistic constraints generator was developed, in order to create the problem data and to choose its relative weight. The resolution of the formulated problem is performed by a genetic algorithm, which determines a feasible project plan. This algorithm applies many operators on a population of solutions, such as selection crossover and mutation, to improve the population global quality and make the solutions converge towards a local optimum

    Étude de l’impact du chevauchement sur les performances de projets complexes d’ingĂ©nierie

    Get PDF
    RÉSUMÉ : Le chevauchement d’activitĂ©s au sein d’un projet est une des techniques les plus rĂ©pandues pour accĂ©lĂ©rer l’exĂ©cution d’un projet. Le chevauchement d’activitĂ©s consiste Ă  autoriser des activitĂ©s qui traditionnellement s’exĂ©cutent de façon sĂ©quentielle Ă  se chevaucher de sorte que les activitĂ©s en aval dĂ©butent avant la fin des activitĂ©s en amont en se basant sur des informations partielles. Plusieurs stratĂ©gies d’exĂ©cution de projets appliquĂ©es dans la pratique, telles que l’ingĂ©nierie simultanĂ©e dans les projets de dĂ©veloppement de produits et la construction en « fast-tracking », reposent sur ce principe. Cette technique a montrĂ© ses preuves dans sa capacitĂ© Ă  rĂ©duire la durĂ©e de projets, avec cependant plusieurs inconvĂ©nients. Le chevauchement peut causer des retouches du travail exĂ©cutĂ© Ă  partir d’information prĂ©liminaire et mener Ă  des itĂ©rations. Ces retouches sont difficilement quantifiables et reprĂ©sentent une charge de travail et des coĂ»ts supplĂ©mentaires qui peuvent rĂ©duire ou annuler les bĂ©nĂ©fices du chevauchement. Cela soulĂšve la question de quand et de combien les activitĂ©s devraient ĂȘtre chevauchĂ©es dans les projets industriels. Dans la pratique, les gestionnaires de projets ne possĂšdent pas d’outils d’aide Ă  la dĂ©cision pour rĂ©pondre Ă  cette question. Cette thĂšse s’intĂ©resse ainsi au problĂšme d’ordonnancement de projet avec chevauchement d’activitĂ©s dans un contexte dĂ©terministe. Ce problĂšme cherche Ă  dĂ©terminer conjointement le meilleur calendrier en termes de coĂ»t ou de durĂ©e de projet et les dĂ©cisions de chevauchement, c’est-Ă -dire quelles activitĂ©s chevaucher et dans quelle mesure. Nous nous intĂ©ressons aux projets complexes caractĂ©risĂ©s par des contraintes de disponibilitĂ© de ressources, des rĂ©seaux complexes d’activitĂ©s, un nombre important d’activitĂ©s et de couples d’activitĂ©s qui peuvent se chevaucher. Les objectifs de cette thĂšse sont de modĂ©liser, quantifier et analyser, dans le cas de projets complexes d’ingĂ©nierie, l’impact des dĂ©cisions de chevauchement activitĂ©s sur les performances de projet (durĂ©e et coĂ»t), en considĂ©rant un modĂšle rĂ©aliste de chevauchement. Cette thĂšse vise aussi Ă  apporter une meilleure comprĂ©hension de ces choix dans les projets complexes et proposer des stratĂ©gies gĂ©nĂ©rales applicables en pratique. Les travaux rĂ©alisĂ©s dans le cadre de cette thĂšse sont articulĂ©s autour de trois articles publiĂ©s ou soumis Ă  des revues scientifiques. Le premier article intitulĂ© « Time-cost trade-offs in resource-constrained project scheduling problems with overlapping modes » (publiĂ© en 2014 dans International Journal of Project Organisation and Management) introduit un modĂšle de chevauchement d’activitĂ©s basĂ© sur des modes de chevauchement reliĂ©s aux jalons internes des activitĂ©s et permet de modĂ©liser de façon rĂ©aliste et flexible la relation entre durĂ©e de chevauchement et durĂ©e de retouche. Ce modĂšle est insĂ©rĂ© dans une modĂ©lisation du problĂšme de compromis durĂ©e-coĂ»t de l’ordonnancement de projets complexes avec contraintes de ressource et chevauchement d’activitĂ©s sous la forme d’un programme linĂ©aire en nombres entiers. Les durĂ©es de communication/coordination et de retouches sont considĂ©rĂ©es. Le problĂšme est rĂ©solu avec une mĂ©thode exacte pour un exemple virtuel de projet. Les rĂ©sultats illustrent les interactions entre le coĂ»t de projet, la durĂ©e du projet et les contraintes de ressource ainsi que leur influence sur le temps de rĂ©solution. Le second article intitulĂ© « A Path Relinking-based Scatter Search for the Resource-Constrained Project Scheduling Problem » (soumis dans European Journal of Operational Research) introduit une mĂ©taheuristique dans la famille des recherches dispersĂ©es (« scatter search ») pour rĂ©soudre le problĂšme standard RCPSP (« Resource-Constrained Project Scheduling Problem ») sans chevauchement. Cet algorithme utilise la mĂ©thode FBI (« Forward-Backward Improvement »), inverse la direction d’ordonnancement Ă  chaque itĂ©ration et est basĂ© sur deux mĂ©canismes novateurs. PremiĂšrement, un PR (« Path Relinking ») bidirectionnel avec un nouveau mouvement opĂ©rant sur les distances entre activitĂ©s est utilisĂ© comme mĂ©thode de combinaison des solutions. DeuxiĂšmement, une mĂ©thode d’amĂ©lioration est utilisĂ©e pour amĂ©liorer la qualitĂ© et la diversitĂ© des solutions de l’ensemble de rĂ©fĂ©rence. Une mĂ©thode avancĂ©e de paramĂ©trage de l’algorithme utilisant une mĂ©thode de recherche locale a Ă©tĂ© dĂ©veloppĂ©e pour dĂ©terminer les meilleures valeurs de ses paramĂštres. L’article montre que cette mĂ©taheuristique est capable de fournir des solutions de grande qualitĂ© avec des temps de calcul acceptables et appartient aux meilleures mĂ©thodes approchĂ©es existantes dans la littĂ©rature pour la rĂ©solution des instances virtuelles de projet de PSPLIB. Enfin, le troisiĂšme article intitulĂ© « Influence of the project characteristics on the efficiency of activity overlapping » (soumis dans Computers & Operations Research) a pour principales contributions de quantifier et d’analyser l’influence de huit caractĂ©ristiques de projets sur l’efficacitĂ© du chevauchement pour diminuer la durĂ©e de projet. La rĂ©duction de la durĂ©e de projet est obtenue en rĂ©solvant le problĂšme RCPSP avec et sans chevauchement. Deux mĂ©thodes de rĂ©solution sont dĂ©veloppĂ©es pour rĂ©soudre le problĂšme avec chevauchement. Une nouvelle mĂ©thode exacte basĂ©e sur un programme linĂ©aire en nombres entiers avec modes de chevauchement et des techniques de propagation de contraintes est dĂ©veloppĂ©e. La seconde mĂ©thode est une mĂ©taheuristique dĂ©rivĂ©e de la mĂ©taheuristique proposĂ©e dans le second article. Ces mĂ©thodes sont appliquĂ©es Ă  un bassin de 3888 instances virtuelles de projets de 30 Ă  120 activitĂ©s avec chevauchement. La premiĂšre observation est que le chevauchement n’apporte aucune rĂ©duction dans prĂšs de 25% des cas. Une analyse statistique permet de distinguer l’influence de caractĂ©ristiques de projets sur l’efficacitĂ© du chevauchement et de montrer que la proportion de couples d’activitĂ©s chevauchables qui sont sur le chemin critique et la sĂ©vĂ©ritĂ© des contraintes de ressources ont le plus d’influence sur la rĂ©duction de la durĂ©e de projet. Également, les rĂ©sultats indiquent que certains principes gĂ©nĂ©raux se dĂ©gagent pour les dĂ©cisions de chevauchement. La meilleure stratĂ©gie devrait consister Ă  chevaucher peu de couples d’activitĂ©s chevauchables et de les chevaucher beaucoup. De plus, mĂȘme si les activitĂ©s sur le chemin critique sont plus susceptibles d’ĂȘtre chevauchĂ©es, les dĂ©cisions de chevauchement dans un contexte de contraintes de ressource ne doivent pas uniquement ĂȘtre basĂ©es sur la criticalitĂ© des activitĂ©s. Enfin, ces observations sont confrontĂ©es aux stratĂ©gies pratiques de chevauchement proposĂ©es dans la littĂ©rature. Ces travaux visent Ă  contribuer au dĂ©veloppement d’outils pour assister les gestionnaires de projet dans leurs dĂ©cisions relatives au chevauchement d’activitĂ©s. Les principales contributions scientifiques de ces travaux sont les suivantes. PremiĂšrement, nous proposons une modĂ©lisation plus rĂ©aliste du problĂšme d’ordonnancement de projet avec chevauchement d’activitĂ©s. DeuxiĂšmement, une nouvelle mĂ©taheuristique de type recherche dispersĂ©e (« scatter search ») performante pour le problĂšme classique RCPSP est dĂ©veloppĂ©e. TroisiĂšmement, nous introduisons des mĂ©thodes de rĂ©solution exacte et approchĂ©e performantes pour le problĂšme d’ordonnancement de projet avec chevauchement d’activitĂ©s. La capacitĂ© des mĂ©thodes exactes Ă  rĂ©soudre des problĂšmes d’ordonnancement pour des projets de grande taille avec contraintes de ressource Ă©tant limitĂ©e, cette thĂšse prĂ©sente en effet Ă  notre connaissance la premiĂšre mĂ©thode approchĂ©e de type mĂ©taheuristique pour ce problĂšme. QuatriĂšmement, ces travaux quantifient et analysent l’effet de huit caractĂ©ristiques de projet sur l’efficacitĂ© du chevauchement d’activitĂ©s pour diminuer la durĂ©e d’un projet. Enfin, nous proposons des principes gĂ©nĂ©raux pour aide les praticiens Ă  prendre les meilleures dĂ©cisions de chevauchement d’activitĂ©s.----------ABSTRACT : Activity overlapping is one of the most employed strategies used to accelerate project execution. It consists in relaxing the sequential execution of dependent activities by allowing downstream activities to begin before receiving all the final information required from upstream activities. Several practical strategies, such as concurrent engineering and fast-tracking construction, are based on the concept of overlapping. Overlapping has been demonstrated to be powerful for reducing project makespan, but it has some drawbacks. Overlapping often causes additional reworks in downstream activities, as well as iterations of interdependent activities, that are difficult to quantify and represent additional workloads and costs. Such reworks may outweigh the benefices of overlapping in terms of cost and time. This raises the question of when and to which extent overlapping should be applied. In practice, project teams determine overlapping strategies on an ad hoc basis without always considering rework and interaction between activities. This thesis considers the project scheduling problem with activity overlapping in a deterministic context. This problem aims to jointly determine the best schedule in terms of cost and duration and the best overlapping decisions, namely which activities should be overlapped and to which extent. We focus on the complex projects characterized by constraints on resource availability, a complex network of activities, a large number of activities and a large number of couples of overlappable activities. The main objectives of this thesis are to model, quantify and analyze the impact of overlapping decisions on the project performances (cost and duration) in the case of complex industrial projects, by considering a realistic model of the overlapping process. This thesis also aims at providing a better understanding of these decisions in complex projects and at guiding planners in improving existing practices. The research undertaken in this thesis is divided into three papers published or submitted to international peer-reviewed scientific journals. The first paper titled « Time-cost trade-offs in resource-constrained project scheduling problems with overlapping modes » (published in 2014 in International Journal of Project Organisation and Management) proposes an overlapping process model based on overlapping modes related to activities’ internal milestones that is a realistic and flexible model of the relation between the amount of overlap and the amount of rework. This overlapping model is then enclosed in a model for the time-cost trade-offs in resource-constrained project scheduling problem with activity overlapping. The model is formulated as a linear integer programming model. The times and costs for communication/coordination and reworks are considered. The problem is solved with an exact method for an illustrative project instance. The results highlight the interactions between the project total cost, its makespan and the severity of the resource constraints and also show their influence on the computational time. The second paper titled « A Path Relinking-based Scatter Search for the Resource-Constrained Project Scheduling Problem » (submitted to European Journal of Operational Research) introduces a metaheuristic based on scatter search for solving the standard RCPSP (Resource-Constrained Project Scheduling Problem) without overlapping. This algorithm involves FBI (Forward-Backward Improvement), reversing the project network at each iteration and two new mechanisms. First, a bidirectional PR (path relinking method) with a new move is used as method for combining solutions. Second, a new improvement procedure is proposed in the reference set update method for enhancing the quality and the diversity of the reference set. An advanced parameter tuning method based on local search is employed. The paper shows that the proposed scatter search produces high-quality solutions in reasonable computational time and is among the best performing heuristic procedures in the literature for solving the instance of the PSPLIB benchmark.Finally, the main contributions of the third paper titled « Influence of the project characteristics on the efficiency of activity overlapping » (submitted to Computers & Operations Research) are to quantify and analyze the influence of eight project characteristics on the efficiency of activity overlapping for reducing project makespan. The reduction of the project makespan is obtained by solving the project scheduling problem with and without overlapping. Two methods have been developed for solving the problem with overlapping. First, we introduce a 0-1 integer linear programming model with overlapping modes and constraint propagation techniques as preprocessing. Second, we propose a metaheuristic based on the scatter search algorithm described in the second paper. These methods are applied on a set of 3888 project instances with overlapping composed of 30 to 120 activities. The first finding is that no reduction of the makespan is observed in about 25% of the projects of the benchmark. A statistical analysis is conducted to measure the effect of eight project parameters on the makespan gain. It reveals that the proportion of couples of overlappable activities on the critical path and the scarcity of the resource constraints have the highest influence on the makespan gain. In addition, general rules of thumb are derived from the analysis of the results. The best overlapping decisions should consist in overlapping only few couples of overlappable activities and to overlap them with a large degree of overlapping. Even though the activities on the critical path are more likely to be overlapped, overlapping decision should not rely solely on the criticality of the activities. The results are also compared to practical strategies for applying overlapping proposed in the literature. The main scientific contributions of these works with respect to the scientific literature and from the perspective of assisting project managers to choose the most appropriate overlapping decisions can be summarized as follows. First, we propose a more realistic model of the project scheduling problem with activity overlapping. Second, a new competitive metaheuristic based on scatter search has been developed. Third, we propose competitive exact and heuristic methods for solving the project scheduling problem with activity overlapping. Indeed, as the capacity of exact methods for solving project scheduling problem for large scale projects with resource constraints is limited, the metaheuristic developed in this thesis for this kind of problem is the first in the literature to our knowledge. Fourth, this thesis proposes to quantify and analyze the influence of eight project characteristics on the efficiency of activity overlapping for reducing project makespan. Finally, the findings of this work provide a better understanding of the overlapping decisions and should guide planners for the decisions on activity overlapping

    Metaheuristic and matheuristic approaches for multi-objective optimization problems in process engineering : application to the hydrogen supply chain design

    Get PDF
    Complex optimization problems are ubiquitous in Process Systems Engineering (PSE) and are generally solved by deterministic approaches. The treatment of real case studies usually involves mixed-integer variables, nonlinear functions, a large number of constraints, and several conflicting criteria to be optimized simultaneously, thus challenging the classical methods. The main motivation of this research is therefore to explore alternative solution methods for addressing these complex multiobjective optimization problems related to the PSE area, focusing on the recent advances in Evolutionary Computation. If multiobjective evolutionary algorithms (MOEAs) have proven to be robust for the solution of multiobjective problems, their performance yet strongly depends on the constraint-handling techniques for the solution of highly constrained problems. The core of innovation of this research is the adaptation of metaheuristic-based tools to this class of PSE problems. For this purpose, a two-stage strategy was developed. First, an empirical study was performed in the perspective of comparing different algorithmic configurations and selecting the best to provide a high-quality approximation of the Pareto front. This study, comprising both academic test problems and several PSE applications, demonstrated that a method using the gradient-based mechanism to repair infeasible solutions consistently obtains the best results, in particular for handling equality constraints. Capitalizing on the experience from this preliminary numerical investigation, a novel matheuristic solution strategy was then developed and adapted to the problem of Hydrogen Supply Chain (HSC) design that encompasses the aforementioned numerical difficulties, considering both economic and environmental criteria. A MOEA based on decomposition combined with the gradient-based repair was first explored as a solution technique. However, due to the important number of mass balances (equality constraints), this approach showed a poor convergence to the optimal Pareto front. Therefore, a novel matheuristic was developed and adapted to this problem, following a bilevel decomposition: the upper level (discrete) addresses the HSC structure design problem (facility sizing and location), whereas the lower level (Linear Programming problem) solves the corresponding operation subproblem (production and transportation). This strategy allows the development of an ad-hoc matheuristic solution technique, through the hybridization of a MOEA (upper level) with a LP solver (lower level) using a scalarizing function to deal with the two objectives considered. The numerical results obtained for the Occitanie region case study highlight that the hybrid approach produces an accurate approximation of the optimal Pareto front, more efficiently than exact solution methods. Finally, the matheuristic allowed studying the HSC design problem with more realistic assumptions regarding the technologies used for hydrogen synthesis, the learning rates capturing the increasing maturity of these technologies over time and nonlinear relationships for the computation of Capital and Operational Expenditures (CAPEX and OPEX) for the hydrogen production facilities. The resulting novel model, with a non-convex, bi-objective mixed-integer nonlinear programming (MINLP) formulation, can be efficiently solved through minor modifications in the hybrid algorithm proposed earlier, which finds its mere justification in the determination of the timewise deployment of sustainable hydrogen supply chains

    The significance of silence. Long gaps attenuate the preference for ‘yes’ responses in conversation.

    Get PDF
    In conversation, negative responses to invitations, requests, offers and the like more often occur with a delay – conversation analysts talk of them as dispreferred. Here we examine the contrastive cognitive load ‘yes’ and ‘no’ responses make, either when given relatively fast (300 ms) or delayed (1000 ms). Participants heard minidialogues, with turns extracted from a spoken corpus, while having their EEG recorded. We find that a fast ‘no’ evokes an N400-effect relative to a fast ‘yes’, however this contrast is not present for delayed responses. This shows that an immediate response is expected to be positive – but this expectation disappears as the response time lengthens because now in ordinary conversation the probability of a ‘no’ has increased. Additionally, however, 'No' responses elicit a late frontal positivity both when they are fast and when they are delayed. Thus, regardless of the latency of response, a ‘no’ response is associated with a late positivity, since a negative response is always dispreferred and may require an account. Together these results show that negative responses to social actions exact a higher cognitive load, but especially when least expected, as an immediate response

    Multi-channel Communication in Wireless Networks

    Get PDF
    Multi-channel communication has been developed to overcome some limitations related to the throughput and delivery rate which become necessary for many applications that require sufficient bandwidth to transmit a large amount of data in Wireless Networks (WNs) such as multimedia communication. However, the requirement of frequent negotiation for the channels assignment process incurs extra-large communication overhead and collisions, which results in the reduction of both communication quality and network lifetime. This effect can play an important role in the performance deterioration of certain WNs types, especially the Wireless Sensor Networks (WSNs) since they are characterized by their limited resources. This work addresses the improvement of communication in multi-channel WSNs. Consequently, four protocols are proposed. The first one is the Multi-Channel Scheduling Protocol (MCSP) for wireless personal networks IEEE802.15.4, which focuses on overcoming the collisions problem through a multi-channel scheduling scheme. The second protocol is the Energy-efficient Reinforcement Learning (RL) Multi-channel MAC (ERL MMAC) for WSNs, which bases on the enhancement of the energy consumption in WSNs by reducing collisions and balancing the remaining energy between the nodes using a singleagent RL. The third work is the proposition of a new heuristically accelerated RL protocol named Heuristically Accelerated Reinforcement Learning approach for Channel Assignment (HARL CA) for WSNs to reduce the number of learning iterations in an energy-efficient way taking into account the bandwidth aspect in the scheduling process. Finally, the fourth contribution represents a proposition of a new cooperative multi-agent RL approach for Channel Assignment (CRLCA) in WSNs, which improves cooperative learning using an accelerated learning model, and overcomes the extra communication overhead problem of the cooperative RL using a new method for self-scheduling and energy balancing. The proposed approach is performed through two algorithms SCRLCA and DCRLCA for Static and Dynamic performance respectively. The proposed protocols and techniques have been successfully evaluated and show outperformed results in different cases through several experiments

    Optimisation de la configuration d'un instrument superspectral aéroporté pour la classification : application au milieu urbain

    Get PDF
    This work was performed in the context of a possible enrichment of land cover databases. The description of land cover is necessary it possible to produce environmental indicators for the management of ecosystems and territories, in response to various societal and scientific needs. Thus, different land cover databases already exist at various levels (global, European, national, regional or local) or are currently being produced. However, it appeared that knowledge about land cover should more detailled in urban areas, since it is required by several city modeling applications (micro-meteorological, hydrological, or pollution monitoring simulators), or public regulations monitoring (e.g. concerning ground perviousness). Such materials maps would be (both semantically and spatially) finer than what is contained in existing land cover databases. Therefore, they could be an additional layer, both in land cover databases (such as in IGN High Resolution land cover database) and in 3D city models. No existing database contains such information about urban material maps. Thus remote sensing is the only solution to produce it. However, due to the high heterogeneity of urban materials, their variability, but also the strong similarities between different material classes, usual optical multispectral sensors (with only the 4 red - green - blue - near infrared bands) are not sufficient to reach a good discrimination of materials. A multispectral sensor or superspectral, that is to say spectrally richer, could therefore provide a solution to this limit. Thus, this work was performed intending the design of such sensor. It aimed at identifying the best spectral configuration for classification of urban materials, or at least to propose sub-optimal solutions. In other words, a spectral optimization was carried out in order to optimize both the position of the bands in the spectrum and their width. Automatic feature selection methods were used. This work was performed in two steps. A first task aimed at defining the spectral optimization methods and at validating them on literature reference data sets. Two state-of-the-art optimization heuristics (Sequential Forward Floating Search and genetic algorithms) were chosen owing to their genericity and flexibility, and therefore their ability to be used to optimize different feature selection criteria. A benchmark of different scores measuring the relevance of a set of features was performed to decide which score to optimize during the band selection process. Band width optimization was then studied: the proposed method consisted in building a hierarchy of bands merged according to their similarities. Band selection was then processed within this hierarchy. The second part of the work consisted in the application of these spectral optimization algorithms to the case study of urban materials. A collection of urban materials spectra was first caught and from various spectral libraries ( ASTER , MEMORIES...). Spectral optimization was then performed on this dataset. A limited number (about 10) of well chosen bands appeared to be sufficient to classify next common materials (slates - asphalt - cement - gravel - metal - cobblestones - shingle - earth – tiles). Bands from short wave infrared spectral domain (1400 - 2500 nm) were shown again to be very useful to discriminate urban materials. However, quantitative results assessing the confusions between the materials must be considered carefully since some materials are very uncommon in the library of collected spectra, and thus their possible variability is not completely consideredCe travail s'inscrit dans la perspective de l'enrichissement des bases de donnĂ©es d'occupation du sol. La description de l'occupation du sol permet de produire des indicateurs environnementaux pour la gestion des Ă©cosystĂšmes et des territoires, en rĂ©ponse Ă  des besoins sociĂ©taux, rĂ©glementaires et scientifiques. Aussi, des bases de donnĂ©es dĂ©crivant l'occupation du sol existent Ă  diffĂ©rents niveaux (local, national, europĂ©en) ou sont en cours de constitution. Il est toutefois apparu que la connaissance de l'occupation du sol nĂ©cessaire pour certaines applications de modĂ©lisation de la ville (simulateurs de micro-mĂ©tĂ©orologie, d'hydrologie, ou de suivi de pollutions), voire de suivi rĂ©glementaire (impermĂ©abilisation des sols) est plus fine (au niveau sĂ©mantique et gĂ©omĂ©trique) que ce que contiennent ces bases de donnĂ©es. Des cartes de matĂ©riaux sont donc nĂ©cessaires pour certaines applications. Elles pourraient constituer une couche supplĂ©mentaire, Ă  la fois dans des bases de donnĂ©es sur l'occupation du sol (comme l'occupation du sol Ă  grande Ă©chelle de l'IGN) et dans des maquettes urbaines 3D.Aucune base de donnĂ©es existante ne contenant cette information, la tĂ©lĂ©dĂ©tection apparaĂźt comme la seule solution pour la produire. NĂ©anmoins, du fait de la forte hĂ©tĂ©rogĂ©nĂ©itĂ© des matĂ©riaux, de leur variabilitĂ©, mais aussi des fortes ressemblances entre classes distinctes, il apparaĂźt que les capteurs optiques multispectraux classiques (limitĂ©s aux 4 canaux rouge - vert - bleu - proche infrarouge) sont insuffisants pour bien discriminer des matĂ©riaux. Un capteur dit superspectral, c'est-Ă -dire plus riche spectralement, pourrait apporter une solution Ă  cette limite. Ce travail s'est donc positionnĂ© dans l'optique de la conception d'un tel capteur et a consistĂ© Ă  identifier la meilleure configuration spectrale pour la classification des matĂ©riaux urbains, ou du moins Ă  proposer des solutions s'en approchant. Un travail d'optimisation spectrale a donc Ă©tĂ© rĂ©alisĂ© afin d'optimiser Ă  la fois la position des bandes dans le spectre ainsi que leur largeur. Le travail s'est dĂ©roulĂ© en deux temps. Une premiĂšre tĂąche a consistĂ© Ă  dĂ©finir et prĂ©ciser les mĂ©thodes d'optimisation de bandes, et Ă  les valider sur des jeux de donnĂ©es de rĂ©fĂ©rence de la littĂ©rature. Deux heuristiques d'optimisation classiques (l'une incrĂ©mentale, l'autre stochastique) ont Ă©tĂ© choisies du fait de leur gĂ©nĂ©ricitĂ© et de leur flexibilitĂ©, et donc de leur capacitĂ© Ă  ĂȘtre utilisĂ©es pour diffĂ©rents critĂšres de sĂ©lection d'attributs. Une comparaison de diffĂ©rentes mesures de la pertinence d'un jeu de bandes a Ă©tĂ© effectuĂ©e afin de dĂ©finir le score Ă  optimiser lors du processus de sĂ©lection de bandes. L'optimisation de la largeur des bandes a ensuite Ă©tĂ© Ă©tudiĂ©e : la mĂ©thode proposĂ©e consiste Ă  prĂ©alablement construire une hiĂ©rarchie de bandes fusionnĂ©es en fonction de leur similaritĂ©, le processus de sĂ©lection de bandes se dĂ©roulant ensuite au sein de cette hiĂ©rarchie. La seconde partie du travail a consistĂ© en l'application de ces algorithmes d'optimisation spectrale au cas d'Ă©tude des matĂ©riaux urbains. Une collection de spectres de matĂ©riaux urbains a d'abord Ă©tĂ© rĂ©unie Ă  partir de diffĂ©rentes librairies spectrales (ASTER, MEMOIRES, ...). L'optimisation spectrale a ensuite Ă©tĂ© menĂ©e Ă  partir de ce jeu de donnĂ©es. Il est apparu qu'un nombre limitĂ© de bandes bien choisies suffisait pour discriminer 9 classes de matĂ©riaux communs (ardoise - asphalte - ciment - gravier - mĂ©tal - pavĂ©s en pierre - shingle - terre – tuile). L'apport de bandes issues du domaine de l'infrarouge onde courte (1400 - 2500 nm) pour la discrimination des matĂ©riaux a Ă©galement Ă©tĂ© vĂ©rifiĂ©e. La portĂ©e des rĂ©sultats chiffrĂ©s obtenus en terme de confusions entre les matĂ©riaux reste toutefois Ă  nuancer du fait de la trĂšs faible reprĂ©sentation de certains matĂ©riaux dans la librairie de spectres collectĂ©s, ne couvrant donc pas la totalitĂ© de leur variabilit
    corecore