58 research outputs found

    La nouvelle Commission québécoise des relations du travail (1988)

    Get PDF
    Après avoir résumé les motifs allégués par le ministre du Travail du Québec au soutien du projet de loi 30 et des objectifs qu'il y visait, les auteurs analysent les changements institutionnels apportes par cette loi, précisent le rôle et le mandat de la Commission et étudient en détail son fonctionnement et ses pouvoirs.In this article the authors first summarize the reasons put forward by the Quebec Minister of Labour to justify Bill 30 and what the amendments to the Labour Code are intended to accomplish. They then analyse the changes brought about by the new law in the decision-making bodies: they discuss those bodies which will be abolished and the transfer of jurisdiction to other tribunals including the new Labour Relations Board which will be created. The authors next evaluate the role and mandate of the new Board and they draw a distinction between those areas where the Board alone will have jurisdiction and those where its role will be concurrent with another body. There is particular emphasis on unfair labour practice complaints since the Board will have exclusive jurisdiction over a whole new range of such complaints. The major part of the article consists of a detailed study of the functions and powers of the new Board. Here the authors begin by reviewing how the Board becomes seized of a matter: who has standing to file applications and be heard, what are the time limits and what procedural requirements must be met. Once seized of the matter but prior to the hearing, the Board is called upon to perform two distinct functions: investigation and mediation. In this context, the authors attempt to answer these questions: Who will act on the Board's behalf to investigate the facts and what are the precise powers of the investigator and who will act on the Board's behalf to mediate between the parties and at what point is such mediation likely to occur?The hearing itself is then considered as well as the particular problem of who can act as a decision maker. Also discussed is the legality of the provision of the new law whereby a hearing does not have to be held and can be dispensed with by the expedient of allowing each party the opportunity to make representations without the possibility of cross-examining the opposite party. The new law allows the Board to examine matters «according to the mode of proof it deems appropriate» and the precise meaning of this phrase is discussed. The Board's remedial powers come under close scrutiny in the article, especially the power to order injunctive-type relief in the form of cease and desist orders. It is pointed out that the Board may refuse to intervene in certain cases and that it may take cognizance of an undertaking to conform to the Code instead of rendering a decision. Since the Board must give written reasons for any decision which terminates a matter, consideration is given to how the expressions «reasons» and «terminates a matter» are likely to be interpreted. The Board has the power to revoke and review its own decision and the circumstances giving rise to such a procedure are examined. Finally, the authors look at the manner in which the Board's decisions become enforceable. By way of conclusion, the authors list the additional areas in which the Board will have the power to intervene and consider the consequence of such increased intervention by the Board. They also indicate how they believe employers and unions will have to adjust and change some long-standing practices. The article ends with a warning of the problems that could arise if Bill 30 is not implemented and applied with good judgment and insight

    Process intensification education contributes to sustainable development goals: Part 2

    Get PDF
    Achieving the United Nations sustainable development goals requires industry and society to develop tools and processes that work at all scales, enabling goods delivery, services, and technology to large conglomerates and remote regions. Process Intensification (PI) is a technological advance that promises to deliver means to reach these goals, but higher education has yet to totally embrace the program. Here, we present practical examples on how to better teach the principles of PI in the context of the Bloom's taxonomy and summarise the current industrial use and the future demands for PI, as a continuation of the topics discussed in Part 1. In the appendices, we provide details on the existing PI courses around the world, as well as teaching activities that are showcased during these courses to aid students’ lifelong learning. The increasing number of successful commercial cases of PI highlight the importance of PI education for both students in academia and industrial staff.We acknowledge the sponsors of the Lorentz’ workshop on“Educating in PI”: The MESA+Institute of the University of Twente,Sonics and Materials (USA) and the PIN-NL Dutch Process Intensi-fication Network. DFR acknowledges support by The Netherlands Centre for Mul-tiscale Catalytic Energy Conversion (MCEC), an NWO Gravitationprogramme funded by the Ministry of Education, Culture and Sci-ence of the government of The Netherlands. NA acknowledges the Deutsche Forschungsgemeinschaft (DFG)- TRR 63¨Integrierte Chemische Prozesse in flüssigen Mehrphasen-systemen¨(Teilprojekt A10) - 56091768. The participation by Robert Weber in the workshop and thisreport was supported by Laboratory Directed Research and Devel-opment funding at Pacific Northwest National Laboratory (PNNL).PNNL is a multiprogram national laboratory operated for theUS Department of Energy by Battelle under contract DE-AC05-76RL0183

    Efficient Production of HIV-1 Virus-Like Particles from a Mammalian Expression Vector Requires the N-Terminal Capsid Domain

    Get PDF
    It is now well accepted that the structural protein Pr55Gag is sufficient by itself to produce HIV-1 virus-like particles (VLPs). This polyprotein precursor contains different domains including matrix, capsid, SP1, nucleocapsid, SP2 and p6. In the present study, we wanted to determine by mutagenesis which region(s) is essential to the production of VLPs when Pr55Gag is inserted in a mammalian expression vector, which allows studying the protein of interest in the absence of other viral proteins. To do so, we first studied a minimal Pr55Gag sequence called Gag min that was used previously. We found that Gag min fails to produce VLPs when expressed in an expression vector instead of within a molecular clone. This failure occurs early in the cell at the assembly of viral proteins. We then generated a series of deletion and substitution mutants, and examined their ability to produce VLPs by combining biochemical and microscopic approaches. We demonstrate that the matrix region is not necessary, but that the efficiency of VLP production depends strongly on the presence of its basic region. Moreover, the presence of the N-terminal domain of capsid is required for VLP production when Gag is expressed alone. These findings, combined with previous observations indicating that HIV-1 Pr55Gag-derived VLPs act as potent stimulators of innate and acquired immunity, make the use of this strategy worth considering for vaccine development

    In-Datacenter Performance Analysis of a Tensor Processing Unit

    Full text link
    Many architects believe that major improvements in cost-energy-performance must now come from domain-specific hardware. This paper evaluates a custom ASIC---called a Tensor Processing Unit (TPU)---deployed in datacenters since 2015 that accelerates the inference phase of neural networks (NN). The heart of the TPU is a 65,536 8-bit MAC matrix multiply unit that offers a peak throughput of 92 TeraOps/second (TOPS) and a large (28 MiB) software-managed on-chip memory. The TPU's deterministic execution model is a better match to the 99th-percentile response-time requirement of our NN applications than are the time-varying optimizations of CPUs and GPUs (caches, out-of-order execution, multithreading, multiprocessing, prefetching, ...) that help average throughput more than guaranteed latency. The lack of such features helps explain why, despite having myriad MACs and a big memory, the TPU is relatively small and low power. We compare the TPU to a server-class Intel Haswell CPU and an Nvidia K80 GPU, which are contemporaries deployed in the same datacenters. Our workload, written in the high-level TensorFlow framework, uses production NN applications (MLPs, CNNs, and LSTMs) that represent 95% of our datacenters' NN inference demand. Despite low utilization for some applications, the TPU is on average about 15X - 30X faster than its contemporary GPU or CPU, with TOPS/Watt about 30X - 80X higher. Moreover, using the GPU's GDDR5 memory in the TPU would triple achieved TOPS and raise TOPS/Watt to nearly 70X the GPU and 200X the CPU.Comment: 17 pages, 11 figures, 8 tables. To appear at the 44th International Symposium on Computer Architecture (ISCA), Toronto, Canada, June 24-28, 201

    Process intensification education contributes to sustainable development goals : part 1

    No full text
    In 2015 all the United Nations (UN) member states adopted 17 sustainable development goals (UN-SDG) as part of the 2030 Agenda, which is a 15-year plan to meet ambitious targets to eradicate poverty, protect the environment, and improve the quality of life around the world. Although the global community has progressed, the pace of implementation must accelerate to reach the UN-SDG time-line. For this to happen, professionals, institutions, companies, governments and the general public must become cognizant of the challenges that our world faces and the potential technological solutions at hand, including those provided by chemical engineering. Process intensification (PI) is a recent engineering approach with demonstrated potential to significantly improve process efficiency and safety while reducing cost. It offers opportunities for attaining the UN-SDG goals in a cost-effective and timely manner. However, the pedagogical tools to educate undergraduate, graduate students, and professionals active in the field of PI lack clarity and focus. This paper sets out the state-of-the-art, main discussion points and guidelines for enhanced PI teaching, deliberated by experts in PI with either an academic or industrial background, as well as representatives from government and specialists in pedagogy gathered at the Lorentz Center (Leiden, The Netherlands) in June 2019 with the aim of uniting the efforts on education in PI and produce guidelines. In this Part 1, we discuss the societal and industrial needs for an educational strategy in the framework of PI. The terminology and background information on PI, related to educational implementation in industry and academia, are provided as a preamble to Part 2, which presents practical examples that will help educating on Process Intensification

    Compared fixation and survival of 280 lateralised vs 527 standard cementless stems after two years (1-7)

    No full text
    Restoring the native hip anatomy increases hip prosthesis survival, whereas increased femoral lateralisation creates high torque stresses that may alter prosthesis fixation. After finding lucent lines around cementless lateralised stems (CorailTM, DePuy Synthes, St Priest, France) in several patients, we evaluated the effects of lateralisation in a large case-series. The objective of our study was to compare lateralised vs standard stems of identical design in terms of radiological osteo-integration and survival. HYPOTHESIS: Lateralised stems, despite being used only when indicated by the anatomical parameters, carry a higher risk of impaired osteo-integration. MATERIALS AND METHODS: A retrospective study was conducted in 807 primary total hip arthroplasties (THAs) performed between 2006 and 2010 in 798 patients with a mean age of 65 ± 14.2 years. Lateralised stems were used in 280 cases (Corail High Offset KHO, n = 169; and Corail coxa vara KLA, n = 111 cases) and standard stems in 527 cases (Corail KA). Mean follow-up was 2.3 years (range, 1-7 years). The clinical evaluation included determination of the Postel-Merle d'Aubigné (PMA) score. Bone fixation and stability of the implants were assessed by determining the Engh and Massin score and the ARA score on the radiographs at last follow-up. Femoral, acetabular and global offset values were determined before and after THA. Nobles's Canal Flare Index was computed. Survival was estimated using the Kaplan-Meier method with surgical revision for aseptic loosening as the end-point. RESULTS: The PMA score improved from 12 (10-15) pre-operatively to 17.7 (14-18) (P < 0.05). After THA, in the lateralised stem group, femoral offset was restored in 217 (77%) hips and the mean change vs the pre-operative offset value was -2 mm; in the standard stem group, femoral offset was restored in 440 (83.5%) hips and the mean change was +1 mm. The Engh and Massin score values were similar in the standard stem and lateralised stem groups (24.4 ± 2.2 and 22.6 ± 2.4, respectively, NS). Revision for aseptic loosening was required in 5 patients with lateralised stems (3 KHO and 2 KLA) versus none of the patients with standard stems. There were no cases of excessive femoral offset and the mean change in offset was -2.3mm (-5.3 to -1.1). Noble's index was increased (4.27 ± 0.5 for the loosened lateralised stems, 3.65 ± 0.8 for the well-fixed lateralised stems and 3.82 ± 0.6 for the standard stems), with no significant difference across groups. Overall survival after 3.5 years of follow-up was 94.6% (95% confidence interval, 88.4-100%) with lateralised stems and 100% with standard stems (P < 0.05). DISCUSSION: The risk of aseptic loosening was significantly higher with the lateralised stem (5/280, 1.8%) than with the standard stem (n = 0). Our findings indicate a need for careful preparation to obtain primary fixation of lateralised stems. LEVEL OF EVIDENCE: III, retrospective case-control study

    Nosespace PTR-MS analysis with simutaneous TDS or TCATA sensory evaluation: Release and perception of the aroma of dark chocolates differing in sensory properties

    No full text
    International audiencePerception of flavor is a dynamic process during which the concentration of aroma molecules at the olfactory epithelium varies with time as they are released progressively from the food in the mouth during consumption. However, how the various components combine to produce a sensory impression is still not completely understood. Real-time mass spectrometry (MS) techniques that measure aroma compounds directly in the nose (nosespace) aim at obtaining data patterns that are supposed to reflect the way aromas are released in real time during food consumption. These patterns are supposed to be representative of the retronasal stimuli perception. Real-time sensory methods, such as time-intensity, or the more recent Temporal Dominance of Sensations (TDS) and Temporal Check All That Apply (TCATA) procedures, are used to account for the dynamic and time-related aspects of flavor perception. Combined together, preferably simultaneously in a fully real-time in vivo approach, both chemical and sensory methods should provide fundamental results to understand the link between aroma release and aroma perception better. The present lecture will present an overview of the advances made for combining real-time nosespace analysis with simultaneous temporal sensory evaluation. In order to analyse conjointly the two sets of data, a statistical procedure will be presented and discussed. These advancements will be illustrated and discussed through the study of the flavor of three dark chocolates differing in sensory properties, analyzed by a panel of 16 assessors in nosespace with a PTR-ToF-MS instrument and simultaneous TDS or TCATA sensory evaluation. Results obtained with TDS and TCATA will be compared
    • …
    corecore