71 research outputs found

    Assessment of dosimetric errors induced by deformable image registration methods in 4D pencil beam scanned proton treatment planning for liver tumours

    Get PDF
    PURPOSE: Respiratory impacts in pencil beam scanned proton therapy (PBS-PT) are accounted by extensive 4D dose calculations, where deformable image registration (DIR) is necessary for estimating deformation vector fields (DVFs). We aim here to evaluate the dosimetric errors induced by different DIR algorithms in their resulting 4D dose calculations by using ground truth(GT)-DVFs from 4DMRI. MATERIALS AND METHODS: Six DIR methods: ANACONDA, Morfeus, B-splines, Demons, CT Deformable, and Total Variation, were respectively applied to nine 4DCT-MRI liver data sets. The derived DVFs were then used as input for 4D dose calculation. The DIR induced dosimetric error was assessed by individually comparing the resultant 4D dose distributions to those obtained with GT-DVFs. Both single-/three-field plans and single/rescanned strategies were investigated. RESULTS: Differences in 4D dose distributions among different DIR algorithms, and compared to the results using GT-DVFs, were pronounced. Up to 40 % of clinically relevant dose calculation points showed dose differences of 10 % or more between the GT. Differences in V95(CTV) reached up to 11.34 ± 12.57 %. The dosimetric errors became in general less substantial when applying multiple-field plans or using rescanning. CONCLUSION: Intrinsic geometric errors by DIR can influence the clinical evaluation of liver 4D PBS-PT plans. We recommend the use of an error bar for correctly interpreting individual 4D dose distributions

    Levels and enantiomeric signatures of methyl sulfonyl PCB and DDE metabolites in livers of harbor porpoises (Phocoena phocoena) from the southern North Sea

    Full text link
    The concentration of 26 methyl sulfonyl metabolites of polychlorinated biphenyls (MeSO2-PCBs) and of p,p'-DDE (MeSO2-DDE) were determined in 19 liver samples from harbor porpoises (Phocoena phocoena) stranded between 1997 and 2000 on the Belgian and French North Sea Coasts. The total concentration of MeSO2-PCBs ranged from 39 to 4221 ng/g lipid weight (lw) and were generally higher in adults (age >2 yr, range 969-4221 ng/g lw) than in juveniles (age 0.73 or EF < 0.23) for the measured chiral MeSO2-PCB congeners was found in all samples. This result may suggest that one atropisomer may be preferentially formed in harbor porpoises or that the atropisomers are retained in a highly selective manner.Peer reviewe

    Studies on organochlorine environmental contaminants with emphasis on analytical methods and occurrence in humans

    No full text
    Methods for multicomponent analysis of lipophilic organochlorine contaminants(OCs) in blood plasma and lipid-rich tissues are presented. By using the describedmethods, polychlorinated biphenyls (PCB), polychlorinated naphthalenes (PCN), hexachlorobenzene(HCB), 1,1-dichloro-2,2-bis(4 chlorophenyl)ethene (p,p'-DDE) and methylsulphonylmetabolites of PCB (MeSO2-PCB) and p,p'-DDE (Meso2-DDE) were determined in humanblood plasma, adipose tissue and liver. In addition, polychlorinated dibenzo-p-dioxinsand dibenzofurans (PCDD/F) were determined in selected samples of blood plasma. The profiles and levels of PCB, PCN, HCB andp,p'-DDE were similar in adipose tissueand liver, while differences were found for MeSO2-PCB and MeSO2-DDE. The concentration(on lipid weight basis) of the metabolites was about 10 times higher in liver thanin adipose tissue. Furthermore, meta-substituted isomers of MeSO2-PCB were selectivelyretained in the liver. The most abundant methylsulphonyl metabolite in the liverwas 3-MeSO2-2,2',3',4',5,6-hexaCB, followed by 3-MeSO2-DDE. In adipose tissue, 3-MeSO2-DDEwas the most abundant metabolite. PCB, HCB, p,p'-DDE, MeSO2-PCB and MeSO2-DDE were determined in whole plasma andin the fractions obtained afler separation of the plasma into very low density lipoprotein(VLDL), low density lipoprotein (LDL), high density lipoprotein (HDL) and lipoproteindepleted (LPDP) fractions. The OCs were present in all fractions, but predominantlyin the LPDP fraction. A tendency to greater association of meta-substituted compoundsof MeSO2-PCB to the LPDP fraction was noted. Among the lipoprotein fractions, LDLwas the main carrier of PCB, HCB and p,p'-DDE. 3-MeSO2-DDE was predominantly foundin the HDL fraction and MeSO2-PCB was distributed about equally between LDL and HDLfractions. A local source of contamination with PCB and PCN was traced to old electronicequipment, which was temporarily stored next to the laboratory. It was shown thatPCB and PCN may emit from electronics and pollute the indoor air. However, occupationalexposure to PCB, PCN, PCDD/F and/or HCB could not be confirmed in a pilot study ofplasma from persons who had worked with cable incineration or electronic equipment. ISBN 91-628-2659-

    Global formbeskrivning av digitala objekt

    No full text
    New methods for global shape description of three-dimensional digital objects are presented. The shape of an object is first represented by a digital surface where the faces are either triangles or quadrilaterals. Techniques for computing a high-quality parameterization of the surface are developed and this parameterization is used to approximate the shape of the object. Spherical harmonics are used as basis functions for approximations of the coordinate functions. Information about the global shape is then captured by the coefficients in the spherical harmonics expansions. For a starshaped object it is shown how a parameterization can be computed by a projection from its surface onto the unit sphere. An algorithm for computing the position at which the centre of the sphere should be placed, is presented. This algorithm is suited for digital voxel objects. Most of the work is concerned with digital objects whose surfaces are homeomorphic to the sphere. The standard method for computing parameterizations of such surfaces is shown to fail on many objects. This is due to the large distortions of the geometric properties of the surface that often occur with this method. Algorithms to handle this problem are suggested. Non-linear optimization methods are used to find a mapping between a surface and the sphere that minimizes geometric distortion and is useful as a parameterization of the surface. The methods can be applied, for example, in medical imaging for shape recognition, detection of shape deformations and shape comparisons of three-dimensional objects

    Implementation av en Distribuerad Lösning i Meddelandehanteraren LavinMQ

    No full text
    Today applications are often designed with a modular approach, dividing functionality into micro services rather than relying on a monolithic structure. This requires solutions for decoupled message exchange throughout the distributed system and can be achieved by implementing a message broker. In some cases, it is interesting to make the message broker itself a distributed system, which has been a development path for systems such as RabbitMQ, Apache Kafka, and others. This thesis seeks to understand the alternative approaches to implementing  a distributed solution for the message broker LavinMQ. To find out which algorithm would be the most suitable for this purpose, a comparative analysis of the most common alternatives was performed based on LavinMQ's requirements and a literature review of related comparisons. The results showed that Raft would be the best choice due to its simple but effective nature. To further investigate the consensus approach in LavinMQ, a Raft prototype was developed in the programming language Crystal. The prototype was then evaluated based on correctness and performance in terms of mean replication time and mean election time. The prototype successfully passed the correctness tests, showing that the prototype successfully achieves correctness according to LavinMQ standard. The mean election time results show that the prototype recovers from a leader failure in 216 ms and that the most effective range for the heartbeat timeout is 150 to 300 ms. The replication time test results show that the mean replication time is 84.45 ms and the most efficient interval for message replication is 0.13 ms. The performance results are consistent with the results of related work, however result in overall slower performance and indicate that some additional features and optimizations need to be implemented in order to consider the prototype for practical use

    Implementation av en Distribuerad Lösning i Meddelandehanteraren LavinMQ

    No full text
    Today applications are often designed with a modular approach, dividing functionality into micro services rather than relying on a monolithic structure. This requires solutions for decoupled message exchange throughout the distributed system and can be achieved by implementing a message broker. In some cases, it is interesting to make the message broker itself a distributed system, which has been a development path for systems such as RabbitMQ, Apache Kafka, and others. This thesis seeks to understand the alternative approaches to implementing  a distributed solution for the message broker LavinMQ. To find out which algorithm would be the most suitable for this purpose, a comparative analysis of the most common alternatives was performed based on LavinMQ's requirements and a literature review of related comparisons. The results showed that Raft would be the best choice due to its simple but effective nature. To further investigate the consensus approach in LavinMQ, a Raft prototype was developed in the programming language Crystal. The prototype was then evaluated based on correctness and performance in terms of mean replication time and mean election time. The prototype successfully passed the correctness tests, showing that the prototype successfully achieves correctness according to LavinMQ standard. The mean election time results show that the prototype recovers from a leader failure in 216 ms and that the most effective range for the heartbeat timeout is 150 to 300 ms. The replication time test results show that the mean replication time is 84.45 ms and the most efficient interval for message replication is 0.13 ms. The performance results are consistent with the results of related work, however result in overall slower performance and indicate that some additional features and optimizations need to be implemented in order to consider the prototype for practical use

    Avvägningar mellan säkerhet och prestanda vid implementationer av HTTPS i webbläsare

    No full text
    Different browsers manage security in different ways when communicating with web servers. Many of these differences are due to browsers making security-performance tradeoffs in the battle to be the most popular browser. This thesis characterize and analyze how browsers manage security in the implementation of HTTPS. This is important because most of us use HTTPS regularly and thrust it with our passwords, bank accounts and everything else we communicate over the Internet. Our analysis includes which TLS version that is used for the connections, which cipher suites the browsers prefer, why they are preferred and which cipher suites the web servers selects based on this. We also compare the difference in number of secure connections and certificates between the browser in their communication with the web servers. The analysis shows that Firefox and Chrome has the latest security updates regarding TLS version 1.3. By default, they have three TLS 1.3 cipher suites on top of their list of offered cipher suites, where the safest is at the top. In contrast, Safari has their number two as its number one and it is possible that it is due to some latency in the development. When it comes to cipher suites it seems that the browsers choose security over performance. As for the number of secure connections and certificates we could see a difference between Safari and the other two browsers and these differences indicates that Safari stops more third party tracking than Firefox and Chrome

    Avvägningar mellan säkerhet och prestanda vid implementationer av HTTPS i webbläsare

    No full text
    Different browsers manage security in different ways when communicating with web servers. Many of these differences are due to browsers making security-performance tradeoffs in the battle to be the most popular browser. This thesis characterize and analyze how browsers manage security in the implementation of HTTPS. This is important because most of us use HTTPS regularly and thrust it with our passwords, bank accounts and everything else we communicate over the Internet. Our analysis includes which TLS version that is used for the connections, which cipher suites the browsers prefer, why they are preferred and which cipher suites the web servers selects based on this. We also compare the difference in number of secure connections and certificates between the browser in their communication with the web servers. The analysis shows that Firefox and Chrome has the latest security updates regarding TLS version 1.3. By default, they have three TLS 1.3 cipher suites on top of their list of offered cipher suites, where the safest is at the top. In contrast, Safari has their number two as its number one and it is possible that it is due to some latency in the development. When it comes to cipher suites it seems that the browsers choose security over performance. As for the number of secure connections and certificates we could see a difference between Safari and the other two browsers and these differences indicates that Safari stops more third party tracking than Firefox and Chrome

    Avvägningar mellan säkerhet och prestanda vid implementationer av HTTPS i webbläsare

    No full text
    Different browsers manage security in different ways when communicating with web servers. Many of these differences are due to browsers making security-performance tradeoffs in the battle to be the most popular browser. This thesis characterize and analyze how browsers manage security in the implementation of HTTPS. This is important because most of us use HTTPS regularly and thrust it with our passwords, bank accounts and everything else we communicate over the Internet. Our analysis includes which TLS version that is used for the connections, which cipher suites the browsers prefer, why they are preferred and which cipher suites the web servers selects based on this. We also compare the difference in number of secure connections and certificates between the browser in their communication with the web servers. The analysis shows that Firefox and Chrome has the latest security updates regarding TLS version 1.3. By default, they have three TLS 1.3 cipher suites on top of their list of offered cipher suites, where the safest is at the top. In contrast, Safari has their number two as its number one and it is possible that it is due to some latency in the development. When it comes to cipher suites it seems that the browsers choose security over performance. As for the number of secure connections and certificates we could see a difference between Safari and the other two browsers and these differences indicates that Safari stops more third party tracking than Firefox and Chrome
    corecore