2,107 research outputs found
Investigating viscous damping using a webcam
We describe an experiment involving a mass oscillating in a viscous fluid and
analyze viscous damping of harmonic motion. The mechanical oscillator is
tracked using a simple webcam and an image processing algorithm records the
position of the geometrical center as a function of time. Interesting
information can be extracted from the displacement-time graphs, in particular
for the underdamped case. For example, we use these oscillations to determine
the viscosity of the fluid. Our mean value of 1.08 \pm 0.07 mPa s for distilled
water is in good agreement with the accepted value at 20\circC. This experiment
has been successfully employed in the freshman lab setting.Comment: 13 pages, 5 figure
Growth, Policy and Institutions: lessons from the Indian experience (Dr Montek Singh Ahluwalia)
Sohaib Athar offers an overview of Dr Montek Singh Ahluwalia’s talk at the LSE last Thursday
A study of individual consumer level culture in B2C e-commerce through a multi-perspective iTrust model
University of Technology Sydney. Faculty of Engineering and Information Technology.Building trust and understanding their relationship with consumer online purchasing decisions is important to business-to-consumer (B2C) e-commerce firms seeking to extend their reach to consumers globally. This study addresses the gap in the knowledge about this relationship by studying the cognitive and affective responses of consumers towards a B2C e-commerce website.
Based on the Stimulus–Organism–Response (S–O–R) model, this study examines the moderating role of individual consumer culture on the relationship between web design (website accessibility, visual appearance (colour and images) and social networking services), consumer behaviour (religiosity), privacy, security, emotions (fear and joy) and interpersonal trust (iTrust), cognitive and affect-based trust concerning online purchasing intentions. The motivation of this study includes testing and comparing individual consumer cultural values (individualism and uncertainty avoidance) difference moderators in proposed multi-perspective model of online interpersonal trust (iTrust) across two different societies (Australia and Pakistan).
This research applied a quantitative methodology and a cross-sectional survey design approach. In order to empirically test the research model, surveys were conducted in Pakistan and Australia. A total of 270 participants from Pakistan and 255 from Australia responded to the survey. The data of the survey were analysed with the SEM-Partial Least Square (PLS) approach using SmartPLS 3.0.
The results of the analysis generated mixed findings. It was found that depending on the stimulus (S) towards which a reaction is made provides a signal regarding the cognitive and affect-based trust (Organism) of B2C e-commerce website, which influence consumers purchase intentions (Response) at the individual level across culture.
The results of this study highlight the need to consider individual consumer level cultural differences when identifying the mix of e-commerce strategies to employ in B2C websites, not only at the country level but also in culturally diverse country such as Australia
“Cardiac resynchronisation therapy”: Does the haemodynamic improvement of biventricular pacing truly arise from cardiac resynchronisation?
In this thesis I have explored some of the fundamental concepts which underpin biventricular pacing (commonly called cardiac resynchronisation therapy, CRT). As a therapy, its impact on survival, and symptoms is impressive. By adopting the name cardiac resynchronisation therapy, a common assumption is that these benefits come from ensuring resynchronisation of the ventricles in the failing heart. In this thesis I explore how biventricular pacing delivers its benefit, and whether there are other dimensions beyond resynchronisation which deserve more attention.
I first performed a meta-analysis to quantify what the actual symptomatic benefit from biventricular pacing is in the randomised controlled trials. A non-response rate of one-third is often quoted to biventricular pacing, but my analysis demonstrated that once the effect seen in the control arms is deducted the incremental symptomatic response rate is closer to 15%.
I explored more acute markers of response, and how they are used for optimisation of biventricular pacing. I composed a review of different technologies available for optimisation, and a developed a step wise approach to develop the ideal optimisation scheme. Left ventricular outflow tract (LVOT) Doppler is one commonly used measure for optimisation, and my analysis concluded that a much larger number of beats is required for precise optimisation., I evaluated a novel method to acquire and trace around large numbers of LVOT Doppler velocities, and assessed whether breath holding is required. I discovered that breath holding did not have a significant impact on the magnitude or variability of measurements, and quiet breathing may be the easier way to acquire a larger number of beats for precise measurements.
An algorithm using multiple alternations of systolic blood pressure between reference and tested pacing setting has been developed by my supervisors for reproducible AV optimisation, I used this technology to explore current techniques, and explore concepts in biventricular pacing: I evaluate the different methods for manufacturer specific electrogram-based AV optimisation. I found that agreement between the different methods is poor, and none agree with the haemodynamic optimum. I explored the apparent discrepancy studies have reported on the effect of VV optimisation. By performing VV optimisation by using four different methods for holding the AV delay constant (A-LV constant, A-RV constant, time to first ventricular lead constant, and time to second ventricular lead constant), I discovered that the acute haemodynamic effect was predominantly determined by the time to the first paced ventricle. To explore the influence of pure AV optimisation in heart failure I examined a group of patients with PR prolongation and demonstrated a significant improvement in acute haemodynamic response with AV optimised pacing of the His bundle. Temporary pacing of the His allows us to maintain the same, narrow QRS morphology and thus examine the pure effect of AV optimisation, an mean increment of 4 mmHg in systolic pressure is seen, approximately 60% of that seen in heart failure with LBBB. This also demonstrates the pure effect of AV shortening without an associated adverse haemodynamic effect of right ventricular pacing. I explored the role of lead position and whether the AV optimum varies between different LV lead positions. The AV optimum, did not significantly differ, suggesting high precision measurements at one AV delay could be used to determine the best lead position which in my study was occasionally in a position which would usually be considered non-conventional (anterior basal wall).
I finally explored the role of biventricular pacing in non-LBBB morphologies looking at outcome studies. My analysis showed that biventricular pacing has a harmful effect in narrow QRS and the effect increases with the duration of time, indicating that this is due to a physiologically adverse effect of pacing. As this is the case, one can make a case for switching off biventricular pacing in such patients.Open Acces
Load Balancing in Cloud Computing Empowered with Dynamic Divisible Load Scheduling Method
The need to process and dealing with a vast amount of data is increasing with the developing technology. One of the leading promising technology is Cloud Computing, enabling one to accomplish desired goals, leading to performance enhancement. Cloud Computing comes into play with the debate on the growing requirements of data capabilities and storage capacities. Not every organization has the financial resources, infrastructure & human capital, but Cloud Computing offers an affordable infrastructure based on availability, scalability, and cost-efficiency. The Cloud can provide services to clients on-demand, making it the most adapted system for virtual storage, but still, it has some issues not adequately addressed and resolved. One of those issues is that load balancing is a primary challenge, and it is required to balance the traffic on every peer adequately rather than overloading an individual node. This paper provides an intelligent workload management algorithm, which systematically balances traffic and homogeneously allocates the load on every node & prevents overloading, and increases the response time for maximum performance enhancement
Non-parametric Methods for Automatic Exposure Control, Radiometric Calibration and Dynamic Range Compression
Imaging systems are essential to a wide range of modern day
applications. With the continuous advancement in imaging systems,
there is an on-going need to adapt and improve the imaging
pipeline running inside the imaging systems.
In this thesis, methods are presented to improve the imaging
pipeline of digital cameras. Here we present three methods to
improve important phases of the imaging process, which are (i)
``Automatic exposure adjustment'' (ii) ``Radiometric
calibration'' (iii) ''High dynamic range compression''. These
contributions touch the initial, intermediate and final stages of
imaging pipeline of digital cameras.
For exposure control, we propose two methods. The first makes use
of CCD-based equations to formulate the exposure control problem.
To estimate the exposure time, an initial image was acquired for
each wavelength channel to which contrast adjustment techniques
were applied. This helps to recover a reference cumulative
distribution function of image brightness at each channel. The
second method proposed for automatic exposure control is an
iterative method applicable for a broad range of imaging systems.
It uses spectral sensitivity functions such as the photopic
response functions for the generation of a spectral power image
of the captured scene. A target image is then generated using the
spectral power image by applying histogram equalization. The
exposure time is hence calculated iteratively by minimizing the
squared difference between target and the current spectral power
image. Here we further analyze the method by performing its
stability and controllability analysis using a state space
representation used in control theory. The applicability of the
proposed method for exposure time calculation was shown on real
world scenes using cameras with varying architectures.
Radiometric calibration is the estimate of the non-linear mapping
of the input radiance map to the output brightness values. The
radiometric mapping is represented by the camera response
function with which the radiance map of the scene is estimated.
Our radiometric calibration method employs an L1 cost function by
taking advantage of Weisfeld optimization scheme. The proposed
calibration works with multiple input images of the scene with
varying exposure. It can also perform calibration using a single
input with few constraints. The proposed method outperforms,
quantitatively and qualitatively, various alternative methods
found in the literature of radiometric calibration.
Finally, to realistically represent the estimated radiance maps
on low dynamic range display (LDR) devices, we propose a method
for dynamic range compression. Radiance maps generally have
higher dynamic range (HDR) as compared to the widely used display
devices. Thus, for display purposes, dynamic range compression is
required on HDR images. Our proposed method generates few LDR
images from the HDR radiance map by clipping its values at
different exposures. Using contrast information of each LDR
image generated, the method uses an energy minimization approach
to estimate the probability map of each LDR image. These
probability maps are then used as label set to form final
compressed dynamic range image for the display device. The
results of our method were compared qualitatively and
quantitatively with those produced by widely cited and
professionally used methods
- …