567 research outputs found
Prevalence of Bitcoin in New Zealand
Global warming is a reality. Organisations realise their corporate responsibility to conduct their business with the 'future' in mind. Sustainability is having a green conscience and ensuring the steps you take today do not have a negative impact on the future. Green Human Resources Management is to promote the sustainable use of resources within business organisations. The aim of this research is to provide organisations with a Green Human Resource Management Strategy (GHRM). A qualitative approach was followed, and five participants interviewed. The researcher followed this approach to gain an in-depth understanding of business eco-friendly practices, to ascertain if they utilise HR to drive “green” in the organisation and engage employees. The study found that most organisations have implemented some eco-friendly practice and know the value of becoming a 'green' employer. However, the researcher identified a significant gap in that organisations are not aware of or lack the knowledge of how to utilise HR practices to get staff engaged in green policies and procedures. The researcher will strive to come up with various ideas and recommendation to the business on how they can utilise their HR practices to go green and engage their staff
Recommended from our members
Towards Optimized Traffic Provisioning and Adaptive Cache Management for Content Delivery
Content delivery networks (CDNs) deploy hundreds of thousands of servers around the world to cache and serve trillions of user requests every day for a diverse set of content such as web pages, videos, software downloads and images. In this dissertation, we propose algorithms to provision traffic across cache servers and manage the content they host to achieve performance objectives such as maximizing the cache hit rate, minimizing the bandwidth cost of the network and minimizing the energy consumption of the servers.
Traffic provisioning is the process of determining the set of content domains hosted on the servers. We propose footprint descriptors that effectively capture the popularity characteristics and caching performance of different content classes. We also propose a footprint descriptor calculus that can be used to decide how content should be mixed or partitioned to efficiently provision traffic. To automate traffic provisioning, we propose optimization models to provision traffic such that the cache miss traffic from the network is minimized without overloading the servers. We find that such optimization models produce significant reductions in the cache miss traffic when compared with traffic provisioning algorithms in use today.
Cache management is the process of deciding how content is cached in the servers of a CDN. We propose TTL-based caching algorithms that provably achieve performance targets specified by a CDN operator. We show that the proposed algorithms converge to the target hit rate and target cache size with low error. Finally, we propose cache management algorithms to make the servers energy-efficient using disk shutdown. We find that disk shutdown is well suited for CDN servers and provides energy savings without significantly impacting cache hit rates
A Virtual World for Troubleshooting Distributed Systems
An observerse is proposed, to provide users of cloud-based distributed systems with tools for streamlined debugging and root cause analysis. The observerse serves as an observability platform, incorporating aspects of virtual reality technology to help users efficiently navigate the system to resolve operational issues
Hydration and vocal loading on voice measures
Vocal loading adversely affects the healthy larynx. The negative effects of vocal loading are thought to be exacerbated in dry environments, noisy environments, using non-habitual speaking patterns, and voice quality. Advancing age is also thought to be a risk factor for the negative effects of loading. To systematically tease out the effects of these factors on the healthy larynx, three different experiments were conducted. In each experiment, healthy participants produced 45-minutes of child-directed speech. In experiment 1, older, healthy adults produced loud child-directed speech, in the presence of background noise, in both low and moderate humidities, and voice was assessed. In experiment 2, young, healthy adults produced loud child-directed speech, in the presence of background noise, in both low and moderate humidities, and voice was assessed. In experiment 3, young, healthy adults produced child-directed speech using low-effort whisper quality and voice was assessed. In each experiment, voice measures included Phonation Threshold Pressure, Cepstral Peak Prominence, self-perceived phonatory effort, and self-perceived vocal tiredness. These voice measures were collected at set points of the frequency range. Our data suggest that the aging larynx is negatively affected by 45-minutes of loud child-directed speech and that humidification is beneficial in reducing these negative effects. Younger adults are also negatively affected by 45- minutes of loud child-directed speech, but not whispered speech. Increasing ambient humidity does not minimize these effects. The adverse effects of loud speech are much greater than whispered speech. Overall, these data increase our understanding of factors that load the larynx and lay the foundation for developing clinical tests to identify speakers who are susceptible to voice problems
Adaptive TTL-Based Caching for Content Delivery
Content Delivery Networks (CDNs) deliver a majority of the user-requested
content on the Internet, including web pages, videos, and software downloads. A
CDN server caches and serves the content requested by users. Designing caching
algorithms that automatically adapt to the heterogeneity, burstiness, and
non-stationary nature of real-world content requests is a major challenge and
is the focus of our work. While there is much work on caching algorithms for
stationary request traffic, the work on non-stationary request traffic is very
limited. Consequently, most prior models are inaccurate for production CDN
traffic that is non-stationary.
We propose two TTL-based caching algorithms and provide provable guarantees
for content request traffic that is bursty and non-stationary. The first
algorithm called d-TTL dynamically adapts a TTL parameter using a stochastic
approximation approach. Given a feasible target hit rate, we show that the hit
rate of d-TTL converges to its target value for a general class of bursty
traffic that allows Markov dependence over time and non-stationary arrivals.
The second algorithm called f-TTL uses two caches, each with its own TTL. The
first-level cache adaptively filters out non-stationary traffic, while the
second-level cache stores frequently-accessed stationary traffic. Given
feasible targets for both the hit rate and the expected cache size, f-TTL
asymptotically achieves both targets. We implement d-TTL and f-TTL and evaluate
both algorithms using an extensive nine-day trace consisting of 500 million
requests from a production CDN server. We show that both d-TTL and f-TTL
converge to their hit rate targets with an error of about 1.3%. But, f-TTL
requires a significantly smaller cache size than d-TTL to achieve the same hit
rate, since it effectively filters out the non-stationary traffic for
rarely-accessed objects
Novel Regulators of Feeding and Cardiovascular Physiology in Fish
Nesfatin-1, an 82 amino acid anorexigen is encoded in a secreted precursor, nucleobindin-2 (NUCB2). NUCB2 was named so due to its high sequence similarity with nucleobindin-1 (NUCB1). It was recently reported that NUCB1 encodes an insulinotropic nesfatin-1-like peptide (NLP) in mice. Irisin, a muscle protein is encoded in its precursor fibronectin type III domain containing 5 (FNDC5) and released into blood from skeletal muscle. Here we aimed to characterize NLP and irisin in fish, and to study whether these are novel regulators of feeding and cardiovascular functions in zebrafish and goldfish. Western blot analysis and immunohistochemical studies determined the expression of NUCB1/NLP in central and peripheral tissues of goldfish. Administration of rat and goldfish NLP at 10 and 100 ng/g body weight doses caused potent inhibition of food intake in goldfish. NLP also downregulated the expression of preproghrelin and orexin-A mRNA, and upregulated cocaine and amphetamine regulated transcript (CART) mRNA in goldfish brain. Intraperitoneal (I.P) administration of NLP reduced cardiac functions in zebrafish and goldfish, downregulated irisin, and RyR1b mRNA expression in zebrafish. Irisin was detected in zebrafish heart and skeletal muscle. Single I.P. injection of irisin did not affect feeding, but its knockdown using siRNA caused a significant reduction in food intake. Knockdown of irisin reduced ghrelin and orexin-A mRNA expression, and increased CART mRNA expression in zebrafish brain and gut. Meanwhile, injection of irisin (0.1 and 1 ng/g B.W) increased cardiac functions, while knockdown of irisin resulted in reverse effects on cardiovascular physiology. Administration of propranolol attenuated the effects of irisin on cardiac physiology. Collectively, my research discovered that NLP and irisin modulate food intake and cardiac physiology in fish. Future studies should focus on the mechanisms of action of NLP and irisin in regulating metabolism and cardiovascular biology in fish
Direct ultrafast laser written C-band waveguide amplifier in Er-doped chalcogenide glass
This paper reports the fabrication and characterization of an ultrafast laser written Er-doped chalcogenide glass buried waveguide amplifier; Er-doped GeGaS glass has been synthesized by the vacuum sealed melt quenching technique. Waveguides have been fabricated inside the 4 mm long sample by direct ultrafast laser writing. The total passive fiber-to-fiber insertion loss is 2.58 +/- 0.02 dB at 1600 nm, including a propagation loss of 1.6 +/- 0.3 dB. Active characterization shows a relative gain of 2.524 +/- 0.002 dB/cm and 1.359 +/- 0.005 dB/cm at 1541 nm and 1550 nm respectively, for a pump power of 500 mW at a wavelength of 980 nm. (C) 2012 Optical Society of Americ
Using High-fidelity Time-Domain Simulation Data to Construct Multi-fidelity State Derivative Function Surrogate Models for use in Control and Optimization
Models that balance accuracy against computational costs are advantageous
when designing dynamic systems with optimization studies, as several hundred
predictive function evaluations might be necessary to identify the optimal
solution. The efficacy and use of derivative function surrogate models (DFSMs),
or approximate models of the state derivative function, have been
well-established in the literature. However, previous studies have assumed an a
priori state dynamic model is available that can be directly evaluated to
construct the DFSM. In this article, we propose an approach to extract the
state derivative information from system simulations using piecewise polynomial
approximations. Once the required information is available, we propose a
multi-fidelity DFSM approach as a predictive model for the system's dynamic
response. This multi-fidelity model consists of summation between a linear-fit
lower-fidelity model and an additional nonlinear error corrective function that
compensates for the error between the high-fidelity simulations and
low-fidelity models. We validate the model by comparing the simulation results
from the DFSM to the high-fidelity tools. The DFSM model is, on average, five
times faster than the high-fidelity tools while capturing the key time domain
and power spectral density~(PSD) trends. Then, an optimal control study using
the DFSM is conducted with outcomes showing that the DFSM approach can be used
for complex systems like floating offshore wind turbines~(FOWTs) and help
identify control trends and trade-offs.Comment: 14 pages,45 figure
Optimization of Process Parameters of Al-Si Alloy by Centrifugal Casting Technique Using Taguchi Design of Experiments
AbstractIn this paper, the influence of process parameters on the mechanical properties during centrifugal casting of aluminum alloy (4600) is studied. Taguchi method of design of experiments was employed to optimize the process parameters and to increase the mechanical properties. The investigation has indicated that increase in pouring temperature reduces mechanical properties while increase in die speed increases mechanical properties and density. Results were analyzed using ANOVA technique to know the percentage of contribution of each casting process parameters. Microstructures were studied under optical microscope and SEM were analyzed with process parameters by correlating with the mechanical properties of as cast structures
- …