1,770 research outputs found
Maximising Technology Efficiencies for SMEs Using Computer Intelligence
This paper is part of a series which present elements of our research into technology adoption and usability problems faced by small businesses (SME’s) in Australia. We discuss our approach to research in the small business area, the importance of aggregating small businesses into vertical technology sectors and the usability improvements which can be expected from this collaborative approach. In this paper we focus on information retrieval requirements and the related usability problems which confront small businesses. We discuss adoption and usability issues, requirements and risks using a business case study, then introduce experiments which it is hoped will deliver efficiency improvements to information retrieval applications for small businesses
CBSE: an implementation case study
Over the last couple of years, the shift towards component based software engineering (CBSE) methods has become a cost effective way to get an application to implementation stage much earKer. Adoption of Component Based Development methods acknowledges the use of third party components wherever possible to reduce the cost of software development, shorten the development phase and provide a richer set of processing options for the end user. The use of these tools is particularly relevant in Web based applications, where commercial off the shelf (COTS) products are so prevalent. However, there are a number of risks associated with the use of component based development methods. This thesis investigates these risks within the context of a software engineering project and attempts to provide a means to minimise and or at least manage the risk potential when using component based development method
ACADEMIC SUPPORT FOR FIRST-YEAR SOCIAL WORK STUDENTS IN SOUTH AFRICA
Students in their first year face the great challenge of transition from school to university,where independent and self-directed learning is called for. Students must navigate multifacetedlife adaptations – physical ones such as moving away from home, and psychological ones suchas moving into young adulthood, from the familiarity of the homogeneous school environmentto the heterogeneous culture of the university, often from rural to urban, to a different language,to mixing with diverse race groups. Moving from the control, protection and predictability ofschool life, learners are free for the first time to test their autonomy and experiment withchoices. Equally, the challenge of providing the best teaching and learning to this group restswith the educators in first year
Academic support for first-year social work students in South Africa
This article sets out the context for first-year social work students in South Africa, explaining particular needs of the typical student and the facilities available for support. The expectation of deep learning required of university students raises many questions and a proposal for a research project is suggested on teaching and learning in social work, to be carried out collaboratively by South African universities. A literature survey, with extrapolation and application of relevant principles to serve as foundation for the project, is presented
Recommended from our members
Bioaccessibility of PBDEs present in indoor dust: a novel dialysis membrane method with a Tenax TA® absorption sink
Human uptake of flame retardants (FRs) such as polybrominated diphenyl ethers (PBDEs) via indoor dust ingestion is commonly considered as 100% bioaccessible, leading to potential risk overestimation. Here, we present a novel in vitro colon-extended physiologically-based extraction test (CE-PBET) with Tenax TA® as an absorptive "sink" capable to enhance PBDE gut bioaccessibility. A cellulose-based dialysis membrane (MW cut-off 3.5kDa) with high pH and temperature tolerance was used to encapsulate Tenax TA®, facilitating efficient physical separation between the absorbent and the dust, while minimizing re-absorption of the ingested PBDEs to the dust particles. As a proof of concept, PBDE-spiked indoor dust samples (n=3) were tested under four different conditions; without any Tenax TA® addition (control) and with three different Tenax TA® loadings (i.e. 0.25, 0.5 or 0.75g). Our results show that in order to maintain a constant sorptive gradient for the low MW PBDEs, 0.5g of Tenax TA® are required in CE-PBET. Tenax TA® inclusion (0.5g) resulted in 40% gut bioaccessibility for BDE153 and BDE183, whereas greater bioaccessibility values were seen for less hydrophobic PBDEs such as BDE28 and BDE47 (~60%). When tested using SRM 2585 (n=3), our new Tenax TA® method did not present any statistically significant effect (p>0.05) between non-spiked and PBDE-spiked SRM 2585 treatments. Our study describes an efficient method where due to the sophisticated design, Tenax TA® recovery and subsequent bioaccessibility determination can be simply and reliably achieved
Singular terms of helicity amplitudes at one-loop in QCD and the soft limit of the cross sections of multi-parton processes
We describe a general method that enables us to obtain all the singular terms
of helicity amplitudes of n-parton processes at one loop. The algorithm uses
helicity amplitudes at tree level and simple color algebra.
We illustrate the method by calculating the singular part of the one loop
helicity amplitudes of all parton subprocesses. The results are used
to derive the soft gluon limit of the cross sections of all parton
scattering subprocesses which provide a useful initial condition for the
angular ordering approximation to coherent multiple soft gluon emission,
incorporated in existing Monte Carlo simulation programs.Comment: Latex,13 pages, ETH-TH/94-
Designing a regional e-logistics portal
A variety of optimization and negotiation technologies hold the promise of delivering value to the logistics processes of businesses both small and large, yet they tend to remain inaccessible to SMEs (largely due to price and complexity concerns). This paper describes the early-phase steps in a project to develop a regional e- logistics portal. The project seeks to make constraint-based optimization and automated negotiation technologies accessible to SMEs within a portal that also serves their information needs. The paper highlights several novel aspects of the design of the portal, as well as a novel requirements gathering process involving community consultation
Significantly reducing the processing times of high-speed photometry data sets using a distributed computing model
The scientific community is in the midst of a data analysis crisis. The increasing capacity of scientific CCD instrumentation and their falling costs is contributing to an explosive generation of raw photometric data. This data must go through a process of cleaning and reduction before it can be used for high precision photometric analysis. Many existing data processing pipelines either assume a relatively small dataset or are batch processed by a High Performance Computing centre. A radical overhaul of these processing pipelines is required to allow reduction and cleaning rates to process terabyte sized datasets at near capture rates using an elastic processing architecture. The ability to access computing resources and to allow them to grow and shrink as demand fluctuates is essential, as is exploiting the parallel nature of the datasets. A distributed data processing pipeline is required. It should incorporate lossless data compression, allow for data segmentation and support processing of data segments in parallel. Academic institutes can collaborate and provide an elastic computing model without the requirement for large centralized high performance computing data centers. This paper demonstrates how a base 10 order of magnitude improvement in overall processing time has been achieved using the ACN pipeline , a distributed pipeline spanning multiple academic institutes
Use of sediment source fingerprinting to assess the role of subsurface erosion in the supply of fine sediment in a degraded catchment in the Eastern Cape, South Africa
Sediment source fingerprinting has been successfully deployed to provide information on the surface and subsurface sources of sediment in many catchments around the world. However, there is still scope to reexamine some of the major assumptions of the technique with reference to the number of fingerprint properties used in the model, the number of model iterations and the potential uncertainties of using more than one sediment core collected from the same floodplain sink. We investigated the role of subsurface erosion in the supply of fine sediment to two sediment cores collected from a floodplain in a small degraded catchment in the Eastern Cape, South Africa. The results showed that increasing the number of individual fingerprint properties in the composite signature did not improve the model goodness-of-fit. This is still a much debated issue in sediment source fingerprinting. To test the goodness-of-fit further, the number of model repeat iterations was increased from 5000 to 30,000. However, this did not reduce uncertainty ranges in modelled source proportions nor improve the model goodness-of-fit. The estimated sediment source contributions were not consistent with the available published data on erosion processes in the study catchment. The temporal pattern of sediment source contributions predicted for the two sediment cores was very different despite the cores being collected in close proximity from the same floodplain. This highlights some of the potential limitations associated with using floodplain cores to reconstruct catchment erosion processes and associated sediment source contributions. For the source tracing approach in general, the findings here suggest the need for further investigations into uncertainties related to the number of fingerprint properties included in un-mixing models. The findings support the current widespread use of <5000 model repeat iterations for estimating the key sources of sediment samples
- …