15 research outputs found

    Exploring Sources of Competitive Advantages in E-business Application in Mainland Chinese Real Estate Industry

    Get PDF
    One of the key issues in e-business research is how established companies can gain competitive advantage by exploring e-business. Despite the interest in e-business applications by traditional firms, few empirical studies have been carried out to look at how ‘clicks-and-mortar’ approaches offer competitive advantages, especially from a specific industry perspective. This study investigates the key sources of competitive advantage gained from e-business applications by Chinese real estate developers and whether the value chain theory and its related theories can explain this phenomenon. By using a qualitative case study approach, the study shows that the value chain framework is useful to identify and categorize possible e-business application areas. Moreover, this categorization makes identification of key sources of competitive advantage explicit. However, this framework cannot fully explain the success of e-business applications nor the realization of intended motivations. Hence, further research and work are needed to make the value chain model become an easily-used, practical guideline for e-business implementation

    Managing E-Operations for Comptitive Advantage

    Get PDF
    This paper reports the initial stages of a research project investigating how UK-based organisations undertaking electronic commerce are seeking competitive advantage through the management of their e-operations. Success in e-business depends on the extent to which the dramatic increase in connectivity offered by the Internet can be harnessed to improve efficiency and effectiveness in managing business processes that produce and deliver goods and services. This requires the integration of operations management and information systems both within the organisation and with supply chain partners. Results from a cross-case analysis of seven companies (three manufacturers and four financial service companies) that have converted from bricks-and-mortar to clicks-andmortar are reported. These indicate that: (1) e-commerce investments are mainly driven by a fear of being left behind by competitors rather than a desire to improve business process performance; (2) e-commerce investments tend to automate rather than re-design existing processes; (3) e-operations are run as a discrete set of processes, with little or no integration between eoperations information systems and those of the bricksand-mortar operations; (4) there is a lack of formal performance measures for e-commerce investments; (5) legacy systems and a lack of industry standards are major encumbrances to information systems integration

    Local and global Fokker-Planck neoclassical calculations showing flow and bootstrap current modification in a pedestal

    Full text link
    In transport barriers, particularly H-mode edge pedestals, radial scale lengths can become comparable to the ion orbit width, causing neoclassical physics to become radially nonlocal. In this work, the resulting changes to neoclassical flow and current are examined both analytically and numerically. Steep density gradients are considered, with scale lengths comparable to the poloidal ion gyroradius, together with strong radial electric fields sufficient to electrostatically confine the ions. Attention is restricted to relatively weak ion temperature gradients (but permitting arbitrary electron temperature gradients), since in this limit a delta-f (small departures from a Maxwellian distribution) rather than full-f approach is justified. This assumption is in fact consistent with measured inter-ELM H-Mode edge pedestal density and ion temperature profiles in many present experiments, and is expected to be increasingly valid in future lower collisionality experiments. In the numerical analysis, the distribution function and Rosenbluth potentials are solved for simultaneously, allowing use of the exact field term in the linearized Fokker-Planck collision operator. In the pedestal, the parallel and poloidal flows are found to deviate strongly from the best available conventional neoclassical prediction, with large poloidal variation of a different form than in the local theory. These predicted effects may be observable experimentally. In the local limit, the Sauter bootstrap current formulae appear accurate at low collisionality, but they can overestimate the bootstrap current near the plateau regime. In the pedestal ordering, ion contributions to the bootstrap and Pfirsch-Schluter currents are also modified

    One hundred years of forgetting: A quantitative description of retention

    No full text
    A sample of 210 published data sets were assembled that (a) plotted amount remembered versus time, (b) had 5 or more points, and (c) were smooth enough to fit at least 1 of the functions tested with a correlation coefficient of.90 or greater. Each was fit to 105 different 2-parameter functions. The best fits were to the logarithmic function, the power function, the exponential in the square root of time, and the hyperbola in the square root of time. It is difficult to distinguish among these 4 functions with the available data, but the same set of 4 functions fit most data sets, with autobiographical memory being the exception. Theoretical motivations for the best fitting functions are offered. The methodological problems of evaluating functions and the advantages of searching existing data for regularities before formulating theories are considered. At the simplest level, this article is a search for regularities. We ask whether there is one retention function that can describe all of memory, or perhaps a different function for each of a small number of different kinds of memory. At a more abstract level, it is about the role of theory and data in psychological research. Can we most rapidly advance psychology as a science by developing theories at the level that commonly fills psychological journals such as this one, or should we first try to describe phenomena that could constrain theories by establishing robust, preferably quantitative, regularities (Rubin, 1985, 1989, 1995)? A balance between these alternatives is needed, and here we argue that to obtain such a balance more description is needed. Retention offers the ideal topic to make this abstract, philo

    The taipan galaxy survey:Scientific goals and observing strategy

    Get PDF
    The Taipan galaxy survey (hereafter simply 'Taipan') is a multi-object spectroscopic survey starting in 2017 that will cover 2π steradians over the southern sky (δ ≲ 10°, |b| ≥ 10°), and obtain optical spectra for about two million galaxies out to z &lt; 0.4. Taipan will use the newly refurbished 1.2-m UK Schmidt Telescope at Siding Spring Observatory with the new TAIPAN instrument, which includes an innovative 'Starbugs' positioning system capable of rapidly and simultaneously deploying up to 150 spectroscopic fibres (and up to 300 with a proposed upgrade) over the 6° diameter focal plane, and a purpose-built spectrograph operating in the range from 370 to 870nm with resolving power R≳;2 000. Themain scientific goals of Taipan are (i) to measure the distance scale of the Universe (primarily governed by the local expansion rate, H0) to 1% precision, and the growth rate of structure to 5%; (ii) to make the most extensive map yet constructed of the total mass distribution and motions in the local Universe, using peculiar velocities based on improved Fundamental Plane distances, which will enable sensitive tests of gravitational physics; and (iii) to deliver a legacy sample of low-redshift galaxies as a unique laboratory for studying galaxy evolution as a function of dark matter halo and stellar mass and environment. The final survey, which will be completed within 5 yrs, will consist of a complete magnitude-limited sample (i ≲ 17) of about 1.2 × 106 galaxies supplemented by an extension to higher redshifts and fainter magnitudes (i ≲ 18.1) of a luminous red galaxy sample of about 0.8 × 106 galaxies. Observations and data processing will be carried out remotely and in a fully automated way, using a purpose-built automated 'virtual observer' software and an automated data reduction pipeline. The Taipan survey is deliberately designed to maximise its legacy value by complementing and enhancing current and planned surveys of the southern sky at wavelengths from the optical to the radio; it will become the primary redshift and optical spectroscopic reference catalogue for the local extragalactic Universe in the southern sky for the coming decade.</p

    Progressive Punitivism: Notes on the Use of Punitive Social Control to Advance Social Justice Ends

    No full text

    Content Complexity, Similarity, and Consistency in Social Media: A Deep Learning Approach

    No full text
    corecore