174,871 research outputs found

    Planning prefabricated homes using the faster, better, cheaper concept

    Get PDF
    Prefabrication as a technology that has long been used in Europe, have a big potential to help fulfill the housing backlog in Indonesia. Project time and cost efficiency along with multiple other benefits have proven that prefabrication system is a valid alternative to the conventional construction system. One of the biggest problem delaying the implementation was the perception of construction practitioners and home buyers about the structural and visual quality, and a prefabricated home owners’satisfaction when compared to a conventionally built home owners’. Faster, Better, Cheaper concept may be the guide to planning a house building project that can be finished faster, be more cost efficient, while achieving higher user satisfaction

    User-Based Web Recommendation System: A Case Study of the National Museum of History

    Get PDF
    With the explosion and the rapidly growing market of the Internet, it is imperative that managers re-think to using technology, especially internet, to deliver services faster, cheaper, and with better quality than their competitors do. The web site provides a communication way that reveals real-time assess data and fruitful information of customers. Therefore, the call for customer with personalized web pages has become loud. To achieve personalized web pages, this study proposes recommendation algorithm of user behavior oriented by using the web log files from National Museum of History

    Femtosecond Pulsed Laser Direct Writing System for Photomask Fabrication

    Get PDF
    Photomasks are the backbone of microfabrication industries. Currently they are fabricated by lithographic process, which is very expensive and time consuming since it is a several step process. These issues can be addressed by fabricating photomask by direct femtosecond laser writing, which is a single step process and comparatively cheaper and faster than lithography. In this paper we discuss about our investigations on the effect of two types of laser writing techniques, namely, front and rear side laser writing with regard to the feature size and the edge quality of the feature. It is proved conclusively that for the patterning of mask, front side laser writing is a better technique than rear side laser writing with regard to smaller feature size and better edge quality. Moreover the energy required for front side laser writing is considerably lower than that for rear side laser writing.Singapore-MIT Alliance (SMA

    Paper Session I-A - Benchmarking: A Tool For Sharing and Cooperation

    Get PDF
    Kennedy Space Center, AmericaÕs gateway to the universe, leads the world in preparing and launching missions from earth to the frontiers of space. NASAÕs mission is to advance and communicate scientific knowledge and understanding; explore, use, and develop space; and research, develop, and transfer aerospace technologies. Like any private or public organization, NASA and its contractors are faced with the challenge of performing work better, faster, and cheaper, while maintaining world-class levels of safety and quality. To meet this challenge, the KSC team is using benchmarking as one tool to achieve the goal of improving the bottom-line. Benchmarking is a disciplined approach for comparing and measuring work processes against best-in-class organizations. One outcome of performing a benchmarking study is the identification of best practices that enable superior performance. These best practices can then be adapted and incorporated to achieve improvements in cost, quality, schedule, and cycle time

    Faster k-Medoids Clustering: Improving the PAM, CLARA, and CLARANS Algorithms

    Full text link
    Clustering non-Euclidean data is difficult, and one of the most used algorithms besides hierarchical clustering is the popular algorithm Partitioning Around Medoids (PAM), also simply referred to as k-medoids. In Euclidean geometry the mean-as used in k-means-is a good estimator for the cluster center, but this does not hold for arbitrary dissimilarities. PAM uses the medoid instead, the object with the smallest dissimilarity to all others in the cluster. This notion of centrality can be used with any (dis-)similarity, and thus is of high relevance to many domains such as biology that require the use of Jaccard, Gower, or more complex distances. A key issue with PAM is its high run time cost. We propose modifications to the PAM algorithm to achieve an O(k)-fold speedup in the second SWAP phase of the algorithm, but will still find the same results as the original PAM algorithm. If we slightly relax the choice of swaps performed (at comparable quality), we can further accelerate the algorithm by performing up to k swaps in each iteration. With the substantially faster SWAP, we can now also explore alternative strategies for choosing the initial medoids. We also show how the CLARA and CLARANS algorithms benefit from these modifications. It can easily be combined with earlier approaches to use PAM and CLARA on big data (some of which use PAM as a subroutine, hence can immediately benefit from these improvements), where the performance with high k becomes increasingly important. In experiments on real data with k=100, we observed a 200-fold speedup compared to the original PAM SWAP algorithm, making PAM applicable to larger data sets as long as we can afford to compute a distance matrix, and in particular to higher k (at k=2, the new SWAP was only 1.5 times faster, as the speedup is expected to increase with k)

    Feedback and time are essential for the optimal control of computing systems

    Get PDF
    The performance, reliability, cost, size and energy usage of computing systems can be improved by one or more orders of magnitude by the systematic use of modern control and optimization methods. Computing systems rely on the use of feedback algorithms to schedule tasks, data and resources, but the models that are used to design these algorithms are validated using open-loop metrics. By using closed-loop metrics instead, such as the gap metric developed in the control community, it should be possible to develop improved scheduling algorithms and computing systems that have not been over-engineered. Furthermore, scheduling problems are most naturally formulated as constraint satisfaction or mathematical optimization problems, but these are seldom implemented using state of the art numerical methods, nor do they explicitly take into account the fact that the scheduling problem itself takes time to solve. This paper makes the case that recent results in real-time model predictive control, where optimization problems are solved in order to control a process that evolves in time, are likely to form the basis of scheduling algorithms of the future. We therefore outline some of the research problems and opportunities that could arise by explicitly considering feedback and time when designing optimal scheduling algorithms for computing systems
    • …
    corecore