3,358 research outputs found

    Towards Exascale Scientific Metadata Management

    Full text link
    Advances in technology and computing hardware are enabling scientists from all areas of science to produce massive amounts of data using large-scale simulations or observational facilities. In this era of data deluge, effective coordination between the data production and the analysis phases hinges on the availability of metadata that describe the scientific datasets. Existing workflow engines have been capturing a limited form of metadata to provide provenance information about the identity and lineage of the data. However, much of the data produced by simulations, experiments, and analyses still need to be annotated manually in an ad hoc manner by domain scientists. Systematic and transparent acquisition of rich metadata becomes a crucial prerequisite to sustain and accelerate the pace of scientific innovation. Yet, ubiquitous and domain-agnostic metadata management infrastructure that can meet the demands of extreme-scale science is notable by its absence. To address this gap in scientific data management research and practice, we present our vision for an integrated approach that (1) automatically captures and manipulates information-rich metadata while the data is being produced or analyzed and (2) stores metadata within each dataset to permeate metadata-oblivious processes and to query metadata through established and standardized data access interfaces. We motivate the need for the proposed integrated approach using applications from plasma physics, climate modeling and neuroscience, and then discuss research challenges and possible solutions

    Feasibility Study of a Campus-Based Bikesharing Program at UNLV

    Get PDF
    Bikesharing systems have been deployed worldwide as a transportation demand management strategy to encourage active modes and reduce single-occupant vehicle travel. These systems have been deployed at universities, both as part of a city program or as a stand-alone system, to serve for trips to work, as well as trips on campus. The Regional Transportation Commission of Southern Nevada (RTCSNV) has built a public bikesharing system in downtown Las Vegas, approximately five miles from the University of Nevada, Las Vegas (UNLV). This study analyzes the feasibility of a campus-based bikesharing program at UNLV. Through a review of the literature, survey of UNLV students and staff, and field observations and analysis of potential bikeshare station locations, the authors determined that a bikesharing program is feasible at UNLV

    Evolution of Supply Chain Collaboration: Implications for the Role of Knowledge

    Get PDF
    Increasingly, research across many disciplines has recognized the shortcomings of the traditional “integration prescription” for inter-organizational knowledge management. This research conducts several simulation experiments to study the effects of different rates of product change, different demand environments, and different economies of scale on the level of integration between firms at different levels in the supply chain. The underlying paradigm shifts from a static, steady state view to a dynamic, complex adaptive systems and knowledge-based view of supply chain networks. Several research propositions are presented that use the role of knowledge in the supply chain to provide predictive power for how supply chain collaborations or integration should evolve. Suggestions and implications are suggested for managerial and research purposes

    A Development of Computer Aided Program for Aluminium Die-Casting Mold Design

    Get PDF
    In each stage of design, aluminium die-casting mold design is considerate many factors and conditions. The design requires some experiences with trial-and-error platform that causes the problems such as misrun, cold shut, cold shot, penetrations or instability stage during or after molding process.  Proposed in this research is about the development of Computer Aided Program to support aluminium die-casting mold design to select and estimate the initial state values under the same standard condition requirements. Before starting a mold design, the C# language is asked to construct the platform that soothes and is insightful or applicably useful in a content database of reference theory, equations and principles including mold parameters. After identifying the proper input conditions of the mold design, the analysis of die casting MAGMASOFT is performed to verify the conditions of the material flow according to the suggested parameters. The simulated results can be considered as the guideline for supporting mold designer where the essential values of mold dimensions and the cold chamber type injection conditions are obtained as easy-to-access graphical images and numerical values. Applying this developed program can help to reduce time spent for mold designing stage with less defects occurred on the obtained cast parts

    Analyzing the Performance of Data Replication and Data Partitioning in the Cloud: the Beowulf Approach

    Get PDF
    Applications deployed in the Cloud usually come with dedicated performance and availability requirements. This can be achieved by replicating data across several sites and/or by partitioning data. Data replication allows to parallelize read requests and thus to decrease data access latency, but induces significant overhead for the synchronization of updates. Partitioning, in contrast, is highly beneficial if all the data accessed by an application is located at the same site, but again necessitates coordination if distributed transactions are needed to serve applications. In this paper, we analyze three protocols for distributed data management in the Cloud, namely Read-One Write-All-Available (ROWAA), Majority Quorum (MQ) and Data Partitioning (DP) - all in a configuration that guarantees strong consistency. We introduce Beowulf, a meta protocol based on a comprehensive cost model that integrates the three protocols and that dynamically selects the protocol with the lowest latency for a given workload. In the evaluation, we compare the prediction of the Beowulf cost model with a baseline evaluation. The results nicely show the effectiveness of the analytical model and the precision in selecting the best suited protocol for a given workload

    Third Semester and Master’s Thesis Ideas 2018:M.Sc. in Civil and Structural Engineering

    Get PDF

    Leveraging risk management in the sales and operations planning process

    Get PDF
    Thesis (M. Eng. in Logistics)--Massachusetts Institute of Technology, Engineering Systems Division, 2008.Includes bibliographical references (leaves 71-72).(cont.) Lastly, we visited SemiCo, a leading global supplier of high performance semiconductor products, to gain first-hand insight into the S&OP process of a large multinational company and complete a brief case study about how risk management is currently being utilized within this company's S&OP process. Finally, we synthesized these four sources of information in order to develop a common framework and recommendations that companies can use for understanding the best practices for incorporating risk management into the S&OP process.The objective of this thesis project is to analyze how companies can utilize risk management techniques in their sales and operations planning process (S&OP). S&OP is a strategy used to integrate planning and processes across functional groups within a company, such as sales, operations, and finance. A large body of academic and industry literature already exits, proving that S&OP can integrate people, processes, and technology leading to improved operational performance for a business. However, little research has been done in the area of applying risk management techniques to the S&OP process. When companies use S&OP in order to align their demand, supply, capacity, and production, based on various factors such as history, pricing, promotions, competition, and technology, they rarely factor in uncertainty and risk into the S&OP process. Furthermore, for those companies that do implement risk management in the S&OP process, there is no consensus in the business community about how to do this accurately and effectively. Our basic approach to understanding risk management and its place in the S&OP process will be four-fold. First, we conducted a literature review in order to gain basic S&OP process understanding and current risk management strategies. Next, we conducted thirteen hour-long phone interviews with practitioners and thought leaders in the field of sales and operations planning in order to gain insight into how companies currently discuss, assess, and act upon uncertainty within the S&OP process. Third, we conducted an online survey of various companies and consultants working in the field of S&OP to see how they currently discuss and incorporate uncertainty into their S&OP work.by Yanika Daniels and Timothy Kenny.M.Eng.in Logistic
    • …
    corecore