65 research outputs found

    Fuzzy Interval Matrices, Neutrosophic Interval Matrices and their Applications

    Full text link
    The new concept of fuzzy interval matrices has been introduced in this book for the first time. The authors have not only introduced the notion of fuzzy interval matrices, interval neutrosophic matrices and fuzzy neutrosophic interval matrices but have also demonstrated some of its applications when the data under study is an unsupervised one and when several experts analyze the problem. Further, the authors have introduced in this book multiexpert models using these three new types of interval matrices. The new multi expert models dealt in this book are FCIMs, FRIMs, FCInMs, FRInMs, IBAMs, IBBAMs, nIBAMs, FAIMs, FAnIMS, etc. Illustrative examples are given so that the reader can follow these concepts easily. This book has three chapters. The first chapter is introductory in nature and makes the book a self-contained one. Chapter two introduces the concept of fuzzy interval matrices. Also the notion of fuzzy interval matrices, neutrosophic interval matrices and fuzzy neutrosophic interval matrices, can find applications to Markov chains and Leontief economic models. Chapter three gives the application of fuzzy interval matrices and neutrosophic interval matrices to real-world problems by constructing the models already mentioned. Further these models are mainly useful when the data is an unsupervised one and when one needs a multi-expert model. The new concept of fuzzy interval matrices and neutrosophic interval matrices will find their applications in engineering, medical, industrial, social and psychological problems. We have given a long list of references to help the interested reader.Comment: 304 page

    AdaNET research project

    Get PDF
    The components necessary for the success of the commercialization of an Ada Technology Transition Network are reported in detail. The organizational plan presents the planned structure for services development and technical transition of AdaNET services to potential user communities. The Business Plan is the operational plan for the AdaNET service as a commercial venture. The Technical Plan is the plan from which the AdaNET can be designed including detailed requirements analysis. Also contained is an analysis of user fees and charges, and a proposed user fee schedule

    The dark side of artificial intelligence in retail services innovation

    Get PDF
    Many academic scholars argue that the goal of using artificial intelligence (hereafter, AI) in business has been to serve humans in performing their jobs. Yet, some scholars refute such arguments and warn against potential threats of AI to humankind in the future. AI or machine intelligence comprises three main aspects, i.e., learning, reasoning, and self-correction which aggregate to conjure up the artificial mind. In retailing, the employment of AI is progressively becoming a major theme of innovation and retailers are rapidly increasing the use of machine intelligence to efficiently simulate human intelligence and become more competitive through cutting costs and improving customer journeys. However, such benefits can be catastrophic in the long run. Hereby, this chapter represents an attempt to produce a synthesis of current research on the use of AI in retailing and identify the possible benefits or ramifications on the human pillars of the retail process (i.e., the employers, employees, and customers). Finally, this chapter aims to reflect on relevant literature to conclude future research and industrial implications

    AdaNET executive summary

    Get PDF
    The goal of AdaNET is to transfer existing and emerging software engineering technology from the Federal government to the private sector. The views and perspectives of the current project participants on long and short term goals for AdaNET; organizational structure; resources and returns; summary of identified AdaNET services; and the summary of the organizational model currently under discussion are presented

    U.S. agricultural policy and the demand for imported beef

    Get PDF
    An econometric model of the U.S. livestock-feed grain subsector was constructed to investigate the impact of agricultural policies and other exogenous variables upon the U.S. livestock-feed subsector and on the demand for imported beef. The model was estimated with quarterly data to allow the assumption of predetermined beef import levels and total beef consumption was disaggregated into table and processing quality classes to avoid the possibility of bias from mis-classification of some low quality beef under the fed/nonfed classification procedure. The basic specifications for the model equations were derived from economic theory with modification for the capital-good nature of livestock inventories and policy intervention in the feed grain market. The model was stable, with cyclical patterns corresponding to both the cattle and the hog cycle;The estimated impacts of changes in the level of beef imports were generally in line with the estimates obtained from earlier studies. Comparison of the factors influencing the excess demand for imports, and the beef import quota, revealed some divergent trends. An increase in corn prices, for instance, increased the long-run excess demand but reduced the import quota, while an increase in consumer incomes increased the quota but had virtually no effect on the level of excess demand for beef;Comparison of estimates obtained using the fed/nonfed and table/processing approaches to disaggregation suggested that the bias resulting from use of the fed/nonfed approach may tend to underestimate the effect of imports on U.S. beef prices, rather than to overestimate it as has previously been suggested

    The Federal Conference on Intelligent Processing Equipment

    Get PDF
    Research and development projects involving intelligent processing equipment within the following U.S. agencies are addressed: Department of Agriculture, Department of Commerce, Department of Energy, Department of Defense, Environmental Protection Agency, Federal Emergency Management Agency, NASA, National Institutes of Health, and the National Science Foundation

    Effectiveness of Preflighting Software in a Design Workflow

    Get PDF
    The variables that can potentially impact the print quality of a digital file have necessitated the additional workflow step of preflighting. Preflighting is a process by which all elements of a digital file are checked to ensure that they will properly work in a production workflow. This enables problems to be fixed as early as possible in the workflow and not hold up the printing process. Preflighting was originally a manual process, but can now be handled by software. The effectiveness and accuracy of preflighting software was tested by creating files to include common errors, such as fonts not embedded or missing, wrong color space, image resolution too low, wrong file formats and improperly set bleeds. These files were run through preflighting software and a record was kept of whether or not the preflight software identified these common errors. Printed output from these files was then compared to the list of flagged errors from the reports generated by the preflight software. In turn, the output was verified to determine whether the errors affected the final output. Adobe InDesign CS Preflight, Markzware FlightCheck 5.5 and PitStop Professional 6. 1 were selected to determine their effectiveness in detecting and reporting errors that most commonly impact print reproduction quality. The tests conducted showed that none of the three software packages tested wa completely effective in detecting and reporting errors. FlightCheck was the most effective software in detecting errors in the native and PDF files. PitStop flagged more errors that affected output, but all programs flagged too many errors that did not affect output. InDesign Preflight was only effective at flagging RGB errors, while FlightCheck was the most effective at catching common errors. Both FlightCheck and PitStop had problems detecting image color and file format problems in PDF files. This leads to the conclusion that it is best not to rely completely on preflighting software. It is best to use proper file creation techniques and to use preflight as a secondary method to find errors

    One-pass procedures of unequal probability sampling.

    Get PDF
    by Kwok-fai Lee.Thesis (M.Phil.)--Chinese University of Hong Kong, 1993.Includes bibliographical references (leaves 85-86).Chapter CHAPTER 1 --- INTRODUCTION --- p.1Chapter §1.1 --- Unequal probabilities sampling schemes without replacement --- p.1Chapter §1.2 --- Estimation Problems in unequal probabilities sampling scheme without replacement --- p.3Chapter §1.3 --- Classification of unequal probabilities sampling schemes without replacement --- p.5Chapter CHAPTER 2 --- ONE-PASS ALGORITHMS --- p.9Chapter §2.1 --- Characteristics of one-pass algorithms --- p.9Chapter §2.2 --- Existing one-pass algorithms --- p.10Chapter §2.2.1 --- Chao's one-pass algorithmChapter §2.2.2 --- Other algorithmsChapter §2.3 --- Second order inclusion probabilities --- p.14Chapter CHAPTER 3 --- A NEW ONE-PASS ALGORITHM --- p.17Chapter §3.1 --- Introduction --- p.17Chapter §3.2 --- Examination of all possible cases --- p.20Chapter §3.3 --- Initialization --- p.57Chapter §3.4 --- Final step --- p.61Chapter §3.5 --- Theorems --- p.64Chapter §3.6 --- Worked example --- p.76Chapter CHAPTER 4 --- CONCLUSION --- p.83References --- p.8
    corecore