10,128 research outputs found
Information Overload: An Overview
For almost as long as there has been recorded information, there has been a perception that humanity has been overloaded by it. Concerns about 'too much to read' have been expressed for many centuries, and made more urgent since the arrival of ubiquitous digital information in the late twentieth century. The historical perspective is a necessary corrective to the often, and wrongly, held view that it is associated solely with the modern digital information environment, and with social media in particular. However, as society fully experiences Floridi's Fourth Revolution, and moves into hyper-history (with society dependent on, and defined by, information and communication technologies) and the infosphere (a information environment distinguished by a seamless blend of online and offline information actvity), individuals and societies are dependent on, and formed by, information in an unprecedented way, information overload needs to be taken more seriously than ever. Overload has been claimed to be both the major issue of our time, and a complete non-issue. It has been cited as an important factor in, inter alia, science, medicine, education, politics, governance, business and marketing, planning for smart cities, access to news, personal data tracking, home life, use of social media, and online shopping, and has even influenced literature The information overload phenomenon has been known by many different names, including: information overabundance, infobesity, infoglut, data smog, information pollution, information fatigue, social media fatigue, social media overload, information anxiety, library anxiety, infostress, infoxication, reading overload, communication overload, cognitive overload, information violence, and information assault. There is no single generally accepted definition, but it can best be understood as that situation which arises when there is so much relevant and potentially useful information available that it becomes a hindrance rather than a help. Its essential nature has not changed with changing technology, though its causes and proposed solutions have changed much. The best ways of avoiding overload, individually and socially, appear to lie in a variety of coping strategies, such as filtering, withdrawing, queuing, and 'satisficing'. Better design of information systems, effective personal information management, and the promotion of digital and media literacies, also have a part to play. Overload may perhaps best be overcome by seeking a mindful balance in consuming information, and in finding understanding
Consequentialist Options
According to traditional forms of act-consequentialism, an action is right if and only if no other action in the given circumstances would have better consequences. It has been argued that this view does not leave us enough freedom to choose between actions which we intuitively think are morally permissible but not required options. In the first half of this article, I will explain why the previous consequentialist responses to this objection are less than satisfactory. I will then attempt to show that agents have more options on consequentialist grounds than the traditional forms of act-consequentialism acknowledged. This is because having a choice between many permissible options can itself have value
Toward Better Defining the Field of Agribusiness Management
Agribusiness management, authority, bounded rationality, diversified growth, resources., Agribusiness, Resource /Energy Economics and Policy, Q1,
Is there an optimization in bounded rationality? The ratio of aspiration levels
Simon’s (1955) famous paper was one of the first to cast doubt on the validity of rational choice theory; it has been supplemented by many more papers in the last three and a half decades. Nevertheless, rational choice theory plays a crucial role in classical and neoclassical economic theory, which presumes a completely rational agent. The central points characterizing such an agent are: (1) The agent uses all the information that is given to him. (2) The agent has clear preferences with respect to the results of different actions. (3) The agent has adequate competences to optimize his decisions. As an alternative to this conception, Simon (1955) himself suggests the concept of “bounded rationality”. In this context, Simon (1956) discusses a principle, which he names the “satisficing principle” (for explanations with respect to this notion cf. Gigerenzer & Todd 1999, p. 13). It assumes that, instead of searching for an optimal action, the search for an action terminates if an alternative has been found that satisfies a given “aspiration level”. It will be demonstrated that although the satisficing principle is nothing but a heuristic, there is a mathematical optimization at work when aspiration levels are used in this kind of problems. The question about the optimal aspiration level can be posed. Optimization within the framework of bounded rationality is possible. However, the way in which such an optimization can be achieved is very simple: Optimal thresholds in binary sequential decisions rest with the median.
Linear Superiorization for Infeasible Linear Programming
Linear superiorization (abbreviated: LinSup) considers linear programming
(LP) problems wherein the constraints as well as the objective function are
linear. It allows to steer the iterates of a feasibility-seeking iterative
process toward feasible points that have lower (not necessarily minimal) values
of the objective function than points that would have been reached by the same
feasiblity-seeking iterative process without superiorization. Using a
feasibility-seeking iterative process that converges even if the linear
feasible set is empty, LinSup generates an iterative sequence that converges to
a point that minimizes a proximity function which measures the linear
constraints violation. In addition, due to LinSup's repeated objective function
reduction steps such a point will most probably have a reduced objective
function value. We present an exploratory experimental result that illustrates
the behavior of LinSup on an infeasible LP problem.Comment: arXiv admin note: substantial text overlap with arXiv:1612.0653
Constraint-based scheduling
The GERRY scheduling system developed by NASA Ames with assistance from the Lockheed Space Operations Company, and the Lockheed Artificial Intelligence Center, uses a method called constraint-based iterative repair. Using this technique, one encodes both hard rules and preference criteria into data structures called constraints. GERRY repeatedly attempts to improve schedules by seeking repairs for violated constraints. The system provides a general scheduling framework which is being tested on two NASA applications. The larger of the two is the Space Shuttle Ground Processing problem which entails the scheduling of all the inspection, repair, and maintenance tasks required to prepare the orbiter for flight. The other application involves power allocation for the NASA Ames wind tunnels. Here the system will be used to schedule wind tunnel tests with the goal of minimizing power costs. In this paper, we describe the GERRY system and its application to the Space Shuttle problem. We also speculate as to how the system would be used for manufacturing, transportation, and military problems
Optimal modularity: A demonstration of the evolutionary advantage of modular architectures
Modularity is an important concept in evolutionary theorizing but lack of a consistent definition renders study difficult. Using the generalised NK-model of fitness landscapes, we differentiate modularity from decomposability. Modular and decomposable systems are both composed of subsystems but in the former these subsystems are connected via interface standards while in the latter subsystems are completely isolated. We derive the optimal level of modularity, which minimises the time required to globally optimise a system, both for the case of two-layered systems and for the general case of multi-layered hierarchical systems containing modules within modules. This derivation supports the hypothesis of modularity as a mechanism to increase the speed of evolution. Our formal definition clarifies the concept of modularity and provides a framework and an analytical baseline for further research.Modularity, Decomposability, Near-decomposability, Complexity, NK-model, Search, hierarchy
The 2014 International Planning Competition: Progress and Trends
We review the 2014 International Planning Competition (IPC-2014), the eighth
in a series of competitions starting in 1998. IPC-2014 was held in three separate
parts to assess state-of-the-art in three prominent areas of planning research: the
deterministic (classical) part (IPCD), the learning part (IPCL), and the probabilistic
part (IPPC). Each part evaluated planning systems in ways that pushed the edge of
existing planner performance by introducing new challenges, novel tasks, or both.
The competition surpassed again the number of competitors than its predecessor,
highlighting the competition’s central role in shaping the landscape of ongoing
developments in evaluating planning systems
- …
