7,558 research outputs found
Government Contracts Under Argentine Law: A Comparative Law Overview
This Article will summarize Argentine law on government contracts as it exists today, with special reference to the contracts of the Federal Government. Due to the French origin of the theory and to the fact that this Article is addressed to an American readership, a tentative comparison with the main legal rules on the subject of these two countries will be offered. A discussion of the practical consequences of the application of the administrative contract doctrine, and some possible solutions to the problems created thereby will be then put forward. But first, the basic issues that this doctrine gives rise to will be defined and the French origin of the concept of contract administratif and its reception in Argentina will be explained. The analysis offered will be limited to the general substantive legal regime of Government contracts leaving aside the issues arising from the contracting procedure, i.e., the rules on competitive bidding. To the extent that this substantive regime results from laws and regulations, only those directly applicable to Government contracts shall be considered. Thus, the analysis will only deal tangentially with the impact on these contracts of the exercise of public powers granted by statutes that may affect indirectly the performance of the private contractor. Since such statutes may reach all Government contracts and not only those defined as “administrative” (unless a tautological definition is used, i.e., one that characterizes as “administrative” only those Government contracts that can be reached by laws granting regulatory or police powers to the Government), it may be argued that the issues raised by those statutes lie outside the scope of the doctrine of the administrative contract. Therefore, the issue of the conflict between the legislative powers of the State and the principle of the sanctity of the contract shall not be treated
A Survey of Digital Systems Curriculum and Pedagogy in Electrical and Computer Engineering Programs
Digital Systems is one of the basic foundational courses in Electrical and Computer Engineering. One of the challenges in designing and modifying the curriculum for the course is the fast pace of technology change in the area. TTL chips that were in vogue with students building physical circuits, have given way to new paradigms like FPGA based synthesis with hardware description languages such as VHDL. However, updating a course is not as simple as just changing the book, and changing the syllabus. A large amount of work needs to be done in terms of selecting the book that will accommodate the course, the device that should be used, the laboratory content, and even how much time needs to be dedicated for every topic. All these issues, and many more makes it hard to take the decision of updating the course. For that reason, this paper surveys the pedagogy and methodology that is used to teach the digital systems curriculum at different universities. The goal is that it will serve as a resource for faculty looking to update or revamp their digital systems curricula. Within the document they will find a comparative study by electrical and computer engineering program, a list of textbooks, and the devices most commonly used.Cockrell School of Engineerin
Measuring Galactic Extinction: A Test
We test the recently published all-sky reddening map of Schlegel, Finkbeiner
& Davis (1998 [SFD]) using the extinction study of a region in the Taurus dark
cloud complex by Arce & Goodman (1999 [AG]). In their study, AG use four
different techniques to measure the amount and structure of the extinction
toward Taurus, and all four techniques agree very well. Thus we believe that
the AG results are a truthful representation of the extinction in the region
and can be used to test the reliability of the SFD reddening map. The results
of our test show that the SFD all-sky reddening map, which is based on data
from COBE/DIRBE and IRAS/ISSA, overestimates the reddening by a factor of 1.3
to 1.5 in regions of smooth extinction with A_V > 0.5 mag. In some regions of
steep extinction gradients the SFD map underestimates the reddening value,
probably due to its low spatial resolution. We expect that the astronomical
community will be using the SFD reddening map extensively. We offer this Letter
as a cautionary note about using the SFD map in regions of high extinction (A_V
> 0.5 mag), as it might not be giving accurate reddening values there.Comment: 14 pages (which include 2 pages of figures
Some Computational Aspects of Essential Properties of Evolution and Life
While evolution has inspired algorithmic methods of heuristic optimisation, little has been done in the way of using concepts of computation to advance our understanding of salient aspects of biological evolution. We argue that under reasonable assumptions, interesting conclusions can be drawn that are of relevance to behavioural evolution. We will focus on two important features of life--robustness and fitness optimisation--which, we will argue, are related to algorithmic probability and to the thermodynamics of computation, subjects that may be capable of explaining and modelling key features of living organisms, and which can be used in understanding and formulating algorithms of evolutionary computation
The Thermodynamics of Network Coding, and an Algorithmic Refinement of the Principle of Maximum Entropy
The principle of maximum entropy (Maxent) is often used to obtain prior
probability distributions as a method to obtain a Gibbs measure under some
restriction giving the probability that a system will be in a certain state
compared to the rest of the elements in the distribution. Because classical
entropy-based Maxent collapses cases confounding all distinct degrees of
randomness and pseudo-randomness, here we take into consideration the
generative mechanism of the systems considered in the ensemble to separate
objects that may comply with the principle under some restriction and whose
entropy is maximal but may be generated recursively from those that are
actually algorithmically random offering a refinement to classical Maxent. We
take advantage of a causal algorithmic calculus to derive a thermodynamic-like
result based on how difficult it is to reprogram a computer code. Using the
distinction between computable and algorithmic randomness we quantify the cost
in information loss associated with reprogramming. To illustrate this we apply
the algorithmic refinement to Maxent on graphs and introduce a Maximal
Algorithmic Randomness Preferential Attachment (MARPA) Algorithm, a
generalisation over previous approaches. We discuss practical implications of
evaluation of network randomness. Our analysis provides insight in that the
reprogrammability asymmetry appears to originate from a non-monotonic
relationship to algorithmic probability. Our analysis motivates further
analysis of the origin and consequences of the aforementioned asymmetries,
reprogrammability, and computation.Comment: 30 page
- …