60 research outputs found

    SCU Faculty & Staff Housing Development

    Get PDF
    Due to the high housing costs in the Bay Area, Santa Clara University’s (SCU) faculty and staff have to live further away from campus where the housing market is more affordable, ultimately increasing their commute time and increasing the environmental impact due to transportation. Therefore, SCU has expressed the need to provide affordable housing for their faculty and staff who do not earn enough income to be able to live in the City of San Jose or County of Santa Clara. The project proposed in this report represents the efforts of SCU Civil Engineering students to adhere towards the social, sustainable, and economic concerns held by the Civil, Environmental and Sustainable Engineering (CESE) Department in the design and construction of a proposed housing development for Santa Clara University faculty and staff. The team of civil engineering students, RADS Construction, LLC., has provided design recommendations for the 1200 Campbell Avenue development. The team gained initial inspiration from the Planned Development Zoning Submittal that was received from the City of San Jose, which contained architectural drawings provided by Studio TSquare. The team also received a map of the water facilities at the proposed project site from San Jose Water. Using the architectural drawings and a map of the water facilities on site, RADS Construction designed the structural and stormwater management plans for the development; designed potable water and wastewater piping layouts; and created a construction schedule, waste management plan, and a Building Information Modeling (BIM) model. The team decided to change the originally proposed incubator space, as displayed in the architectural drawings, into a commercial space to allow shops and other small businesses to use this new building. This change helped to address the concerns of the stakeholders in the proximity of the project site since they wanted to benefit from this new building to help compensate for bringing in more traffic into the neighborhood. Through these deliverables, RADS Construction met both social and economic needs of SCU’s faculty and staff, as well as fulfilling the CESE Departmental and School of Engineering standards for socially, economically, and environmentally sustainable engineering

    Markov Chain Monte Carlo in a Dynamical System of Information Theoretic Particles

    Get PDF
    In Bayesian learning, the posterior probability density of a model parameter is estimated from the likelihood function and the prior probability of the parameter. The posterior probability density estimate is refined as more evidence becomes available. However, any non-trivial Bayesian model requires the computation of an intractable integral to obtain the probability density function (PDF) of the evidence. Markov Chain Monte Carlo (MCMC) is a well-known algorithm that solves this problem by directly generating the samples of the posterior distribution without computing this intractable integral. We present a novel perspective of the MCMC algorithm which views the samples of a probability distribution as a dynamical system of Information Theoretic particles in an Information Theoretic field. As our algorithm probes this field with a test particle, it is subjected to Information Forces from other Information Theoretic particles in this field. We use Information Theoretic Learning (ITL) techniques based on Rényi’s α-Entropy function to derive an equation for the gradient of the Information Potential energy of the dynamical system of Information Theoretic particles. Using this equation, we compute the Hamiltonian of the dynamical system from the Information Potential energy and the kinetic energy. The Hamiltonian is used to generate the Markovian state trajectories of the system

    Multi-instance contingent fusion for the verification of infant fingerprints

    Get PDF
    It is imperative to establish an automated system for the identification of neonates (1–28 days old) and infants (29 days–12 months old) through the utilisation of the readily accessible 500 ppi fingerprint reader. This measure is crucial in addressing the issue of newborn swapping, facilitating the identification of missing children, monitoring immunisation records, maintaining comprehensive medical history, and other related purposes. The objective of this study is to demonstrate the potential for future identification of infants using fingerprints obtained from a 500 ppi fingerprint reader by employing a fusion technique that combines multiple instances of fingerprints, specifically the left thumb and right index fingers. The fingerprints were acquired from babies who were between the ages of one day and six months at the enrolment session. The sum-score fusion algorithm was implemented. The approach mentioned above yielded verification accuracies of 73.8%, 69.05%, and 57.14% for time intervals of 1 month, 3 months, and 6 months, respectively, between the enrolment and query fingerprints

    Adaptive Nonlinear System Identification: The Volterra and Wiener Model Approaches

    No full text

    Adaptive Signal Processing and Machine Learning Using Entropy and Information Theory

    No full text
    This Special Issue on “Adaptive Signal Processing and Machine Learning Using Entropy and Information Theory” was birthed from observations of the recent trend in the literature [...

    FPSoC-Based Architecture for a Fast Motion Estimation Algorithm in H.264/AVC

    No full text
    There is an increasing need for high quality video on low power, portable devices. Possible target applications range from entertainment and personal communications to security and health care. While H.264/AVC answers the need for high quality video at lower bit rates, it is significantly more complex than previous coding standards and thus results in greater power consumption in practical implementations. In particular, motion estimation (ME), in H.264/AVC consumes the largest power in an H.264/AVC encoder. It is therefore critical to speed-up integer ME in H.264/AVC via fast motion estimation (FME) algorithms and hardware acceleration. In this paper, we present our hardware oriented modifications to a hybrid FME algorithm, our architecture based on the modified algorithm, and our implementation and prototype on a PowerPC-based Field Programmable System on Chip (FPSoC). Our results show that the modified hybrid FME algorithm on average, outperforms previous state-of-the-art FME algorithms, while its losses when compared with FSME, in terms of PSNR performance and computation time, are insignificant. We show that although our implementation platform is FPGA-based, our implementation results compare favourably with previous architectures implemented on ASICs. Finally we also show an improvement over some existing architectures implemented on FPGAs

    Achieving Maximum Possible Speed on Constrained Block Transmission Systems

    No full text
    We develop a theoretical framework for achieving the maximum possible speed on constrained digital channels with a finite alphabet. A common inaccuracy that is made when computing the capacity of digital channels is to assume that the inputs and outputs of the channel are analog Gaussian random variables, and then based upon that assumption, invoke the Shannon capacity bound for an additive white Gaussian noise (AWGN) channel. In a channel utilizing a finite set of inputs and outputs, clearly the inputs are not Gaussian distributed and Shannon bound is not exact. We study the capacity of a block transmission AWGN channel with quantized inputs and outputs, given the simultaneous constraints that the channel is frequency selective, there exists an average power constraint at the transmitter and the inputs of the channel are quantized. The channel is assumed known at the transmitter. We obtain the capacity of the channel numerically, using a constrained Blahut-Arimoto algorithm which incorporates an average power constraint at the transmitter. Our simulations show that under certain conditions the capacity approaches very closely the Shannon bound. We also show the maximizing input distributions. The theoretical framework developed in this paper is applied to a practical example: the downlink channel of a dial-up PCM modem connection where the inputs to the channel are quantized and the outputs are real. We test how accurate is the bound 53.3 kbps for this channel. Our results show that this bound can be improved upon.</p
    • …
    corecore