16 research outputs found

    P2_10 Toasty Candles

    Get PDF
    The majority of houses use gas boilers to heat them up. This paper looks into an alternative method of heating up one room in a house. This paper investigates how many tea light candles it would take to heat up an empty room with a volume of 63.2 m3 by 5◦C as a potential alternative. We found that the total number of lit tea light candles it would take to achieve this in 15 minutes was 5 in a 100% insulated room.

    P2_6 Solar Leaf

    Get PDF
    This paper explores the reality of a gigantic photosynthetic surface with the goal of fulfilling thehuman species energy requirements. We determined that an area of approximately 3000 km2 would suffice in producing enough energy to power the planet for 1 year. Via a comparison to solar panels we suggest that this area or perhaps even smaller covered with modern solar panels could truly power the planet. However further research into this is certainly required

    P2_7 Outrunning Climate Change

    Get PDF
    In this paper we investigate the rate at which you would have to move Earth away from the Sun to combat the global temperature increase due to greenhouse gas emissions. By assuming the average global temperature increases linearly between 0.2 â—¦C or 4.8 â—¦C by the year 2100, we found that the Earth would need to be moved between 5.44 Ă— 106 and 1.31 Ă— 108 meters per year

    P2_9 Feasibility of Outrunning Climate Change

    Get PDF
    In this paper we investigate the feasibility of moving the Earth away from the Sun to combat climate change. We found that the energy needed to move the Earth 5.44 Ă— 106 m is 8.71 Ă— 1030 J per year for the lower bound and 5.29 Ă— 1033 J to move it 1.31 Ă— 108 J for the upper bound. For the lower bound, the number of rockets required to generate this energy is 1.21 Ă— 1030 and would cost 4.97Ă—1039USD.Fortheupperbound,1.77Ă—1034rocketswouldbeneeded,costing4.97 Ă— 1039 USD. For the upper bound, 1.77 Ă— 1034 rockets would be needed, costing 7.26 Ă— 1043 USD. We conclude that this is not a feasible method for combating climate change as using renewable energy sources to power the entire planet would be much more cost effective

    P2_2 Big Ben 2: Enormous Benjamin

    Get PDF
    The time-keeping ability of an analogue clock in the modern age relies heavily on the functioning of the motor. In this paper we ask: In the absence of a motor as the limiter, assuming some kind of ideal time keeping implement with the power to provide an arbitrarily large amount of torque, what would be the limits of the clock’s time-keeping abilities? We find the optimal material for such a construction to be beryllium and with calculations using the shear wave speed in this material we find a 4.4km length clock hand to be a reasonable upper limit to this clock’s abilities

    P2_5 Kinetic Impact Weapon Jabba

    Get PDF
    Within the Star Wars fictional universe, the Death Star was a battle station used by a government called the Galactic Empire and was designed to be able to destroy planets with a “super laser”. For this paper, we considered a method to break apart the Death Star. Jabba the Hutt was a powerful gangster in the Star Wars universe and this paper looks into how the Death Star could have been broken apart by colliding a perfect sphere of multiple Jabba the Hutt characters with the Death Star. We found that the total number of Jabba the Hutt characters was 4.018x10^25

    P2_3 Balloon Ovens

    Get PDF
    In this paper we investigate the amount of energy which can be extracted from a balloon which has built up charge due to the Contact Electrification. We then investigate the possibility of using charged balloons to generate enough electricity to cook a whole chicken. We found that 3.0 Ă— 1010 balloons would be needed to cook the chicken so it would be safe for consumption. We conclude that the electricity generated from electrically charged balloons likely has no practical application

    A Symbiotic Brain-Machine Interface through Value-Based Decision Making

    Get PDF
    BACKGROUND: In the development of Brain Machine Interfaces (BMIs), there is a great need to enable users to interact with changing environments during the activities of daily life. It is expected that the number and scope of the learning tasks encountered during interaction with the environment as well as the pattern of brain activity will vary over time. These conditions, in addition to neural reorganization, pose a challenge to decoding neural commands for BMIs. We have developed a new BMI framework in which a computational agent symbiotically decoded users' intended actions by utilizing both motor commands and goal information directly from the brain through a continuous Perception-Action-Reward Cycle (PARC). METHODOLOGY: The control architecture designed was based on Actor-Critic learning, which is a PARC-based reinforcement learning method. Our neurophysiology studies in rat models suggested that Nucleus Accumbens (NAcc) contained a rich representation of goal information in terms of predicting the probability of earning reward and it could be translated into an evaluative feedback for adaptation of the decoder with high precision. Simulated neural control experiments showed that the system was able to maintain high performance in decoding neural motor commands during novel tasks or in the presence of reorganization in the neural input. We then implanted a dual micro-wire array in the primary motor cortex (M1) and the NAcc of rat brain and implemented a full closed-loop system in which robot actions were decoded from the single unit activity in M1 based on an evaluative feedback that was estimated from NAcc. CONCLUSIONS: Our results suggest that adapting the BMI decoder with an evaluative feedback that is directly extracted from the brain is a possible solution to the problem of operating BMIs in changing environments with dynamic neural signals. During closed-loop control, the agent was able to solve a reaching task by capturing the action and reward interdependency in the brain

    Evaluation of individual and ensemble probabilistic forecasts of COVID-19 mortality in the United States

    Get PDF
    Short-term probabilistic forecasts of the trajectory of the COVID-19 pandemic in the United States have served as a visible and important communication channel between the scientific modeling community and both the general public and decision-makers. Forecasting models provide specific, quantitative, and evaluable predictions that inform short-term decisions such as healthcare staffing needs, school closures, and allocation of medical supplies. Starting in April 2020, the US COVID-19 Forecast Hub (https://covid19forecasthub.org/) collected, disseminated, and synthesized tens of millions of specific predictions from more than 90 different academic, industry, and independent research groups. A multimodel ensemble forecast that combined predictions from dozens of groups every week provided the most consistently accurate probabilistic forecasts of incident deaths due to COVID-19 at the state and national level from April 2020 through October 2021. The performance of 27 individual models that submitted complete forecasts of COVID-19 deaths consistently throughout this year showed high variability in forecast skill across time, geospatial units, and forecast horizons. Two-thirds of the models evaluated showed better accuracy than a naĂŻve baseline model. Forecast accuracy degraded as models made predictions further into the future, with probabilistic error at a 20-wk horizon three to five times larger than when predicting at a 1-wk horizon. This project underscores the role that collaboration and active coordination between governmental public-health agencies, academic modeling teams, and industry partners can play in developing modern modeling capabilities to support local, state, and federal response to outbreaks

    The United States COVID-19 Forecast Hub dataset

    Get PDF
    Academic researchers, government agencies, industry groups, and individuals have produced forecasts at an unprecedented scale during the COVID-19 pandemic. To leverage these forecasts, the United States Centers for Disease Control and Prevention (CDC) partnered with an academic research lab at the University of Massachusetts Amherst to create the US COVID-19 Forecast Hub. Launched in April 2020, the Forecast Hub is a dataset with point and probabilistic forecasts of incident cases, incident hospitalizations, incident deaths, and cumulative deaths due to COVID-19 at county, state, and national, levels in the United States. Included forecasts represent a variety of modeling approaches, data sources, and assumptions regarding the spread of COVID-19. The goal of this dataset is to establish a standardized and comparable set of short-term forecasts from modeling teams. These data can be used to develop ensemble models, communicate forecasts to the public, create visualizations, compare models, and inform policies regarding COVID-19 mitigation. These open-source data are available via download from GitHub, through an online API, and through R packages
    corecore