1,735 research outputs found

    Periodic relativity: the theory of gravity in flat space time

    Full text link
    In periodic relativity (PR), the curved space time of general relativity are eliminated by making use of an alternative flat metric without weak field approximation. PR satisfies Einstein's field equations. Theory allows every two body system to deviate differently from the flat Minkowski metric. PR differs from general relativity (GR) in predictions of the proper time intervals of distant objects. PR proposes a definite connection between the proper time interval of an object and gravitational frequency shift of its constituent particles as the object travels through the gravitational field. This is because fundamentally time is periodic in nature. Coordinate and proper time in GR are linear time. Periodic time of PR is the key parameter in development of quantum gravity theory in which the universe begins with a quantum fluctuation in the fundamental substance of the universe which is infinite, motionless and indivisible. PR is based on the dynamic weak equivalence principle which equates the gravitational mass with the relativistic mass. PR provides accurate solutions for the rotation curves of galaxies and the energy levels of the Hydrogen spectra including Lamb shift. Flat space time with Lorentz invariant acceleration presented here makes it possible to unite PR with quantum mechanics. PR satisfies Einstein's field equations with respect to the three major GR tests within the solar system and with respect to the derivation of Friedmann equation in cosmology. PR predicts limiting radius of the event horizon of M87 black hole to be 3Rg and the range of prograde and retrograde spin a_* between \pm0.385 and \pm0.73.Comment: 53 pages, 3 figures, 4 tables, v10: perfected version with M87 black hole analysi

    Parents and Children Together: Design and Implementation of Two Healthy Marriage Programs

    Get PDF
    The Parents and Children Together (PACT) evaluation, conducted by Mathematica Policy Research for the Office of Research, Planning, and Evaluation, Administration for Children and Families (ACF), U.S. Department of Health and Human Services, is examining a set of Healthy Marriage (HM) and Responsible Fatherhood (RF) grantees funded by ACF's Office of Family Assistance (OFA) in 2011. Recognizing that grantees' programs continue to grow and develop, the PACT evaluation aims to provide foundational information to guide ongoing and future program design and evaluation efforts, and to build the evidence base for programming

    Track My Hoist

    Get PDF
    Construction sites all over the world rely on hoist lifts for transporting material and workers at the job site. Despite being such an integral part of construction projects, there is currently no tool to track the use of this critical and highly necessary piece of equipment at the construction site. The purpose of this project is to design a systemthat would enable the managers and employees of construction companies to efficiently track and manage the hoist lifts at their sites. Efficient tracking and management of the lifts would result in savings of cost, time, and energy for the construction companies. Such improvements would also help boost worker morale, which would consequently boost productivity. Over the course of this project the team has designed an interactive yet intuitive software application that helps effectively track and manage hoist lifts on construction sites

    Linked Data Quality Assessment and its Application to Societal Progress Measurement

    Get PDF
    In recent years, the Linked Data (LD) paradigm has emerged as a simple mechanism for employing the Web as a medium for data and knowledge integration where both documents and data are linked. Moreover, the semantics and structure of the underlying data are kept intact, making this the Semantic Web. LD essentially entails a set of best practices for publishing and connecting structure data on the Web, which allows publish- ing and exchanging information in an interoperable and reusable fashion. Many different communities on the Internet such as geographic, media, life sciences and government have already adopted these LD principles. This is confirmed by the dramatically growing Linked Data Web, where currently more than 50 billion facts are represented. With the emergence of Web of Linked Data, there are several use cases, which are possible due to the rich and disparate data integrated into one global information space. Linked Data, in these cases, not only assists in building mashups by interlinking heterogeneous and dispersed data from multiple sources but also empowers the uncovering of meaningful and impactful relationships. These discoveries have paved the way for scientists to explore the existing data and uncover meaningful outcomes that they might not have been aware of previously. In all these use cases utilizing LD, one crippling problem is the underlying data quality. Incomplete, inconsistent or inaccurate data affects the end results gravely, thus making them unreliable. Data quality is commonly conceived as fitness for use, be it for a certain application or use case. There are cases when datasets that contain quality problems, are useful for certain applications, thus depending on the use case at hand. Thus, LD consumption has to deal with the problem of getting the data into a state in which it can be exploited for real use cases. The insufficient data quality can be caused either by the LD publication process or is intrinsic to the data source itself. A key challenge is to assess the quality of datasets published on the Web and make this quality information explicit. Assessing data quality is particularly a challenge in LD as the underlying data stems from a set of multiple, autonomous and evolving data sources. Moreover, the dynamic nature of LD makes assessing the quality crucial to measure the accuracy of representing the real-world data. On the document Web, data quality can only be indirectly or vaguely defined, but there is a requirement for more concrete and measurable data quality metrics for LD. Such data quality metrics include correctness of facts wrt. the real-world, adequacy of semantic representation, quality of interlinks, interoperability, timeliness or consistency with regard to implicit information. Even though data quality is an important concept in LD, there are few methodologies proposed to assess the quality of these datasets. Thus, in this thesis, we first unify 18 data quality dimensions and provide a total of 69 metrics for assessment of LD. The first methodology includes the employment of LD experts for the assessment. This assessment is performed with the help of the TripleCheckMate tool, which was developed specifically to assist LD experts for assessing the quality of a dataset, in this case DBpedia. The second methodology is a semi-automatic process, in which the first phase involves the detection of common quality problems by the automatic creation of an extended schema for DBpedia. The second phase involves the manual verification of the generated schema axioms. Thereafter, we employ the wisdom of the crowds i.e. workers for online crowdsourcing platforms such as Amazon Mechanical Turk (MTurk) to assess the quality of DBpedia. We then compare the two approaches (previous assessment by LD experts and assessment by MTurk workers in this study) in order to measure the feasibility of each type of the user-driven data quality assessment methodology. Additionally, we evaluate another semi-automated methodology for LD quality assessment, which also involves human judgement. In this semi-automated methodology, selected metrics are formally defined and implemented as part of a tool, namely R2RLint. The user is not only provided the results of the assessment but also specific entities that cause the errors, which help users understand the quality issues and thus can fix them. Finally, we take into account a domain-specific use case that consumes LD and leverages on data quality. In particular, we identify four LD sources, assess their quality using the R2RLint tool and then utilize them in building the Health Economic Research (HER) Observatory. We show the advantages of this semi-automated assessment over the other types of quality assessment methodologies discussed earlier. The Observatory aims at evaluating the impact of research development on the economic and healthcare performance of each country per year. We illustrate the usefulness of LD in this use case and the importance of quality assessment for any data analysis

    Cricket Lollipops and Mealworm Chocolate: Investigating Receptivity to Radically Creative Products

    Get PDF
    This study aims to develop a deeper understanding of the factors at play behind consumer’s receptivity to a radically creative product, and whether their personalities, specifically their openness to experience, moderates their intentions to try said product. Creativity research has already established the link between radically creative ideas and the novelty and familiarity dichotomy. I have used products in the entomophagy (edible insects for human consumption) industry as suitably novel products to gauge consumer’s intentions-to-try. For this study, a survey of 77 participants generated observations on intention to try edible insect products of varying familiarities, and information about the participants own resistance to change and openness to experience. A regression analysing the interaction between familiarity and intention to try found significant interactions between the two variables for unprimed populations, and also found significant interactions for the effect of openness on the relationship between familiarity and intention to try . Finally, a theory was posited to explain the results, and suggestions were given for further exploration of the nexus between radical creativity, novelty, intention-to-try, and personality factors

    Asic Design of RF Energy Harvester Using 0.13UM CMOS Technology

    Get PDF
    Recent advances in wireless sensor nodes, data acquisition devices, wearable and implantable medical devices have paved way for low power (sub 50uW) devices. These devices generally use small solid state or thin film batteries for power supply which need replacement or need to be removed for charging. RF energy harvesting technology can be used to charge these batteries without the need to remove the battery from the device, thus providing a sustainable power supply. In other cases, a battery can become unnecessary altogether. This enables us to deploy wireless network nodes in places where regular physical access to the nodes is difficult or cumbersome. This thesis proposes a design of an RF energy harvesting device able to charge commercially available thin film or solid-state batteries. The energy harvesting amplifier circuit is designed in Global Foundry 0.13um CMOS technology using Cadence integrated circuit design tools. This Application Specific Integrated Circuit (ASIC) is intended to have as small a footprint as possible so that it can be easily integrated with the above-mentioned devices. While a dedicated RF power source is a direct solution to provide sustainable power to the harvesting circuit, harvesting ambient RF power from TV and UHF cellular frequencies increases the possibilities of where the harvesting device can be placed. The biggest challenge for RF energy harvesting technology is the availability of adequate amount of RF power. This thesis also presents a survey of available RF power at various ultra-high frequencies in San Luis Obispo, CA.The idea is to determine the frequency band which can provide maximum RF power for harvesting and design a harvester for that frequency band
    • …
    corecore