781 research outputs found

    Spectator 1988-03-02

    Get PDF

    FIELD, Issue 37, Fall 1987

    Get PDF
    https://digitalcommons.oberlin.edu/field/1032/thumbnail.jp

    The BG News March 2, 1977

    Get PDF
    The BGSU campus student newspaper March 2, 1977. Volume 61 - Issue 74https://scholarworks.bgsu.edu/bg-news/4341/thumbnail.jp

    Garth Lunt v. Harold Lance and Diane Lance : Reply Brief

    Get PDF
    APPEAL AND CROSS APPEAL FROM JUDGMENT OF THE FOURTH DISTRICT COURT, WASATCH COUNTY, THE HONORABLE DEREK PULLA

    The BG News March 7, 1984

    Get PDF
    The BGSU campus student newspaper March 7, 1984. Volume 66 - Issue 87https://scholarworks.bgsu.edu/bg-news/5233/thumbnail.jp

    Spartan Daily, March 1, 1982

    Get PDF
    Volume 78, Issue 17https://scholarworks.sjsu.edu/spartandaily/6859/thumbnail.jp

    Law and Conscience

    Get PDF

    You Might be a Robot

    Get PDF
    As robots and artificial intelligence (Al) increase their influence over society, policymakers are increasingly regulating them. But to regulate these technologies, we first need to know what they are. And here we come to a problem. No one has been able to offer a decent definition of robots arid AI-not even experts. What\u27s more, technological advances make it harder and harder each day to tell people from robots and robots from dumb machines. We have already seen disastrous legal definitions written with one target in mind inadvertently affecting others. In fact, if you are reading this you are (probably) not a robot, but certain laws might already treat you as one. Definitional challenges like these aren\u27t exclusive to robots and Al. But today, all signs indicate we are approaching an inflection point. Whether it is citywide bans of robot sex brothels or nationwide efforts to crack down on ticket scalping bots, we are witnessing an explosion of interest in regulating robots, human enhancement technologies, and all things in between. And that, in turn, means that typological quandaries once confined to philosophy seminars can no longer be dismissed as academic. Want, for example, to crack down on foreign influence campaigns by regulating social media bots? Be careful not to define bot too broadly (like the Calfornia legislature recently did), or the supercomputer nestled in your pocket might just make you one. Want, instead, to promote traffic safety by regulating drivers? Be careful not to presume that only humans can drive (as our Federal Motor Vehicle Safety Standards do), or you may soon exclude the best drivers on the road. In this Article, we suggest that the problem isn\u27t simply that we haven\u27t hit upon the right definition. Instead, there may not be a right definition for the multifaceted, rapidly evolving technologies we call robots or AI. As we will demonstrate, even the most thoughtful of definitions risk being overbroad, underinclusive, or simply irrelevant in short order. Rather than trying in vain to find the perfect definition, we instead argue that policymakers should do as the great computer scientist, Alan Turing, did when confronted with the challenge of defining robots: embrace their ineffable nature. We offer several strategies to do so. First, whenever possible, laws should regulate behavior, not things (or as we put it, regulate verbs, not nouns). Second, where we must distinguish robots from other entities, the law should apply what we call Turing\u27s Razor, identifying robots on a case-by-case basis. Third, we offer six functional criteria for making these types of I know it when I see it determinations and argue that courts are generally better positioned than legislators to apply such standards. Finally, we argue that if we must have definitions rather than apply standards, they should be as short-term and contingent as possible. That, in turn, suggests that regulators-not legislators-should play the defining role

    Law and Conscience

    Get PDF

    Kenyon Collegian - October 31, 2002

    Get PDF
    https://digital.kenyon.edu/collegian/1423/thumbnail.jp
    corecore