16,247 research outputs found

    The Advocate

    Get PDF
    Headlines Include: Vicinanzo, Toner, Take Wormser; One Night At a Stein; Special Examination Introhttps://ir.lawnet.fordham.edu/student_the_advocate/1019/thumbnail.jp

    Exploring the State of the Art in Legal QA Systems

    Full text link
    Answering questions related to the legal domain is a complex task, primarily due to the intricate nature and diverse range of legal document systems. Providing an accurate answer to a legal query typically necessitates specialized knowledge in the relevant domain, which makes this task all the more challenging, even for human experts. QA (Question answering systems) are designed to generate answers to questions asked in human languages. They use natural language processing to understand questions and search through information to find relevant answers. QA has various practical applications, including customer service, education, research, and cross-lingual communication. However, they face challenges such as improving natural language understanding and handling complex and ambiguous questions. Answering questions related to the legal domain is a complex task, primarily due to the intricate nature and diverse range of legal document systems. Providing an accurate answer to a legal query typically necessitates specialized knowledge in the relevant domain, which makes this task all the more challenging, even for human experts. At this time, there is a lack of surveys that discuss legal question answering. To address this problem, we provide a comprehensive survey that reviews 14 benchmark datasets for question-answering in the legal field as well as presents a comprehensive review of the state-of-the-art Legal Question Answering deep learning models. We cover the different architectures and techniques used in these studies and the performance and limitations of these models. Moreover, we have established a public GitHub repository where we regularly upload the most recent articles, open data, and source code. The repository is available at: \url{https://github.com/abdoelsayed2016/Legal-Question-Answering-Review}

    Teaching Remedial Problem-Solving Skills to a Law School\u27s Underperforming Students

    Get PDF
    This article describes a course called the Art of Lawyering developed by the Texas A&M University School of Law to help the bottom quarter of the 2L class develop the critical-thinking and problem-solving skills they should have learned in their first year of law school. Students in the bottom quarter of the class at the beginning of their 2L year are most at risk for failing the bar exam after graduation. The Art of Lawyering gives these students the structural framework necessary to solve problems like a lawyer, improve their performance in law school, and pass the bar exam. The course, in its current iteration, is remarkably effective, producing a significant increase in students\u27 grade-point averages. This article describes the theory, methods, and resources behind the course, and it includes a detailed lesson plan so that other schools can replicate the course and realize similar success

    Putting the Bar Exam on Constitutional Notice: Cut Scores, Race & Ethnicity, and the Public Good

    Get PDF
    Nothing to see here. Season in and season out, bar examiners, experts, supreme courts, and bar associations seem nonplussed, trapped by what they see as the facts, namely, that the bar exam has no possible weaknesses, at least when it comes to alternative licensure mechanisms, that the bar exam is not to blame for disparate racial impacts that spring from administration of this ritualistic process, and that there are no viable alternatives in the harsh cold world of determining minimal competency for the noble purpose of protecting the public from legal harms. All a lie, of course. But rather than challenging our assumptions, state bar associations and bar examiners keep going as business as usual. We might even say that it’s just the cost of doing business. Yes, some bar applicants will pay the price, they admit, by not passing bar exams, but protecting the public good demands that we be demanding, that we not yield to temptation to soften our approach. We can never be too cautious when it comes to protecting the public. After all, the public good is at risk. Or is it? This Article challenges conventional stories told about the bar exam. Part I describes the background of the bar exam as currently used by most jurisdictions to include a hypothetical “Socratic” conversation as a prelude to understanding the bar exam and its impact on demography and the public good. Part II catalogues stories we tell to justify our recurrent resort to bar exams as the penultimate source of wisdom in making licensure decisions. Part III exposes fallacies behind many of these justifications. Part IV analyzes whether we might look to common law tort principles as a tool for exposing whether the bar exam, by producing recurrent well-known racial disparate impacts, might suffer from constitutional infirmity. Part V concludes with an exploration of some common-sense alternatives to the behemoth of the bar exam to better protect the public

    How to Build a Better Bar Exam

    Get PDF
    As a licensing exam, the purpose of the bar exam is consumer protection–-ensuring that new lawyers have the minimum competencies required to practice law effectively. As critics point out, however, the exam, and particularly the multiple-choice question portion of the exam, has significant flaws because it assesses legal knowledge and analysis in an artificial and unrealistic context, and the closed-book format rewards the ability to memorize thousands of legal rules, a skill unrelated to law practice. This essay discusses how to improve the exam by changing its multiple-choice content and format. We use two law licensing exams to illustrate how bar examiners could utilize an open-book format and develop multiple-choice questions that assess a candidate’s ability to engage in legal reasoning and analysis without demanding unproductive memorization of so many detailed rules of law. The first example, the case file approach, is drawn from a 1983 California “Performance Test” in which test-takers received a case file and a series of multiple-choice questions testing the candidates’ ability to read, understand, and use cases to support their legal positions. The second example discusses the current licensing exam administered by The Law Society of Upper Canada (LSUC), an open-book multiple-choice exam that tests the use of doctrinal knowledge in the context of law practice. These two licensing exams demonstrate how we could re-structure the bar exam’s multiple-choice questions to measure legal analysis and reasoning skills as lawyers use those skills to represent clients. They also demonstrate that we can do a better job of testing some aspects of minimum competence, while still using a multiple-choice exam format

    The Advocate

    Get PDF
    Crowley Named Winner of 1977 Keefe Award; Alumni Secretary Blake Named to Head Placement; McLaughlin Announced Faculty Appointments; Graduation Planned for Damrosch Parkhttps://ir.lawnet.fordham.edu/student_the_advocate/1072/thumbnail.jp

    Battling Biases: How Can Diverse Students Overcome Test Bias on the Multistate Bar Examination

    Get PDF
    Drafters of standardized tests, such as the Law School Admissions Test (LSAT) and Multistate Bar Examination (MBE), strive to eliminate biases in multiple-choice questions by assembling representatives of diverse backgrounds to screen and discard prejudicial questions. But in reality, intelligence tests will always contain some aspect of bias because a committee of test administrators can never represent the views of every person. Nevertheless, the bar exam incorrectly assumes that all applicants learned the same information throughout their academic careers and possess similar cultural experiences and opinions. The bar exam has not fully recognized that questions can be interpreted differently. Scholars advocate to abandon intelligence tests as a measure of a person’s future success, but this is unlikely to happen anytime soon because intelligence tests have been used since the early 1900s. Thus, in the meantime, professors must teach students how to identify and eliminate personal biases to increase the students’ chances of selecting the best answer. We must acknowledge that biases will never fully disappear and figure out how to properly support students who experience biases. This article does not promote conforming to social norms, changing our students core beliefs, or decreasing diversity. This article addresses the reality of the bar exam and provides students with a chameleon-like skill that they can use to ensure they are triumphant on the MBEs. Part I provides background information on the components of the bar exam and disparity in performance results between Whites and people of color. It defines test validity and explores test biases as a possible reason for the lower passage of minorities, such as language barriers, the equal experience assumption, promotion of dominantvalues, and bias in item selection. Part II discusses test biases on the MBE portion of the bar by exploring the National Conference of Bar Examiners’ (NCBE) five myths and breaking down specific multiple-choice questions from NCBE’s Online Practice Exam #4. Part III shares how academics can (a) reframe stereotype threat to help students overcome test anxiety and (b) reframe the speediness and memorization requirements of the bar exam to requirements of grit and determination to join the profession. Finally, Part IV acknowledges that test biases are unlikely to disappear and provides a step-by-step solution to help students be successful on the MBEs. The step-by-step approach is supported by statistics from the Logic for Lawyers class at the University of San Francisco School of Law, a multiple-choice skills-based test that employs the step-by-step method

    The Justinian

    Get PDF

    The Advocate

    Get PDF
    Headlines include: Dean Reilly And The Alumni Affairs Officehttps://ir.lawnet.fordham.edu/student_the_advocate/1003/thumbnail.jp
    • 

    corecore