404 research outputs found

    Oxidation Kinetics Of Polycrystalline LaCrO3

    Get PDF
    The oxidation kinetics of polycrystalline LaCrO3 were determined by measuring the time and temperature dependence of the weight and conductivity change of reduced samples. A region of fast diffusion followed by a smaller, slower diffusion tail was observed in the thermogravimetric measurements. This observation can be interpreted as the rapid diffusion along the grain boundaries and subsequent diffusion into the body of the grain. The absence of the tail in the conductivity measurements is due to the high hole mobility along the boundaries. Copyright © 1987, Wiley Blackwell. All rights reserved

    High-temperature Defect Structure Of Nb-doped LaCrO3

    Get PDF
    Electrical conductivity and Seebeck measurements on LaCr0.98Nb0.02CrO3 show that the defect structure of the material is mainly controlled by the extrinsic electrons formed by the Nb donors through the electronic compensation process. The experimental results also indicate that this material conducts electricity via a small polaron mechanism with an electron mobility around 0.004-0.01 cm2/V sec between 1100 and 1300°C. © 1989

    Government Data and the Invisible Hand

    Get PDF
    If President Barack Obama\u27s new administration really wants to embrace the potential of Internet-enabled government transparency, it should follow a counter-intuitive but ultimately compelling strategy: reduce the federal role in presenting important government information to citizens. Today, government bodies consider their own Web sites to be a higher priority than technical infrastructures that open up their data for others to use. We argue that this understanding is a mistake. It would be preferable for government to understand providing reusable data, rather than providing Web sites, as the core of its online publishing responsibility. During the presidential campaign, all three major candidates indicated that they thought the federal government could make better use of the Internet. Barack Obama\u27s platform went the furthest and explicitly endorsed maling government data available online in universally accessible formats. Hillary Clinton, meanwhile, remarked that she wanted to see much more government information online. John McCain\u27s platform called for a new Office of Electronic Government. But the situation to which these candidates were responding-the wide gap between the exciting uses of Internet technology by private parties, on the one hand, and the government\u27s lagging technical infrastructure, on the other-is not new. A minefield of federal rules and a range of other factors, prevent government Web masters from keeping pace with the evergrowing potential of the Internet

    Simulations of inner magnetosphere dynamics with an expanded RAM-SCB model and comparisons with Van Allen Probes observations

    Get PDF
    Abstract Simulations from our newly expanded ring current-atmosphere interactions model with self-consistent magnetic field (RAM-SCB), now valid out to 9 R E, are compared for the first time with Van Allen Probes observations. The expanded model reproduces the storm time ring current buildup due to the increased convection and inflow of plasma from the magnetotail. It matches Magnetic Electron Ion Spectrometer (MagEIS) observations of the trapped high-energy (\u3e50 keV) ion flux; however, it underestimates the low-energy (\u3c10 keV) Helium, Oxygen, Proton, and Electron (HOPE) observations. The dispersed injections of ring current ions observed with the Energetic particle, Composition, and Thermal plasma (ECT) suite at high (\u3e20 keV) energy are better reproduced using a high-resolution convection model. In agreement with Electric and Magnetic Field Instrument Suite and Integrated Science (EMFISIS) observations, RAM-SCB indicates that the large-scale magnetic field is depressed as close as ∼4.5 RE during even a moderate storm. Regions of electromagnetic ion cyclotron instability are predicted on the duskside from ∼6 to ∼9 RE, indicating that previous studies confined to geosynchronous orbit may have underestimated their scattering effect on the energetic particles. Key Points Expanded RAM-SCB model reproduces well high-energy (\u3e50 keV) MagEIS observations The magnetic field is depressed as close as ∼4.5 RE during even a moderate storm EMIC wave growth extends on duskside from ∼6 to ∼9 RE during storm main phase

    Application and testing of the L neural network with the self-consistent magnetic field model of RAM-SCB

    Get PDF
    Abstract We expanded our previous work on L neural networks that used empirical magnetic field models as the underlying models by applying and extending our technique to drift shells calculated from a physics-based magnetic field model. While empirical magnetic field models represent an average, statistical magnetospheric state, the RAM-SCB model, a first-principles magnetically self-consistent code, computes magnetic fields based on fundamental equations of plasma physics. Unlike the previous L neural networks that include McIlwain L and mirror point magnetic field as part of the inputs, the new L neural network only requires solar wind conditions and the Dst index, allowing for an easier preparation of input parameters. This new neural network is compared against those previously trained networks and validated by the tracing method in the International Radiation Belt Environment Modeling (IRBEM) library. The accuracy of all L neural networks with different underlying magnetic field models is evaluated by applying the electron phase space density (PSD)-matching technique derived from the Liouville\u27s theorem to the Van Allen Probes observations. Results indicate that the uncertainty in the predicted L is statistically (75%) below 0.7 with a median value mostly below 0.2 and the median absolute deviation around 0.15, regardless of the underlying magnetic field model. We found that such an uncertainty in the calculated L value can shift the peak location of electron phase space density (PSD) profile by 0.2 RE radially but with its shape nearly preserved. Key Points L* neural network based on RAM-SCB model is developed L* calculation accuracy is estimated by PSD matching using RBSP data L* uncertainty causes a radial shift in the electron phase space density profile

    Accountable Algorithms

    Get PDF
    Many important decisions historically made by people are now made by computers. Algorithms count votes, approve loan and credit card applications, target citizens or neighborhoods for police scrutiny, select taxpayers for IRS audit, grant or deny immigration visas, and more. The accountability mechanisms and legal standards that govern such decision processes have not kept pace with technology. The tools currently available to policymakers, legislators, and courts were developed to oversee human decisionmakers and often fail when applied to computers instead. For example, how do you judge the intent of a piece of software? Because automated decision systems can return potentially incorrect, unjustified, or unfair results, additional approaches are needed to make such systems accountable and governable. This Article reveals a new technological toolkit to verify that automated decisions comply with key standards of legal fairness. We challenge the dominant position in the legal literature that transparency will solve these problems. Disclosure of source code is often neither necessary (because of alternative techniques from computer science) nor sufficient (because of the issues analyzing code) to demonstrate the fairness of a process. Furthermore, transparency may be undesirable, such as when it discloses private information or permits tax cheats or terrorists to game the systems determining audits or security screening. The central issue is how to assure the interests of citizens, and society as a whole, in making these processes more accountable. This Article argues that technology is creating new opportunities—subtler and more flexible than total transparency—to design decisionmaking algorithms so that they better align with legal and policy objectives. Doing so will improve not only the current governance of automated decisions, but also—in certain cases—the governance of decisionmaking in general. The implicit (or explicit) biases of human decisionmakers can be difficult to find and root out, but we can peer into the “brain” of an algorithm: computational processes and purpose specifications can be declared prior to use and verified afterward. The technological tools introduced in this Article apply widely. They can be used in designing decisionmaking processes from both the private and public sectors, and they can be tailored to verify different characteristics as desired by decisionmakers, regulators, or the public. By forcing a more careful consideration of the effects of decision rules, they also engender policy discussions and closer looks at legal standards. As such, these tools have far-reaching implications throughout law and society. Part I of this Article provides an accessible and concise introduction to foundational computer science techniques that can be used to verify and demonstrate compliance with key standards of legal fairness for automated decisions without revealing key attributes of the decisions or the processes by which the decisions were reached. Part II then describes how these techniques can assure that decisions are made with the key governance attribute of procedural regularity, meaning that decisions are made under an announced set of rules consistently applied in each case. We demonstrate how this approach could be used to redesign and resolve issues with the State Department’s diversity visa lottery. In Part III, we go further and explore how other computational techniques can assure that automated decisions preserve fidelity to substantive legal and policy choices. We show how these tools may be used to assure that certain kinds of unjust discrimination are avoided and that automated decision processes behave in ways that comport with the social or legal standards that govern the decision. We also show how automated decisionmaking may even complicate existing doctrines of disparate treatment and disparate impact, and we discuss some recent computer science work on detecting and removing discrimination in algorithms, especially in the context of big data and machine learning. And lastly, in Part IV, we propose an agenda to further synergistic collaboration between computer science, law, and policy to advance the design of automated decision processes for accountabilit

    Model-Constrained Reconstruction Accelerated With Fourier-Based Undersampling for Hyperpolarized [1-13C] Pyruvate Imaging

    Get PDF
    PURPOSE: Model-constrained reconstruction with Fourier-based undersampling (MoReFUn) is introduced to accelerate the acquisition of dynamic MRI using hyperpolarized [1- METHODS: The MoReFUn method resolves spatial aliasing using constraints introduced by a pharmacokinetic model that describes the signal evolution of both pyruvate and lactate. Acceleration was evaluated on three single-channel data sets: a numerical digital phantom that is used to validate the accuracy of reconstruction and model parameter restoration under various SNR and undersampling ratios, prospectively and retrospectively sampled data of an in vitro dynamic multispectral phantom, and retrospectively undersampled imaging data from a prostate cancer patient to test the fidelity of reconstructed metabolite time series. RESULTS: All three data sets showed successful reconstruction using MoReFUn. In simulation and retrospective phantom data, the restored time series of pyruvate and lactate maintained the image details, and the mean square residual error of the accelerated reconstruction increased only slightly (\u3c 10%) at a reduction factor up to 8. In prostate data, the quantitative estimation of the conversion-rate constant of pyruvate to lactate was achieved with high accuracy of less than 10% error at a reduction factor of 2 compared with the conversion rate derived from unaccelerated data. CONCLUSION: The MoReFUn technique can be used as an effective and reliable imaging acceleration method for metabolic imaging using hyperpolarized [1

    Preparation of transparent oxyapatite ceramics by combined use of freeze-drying and spark-plasma sintering

    Get PDF
    Lanthanum silicate oxyapatites, ion-conducting materials presenting a strong aversion against densification, have been obtained in the form of dense transparent ceramics, by combining the beneficial use of freeze-drying and spark plasma sintering methods

    High-density molecular characterization and association mapping in Ethiopian durum wheat landraces reveals high diversity and potential for wheat breeding

    Get PDF
    Durum wheat (Triticum turgidum subsp. durum) is a key crop worldwide, yet its improvement and adaptation to emerging environmental threats is made difficult by the limited amount of allelic variation included in its elite pool. New allelic diversity may provide novel loci to international crop breeding through quantitative trait loci (QTL) mapping in unexplored material. Here we report the extensive molecular and phenotypic characterization of hundreds of Ethiopian durum wheat landraces and several Ethiopian improved lines. We test 81,587 markers scoring 30,155 single nucleotide polymorphisms and use them to survey the diversity, structure, and genome-specific variation in the panel. We show the uniqueness of Ethiopian germplasm using a siding collection of Mediterranean durum wheat accessions. We phenotype the Ethiopian panel for ten agronomic traits in two highly diversified Ethiopian environments for two consecutive years, and use this information to conduct a genome wide association study. We identify several loci underpinning agronomic traits of interest, both confirming loci already reported and describing new promising genomic regions. These loci may be efficiently targeted with molecular markers already available to conduct marker-assisted selection in Ethiopian and international wheat. We show that Ethiopian durum wheat represents an important and mostly unexplored source of durum wheat diversity. The panel analyzed in this study allows the accumulation of QTL mapping experiments, providing the initial step for a quantitative, methodical exploitation of untapped diversity in producing a better wheat
    corecore