54 research outputs found
Probing local nonlinear viscoelastic properties in soft materials
Minimally invasive experimental methods that can measure local rate dependent
mechanical properties are essential in understanding the behaviour of soft and
biological materials in a wide range of applications. Needle based measurement
techniques such as Cavitation Rheology and Volume Controlled Cavity Expansion
(VCCE), allow for minimally invasive local mechanical testing, but have been
limited to measuring the elastic material properties. Here, we propose several
enhancements to the VCCE technique to adapt it for characterization of
viscoelastic response at low to medium stretch rates ( -
s). The proposed technique performs several cycles of
expansion-relaxation at controlled stretch rates in a cavity expansion setting
and then employs a large deformation viscoelastic model to capture the measured
material response. Application of the technique to soft PDMS rubber reveals
significant rate dependent material response with high precision and
repeatability, while isolating equilibrated states that are used to directly
infer the quasistatic elastic modulus. The technique is further established by
demonstrating its ability to capture changes in the rate dependent material
response of a tuneable PDMS system. The measured viscoelastic properties are
used to explain earlier reports of rate insensitive material response by needle
based methods: it is demonstrated that the conventional use of constant
volumetric rate cavity expansion can induce high stretch rates that lead to
viscoelastic stiffening and an illusion of rate insensitive material response.
We thus conclude with a cautionary note on possible overestimation of the
quasistatic elastic modulus in previous studies and suggest that the stretch
rate controlled expansion protocol, proposed in this work, is essential for
accurate estimation of both quasistatic and dynamic material parameters
Temporal fluctuations in excimer-like interactions between pi-conjugated chromophores
Inter- or intramolecular coupling processes between chromophores such as
excimer formation or H- and J-aggregation are crucial to describing the
photophysics of closely packed films of conjugated polymers. Such coupling is
highly distance dependent, and should be sensitive to both fluctuations in the
spacing between chromophores as well as the actual position on the chromophore
where the exciton localizes. Single-molecule spectroscopy reveals these
intrinsic fluctuations in well-defined bi-chromophoric model systems of
cofacial oligomers. Signatures of interchromophoric interactions in the excited
state - spectral red-shifting and broadening, and a slowing of
photoluminescence decay - correlate with each other but scatter strongly
between single molecules, implying an extraordinary distribution in coupling
strengths. Furthermore, these excimer-like spectral fingerprints vary with
time, revealing intrinsic dynamics in the coupling strength within one single
dimer molecule, which constitutes the starting point for describing a molecular
solid. Such spectral sensitivity to sub-Angstrom molecular dynamics could prove
complementary to conventional FRET-based molecular rulers
Biofilms as self-shaping growing nematics
Active nematics are the nonequilibrium analog of passive liquid crystals in
which anisotropic units consume free energy to drive emergent behavior. Similar
to liquid crystal (LC) molecules in displays, ordering and dynamics in active
nematics are sensitive to boundary conditions; however, unlike passive liquid
crystals, active nematics, such as those composed of living matter, have the
potential to regulate their boundaries through self-generated stresses. Here,
using bacterial biofilms confined by a hydrogel as a model system, we show how
a three-dimensional, living nematic can actively shape itself and its boundary
in order to regulate its internal architecture through growth-induced stresses.
We show that biofilms exhibit a sharp transition in shape from domes to lenses
upon changing environmental stiffness or cell-substrate friction, which is
explained by a theoretical model considering the competition between
confinement and interfacial forces. The growth mode defines the progression of
the boundary, which in turn determines the trajectories and spatial
distribution of cell lineages. We further demonstrate that the evolving
boundary defines the orientational ordering of cells and the emergence of
topological defects in the interior of the biofilm. Our findings reveal novel
self-organization phenomena in confined active matter and provide strategies
for guiding the development of programmed microbial consortia with emergent
material properties
Observation of Hadronic W Decays in t-tbar Events with the Collider Detector at Fermilab
We observe hadronic W decays in t-tbar -> W (-> l nu) + >= 4 jet events using
a 109 pb-1 data sample of p-pbar collisions at sqrt{s} = 1.8 TeV collected with
the Collider Detector at Fermilab (CDF). A peak in the dijet invariant mass
distribution is obtained that is consistent with W decay and inconsistent with
the background prediction by 3.3 standard deviations. From this peak we measure
the W mass to be 77.2 +- 4.6 (stat+syst) GeV/c^2. This result demonstrates the
presence of two W bosons in t-tbar candidates in the W (-> l nu) + >= 4 jet
channel.Comment: 20 pages, 4 figures, submitted to PR
Comparative proteomics using 2-D gel electrophoresis and mass spectrometry as tools to dissect stimulons and regulons in bacteria with sequenced or partially sequenced genomes
We propose two-dimensional gel electrophoresis (2-DE) and mass spectrometry to define the protein components of regulons and stimulons in bacteria, including those organisms where genome sequencing is still in progress. The basic 2-DE protocol allows high resolution and reproducibility and enables the direct comparison of hundreds or even thousands of proteins simultaneously. To identify proteins that comprise stimulons and regulons, peptide mass fingerprint (PMF) with matrix-assisted laser desorption ionization/time-of-flight mass spectrometry (MALDI-TOF-MS) analysis is the first option and, if results from this tool are insufficient, complementary data obtained with electrospray ionization tandem-MS (ESI-MS/MS) may permit successful protein identification. ESI-MS/MS and MALDI-TOF-MS provide complementary data sets, and so a more comprehensive coverage of a proteome can be obtained using both techniques with the same sample, especially when few sequenced proteins of a particular organism exist or genome sequencing is still in progress
The eck fistula in animals and humans
In all species so far studied, including man, portacaval shunt causes the same changes in liver morphology, including hepatocyte atrophy, fatty infiltration, deglycogenation, depletion and disorganization of the rough endoplasmic reticulum (RER) and its lining polyribosomes and variable but less specific damage to other organelles. Many, perhaps all, biosynthetic processes are quickly depressed, largely secondary to the selective damage to the RER, which is the "factory" of the cell. These structural and metabolic changes in the liver after portal diversion are caused by the diversion around the liver of the hepatotrophic substances in portal venous blood, of which endogenous insulin is the most important. In experimental animals, the injury of Eck's fistula can be prevented by infusing insulin into the tied-off hilar portal vein. The subtle but far-reaching changes in hepatic function after portal diversion have made it possible to use this procedure in palliating three inborn errors of metabolism: glycogen storage disease, familial hypercholesterolemia, and α1-antitrypsin deficiency In these three diseases, the abnormalities caused by portal diversion have counteracted abnormalities in the patients that were caused by the inborn errors. In these diseases, amelioration of the inborn errors depends on the completeness of the portal diversion. In contrast, total portal diversion to treat complications of portal hypertension is undesirable and always will degrade hepatic function if a significant amount of hepatopetal portal venous blood is taken from the liver. When total portal diversion is achieved (and this is to be expected after all conventional shunts), the incidence of hepatic failure and hepatic encephalopathy is increased. If portal diversion must be done for the control of variceal hemorrhage, a selective procedure such as the Warren procedure is theoretically superior to the completely diverting shunt. In practice, better patient survival has not been achieved after selective shunts than after conventional shunts, but the incidence of hepatic encephalopathy has been less. © 1983 Year Book Medical Publishers, Inc
5. Systematic Item Writing And Test Construction
Standardized objective testing remains the most popular mode of licensure testing. Even where other types of tests are incorporated, it is often the case that they are provided as complimentary to standardized, multiple-choice (MC) tests. Moreover, scoring theories and standard-setting procedures have been developed over the years in the context of standardized MC testing. At the same time, critics have pointed to limitations of contemporary MC testing practices, including lack of fidelity to real-life challenges and emphasis on recall of factual minutiae. In our view, testing professionals should make conscientious attempts to modify test development procedures so as to address valid criticisms. In this chapter we offer several suggestions for improving licensure test development, although it may not be feasible to adopt the entire array of recommendations we make. We are providing an intentionally wide selection in the hope that testing professionals will find something of use in their field of practice. Our discussion emphasizes careful design and systematic item-writing methods. We describe types of test items and make suggestions for development and maintenance of an item pool. Later we discuss test-construction procedures.
OPERATIONAL ASSUMPTIONS
We assume that the testing program is intended for use in licensing persons who are entering an occupation or profession in a U.S. jurisdiction. Our discussion assumes further that the program is new; however, the implications for already established licensure programs may be clear to the reader. The testing programs we consider are those that rely on paper-and-pencil techniques generally associated with standardized testing. These imply having examinees fill in spaces on answer sheets that are optically scanned at a later time. We are also assuming that the standards for passing the licensure test will be established using one or more of the content-based approaches that are presently available. Such standards are fixed and maintained through equating procedures using the appropriate statistical methods. Details of these procedures are provided elsewhere in this volume. In this chapter we assume that systematic pretesting of newly written multiple-choice questions (MCQs) will be implemented as part of the testing program.
Much of our experience has been in the context of licensing and certifying physicians and our examples are largely restricted to medical applications. We believe that the features we outline will be effective with nonmedical professions as well.
IMPORTANCE OF TEST DESIGN
Test development comprises the full array of activities associated with bringing a standardized assessment into operation. The particulars of what we designate as design are of special significance in development of licensure tests for two reasons. First, the imperative to assemble evidence in support of the content validity of the examination is heightened in the licensure context. Second, the logical and procedural linkages between the design and the test items must withstand close scrutiny.
Job Analysis, Job Relevance and Content Validity
Content validity retains a somewhat controversial character among measurement specialists. Much contemporary commentary relegates content validity to an inferior status because it is described as emerging from the apparent fit between the test content and the persons (i .e., experts) involved in the development of the test. This version of content validity places it outside the preferred paradigm of interpretations of examinee scores. In our view this disparagement of content validity is unwarranted in licensure testing. Validation of licensure tests may rely heavily on evidence of unimpeachable job relevance of test content, but there is no reason to exclude empirical processes from content validation, including interpretations of scores. More to the point, the imperative to establish the unimpeachable job relevance of the licensure test enhances the importance of design because it is at the level of test design that the issue of relevance is first addressed.
The job relevance perspective implies that the test items in the licensure examination must be linked through systematic means to a well-defined representation of the demands of the occupation or profession. The Standards for Educational and Psychological Testing (American Educational Research Association, American Psychological Association & National Council on Measurement in Education, 1985) call for a job analysis in licensure test development (Fine, 1986) and this has come to be a well-accepted element of the process (see chapter 4). Although we prefer an alternative method to conventional job analyses, the more significant point is the imperative to start with a representation of the target occupation or profession. The purpose of such a representation is to establish a definition of knowledge and skill that is essential to competent practice. It is possession of the candidate\u27s knowledge and skill that the licensing examination is intended to establish or confirm, and the presumption is that the public is protected by such an assessment
Recommended from our members
Torsion-induced stick-slip phenomena in the delamination of soft adhesives.
Acknowledgements: The authors acknowledge the support from the National Science Foundation under award number CMMI-1942016. T. K. V. acknowledges the Barry Goldwater Scholarship and the support of the MIT Undergraduate Research Opportunities Program.Soft adhesive contacts are ubiquitous in nature and are increasingly used in synthetic systems, such as flexible electronics and soft robots, due to their advantages over traditional joining techniques. While methods to study the failure of adhesives typically apply tensile loads to the adhesive joint, less is known about the performance of soft adhesives under shear and torsion, which may become important in engineering applications. A major challenge that has hindered the characterization of shear/torsion-induced delamination is imposed by the fact that, even after delamination, contact with the substrate is maintained, thus allowing for frictional sliding and re-adhesion. In this work, we address this gap by studying the controlled delamination of soft cylinders under combined compression and torsion. Our experimental observations expose the nucleation of delamination at an imperfection and its propagation along the circumference of the cylinder. The observed sequence of 'stick-slip' events and the sensitivity of the delamination process to material parameters are explained by a theoretical model that captures axisymmetric delamination patterns, along with the subsequent frictional sliding and re-adhesion. By opening up an avenue for improved characterization of adhesive failure, our experimental approach and theoretical framework can guide the design of adhesives in future applications
- …