14 research outputs found

    Alien Registration- Beyea, James A. (Easton, Aroostook County)

    Get PDF
    https://digitalmaine.com/alien_docs/26591/thumbnail.jp

    Alien Registration- Beyea, James A. (Easton, Aroostook County)

    No full text
    https://digitalmaine.com/alien_docs/26591/thumbnail.jp

    Convergent Validity of a Wearable Sensor System for Measuring Sub-Task Performance during the Timed Up-and-Go Test

    No full text
    Background: The timed-up-and-go test (TUG) is one of the most commonly used tests of physical function in clinical practice and for research outcomes. Inertial sensors have been used to parse the TUG test into its composite phases (rising, walking, turning, etc.), but have not validated this approach against an optoelectronic gold-standard, and to our knowledge no studies have published the minimal detectable change of these measurements. Methods: Eleven adults performed the TUG three times each under normal and slow walking conditions, and 3 m and 5 m walking distances, in a 12-camera motion analysis laboratory. An inertial measurement unit (IMU) with tri-axial accelerometers and gyroscopes was worn on the upper-torso. Motion analysis marker data and IMU signals were analyzed separately to identify the six main TUG phases: sit-to-stand, 1st walk, 1st turn, 2nd walk, 2nd turn, and stand-to-sit, and the absolute agreement between two systems analyzed using intra-class correlation (ICC, model 2) analysis. The minimal detectable change (MDC) within subjects was also calculated for each TUG phase. Results: The overall difference between TUG sub-tasks determined using 3D motion capture data and the IMU sensor data was <0.5 s. For all TUG distances and speeds, the absolute agreement was high for total TUG time and walk times (ICC > 0.90), but less for chair activity (ICC range 0.5–0.9) and typically poor for the turn time (ICC < 0.4). MDC values for total TUG time ranged between 2–4 s or 12–22% of the TUG time measurement. MDC of the sub-task times were higher proportionally, being 20–60% of the sub-task duration. Conclusions: We conclude that a commercial IMU can be used for quantifying the TUG phases with accuracy sufficient for clinical applications; however, the MDC when using inertial sensors is not necessarily improved over less sophisticated measurement tools

    Quanti\ufb01cation of superparamagnetic iron oxide with large dynamic range using TurboSPI

    No full text
    This work proposes the use of TurboSPI, a multi-echo single point imaging sequence, for the quanti\ufb01cation of labeled cells containing moderate to high concentrations of iron oxide contrast agent. At each kspace location, TurboSPI acquires several hundred time points during a spin echo, permitting reliable relaxation rate mapping of large-R*\u2082 materials. An automatic calibration routine optimizes image quality by promoting coherent alignment of spin and stimulated echoes throughout the multi-echo train, and this calibration is suf\ufb01ciently robust for in vivo applications. In vitro relaxation rate measurements of SPIO-loaded cervical cancer cells exhibit behavior consistent with theoretical predictions of the static dephasing regime in the spin echo case; the relaxivity measured with TurboSPI was 10.47 \ub1 2.3 s\u207b\ub9/mG, comparable to the theoretical value of 10.78 s\u207b\ub9/mG. Similar measurements of micron-sized iron oxide particles (0.96 \ub5m and 1.63 \ub5m diameter) show a reduced relaxivity of 8.06 \ub1 0.68 s\u207b\ub9/mG and 7.13 \ub1 0.31 s\u207b\ub9/mG respectively, indicating that the static dephasing criterion was not met. Nonetheless, accurate quanti\ufb01cation of such particles is demonstrated up to R*\u2082 = 900 s\u207b\ub9, with a potentially higher upper limit for loaded cells having a more favorable R'\u2082:R\u2082 ratio. Based on the cells used in this study, reliable quanti\ufb01cation of cells loaded with 10 pg of iron per cell should be possible up to a density of 27 million cells/mL. Such quanti\ufb01cation will be of crucial importance to the development of longitudinal monitoring for cellular therapy and other procedures using iron-labeled cellsPeer reviewed: YesNRC publication: Ye

    Signal displacement in spiral-in acquisitions : simulations and implications for imaging in SFG regions

    No full text
    Susceptibility field gradients (SFGs) cause problems for functional magnetic resonance imaging (fMRI) in regions like the orbital frontal lobes, leading to signal loss and image artifacts (signal displacement and \u201cpile-up\u201d). Pulse sequences with spiral-in k-space trajectories are often used when acquiring fMRI in SFG regions such as inferior/medial temporal cortex because it is believed that they have improved signal recovery and decreased signal displacement properties. Previously postulated theories explain differing reasons why spiral-in appears to perform better than spiral-out; however it is clear that multiple mechanisms are occurring in parallel. This study explores differences in spiral-in and spiral-out images using human and phantom empirical data, as well as simulations consistent with the phantom model. Using image simulations, the displacement of signal was characterized using point spread functions (PSFs) and target maps, the latter of which are conceptually inverse PSFs describing which spatial locations contribute signal to a particular voxel. The magnitude of both PSFs and target maps was found to be identical for spiral-out and spiral-in acquisitions, with signal in target maps being displaced from distant regions in both cases. However, differences in the phase of the signal displacement patterns that consequently lead to changes in the intervoxel phase coherence were found to be a significant mechanism explaining differences between the spiral sequences. The results demonstrate that spiral-in trajectories do preserve more total signal in SFG regions than spiral-out; however, spiral-in does not in fact exhibit decreased signal displacement. Given that this signal can be displaced by significant distances, its recovery may not be preferable for all fMRI applications.Peer reviewed: YesNRC publication: Ye

    The predictive power of geographic health care utilization for unintentional fatal fall rates

    No full text
    Abstract Background Falls are the leading cause of fatal and nonfatal injuries among adults over 65 years old. The increase in fall mortality rates is likely multifactorial. With a lack of key drivers identified to explain rising rates of death from falls, accurate predictive modelling can be challenging, hindering evidence-based health resource and policy efforts. The objective of this work is to examine the predictive power of geographic utilization and longitudinal trends in mortality from unintentional falls amongst different demographic and geographic strata. Methods This is a nationwide, retrospective cohort study using the United States Centers for Disease Control (CDC) Web-based Injury Statistics Query and Reporting System (WISQARS) database. The exposure was death from an unintentional fall as determined by the CDC. Outcomes included aggregate and trend crude and age-adjusted death rates. Health care utilization, reimbursement, and cost metrics were also compared. Results Over 2001 to 2018, 465,486 total deaths due to unintentional falls were recorded with crude and age-adjusted rates of 8.42 and 7.76 per 100,000 population respectively. Comparing age-adjusted rates, males had a significantly higher age-adjusted death rate (9.89 vs. 6.17; p <  0.00001), but both male and female annual age-adjusted mortality rates are expected to rise (Male: + 0.25 rate/year, R2= 0.98; Female: + 0.22 rate/year, R2= 0.99). There were significant increases in death rates commensurate with increasing age, with the adults aged 85 years or older having the highest aggregate (201.1 per 100,000) and trending death rates (+ 8.75 deaths per 100,000/year, R2= 0.99). Machine learning algorithms using health care utilization data were accurate in predicting geographic age-adjusted death rates. Conclusions Machine learning models have high accuracy in predicting geographic age-adjusted mortality rates from health care utilization data. In the United States from 2001 through 2018, adults aged 85+ years carried the highest death rate from unintentional falls and this rate is forecasted to accelerate
    corecore