139 research outputs found

    A Penny for your Thoughts

    Get PDF
    A Penny for your thoughts. What could I answer, when, in reality, I had been thinking of absolutely nothing. I was just looking at nature; nature who has always done something to me. I cannot put my finger on it, but I can feel it vaguely with a certain weakness that causes a disengagement from the powers of description. As mysterious as night itself; like the flame glowing in the fireplace, or the tobacco smoke, lazily drifting along to nowhere. It is in that way that I watch the always vanishing \u27something\u27 that is forever there in nature

    Alien Registration- Mangino, Alfonse (Portland, Cumberland County)

    Get PDF
    https://digitalmaine.com/alien_docs/25815/thumbnail.jp

    Alien Registration- Mangino, Alfonse (Portland, Cumberland County)

    Get PDF
    https://digitalmaine.com/alien_docs/25815/thumbnail.jp

    Alien Registration- Mangino, Alfonse (Portland, Cumberland County)

    Get PDF
    https://digitalmaine.com/alien_docs/25815/thumbnail.jp

    Simulation of an electrophotographic halftone reproduction

    Get PDF
    The robustness of three digital halftoning techniques are simulated for a hypothetical electrophotographic laser printer subjected to dynamic environmental conditions over a copy run of one thousand images. Mathematical electrophotographic models have primarily concentrated on solid area reproductions under time-invariant conditions. The models used in this study predict the behavior of complex image distributions at various stages in the electrophotographic process. The system model is divided into seven subsystems: Halftoning, Laser Exposure, Photoconductor Discharge, Toner Development, Transfer, Fusing, and Image Display. Spread functions associated with laser spot intensity, charge migration, and toner transfer and fusing are used to predict the electrophotographic system response for continuous and halftone reproduction. Many digital halftoning techniques have been developed for converting from continuous-tone to binary (halftone) images. The general objective of halftoning is to approximate the intermediate gray levels of continuous tone images with a binary (black-and-white) imaging system. Three major halftoning techniques currently used are Ordered-Dither, Cluster-Dot, and Error Diffusion. These halftoning algorithms are included in the simulation model. Simulation in electrophotography can be used to better understand the relationship between electrophotographic parameters and image quality, and to observe the effects of time-variant degradation on electrophotographic parameters and materials. Simulation programs, written in FORTRAN and SLAM (Simulation Language Alternative Modeling), have been developed to investigate the effects of system degradation on halftone image quality. The programs have been designed for continuous simulation to characterize the behavior or condition of the electrophotographic system. The simulation language provides the necessary algorithms for obtaining values for the variables described by the time-variant equations, maintaining a history of values during the simulation run, and reporting statistical information on time-dependent variables. Electrophotographic variables associated with laser intensity, initial photoconductor surface voltage, and residual voltage are degraded over a simulated run of one thousand copies. These results are employed to predict the degraded electrophotographic system response and to investigate the behavior of the various halftone techniques under dynamic system conditions. Two techniques have been applied to characterize halftone image quality: Tone Reproduction Curves are used to characterize and record the tone reproduction capability of an electrophotographic system over a simulated copy run. Density measurements are collected and statistical inferences drawn using SLAM. Typically the sharpness of an image is characterized by a system modulation transfer function (MTF). The mathematical models used to describe the subsystem transforms of an electrophotographic system involve non-linear functions. One means for predicting this non-linear system response is to use a Chirp function as the input to the model and then to compare the reproduced modulation to that of the original. Since the imaging system is non-linear, the system response cannot be described by an MTF, but rather an Input Response Function. This function was used to characterize the robustness of halftone patterns at various frequencies. Simulated images were also generated throughout the simulation run and used to evaluate image sharpness and resolution. The data, generated from each of the electrophotographic simulation models, clearly indicates that image stability and image sharpness is not influenced by dot orientation, but rather by the type of halftoning operation used. Error-Diffusion is significantly more variable than Clustered-Dot and Dispersed-Dot at low to mid densities. However, Error-Diffusion is significantly less variable than the ordered dither patterns at high densities. Also, images generated from Error-Diffusion are sharper than those generated using Clustered-Dot and Dispersed-Dot techniques, but the resolution capability of each of the techniques remained the same and degraded equally for each simulation run

    Problem of the isochronous hairspring.

    Full text link
    Thesis (M.A.)--Boston University This item was digitized by the Internet Archive.The system of the hairspring and balance wheel used to regulate watches and chronometers does not perform an exact simple harmonic motion. For the proper function of the watch the deviation from isochronism of the vibrations, caused by the reaction force at the fixed end point for any distortion of the spring, must be as small as possible. This reaction force is found to be correspondingly small if the shape of the hairspring is designed so that the reaction force at the end point is small. In theory only, perfect isochronism can be accomplished by a flat spring having one end free so that for any distortion of the sprang, the tangent to the end point moves parallel to itself; however, the free end spring cannot be realised in a watch. Phillips' solution to the problem primarily expresses that the reaction force at the end point is proportional to the displacement of the end point that would occur for a free spring. His solution is based on the problem of finding a possible curve such that the terminal point has a given trajectory or displacement, and simultaneously, at any point of the trajectory the tangent has an imposed direction. [TRUNCATED

    Assessing the Health of an Established Urban Church to Develop a Strategic Plan for Growth

    Get PDF
    The health of a church is solely dependent on the health of its leadership. The more the leadership becomes imitators of Christ (Ephesians 5:1-2), the more the church resembles the image of Christ (Romans 8:29). This work seeks to determine the kind of healthy church leadership, which generates a healthy church culture for multiplying disciples. The purpose of this study is to assess the health of an established urban church in midtown Manhattan, and to evaluate the perception that one large church is more effective than multiple churches in order to discover an evangelistic approach that may help this and other churches with a similar context to grow spiritually as well as in numbers. Therefore, this research project examines the currently available literature on Christian leadership, a healthy church, and church models in an attempt to discover models or practices that might help this particular church to be more effective in its multiplication of disciples. With the Church Health Analysis Questionnaire by Gene A. Getz, this dissertation evaluates the health of this established church and utilizes T4T (Training for Trainers) to assess the change in leadership’s disciple-making approach

    Singapore International Film Festival 2014

    Get PDF
    NOTES FROM A SLIGHTLY SMALL ISLAND: SINGAPORE INTERNATIONAL FILM FESTIVAL Following a two-year hiatus that included a complete revamping of its structure and organization, the Singapore International Film Festival returned for its 25th edition (4-14 December 2014) bigger and better than ever, with a re-branding effort that changed the former 'SIFF' into its current 'SGIFF'. A part of the inaugural Singapore Media Festival, SGIFF featured over 147 films from 50 countries spread over ten days and eleven sections, with a team headed by executive director Wahyuni Hadi and director-programmer Zhang Wenjie. It may be safe to say that the revitalized SGIFF heralds a new golden age of Singaporean and Southeast Asian, cinema...

    Effects of template mass, complexity, and analysis method on the ability to correctly determine the number of contributors to DNA mixtures

    Full text link
    In traditional forensic DNA casework, the inclusion or exclusion of individuals who may have contributed to an item of evidence may be dependent upon the assumption on the number of individuals from which the evidence arose. Typically, the determination of the minimum number of contributors (NOC) to a mixture is achieved by counting the number of alleles observed above a given analytical threshold (AT); this technique is known as maximum allele count (MAC). However, advances in polymerase chain reaction (PCR) chemistries and improvements in analytical sensitivities have led to an increase in the detection of complex, low template DNA (LtDNA) mixtures for which MAC is an inadequate means of determining the actual NOC. Despite the addition of highly polymorphic loci to multiplexed PCR kits and the advent of interpretation softwares which deconvolve DNA mixtures, a gap remains in the DNA analysis pipeline, where an effective method of determining the NOC needs to be established. The emergence of NOCIt -- a computational tool which provides the probability distribution on the NOC, may serve as a promising alternative to traditional, threshold- based methods. Utilizing user-provided calibration data consisting of single source samples of known genotype, NOCIt calculates the a posteriori probability (APP) that an evidentiary sample arose from 0 to 5 contributors. The software models baseline noise, reverse and forward stutter proportions, stutter and allele dropout rates, and allele heights. This information is then utilized to determine whether the evidentiary profile originated from one or many contributors. In short, NOCIt provides information not only on the likely NOC, but whether more than one value may be deemed probable. In the latter case, it may be necessary to modify downstream interpretation steps such that multiple values for the NOC are considered or the conclusion that most favors the defense is adopted. Phase I of this study focused on establishing the minimum number of single source samples needed to calibrate NOCIt. Once determined, the performance of NOCIt was evaluated and compared to that of two other methods: the maximum likelihood estimator (MLE) -- accessed via the forensim R package, and MAC. Fifty (50) single source samples proved to be sufficient to calibrate NOCIt, and results indicate NOCIt was the most accurate method of the three. Phase II of this study explored the effects of template mass and sample complexity on the accuracy of NOCIt. Data showed that the accuracy decreased as the NOC increased: for 1- and 5-contributor samples, the accuracy was 100% and 20%, respectively. The minimum template mass from any one contributor required to consistently estimate the true NOC was 0.07 ng -- the equivalent of approximately 10 cells' worth of DNA. Phase III further explored NOCIt and was designed to assess its robustness. Because the efficacy of determining the NOC may be affected by the PCR kit utilized, the results obtained from NOCIt analysis of 1-, 2-, 3-, 4-, and 5-contributor mixtures amplified with AmpFlstr® Identifiler® Plus and PowerPlex® 16 HS were compared. A positive correlation was observed for all NOCIt outputs between kits. Additionally, NOCIt was found to result in increased accuracies when analyzed with 1-, 3-, and 4-contributor samples amplified with Identifiler® Plus and with 5-contributor samples amplified with PowerPlex® 16 HS. The accuracy rates obtained for 2-contributor samples were equivalent between kits; therefore, the effect of amplification kit type on the ability to determine the NOC was not substantive. Cumulatively, the data indicate that NOCIt is an improvement to traditional methods of determining the NOC and results in high accuracy rates with samples containing sufficient quantities of DNA. Further, the results of investigations into the effect of template mass on the ability to determine the NOC may serve as a caution that forensic DNA samples containing low-target quantities may need to be interpreted using multiple or different assumptions on the number of contributors, as the assumption on the number of contributors is known to affect the conclusion in certain casework scenarios. As a significant degree of inaccuracy was observed for all methods of determining the NOC at severe low template amounts, the data presented also challenge the notion that any DNA sample can be utilized for comparison purposes. This suggests that the ability to detect extremely complex, LtDNA mixtures may not be commensurate with the ability to accurately interpret such mixtures, despite critical advances in software-based analysis. In addition to the availability of advanced comparison algorithms, limitations on the interpretability of complex, LtDNA mixtures may also be dependent on the amount of biological material present on an evidentiary substrate

    Overview of Hydraulics and Simiyu River Sediment Input into Magu Bay, Lake Victoria, Tanzania

    Get PDF
    Lake Victoria is experiencing multifaceted environmental and ecological problems. A  study of the problems needs a multidisciplinary approach. to establish the cause-effect relationships. The study focuses on the hydraulics of Lake Victoria's Magu Bay and Simiyu riverine input of suspended sediments and their distribution in the Bay. Sampling in the river mouth and the Bay was conducted aboard an 8m outboard engine wooden boat. Turbidity, currents speed and direction were measured using an "AANDERAA" multi-sensor self-recording current meter model 9 (RCM9). Water depth was measured using an echo-sounder type FISHIN' BUDDY II. Geographical positions of the sampling locations were obtained using a GPS, Model Garmin 12. Suspended sediment concentrations were determined by sampling the water, filtering and weighing the sediments. Data on rainfall and water discharge were obtained from the Mwanza Meteorological office. Statistical analysis shows that cumulative rainfall of 1043 mm and the respective discharges of 98.5 m3/s have a return period of 5 years. A return period of 50 years is expected to have cumulative rainfall of 1403 mm and discharges of 156.7 m3/s. Concentration of suspended particles was highest at the river mouth (1573 mg/l at station 1) and exponentially decreased as one moves away from the river (0.9 mg/l at station 8, a station that was most off-shore from the river mouth). Also, turbidity was higher towards the river and decreased as one moves away from the river mouth. In the first half a kilometre longitudinally from the river mouth, most of the suspended sediment has been deposited its concentration is attenuated exponentially as C/Co = e,Kx, where C is suspended sediment concentration at distance x from the river mouth and Co is concentration at x = 0 (at the river mouth) and k = 2.1 is the attenuation coefficient. Both surface and bottom currents exhibited the same trend as sediment concentration. At the mouth of the river, the surface and bottom current are northward. In the first half kilometre from the river mouth, the current speed has been strongly attenuated from 0.54 m/s to 0.07 m/s. From there onwards, the current speeds are reversed, almost flowing in the opposite direction of the river flow. Simiyu River is a major sediment contributor to the bay ranging from zero on days when there is no water discharge to about 20,000 tons/day in the observed period. It is recommended that a comprehensive and long-term study to cover all river sediment input into the lake be undertake
    corecore