38 research outputs found

    Facing the Challenge of Data Transfer from Animal Models to Humans: the Case of Persistent Organohalogens

    Get PDF
    A well-documented fact for a group of persistent, bioaccumulating organohalogens contaminants, namely polychlorinated biphenyls (PCBs), is that appropriate regulation was delayed, on average, up to 50 years. Some of the delay may be attributed to the fact that the science of toxicology was in its infancy when PCBs were introduced in 1920's. Nevertheless, even following the development of modern toxicology this story repeats itself 45 years later with polybrominated diphenyl ethers (PBDEs) another compound of concern for public health. The question is why? One possible explanation may be the low coherence between experimental studies of toxic effects in animal models and human studies. To explore this further, we reviewed a total of 807 PubMed abstracts and full texts reporting studies of toxic effects of PCB and PBDE in animal models. Our analysis documents that human epidemiological studies of PBDE stand to gain little from animal studies due to the following: 1) the significant delay between the commercialisation of a substance and studies with animal models; 2) experimental exposure levels in animals are several orders of magnitude higher than exposures in the general human population; 3) the limited set of evidence-based endocrine endpoints; 4) the traditional testing sequence (adult animals – neonates – foetuses) postpones investigation of the critical developmental stages; 5) limited number of animal species with human-like toxicokinetics, physiology of development and pregnancy; 6) lack of suitable experimental outcomes for the purpose of epidemiological studies. Our comparison of published PCB and PBDE studies underscore an important shortcoming: history has, unfortunately, repeated itself. Broadening the crosstalk between the various branches of toxicology should therefore accelerate accumulation of data to enable timely and appropriate regulatory action

    Spelling Progress Bulletin Stress: Syllable and Phrase, by

    No full text
    Consideration of stress is crucial to both the teaching of word perception and proposals for an initial learning medium or an all-out spelling reform. This consideration is especially valid for American English because it is a syllable-stress language. Furthermore, some "working" knowledge of pitch and juncture, as well as stress -grammatical patterns of language (speech) -are needed to introduce learners to the graphic system. Syllabic Stress There are two types of stress: First, syllabic stress -the prominence, or importance, given to the first syllable of given /'givən/ to the second syllable of about /a-'bout/, and to the first of marvelous /'mar-və-ləs/. Second, phrasal, or sentence, stress or variations in levels of energy used in speaking -the stress given to /'liv/ in We will deliver it to you. This last type of stress indicates the relationships between words in a phrase -for example, stressed the /'thē/ said in isolation versus unstressed the /thə/ in the phrase the boy. Syllabic stress (prominence given to syllables in words) overlaps phrasal stress (prominence given to a word within phrases or sentences). It is distinctive from Spanish, for example, because Englis

    Spelling Progress Bulletin Dedicated to finding the causes of difficulties in learning reading and spelling. The 3rd International Conference on Reading & Spelling

    No full text
    on Spelling -Research and Reform to be held July 31-Aug. 3 in Edinburgh, Scotland. Accommodations available at the Pollock Halls of Residence Topics will include recent research on cognitive processes in spelling, comparative spelling in other languages and effects for learners and fluent readers, advances in spelling for electronic communication, improvements in teaching spelling, experiments testing propositions for spelling reform, and progress in the development and implementation of improvements in orthography. There will also be a lighter side. The conference will follow after the 1981 U.K.R.A. Conference, also in Edinburgh. Participants can go on to the I.R.A. European Conference in Finland that follows it, and aspects of greatest interest to reading educators will be early in the programme to facilitate this. Pollock Halls offers: the advantage of a small, relaxed conference in central Edinburgh. Call for Papers. Early offers are invited, so that the programme can be drawn up, with provisional titles and abstracts of papers for consideration. Abstract: Spelling is one of the "tool" subjects that comprise the major curriculum area known as language arts. As spelling is an important factor in fostering communication between individuals, the individualized education plans of educable mentally retarded students should contain specific instructional strategies leading to the acquisition of this skill. The purpose of this article is to review learning characteristics of educable students in regards to spelling as well as to present several instructional approaches. A description of language arts as a segment of the academic curriculum normally includes such areas as reading, spelling, writing, speech, and sometimes listening. These "tool" skill areas are critically important instructional fields in that certain levels of proficiency are necessary if adequate communicative skills are to be developed. Students who perform in educable mentally retarded range of intellectual development exhibit particular deficiencies in learning specific competencies in several of the language art areas. Therefore, the individualized education plans (IEPS) that are required for these students as a result of P.L. 94-142, Education of All Handicapped Children Act of 1975, should include instructional strategies that address identified deficiencies in the language arts. A comprehensive discussion of all the language arts and their significance for IEPs of educable students would be too extensive for the scope of this article. As a result, the authors purport to concentrate primarily on spelling. A selective review of relevant literature will be provided as a rationale supporting the value of acquiring spelling skills. In addition, numerous instructional procedures will be provided

    Spelling Progress Bulletin Winter 1969

    No full text
    Coming Attractions Numalittera -or how to write in Math for the Computer, by Ivor Darreg. A minimal change system of simplified spelling. A study of spelling errors. Nonsense prose as a test of adequacy for a phonetic alphabet. A study of rules used in teaching spelling. Late News Mr. Eugene Kelly, formerly Pres. of Coca Cola of Canada before his retirement, has (Nov. 25,1969) left the income from his estate of over $6 million to further the success of the Initial Teaching Alphabet, it was announced by Prof. J. R. Block, Director of the i.t.a. Foundation (in America), at Hofstra Univ., Hemstead, New York.11550. The making of this gift coincides with the recent publication of the long-awaited British Schools Council Report on the tests and use of the Initial Teaching Alphabet in British schools. This comprehensive government-sponsored report concluded that: "It would appear that the best way to learn to read in traditional orthography is to learn first to read in the Initial Teaching Alphabet." "It is extremely gratifying to me," said Sir James Pitman at a dinner given him by the Academy for Educational Development at the home of Mr. & Mrs. Douglass Cater, in Washington, D.C., "that a philanthropist such as Mr. Kelly had the foresight to leave such substantial resources to a cause which is so intimately related to the growth of men's minds. Only a literate man can realize his full potentiality, and I believe that the i.t.a. is an extraordinarily potent weapon in the war on ignorance everywhere." Sit James Pitman was in the U.S.A. at the invitation of Congressman R. C. Pucinski, Chairman of the General Subcommittee on Education, House of Representatives. He testified on i.t.a.'s use in American and British schools before the Congressional Sub-committee on Education. -o0o- [Spelling Progress Bulletin Winter 1969 p3] Alphabet-amation by Ivor Darreg More than two thousand years ago, the stone cutters of Ancient Rome were at work, and their customary practice determined the shapes of capital letters, many of which forms survive to this very day. In deference to their memory, you will find such inscriptions on buildings as: MVSEVM, recalling that the rounded form of capital U is a comparatively modern invention. Even when so revolutionary an invention as the typewriter appeared on the scene just about a century ago, its makers copied quite slavishly the serifs (spurs or short lines at the ends of certain letter-elements) originated by those Ancient Roman stone-cutters, tho there was no real need to do so. Similarly, there have been almost no innovations in letter-shapes since then despite the extensive use of Linotype and Monotype machines, or even the new phototypesetters. Influenced by this inertia, many spelling reformers have given up all hope of adding any new letters to the alphabet, and with noble resignation, try to work with the traditional 26 letters. They take the typewriter keyboard and the customary appearance of the printed page of English with the blind obeisance due a law of physics or a religious precept. If pressed on this point, they usually have recourse to some economic argument: "It would cost too much to change. Think of the staggering expense for new type, and the impossible cost of transcribing from one orthography to another." As recently as ten years ago, these and other such arguments still seemed plausible. But in this last decade, the computer and its accessories have entered into almost every phase of our lives, and printing cannot remain an exception. Indeed, it hasn't. Computers programmed to insert hyphens at the ends of lines, and to convert the code from typewriter keyboards into a very different code chat will operate Linotype or other typesetter keyboards, are already in operation in many newspaper plants. There is now an electric typewriter which produces a magnetic tape recording which can be edited, altered, and erased before transmitting the typed material to a computer. This means that automation is affecting the field of automation itself! That is, punched paper tape and punched cards are giving way to invisible recordings on magnetic tape that can be erased and reused indefinitely. Does the story end there? Hardly. The big news at present is OCR -optical character recognition. Why? Because the appetite of a computer and its associate apparatus (technically known as "peripherals") for data is incredibly large. What point in building computer insides to work at microsecond (one-millionth of a second) and nanosecond (one-billiontli of a second) speeds, if hundreds of girls are condemned thereby to the boring, repetitive, and necessarily slow work of punching thousands of cards? Or the not much faster work of punching tape? Even the somewhat speedier method of typing onto magnetic tape still interposes the human eye and hand between the printed page and the computer's input, with all the possibilities for error, and the fantastic disparity in speed between human and machine is scarcely alleviated. This disparity in speed entails great expense to provide "buffer storage" and "interfacing." When computer time is quoted in dollars per second, it can't stand idle for the proverbial wink of a gnat's eyelash. Almost as expensive is time on telephone lines, leased wires, radio links, etc. provided to get data from one place to another. The facilities cost money even when standing idle for a split instant. So if a machine could be taught to read, the gathering of data could be largely automated, and many people would be released from monotonous jobs, to do something more interesting and more human. This process has already begun: optical character recognition is an accomplished fact, tho still in its infancy. Not only that: one segment of our economy couldn't wait for OCR and so MICR was invented -magnetic ink character recognition for bank checks. The ink used to print those queerly-shaped numerals contains magnetic material rather like the coating on magnetic recording tape, and pickup heads something like those on tape recorders can "read" this ink. Thus you have before you an example of the influence of automation on the design of printed characters. The numeral 7 has been distorted to the point that it looks like a question mark (or the I.P.A. symbol for the glottal stop, to be exact). Machines are stupid -unthinking. While human beings have brains enough to tell, by consulting the context, whether zero or the capital letter O is meant, and similarly, to tell whether small I or the numeral one is meant (in most cases but not all), a machine -particularly one that must not cost too much -cannot tell what meaning a symbol is to have. Accordingly, this similarity between letterand numeral-signs can no longer be tolerated. This has led to the design of new type-faces (and typewriter type) that machines will be able to read. Depending on how much you can or are willing to spend, an optical character reader can deal with numerals only, numerals and a few capital letters, capital letters only, or a complete alphabet in both lower and upper case with a few mathematical signs thrown in. Experimental machines can even read hand-printed numerals, if these are written in pre-located "boxes." Some machines can read several styles and/or sizes of type, but so far the machine-reading of cursive handwriting is for the future. The usual process of reading is a scanning process, rather like the way in which a TV camera breaks up the picture into lines and dots so that it can be transmitted fragment by fragment. The coarseness or fineness of the scanning pattern determines how well the machine can read, and recognition of small details naturally costs more. (This scanning process was foreshadowed by an invention for the blind, which converted lines of print on a book page into groups of whistling tones via photocells. Very high reading speeds are now possible, and this will increase still more in time to come. However, for better reliability, the material is preferably printed in a special machine-recognizable font, so many typewriter manufacturers are already selling the specially-fitted typewriters for this purpose. Thus the economic pressure is now operating for change and innovation in letter-shapes, rather than against change as it has been the case for centuries. Now this means that some of the opposition to a new spelling, and additional letters for the English alphabet, is crumbling. But -and this is a big but -these new letters must be designed with machinereading in mind. Moreover, the alphabet, and thus the spelling, must be remodled with yet another kind of machine governing the design: Eventually there will be automatic reading-out-loudmachines. These optical character recognizers will actuate synthetic speech devices, doubtless thru a digital-to-analog converter, and for that we must have a phonetic spelling. If we continue using silent letters, they will get pronounced, and if we insist on digraphs and trigraphs, they will get pronounced as the separate letters are, and not as the special sounds for which we are too stingy to assign proper new letters. To build a machine that could recognize the th in other as a unit sound when its properly programed result would be as in outhouse, would be expensive, if even possible. Multiply this by the ten other digraphs now used plus the long vowel sounds, would result in an astronomical cost. Punctuation marks will probably make their own noises, a la Victor Borge, because machines cannot be made that will think and interpret our names for them. Italics will come out in a special tone of voice, while capital letters -now let's see -what would a capital K sound like? How would you learn the new spelling? By one of these reading-aloud machines, of course. Read along with it and listen, and soon you would learn even a completely new alphabet. Write or type in it, and see if the new machine could read it aloud as it should. So there goes the big bugaboo, how to learn a new spelling, up in a thin wisp of smoke. Special computer programs would also take care of the gargantuan task of retranscribing old orthography into the new, altho a certain amount of human editing would be required. That is, the machines could read existing books and produce printing places, no laborious retyping or reading and copying being required. Admittedly, some of this sounds like science-fiction, but it is "breathing down our neck" and if we don't get ready for it, we will be obsolete. -o0o- [Spelling Progress Bulletin Winter 1969 p3] 3. Must we always have absurd spelling? by K. G. Aberdeen, Sask, Canada "For the want of a nail the shoe was lost, For the want of a shoe the horse was lost, For the want of a horse the rider was lost, For the want of a rider the battle was lost, And all for the want of a horseshoe nail!" So goes an old children's poem -and there's more truth than poetry in it. It is also true that in our education system and in international communication there is an obvious, but unsolved, detail which is causing trouble everywhere. That detail is the inconsistent and absurd spelling of words in the English language. It is widely known that some of our spelling is ridiculous, but only on a careful study is it found that our spelling is less than 20% phonetic! We get so used to many words such as: one, was, is, any, two, eight, that we don't realize the difficulties they present to Grade I pupils and foreigners. Many people say that spelling reform would be desirable but that it would be impossible to attain. Do they really mean that in this space age such a housekeeping chore is too difficult to put into effect? In this age of scientific achievement does it make sense to use in our written communication a system which grew up like an unattended garden that has never been cultivated? Russia, Turkey, Norway, and many other countries have reformed their spelling, even tho before reform they were more reliable than ours. Why cannot we? Having been a school teacher for 40 years, I have observed the difficulties caused by spelling irregularities: it affects all people from Grade I to university and those in all walks of life. I have been interested in spelling reform for 30 years and have gathered a good deal of information from Canada, United States, England, Australia and Sweden, where a keen interest is being taken and much good work is being done. I am now retired and I'm anxious to do whatever I can to aid the progress of spelling reform. The New Democratic Party of Canada has a proven record of putting "humanity first." So I believe it should be in favor of spelling reform because it would increase the efficiency of our language, make it easier to learn to read and spell, would decrease the number of dropouts from our schools, would prove unnecessary the endless search for better methods of teaching reading, and last but not least, cut the cost of education and the time for graduation. Certainly these points are worthy of serious consideration. Therefore, I propose to present to the N.D.P. national convention a paper calling for the party to adopt, as a plank in its platform for the next federal election, the establishment of a commission to study the need for and the advantages of a reform in the spelling of English and to work for its use in Canada. Some party members and delegates to the convention may consider this matter of too little importance to include in our election program. But anything that so vitally affects the future of generations of our children needs some serious thought, discussion with dissatisfied teachers and unhappy parents, and then to decide if it is not worthwhile to make the English language easier to attain for the millions of foreigners who struggle to use it. Many of them have said that: If English were regularized, it would soon be the International World Language. Think of what that would mean to better international understanding. Not to mention commerce, scientific advancement, etc. As with the horseshoe nail in the rhyme, let us not allow something small and apparently inconsequential to undermine and retard our people. Be unselfish in your consideration of such changes in spelling as to hinder you little yet be of tremendous aid to future generations and foreigners. [Spelling Progress Bulletin Winter 1969 pp4,5] 4. The Phonics Dilemma, by Bruce A. Lloyd* *Western Mich. Univ. Reprinted from Reading Horizons, vol 10, no. 1, Fall, 1969. The quicksand of confrontation in phonics methodology has been the either-or dichotomy exemplified by the nature-nurture controversy of yesteryear. The proponents of heredity as the prime dictator of human growth and development had well-polished arguments for their position and so did those who believed in the preeminence of environmental factors. In the phonics controversy, the oversimplification of viewpoint was equally clear-cut. Either we teach phonics as synthesis or we teach phonics as analysis, but we cannot do both. However, modern phonics instructional theory and practice indicates that perceptive teachers are taking both handles and doing just that up to a point. The Synthetic Approach Through the process of synthesis, the reader looks at each letter of a word, says the sound of the letter, and puts the sound together with the next letter. This procedure is used with all letters in the word to the end that the reader will be able to pronounce the word when he has all of the sounds together. At first glance, this system might appear to work and work well for pupils attacking unknown words, words not in their sight vocabulary. But all too frequently in actual practice, more confusion than enlightenment was generated. Not only is the meaning of the word not forthcoming, but word pronunciation is equated with reading. The problem of synthesis can be illustrated as follows: take the word BAT. Readers were supposed to say the sound of the letter B first and it usually came out something like BUH. Probably the vowel A gave no trouble and the reader would say A. So far he had BUHA. Finally, he looked at the letter T and probably said TUH. Putting all these sounds together, he came up with BUHATUH which is a far cry from the sounds heard in the word BAT. The Analytic Approach. Analysis, on the other hand, called upon the reader to look at the word as a whole, to find familiar parts, and to see which phonics rules could be applied. From the known parts, the reader could determine the rest of the word providing he knew and could apply techniques such as initial consonant substitution, initial consonant blend or digraph substitution, and/or the substitution of phonograms. For example, if the reader saw the word MAT and did not know it, but did know the word BAT, then all he had to do was substitute the sound of M for the B sound and arrive at the correct pronunciation. Unfortunately, this technique, like all other techniques used with our presently spelled English words, has its advantages and its disadvantages or limitations as well. Looking for known word parts or word families (phonograms) is an acceptable word attack skill, but its use is restricted to those word elements that are: (1) known to the reader and (2) fit the confining pattern. Likewise, not all English words fit these nice, neat patterns because of the ridiculous and inconsistent, irregular spellings which are an ever-present roadblock to learning to read. To illustrate the limitations of analysis techniques, take a look at the word TOGETHER. A reader may analyze the components TO and GET and HER, and not be able to continue reading because that is not the meaning of the word. Another fly in the phonic analysis ointment is the methodology based on rules. Readers were taught the many generalizations and the exceptions thereto with the assumption that they would look at a word, think of the generalization(s) appropriate to it, and come up with the pronunciation. For example, the basic (and comparatively unusable) generalization regarding two adjacent vowels goes something like this: "when two vowels come together in a word, the first vowel usually takes the long sound (says its name) and the second vowel is silent." That rule is illustrated by words such as: seat, boat, and hail. Recent studies have found this rule to be less than fifty percent effective as there are more words that do not conform (break, lead, and said) than there are that do. Moreover, a number of rules of English phonic analysis have been shown to be of considerably less utility than was once thought. So it would seem that the analysis technique using known word parts and/or phonics generalizations has also left something to be desired. The Modern Approach. This brings us to the basic question of the present dilemma, namely, what is the proper place of phonics in present-day, meaningful reading instructional programs. Synthesis has its limitations and so does analysis. What is the reading teacher to do? Perhaps the best answer tests with the proper use-stress continuum. Phonics can be defined as the correct association of speech sounds with their corresponding symbols. In other words, there is a phoneme-grapheme relationship (imperfect as it is) and readers need to be taught the correct phoneme response to the appropriate grapheme. Herein lies the problem. Because of imperfections and inconsistencies in English spellings, with the corresponding lack of utility in either analysis or synthesis, how much stress should be given to modern phonics instruction in today's reading programs? To answer this question, we must look to the actual use of phonics as the reader needs the appropriate skills to apply in attacking words. Initially, the young or inexperienced reader has a limited sight vocabulary and is faced with the problem of attacking many of the words he meets. So there are many printed symbol groups whose pronunciation needs to be unlocked in order for the reader to read and to read better. The real problem, then, tests with the reader's recognition vocabulary (his store of sight words) simply because the larger the number of words he recognizes instantly, the easier it is to read and to comprehend. The fewer words he knows and/or recognizes the more he will have to rely on word attack knowledge. The larger the vocabulary of understanding (recognition vocabulary) acquired by the pupil, the more effective will be his use of phonic analysis. Once the word is pronounced, the appropriate mental associations must be made with the word, then the individual can continue reading with understanding. For the reader who has a limited vocabulary of understanding, using the dictionary is the best resource. Phonic skills do not give the reader word meaning. Even if the reader can "sound out" or pronounce the unknown word through the application of phonic analysis, he is still unsure of the meaning and must resort to context or the dictionary. Phonic analysis does not provide word definitions. These come from the reader's previous experiences. Mature readers follow a similar pattern. They also find phonic analysis skills of service in the pronunciation of unknown words. Such individuals usually look at words, find familiar parts, attach sounds to symbols, synthesize correctly, and come up with a pronunciation. Then, if the set of sounds is in the reader's vocabulary of understanding and he recognizes this from some previous experience, the meaning becomes apparent and he continues reading. On the other hand, even if lie can say the word, he may

    Spelling Progress Bulletin Dedicated to finding the causes of difficulties in learning reading and spelling

    No full text
    Announcements United Kingdom Information Technology Year, 1982 As this issue goes to press, the United Kingdom is well into its promotion and publicity for Information Technology Year 1982 when the British Government is spending £1.2 million to publicise modern developments in electronic communication, microprocessors, and video equipment. All these were non-existent in their present forms only a few decades ago. Two hundred years ago, the only major piece of Information Technology was the printed word. The basis for storing information was invented several thousand years ago -the writing system
    corecore