Wednesday, April 2, 2025

PERSONAL MEDITATIONS AS THIS YEAR'S HOLY WEEK APPROACHES

Thoughts about the Tree of Knowledge
in the Garden of Eden

By Jose A. Carillo


I have
given it a lot of thought, and now I suspect that the original Tree of Knowledge aka The Tree of Knowledge in the Garden of Eden was not a living plant but a powerful computer. The Bible was surprisingly silent about the nature of that tree, so artists and writers through the ages had felt free to variously picture it as an apple tree, a fig tree, a pear tree, a dragon’s blood tree, even a banana tree. I understand that in a 13th century cathedral somewhere in France, there was even a fresco that showed Eve finding a serpent coiled around a giant branching European mushroom, the lightly toxic and hallucinogenic Amanita muscaria, drawn with Provencãl innocence to represent the tree that gave us our much-dreaded mortality. These images of the Tree of Knowledge are as charming as the Romans envisioning their messenger-god Mercury as a runner with winged feet, as frightening as the early Christians sketching the devil as a thoroughly beastly creature with serpent’s snout and bat wings, and as heavenly as the Renaissance artists conjuring archangels with majestic, blindingly white eagle’s wings.

        LEAD-GLAZED EARTHENWARE MADE BY THOMAS TOFT, ©THE FITZWILLIAM MUSEUM,
               THE UNIVERSITY OF CAMBRIDGE
                          Adam and Eve and the Tree of Knowledge: C.206-1928

All of this ancient imagery, however, miserably fails to capture the essence of a device or icon that is supposed to represent the most powerful source of wisdom and instruction the world has ever known. An apple tree, a banana tree, or a vine-like mushroom as the Tree of Knowledge? This seems to me to stretch the credulity of even a nine-year-old grade-schooler much too much! I would therefore rather think of the Tree of Knowledge as a Pentium 4 personal computer with a 56 kbps fax modem, hooked up by a powerful Internet server to the World Wide Web, capable of directly feeding on the 2.5 billion documents accessible to the Internet and of being able to sift through 520 billion more that are publicly accessible in other databases.* I could not think of any other compendium or structure, no matter how massive, that could draw on such a huge database and merit “Tree of Knowledge” as a sobriquet, much less make this database accessible to even the small populace of the Garden of Eden close to the time of Creation.

Of course I realize that a myriad conceptual objections can be raised against this seemingly whimsical intellectual construct. Chief of these is the question of how the Pentium 4 and the Internet could have gotten themselves into the Garden of Eden in the first place. Could it be that they had managed to quietly transport themselves back in time and install themselves into the Tree of Knowledge, or else disguise themselves as the tree itself? Those fixated with time’s immutability would of course deem this too farfetched, as improbable as the tales of extraterrestrial visitations peddled by the Danish writer Erik von Daeniken. But it is at least not as preposterous a concept as a fruit tree being the source of all human understanding and wisdom. A tree as a source of life, yes, like our coconut with its proverbial one thousand and one uses, from food to shelter to medicine to fuel and to lumber; but just any tree as source of all knowledge, I really wonder.

Imagined digital art rendition of The Tree of Knowledge

 And what about the paradox that would result if we believed that the Tree of Knowledge drew its power from a state-of-the-art Pentium? Would that belief still hold if we consider the fact that the computer and the Web are actually the culmination of the series of small and big inventions that sprung from Adam and Eve having eaten the fruit from the Tree of Knowledge itself? Remember that the computer became possible only because somewhere early in time, man discovered and learned how to harness fire, then found a way centuries later to use it to melt the tiny particles of glass in sand into wafers of silicon, then developed a method for converting these wafers into transistor chips and into extremely powerful motherboards and processors that are the heart of the modern computer. Remember, too, that the Internet and the Web are of a much more recent vintage. It was only in 1973 that the Internet came into being, the happy result of American research into technologies to interlink computer networks of various kinds. Another 21 years into the future, in 1994, the British computer scientist Tim Berners-Lee invented the World Wide Web to unify and integrate the Internet’s global information and communication structure. Since then it has expanded into a global network of networks, enabling computers of all kinds—including yours and mine—to directly communicate and share services throughout much of our planet.

What is perhaps little appreciated in this dizzying train of inventions is that the modern computer and the Web have been essentially a continuing but silent Hindu-Arabic-European-American co-production, and that at the root of it was the ancient Indo-European language and the Arabic number system. We know, of course, that these twin foundations of our civilization moved into Europe and jumped across the English Channel into England, polishing themselves into the English language and into the Arabic number system that we know so well today. It really is no wonder that Boolean algebra, a mathematical system of representing logical propositions that became the foundation for the modern computer, was developed by the English language expert and mathematician George Boole in the very same soil that produced the wonder of English literature that was William Shakespeare. The Chinese may have invented paper, the abacus, and gunpowder, and the Romans may have built their empire that extended all the way to Africa and to the banks of the Mesopotamian River in what is now modern Iraq, but I simply cannot conceive of the modern computer built from Chinese script or from the Roman numeral system, with which no stable building taller than the Roman Coliseum could be built because the system simply could not multiply and divide numbers properly.

That the Tree of Knowledge could not have been a fruit tree but a computer linked to the Web may remain debatable, and I will not quibble with that fact. But to me, one thing is clear and certain: the computer and the Worldwide Web have made the Tree of Knowledge much more accessible and closer to us than ever before, and it would be a tragedy if not outright foolish for anyone not to learn to freely partake of its fruits. (Written in 2002)

From English Plain and Simple: No-Nonsense Ways to Learn Today’s Global Language © 2004 by Jose A. Carillo. Copyright 2008 by The Manila Times Publishing Corp. All rights reserved.

-------
*Since this essay was written, of course, the Pentium 4 processor has since been supplanted in personal computers by much more advanced and powerful processors like the Core-Duo, and Google has grown even more explosively from 2,469,940,685 web pages in 2002 to over 30,000,000,000 today. It can thus be said that the computing machines and the online search engine capability that I had described glowingly in this 2002 essay are now obsolete. (2009)


Tuesday, March 25, 2025

THE POWER OF SUASIVE DICTION

Giving a Touch of Authority to Our Prose 
By Jose A. Carillo 


“What a pair we make,” whispered the Prince of Wales to the pilloried presumptive royal knight William in the riotously charming 2001 film A Knight’s Tale, “both trying hard to hide who we are, both unable to do so.”

For those who have not seen the movie, the prince was constrained to shed off his disguise as a monk among the lynching mob to save the disgraced knight, who a few days earlier had spared him from the ignominy of certain defeat by refusing to joust with him in a tournament. The knight, through the machinations of a villainous duke, was thereafter unmasked as a lowly thatcher’s* son masquerading as a member of royalty, thus leading to his arrest and humiliation on the pillory.



This medieval morality tale gives a powerful insight into the crucial need to speak and act in keeping with who we think, presume, or pretend we are. When we write, in particular, we must use language that conveys our thoughts in ways that validate and support our own self-concept or projection of ourselves. The wife of the Caesar must not only be chaste but must look and sound chaste. The professor must really look and sound professorial. The presidentiable** must really look and sound presidentiable. To fail to do this in both civilized and uncivilized society—or not to have the wisdom or guile to at least sustain the charade—is to invite catastrophe, which is precisely what brought the presumptive knight to the pillory for public lynching.

Be that as it may, our most potent tool for becoming credible is what the linguists call suasive diction. This is using language to persuasively convey facts and the speaker’s feelings toward those facts. No instrument is more potent for doing that, of course, than the writer’s or speaker’s vocabulary. Our words define us. Whether armed with excellent research or dubious information, whether motivated by good or bad intentions, we can turn off the audience with awkward or leaden words, or hold it in thrall with engaging words and well-turned phrases. It is largely through word choice, in fact, that we establish our credibility and rapport with our audience. Short of coercion or the force of arms, rarely can persuasive communication take place without this credibility and rapport.

The most basic technique for suasive diction is the proper use of the pronouns of power, namely “we,” “us,” “our,” “they,” and “them.” These innocent-looking pronouns can confer a sense of authority—the illusion of authority, if you may—to our written or spoken statements far beyond what the first-person singular can give. The first-person “I” and “me” speak only for the solitary communicator; on the other hand, the collective “we” and “us” speak for an entire group or institution, which people normally take for granted as less fallible and less prone to vainglory than the individual—hence presumed to be more credible, more authoritative.

This, for instance, is why newspaper editorials routinely use the institutional “we” although they are usually crafted by a solitary writer not so high on the paper’s editorial totem pole; it’s also why tyrants and despots of every stripe and persuasion always invoke “the right vested in me by God/ law/ the sovereign people” to seize power or hold on to it, and why candidates of paltry qualification and virtue invariably invoke “the people’s great desire for change” or “divine signs in the sky” as their passport to public office.

Of course, “we,” “us,” “our,” “they,” and “them” work just as well as pronouns of solidarity. They foster a stronger sense of closeness and intimacy with the audience, and can more easily put audiences at ease with what the speaker has to say. In contrast, the first person “I” often comes across as too one-sided and self-serving, particularly in writing, while the second person “you” can sound too pedantic and intimidating. We stand a much greater chance of getting a fair hearing from those antagonistic to our position by making them think that we are actually on their side.



Even if we are good at using the pronouns of power and solidarity, however, we must not for a minute believe that they are all we need to achieve suasive diction. The facts supporting our contention must be substantial and accurate. Our opinions must be truly informed, not half-baked, and our logic must be sound and beyond reproach. Otherwise, we may have to put on an act like that of the seemingly enlightened prince in A Knight’s Tale, lying to the lynching mob about the parentage of William the thatcher’s son, then justifying that lie by nonchalantly invoking royal infallibility: “He may appear to be of humble origins, but my personal historians have discovered that he is descended from an ancient royal line. This is my word, and as such is beyond contestation.”

A big lie indeed, but said with the confidence of a true royal. (March 16, 2004)

From the weekly column “English Plain and Simple” by Jose A. Carillo in The Manila Times, March 16, 2004 issue, © 2004 by The Manila Times Publishing Corp. All rights reserved.

-------
*Thatcher – A thatch is a sheltering cover, as a house roof, made of straw, grass, or—in our domestic usage—nipa or buri. A thatcher would therefore be someone whose occupation is to install a thatch or that kind of roofing. Of course, the name “Thatcher” became a media mainstay for many years when the feisty Margaret Hilda Thatcher was Britain’s prime minister for 12 years from 1979 to 1990.

**I use the term “presidentiable” here with some strong misgivings, for it is not even recognized in any respectable dictionary. But during every national election season, it forms part of the Philippine journalistic and political vocabulary much too strongly and can't be completely ignored.

Tuesday, March 18, 2025

WRITE TO GET YOUR MESSAGE ACROSS CLEARLY, UNDERSTANDABLY

The sensible way to write 
By Jose A. Carillo

People ask me sometimes if there’s a quick formula for effective writing. My answer is yes, I know of one, but I can't guarantee how quick it will work for anyone aspiring to write better. In fact, like most people who make a living from writing and editing, I arrived at the formula not in just one burst of enlightenment but through years and years of practice. That formula, reduced to its simplest terms, is this: 

Effective Writing = Good semantics + Good syntax + Sensibility

Before discussing the formula, I want to make it clear that by effective writing, I don’t necessarily mean great prose. I only mean writing that gets our message across clearly and understandably. If you have the natural gift or innate facility to write novels, plays, or movie scripts that can impress audiences, that will be great! But if you happen not to have it, don’t simply bewail your fate. It’s never too late to make yourself do much better with the written word. So long as you clearly understand how the formula’s three variables work, that goal should be well within your reach inyour early years.

                                            IMAGE CREDIT: BRITISH COUNCIL.PH

The first writing variable is, of course, our vocabulary. I use the term semantics rather loosely for this variable so it can cover not only the acquisition of words but also the understanding of their various meanings. This variable is actually the easiest to load in our favor. In English, by just learning 10 new words every day, we can enrich our vocabulary by 3,600 words a year; by the time we are 50 years old, there should be more than 200,000 words at our command. That would be about the same number as all of the basic words in the English language outside of archaic words, scientific terms, and jargon—more than what we will really ever need to write effectively.

The second variable is admittedly more difficult to master: syntax or grammar. We are all supposed to have started learning it since we were toddlers. But since English is only a second language for most us (here Im referring to Filipinos in particular), we learned to speak the language only gradually and discovered the intricacies of its grammar much later. Gaps in our English syntax are therefore inevitable. Thus, even if we have a superior intellect, we may not necessarily be able to speak or write English as well as its native speakers do.

An excellent grasp of English semantics and syntax will be great, of course, but we all know that this in itself won’t be enough. It doesn’t guarantee the ability to write good prose. There are in fact many highly educated people whose spoken English is beyond reproach, but who can’t write a clear, understandable, and interesting sentence that goes longer than six words. Their problem is that when they sit down to write, alone and without the stimulus of a live listener, they can only write stilted, unfocused thoughts addressed to no one in particular. In short, their writing doesn’t communicate adequately.

This happens because peoplein general don’t know that effective writing actually needs a very important third variable. Some people call it “sensitivity” but I prefer to use the term sensibility for the variable in writing that creates sensation, feeling, and understanding in the reader through the written word. The “sensible” writer is one who can make his prose resonate or connect with the unseen reader. The ability to achieve this resonance is obviously a much more elusive factor than both semantics and syntax. But it is unfortunate that it’s not well recognized and formally taught in schools. What’s often learned as the principal goal of writing is self-expression, which is the opposite of sensibility.   

The obsession with self-expression is, I think, the single biggest reason why many people, although intelligent, cannot write effectively. They don't realize that writing actually must do the exact opposite of self-expression to work with a reader. They become so busy giving vent in writing to what they feel and think, what they know, and what they believe in. They are unable to grasp the fact that for the readers to understand and appreciate these things, they must write only with words, meanings, and mental images that are already in their readers’ heads. The reality is that these will often be not the words, meanings, and mental images that come naturally to us.

The sensible way to write, therefore, is to clearly understand that we are not writing for ourselves but for others. Writing is essentially speaking silently to an unseen listener. Our terms of reference when we write shouldn’t be our own intellect or accumulated knowledge; it should instead be our best estimate of the quality of mind and temperament of our readers. And it should be obvious by now that the only levels of vocabulary, grammar, and language that will work for this purpose are primarily those of our readers, not our own levels.

This brings us back to this formula that I offered at the very outset of this essay: 

Effective Writing = Good semantics + Good syntax + Sensibility 

Only by carefully balancing this equation can we really hope to make our writing truly clear, understandable, and interesting to our target readers.

This essay first appeared in my “English Plain and Simple” column in The Manila Times on January 3, 2003.




Wednesday, March 5, 2025

MASTERING A BASIC BUT OFTEN BADLY POSITIONED ENGLISH WORD

Once and for all, let’s come to grips
with the proper use of “however”


Who can say right now that he or she has totally mastered the usage of “however,” that slippery word that can’t seem to stay put in just one place to make a sentence yield a desired meaning? I have this feeling that not very many can answer that question with an unqualified “Yes!” In my case, for instance, even after dealing with “however” for the umpteenth time in my writing and editing work, I still sometimes catch myself vacillating where to position it in certain sentence constructions. This is because experience has taught me, sometimes at great risk of social or professional embarrassment, that “however” can make subtle or profound changes in meaning and nuance—even in function—when it’s toggled across clauses and phrases or across sentences. And sometimes, “however” won’t do justice at all to the idea I want to express, so I need to discard and replace it with a more compliant conjunction or adverb.

To put some rough science to the usage of “however,” I wrote a two-part essay about how it works for my English-usage column in The Manila Times in November of 2005. I am now posting that essay once again in this week’s edition of Jose Carillos English Blogspot to give you a much surer footing when building sentences with this very useful but sometimes exasperatingly difficult-to-position word.

The use and misuse of “however” as a function word

One of the most misunderstood and misused words in English could very well be “however,” which works either as a conjunction or as an adverb. This is because many writers, no matter what their writing style may be and no matter how good their English may look, often tumble and fumble when using this very basic and very important function word.


                                                 IMAGE CREDIT: PROWRITINGAID.COM

Consider these representative samples of “however” misuse that I have gathered (all italicizations of quoted text mine):

(1) From an online essay on legal matters: “Correctly, both decisions are cited as saying that conversations and correspondence between the President and public officials are privileged…Once a firm decision, however, has been reached, like who will pay for the Venable contract, the conclusion reached is a matter of public concern no longer covered by privilege.”

(2) From a state university’s admission prospectus:If however, some graduation requirements are completed beyond the deadline, the student must register during the succeeding semester in order to be considered a candidate for graduation as of the end of that semester.”

(3) From an online update on international trade: “The proposal of the United States…has provided a big push to the negotiations. Trade analysts however are quick to point out that the US and EU proposals amount to nothing more than empty promises once again.”

(4) From a newspaper opinion column: “Anyway, the bill proposing the punitive tax has gone through committee deliberations and has been elevated to the plenary stage. Though this has more to do with legislators in customary fashion, humoring rather than indulging their colleague who is spearheading the bill, however, nonsensical its content and intent.”

The sentence in Item 1 shows the classic case of “however” misplacement. Here, “however” works as an adverb to mean “on the other hand” or “in contrast,” so it should logically be placed either at the beginning of the second sentence, where it can link this sentence firmly to its antecedent sentence, or right after the subordinate clause of the second sentence has been stated fully.

But the problem is that many writers habitually sneak “however” just anywhere in their sentences except up front, creating those abrupt interruptions of thought that needlessly bewilder readers. I think this is the result of being taught by English teachers who foist the grammatically, structurally, and semantically ruinous rule never to begin a sentence with “however,” and about this rule I have more to say later.

The quickest way to make cliffhanger “however” constructions smoother and clearer is to put “however” up front: “Correctly, both decisions are cited…However, once a firm decision has been reached, like [on] who will pay for the Venable contract, the conclusion reached is a matter of public concern no longer covered by privilege.” For even better rhythm, though, “however” can be deferred until the subordinate clause has been stated fully: “Correctly, both decisions are cited…Once a firm decision has been reached, however, like [on] who will pay for the Venable contract, the conclusion reached is a matter of public concern no longer covered by privilege.”

It’s likely that the strong resistance to using “however” to begin sentences has also led to the awkward “however” placements in Items 2 and 3. Note that in Item 2, even if the requisite comma after “if” is supplied to make the sentence structurally correct, the sentence would still sound stilted. But simply putting “however” up front fixes the problem, though: “However, if some graduation requirements are completed beyond the deadline, the student must register…”

In Item 3, on the other hand, the dysfunctional placement of “however” makes it difficult for readers to fathom what that word is supposed to be doing. Putting “however” up front clarifies the logic of the statement: “The proposal of the United States…has provided a big push to the negotiations. However, trade analysts are quick to point out that the US and EU proposals amount to nothing more than empty promises once again.”

In Item 4, “however” also functions as an adverb, this time to mean “no matter how” modifying the adjective “nonsensical.” But setting it off between commas has turned the second sentence into gibberish, no matter if the second comma might have been placed there not by the writer but by the proofreader. For the sentence to make sense, though, that second comma has to be dropped so that “however” can logically form part of the phrase that modifies the word “bill”: “Though this has more to do with legislators in customary fashion, humoring rather than indulging their colleague who is spearheading the bill, however nonsensical its content and intent.”

Now let’s examine the awkward consequences of forcing “however” to do a job that’s better performed by the conjunction “but.”

Take a look at the dysfunctional placements of “however” in the following passages:

(1) From a religious website: “To this, we agree. However in recent years, certain details about the Days have become ‘common knowledge’ to friends and family members of Dazers.”

(2) From a civil society website: “Steiner’s ideas found widespread acceptance in a Europe devastated by the First World War. However the hyperinflation of post World War I Germany demolished practical attempts by Steiner and his colleagues to make the threefold society with an active and independent cultural sphere a reality.”

(3) From an online Philippine festival backgrounder: “(Magellan) died in the encounter. That was on April 27, 1521. The remnants of Magellan’s men were however able to return to Spain to report the incident and the possibility of conquest.”

In Item 1, “however” is supposed to mean “on the other hand,” but without the requisite comma to set it off from the contrasting clause, so it erroneously gives the sense of “no matter if” and makes the sentence nonsensical. Adding the comma makes the correct sense emerge: “To this, we agree. However, in recent years, certain details about the Days have become ‘common knowledge’ to friends and family members of Dazers.” The rhythm of the statement gets even better when “however” is made to follow “in recent years” instead: “To this, we agree. In recent years, however, certain details about the Days…

But there’s actually a much more efficient way of constructing such “however” statements. The problem, though, is that many people are afraid to use it because of this other dubious grammar rule taught by some English teachers: Never use “but” to begin a sentence. This rule, which is meant to discourage incomplete sentences, isn’t of much practical value. On the contrary, using “but” instead of “however” to begin sentences makes their construction more forthright and their meaning clearer, as in this reconstruction of the sentence in Item 1: “To this, we agree. But in recent years, certain details about the Days have become common knowledge…

In Item 2, “but” can also do a much better job than “however” in delivering the contrastive idea. Take a look: “Steiner’s ideas found widespread acceptance in a Europe devastated by the First World War. But the hyperinflation of post World War I Germany demolished practical attempts by Steiner and his colleagues to make the threefold society…” Similarly, the historical vignette in Item 3 flows much better when “but” is used to begin the third sentence: “(Magellan) died in the encounter. That was on April 27, 1521. But the remnants of Magellan's men were able to return to Spain to report the incident and the possibility of conquest.”

Indeed, unless we want a very strong contrast, “but” is often a better choice than “however” in setting two ideas in opposition. And we need not worry about the claim of some grammarians that “but” is not dignified enough to begin sentences in formal writing. The American Heritage Dictionary of the English Language, for one, assures us that “‘but’ may be used to begin a sentence at all levels of style.”

Now we can perhaps already agree on these more sensible ground rules for using “however” and “but”:

(1) Ignore the misguided caveats against beginning a sentence with “however” or “but.” Those rules only impede clear expression and the logical development of ideas. Up front, both “however” and “but” work just fine when used to mean “on the other hand” or “in contrast.” When “however” is used to mean “nevertheless,” though, the best position for it may not be up front in the sentence, and putting it within the sentence may sometimes also be inappropriate. Using “but” instead can usually fix the problem: “The trip is long and costly. But the destination is worth the trouble.”

(2) Although beginning a sentence with “however” is perfectly acceptable, we must minimize doing it. “However” is an extremely emphatic conjunctive adverb, one that tends to sound more important than the contrastive or opposing idea that it introduces. Functionally, “however” serves best as a conjunctive adverb in compound sentences where two independent clauses are strongly set in opposition: “They want the house; however, their money simply is not enough.”

When we use “however” primarily for such compound constructions, we also get the bonus of not being forced to use it much too often to begin sentences, which admittedly can make our sentences sound too irritatingly legalistic.

--------------
This essay combines and condenses an original two-part version I wrote for my weekly column “English Plain and Simple” for the November 14 and 21, 2005 issues of The Manila Times. The original two-part version, in essentially the same form, later appeared as Chapters 109 and 110 of my book Give Your English the Winning Edge, © 2009 by Jose A. Carillo. All rights reserved.


Wednesday, February 19, 2025

GETTING USED TO NEW ENGLISH COINAGE

The perils of using back-formations 
By Jose A. Carillo 

I’ll admit that I am rather finicky in my choice of words, rarely giving in to the temptation of using nice-sounding words of doubtful meaning or origin. In fact, when the time came for me to put together my early English-usage newspaper columns into my first book, I became literally obsessive with my vocabulary. I was therefore supremely confident—“smug” is perhaps the better word—that when my first book, English Plain and Simple: No-Nonsense Ways to Learn Today’s Global Language, finally went to press, I had tied up whatever vocabulary loose ends I might have overlooked in my original column pieces due to the pressures of newspaper deadlines. 

A few weeks after the book came out, however, I got very upset when someone took issue with my use of the word “enthused” in this sentence: “In time, distracted and enthused by English-language stylists with comparable if not greater facility with prose, I gave up my search for both the writer and the book.” (“Rediscovering John Galsworthy,” chapter 39, page 116). The comment, which was part of an incisive post-publication critique by an extremely discerning reader, was this: “Enthused is a back-formation, one disapproved of by some careful writers/smug pedants.”

 

    IMAGE CREDIT: THEQUIRKSOFENGLISH.BLOGSPOT.COM

Using back-formations can be perilous because as words coined from previously existing words, many of them have yet to prove themselves as valid and genuinely useful additions to the language.

True enough, I discovered to my consternation that “enthuse,” which means “to show or express enthusiasm,” is not a well-accepted word. According to The Columbia Guide to Standard American English, this back-formation from the word “enthusiasm” has continued to be looked upon with distaste despite its having entered the English lexicon as far back as 1827. Apparently, the guidebook observed, this distaste for “enthuse” stems from people’s “dislike for the external emotional display and manipulation” conveyed by the word itself.

The Merriam-Webster’s 11th Collegiate Dictionary, on the other hand, while similarly noting disapproval for “enthuse,” qualifies that “current evidence shows it to be flourishing nonetheless on both sides of the Atlantic especially in journalistic prose.” Even so, had I known that “enthuse” was still far from widely respectable, I would have avoided using it rather than risk being labeled as a less-than-careful writer.

Indeed, using back-formations can be perilous because as words coined from previously existing words, many of them have yet to prove themselves as valid and genuinely useful additions to the language. After all, a back-formation typically results from extracting what is wrongly supposed to be the root word from an existing longer word, when in fact that longer word is the root word itself. This often happens in the case of words ending in “-er,” “-ar,” “-or,” or “-ion,” not a few of which are thought to be verbs-turned-“doer”-nouns because their endings look like suffixes.

For instance, the verb “peddler” is presumed to be the root word “peddle” with “r” added to it, but “peddler” is, in fact, the root word itself and “peddle” is simply a back-formation created by dropping “r” from “peddler.” The verb “donate,” on the other hand, results when “-ion” from “donation” is replaced with the ending “-e” to form the back-formation “donate.” The noun “donation,” however, is actually the root word here, even if “donate” sounds and structurally looks more like the root word itself.

The same back-formation process has produced such words as “edit” (from the root word “editor”), “emote” (from “emotion”), “accrete” (from “accretion”), “aesthete” (from “aesthetics”), “burgle” (from “burglar”), and “televise” (from “television”). The difference is that through continuing usage that has whittled down opposition by vocabulary gatekeepers, these back-formations have become generally accepted English words.

Of course, it’s probably only a matter of time before “enthuse”—along with such dreadful back-formations as “liaise” (from “liaison”), “surveil” (from “surveillance”), “elocute” (from “elocution”), “incent” (from “incentive”), and “aggress” (from “aggression”)—similarly gains respectability. Until then, however, I think it would be prudent to put the usage of “enthuse” on hold, as I have done by discarding it in my revision for the succeeding printings of my book.

This is not to say, however, that we should eliminate back-formations altogether from our vocabulary. What would have happened to English if people simply hadn’t come up with such catchy words as “scavenge” (from “scavenger”), “diagnose” (from “diagnosis”), “escalate” (from “escalator”), “tweeze” (from “tweezers”), “jell” (from “jelly”), and “sleaze” (from “sleazy”)? Even the most exacting pedants have already given up their resistance to these words, for along with scores of other back-formations, they have already proven their semantic mettle as concise and forceful expressions of new ideas for which no single words had existed before.

The perils of using still unacceptable back-formations will always be there, of course, but no matter. We can easily deal with them by simply checking with a good dictionary each time we encounter words that just don’t seem to look or sound right.

This essay, which forms Chapter 103 of my book Give Your English the Winning Edge, first appeared in my weekly column “English Plain and Simple” in The Manila Times,©2009 by the Manila Times Publishing. All rights reserved.

Wednesday, February 12, 2025

LOOKING BACK TO THE ORIGINS OF THE DAY OF HEARTS

 The Real Score About Valentine’s Day     
 By Jose A. Carillo



“If you must write about Valentine’s Day,” my wife Leonor admonished me, “don’t be a spoilsport. By all means take a break from your grammar columns, but don’t try to take away the romance from Valentine’s.”

“Oh, don’t worry, Leonor,” I said, “I won’t be a spoilsport. Why would I want to do that? On the contrary, I want to tell lovers all over the world that they are right on target in doing the things they do on Valentine’s Day. I mean, you know, exchanging love tokens, whispering sweet nothings, having dinner by candlelight—good, old romance the way it should be.”

“Then you’ve got nothing really new to say,” she said. “You’ll just recycle the same old story that everybody recycles this time of year.”

“Not with this one, Leonor. I have a new thesis: that people should thank their lucky stars they can celebrate Valentine’s Day not so different from how the ancient Romans did it. As you know, those people started it all almost a thousand years before the Christian evangelists came to Europe. They had this much-awaited love festival on February 14, precisely the same day as today’s Valentine’s Day. It went by another name, of course. They called it the Lupercalia.”

“Umm...interesting,” Leonor said. “Tell me more about it.”

“The Lupercalia, in plain English, was the ‘Feast of the Wolf-God.’ It was an ancient fertility rite in honor of a god who protected sheep from the wolves. Its high point was a mating game, a lottery for young, unmarried men and women. The organizers would write the names of qualified, interested women on small pieces of parchment, then drop them into a big vase. Each qualified male drew one piece from the vase, and the woman whose name was on that piece became his date or ‘steady’ for one whole year.”

“That simple? Unacquainted couples were paired with no courtship, no legal and religious mumbo-jumbo?”

“Yes, Leonor, and they had a whole year to find out if they were temperamentally and sexually compatible. If they were, of course, they married and raised a family.”

“How wonderfully uncomplicated, but how unromantic! And my heart bleeds for the young couples that had an eye for each other beforehand. With, say, 1,000 women’s names in that lottery, the probability of a woman getting picked by a man she already liked would be next to zilch; so were the chances of a young man picking the woman he really liked. And the chances of a mutually attracted pair being mated? That’s 1/1,000 multiplied by 1/1,000 or one in a million, right?”


“Right, Leonor! A priori romances simply couldn’t bloom unless the partners decided to mutually violate the rules. But there was one good thing going for that lottery, I think: it leveled the playing field for love and procreation. It must have exquisitely churned and enriched the gene pool of the ancient Romans.”

“Maybe so, but don’t you think their ritual was so elemental, so...shall we say, ‘uncivilized’?”

“That’s saying it mildly, Leonor. It scandalized the early Christian missionaries. They found it decadent, immoral, and, of course, unchristian. So they tried to change it by frying it with its own fat, so to speak.”

“How?”


“Well, the clerics simply revoked the practice of writing the names of young, unmarried women on the pieces of parchment. They wrote on them the names of the Christian saints instead. And you know what they offered to the young, unmarried man who picked the name of a particular saint?”

“What?”

“The privilege of emulating the virtues of that saint for one whole year.”

“What spoilsports, those clerics! Why would any sensible lover whether male or female want to play that sort of game? For Pete’s sake, that lottery was for love and romance and chance encounters, not for sainthood!”


Valentine was imprisoned by the Romans circa 270 A.D. for violating
a ban on performing marriages during wartime, then was stoned to
death on February 14, Lupercalia Day.

“That’s right, so the Romans resisted the new mechanics and stuck to the old. It was two centuries before the evangelists again tried to stamp out the Lupercalia in a big way. In 490 A.D., Pope Gelasius canonized a Roman by the name of Valentine. He was, by tradition, a priest martyred 220 years before for violating a ban on performing marriages during wartime. Valentine was stoned to death on a February 14, Lupercalia Day, so his feast day was conveniently made to coincide with it. In a sense, the clerics finally succeeded in Christianizing the ancient rites, but only in name and only edgewise, in a manner of speaking. As history would prove, no power on earth could stamp out its earthly and earthy attractions.”

‘You’ve got a lovely story there,” Leonor said, “and you kept your promise of not being a spoilsport. So Happy Valentine’s Day, my love!”

“For you, Leonor, Happy Lupercle’s Day just this once, OK?” (2004)


------------------------
This essay in conversation form first appeared in my English-usage column in The Manila Times in 2004 and subsequently formed Chapter 142 of my book Give Your English the Winning Edge.

Tuesday, February 4, 2025

THE GRAMMAR OF NEGATION IN ENGLISH

Mastering the fine art of negation in English 
By Jose A. Carillo


We know that to affirm something to be true is much easier and more pleasant to do than to declare it to be untrue. This is because doing the latter often involves negating what somebody else holds to be true—a situation that could cause bad feelings, wounded pride, acrimonious exchange, or even vicious and protracted debate. It is therefore important for us to develop negation to a fine art, the better to diffuse the pain and unpleasantness to the one being refused, rebutted, contradicted, denied, lied upon, or denigrated.

The staple negation adverbs in English are, of course, “no,” “not,” “never,” and “without.” In addition to them, however, the language uses a remarkably wide range of devices for lexical negation (words with negative connotations) and affixal negation (positive words negated by affixes). I surveyed these negation devices in an essay that I wrote for my English-usage column in The Manila Times in the early 2000s, and I am now reposting it in this week’s edition of the Jose Carillo's English Blogspot for the benefit of Forum members and nonnative English speakers who may need a refresher on how to say “no” without causing offense.

Forming negative sentences correctly

Without any doubt, the adverb “no”—abetted by its semantic cousins “not,” “never,” “without,” and several others with a negative bent—is the most subversive word in the English language. Look how “no” undermines and negates every single thought and idea to which it latches on: “No, I don’t like you.” “No, I have never loved you.” “No, go away; my life will be much better without you.” And if you look back at the adverbial phrase “without any doubt” that begins the first sentence above, you would see how the word “without” totally reverses the sense of “doubt” to “certainty.” Overwhelmingly powerful, “no” and its cohorts can quickly and very efficiently demolish every declarative or affirmative statement that we can think up in the English language.

 IMAGE CREDIT: ENGLISHHINTS.COM

We can see that to negate entire statements, “no” takes a commanding position at the very beginning of sentences. It does so with brutal efficiency: “No swerving.” “No entry.” “No, sir, minors aren’t allowed here.” On the other hand, when “no” has to do the negating within a sentence, it often assigns “not” to take its place, commanders an auxiliary verb, and positions “not” right after it: “The woman drove.” “The woman did not drive.” “The woman will not drive.” Of course, we already know that when “not” does this, the main verb relinquishes the tense to the auxiliary verb. In the example given above, in particular, the auxiliary verb “do” takes either the past or future tense, and the main verb takes the verb stem “drive.”

The pattern of negation is slightly different in the perfect tenses. The adverb “not” simply inserts itself between the auxiliary verb and the main verb, with the main verb remaining in the past participle form even as the negation is consummated: “The woman has driven.” “The woman has not driven.” The important thing to remember is that “not” always positions itself between the helping verb and the main verb; for it to do otherwise would be grammatically and awfully fatal: “The woman not has driven.” “The visitors not have eaten.”

In contrast, “never” is a movable negator, certainly much more versatile than “not.” Watch: “The woman never drives.” “Never does the woman drive.” “The woman has never driven.” “Never has the woman driven.” “The woman never has driven.” “Never” is negation in its emphatic form—demolishing an idea to the extreme.

The adverb “no,” of course, can routinely negate any element by denoting absence, contradiction, denial, or refusal: “Under no circumstances will Claudia’s offer be accepted.” “I see no sign of reconciliation.” The cities of Sodom and Gomorrah are no more.” “Have you no conscience?” The adverbs “not” and “never” work in much the same way: “Not a single drop of rain fell last summer.” “She will always be a bridesmaid, never a bride.”

But there’s one major caveat on “not”: it’s wrong to use it in sentences that have an “all…not” form (to mean “to the degree expected”). Take this sentence: “All of the women in the district did not vote for the lone female candidate.” This sentence is semantically problematic; it could mean that “some of the women did not vote for the lone female candidate”, or that “none of the women voted for the lone female candidate.” Better to remove the ambiguity by fine-tuning the negation to yield the desired meaning. The first option: “Not one of the women in the district voted for the lone female candidate.” The second option: “None of the women in the district voted for the lone female candidate.”

The same caveat should also be observed when using “not” with the adjective “every,” as in this ambiguous sentence: “Every candidate did not meet the voters’ expectations.” Better: “None of the candidates met the voters’ expectations” or “All of the candidates failed to meet the voters’ expectations.”

Apart from using “no,” “not,” and “never,” we can also use the lexical semantics of negation as well as affixal negation to reverse the sense of things. Lexical negation is simply the negative structuring of sentences by using words with negative denotations, such as “neither,” “nor,” “rarely,” “hardly,” and “seldom.” Affixal negation, on the other hand, negates positive words through the use of the affixes “un-”, “im-”/“in-”/“il-”, “dis-”, “de-”, and “-less,” as in “unnecessary,” “imperfect,” “ineffective,” “illegal,” “disregard,” “decamp,” and “useless.”


IMAGE CREDIT: HAGARLANGUAGES.WORDPRESS.COM


When using these negative affixes, however, we must always remember to drop the “no,” “not,” or “never” in the sentence if our true intention is to negate the statement. Failure to do so will result in a grammatically incorrect double negative. “It is not illegal to steal,” for instance, will mean exactly its opposite, “It is legal to steal”—with all its dire consequences to civilized society.

------------------
From the book Give Your English the Winning Edge by Jose A. Carillo © 2009 by the author, © 2010 by the Manila Times Publishing Corp. All rights reserved.