Sunday, November 18, 2007

CIDER, MAIZE, AND GRATITUDE

a homily delivered by the Rev. Dr. Tim W. Jensen
at the First Parish Church in Portland, Maine
Sunday November 18th, 2007


READING: “After Apple-Picking” by Robert Frost

***
One of the great things about being historically-minded is that it really can (and often does) give a person an entirely different perspective on just about everything in life. I know that a lot of people think of history as “boring” -- just a lot of talk about war and politics and the kind of people who are interested in that sort of thing, plus trying to memorize a bunch of meaningless dates that all sound the same after awhile, or the names of people you’ve never met and are never going to meet because they’ve gone to meet their Maker long before any of us were even born. But this is just the superficial view. History is really all about people just like you and me; in its most extensive understanding, it’s the study of everything that Human Beings have ever done or thought or felt since, well, the beginning of time. It’s about tradition and heritage, but mostly it’s about understanding why things are the way they are by learning how they used to be, and how they got to be this way.

For example, take this symbolic communion meal of cornbread and cider we’re about to celebrate. I’m sure you’ve all probably heard that “an Apple a Day Keeps the Doctor Away,” but how many of you knew that, according to historians, the apple is probably the earliest fruit actually cultivated by human beings, which is really pretty amazing when you think about it. Before there were any vineyards or olive groves, or any cultivated citrus fruits; before peaches, pears, plums, figs, dates, cherries, apricots and all the rest, there were apple orchards. So you see, there’s a reason that “A is for Apple.” And there are over 7500 different varieties or “cultivars” known today, all of which are descended from a single ancestoral variety, which can still be found growing wild in the mountains of Central Asia, in the region between the countries of Kazakhstan, Tajikistan, and China.

Modern apples basically come in three different types. There are the sweet, so-called “dessert” apples (which are the kind that you can just pick and eat right off the tree, or nowadays more typically after bringing them home from the supermarket); then there are cooking or baking apples (which are generally a lot more tart than the dessert apples, but release their more subtle flavors when cooked); and finally there are cider apples -- which are far and away the majority of the cultivars, which makes a lot of sense when you stop to think about it. Because of all the beverages which have historically been available to human beings (including plain old water), cider is both one of the simplest and one of the safest...not to mention one of the tastiest.

Furthermore, thanks to what some would call the “miracle” of fermentation, cider also gets “hard,” even if you pretty much just leave it alone -- which makes it both easy to store and preserve, and also gives it all sorts of other historically desirable qualities. As those of you who may have read Michael Pollan’s excellent book The Botany of Desire already know, this is basically what the legendary “Johnny Appleseed” was doing when he planted all those apple seeds out in the Ohio territory back at the start of the 19th century. John Chapman was essentially a very eccentric, mystically-inspired Swedenborgian real estate speculator, who tried to anticipate the westward expansion of the young United States, and planted his orchards in such a manner so that by the time that the pioneer farmers caught up, there would be mature apple trees waiting for them. And although as an adult he apparently never even owned a pair of shoes, when he died he reportedly left his sisters an estate worth several million dollars.

This same intoxicating quality of apples also puts a rather interesting twist on the story of Adam and Eve in the Garden of Eden. You know, I’ve never really understood the doctrine of Original Sin, at least not on a thelogical level; but maybe it was really just that Adam and Eve decided to throw a little party, drank a little too much cider, did some things that maybe they shouldn’t have and which they regreted and were ashamed of later, and the tried to cover it all up; but naturally God found out anyway, and threw them out to fend for themselves. Nothing particularly original about THAT story, is there? It’s more like the oldest mistake in the book. Which is another big advantage of being interested in history; we get to learn from the mistakes of our ancestors, rather than having to make them all again ourselves.

The Cornbread, of course, has a history all its own. Maize, or “Indian Corn” as it was known to the Europeans, is a New World crop, native to the Americas; and for the Native Americans it was one of the “Three Sisters” (along with beans and squash) that provided much of the basis for their diet. The three crops were grown together in fields of small, cultivated hills -- the cornstalk doubling as a beanpole for the beanstalk, with the squash planted around the base, and a fish head at the bottom of the hole for fertilizer. This was the agricultural technique which, according to folklore, Squanto taught to the Pilgrims at Plymouth -- and combined with plentiful fish and game, as well as other native American plants like potatoes, tomatoes, wild rice, wild onions, and of course here in this part of the world, blueberries and maple syrup, it allowed the indigenous inhabitants to eat pretty well most of the time. Cooking was easy. Often they simply combined the meat and vegetables into a thick soup called sagamite, or else steamed their food in the ground just like we would at an old-fashioned clambake today.

And sometimes they would make and eat popcorn, or grind the dried kernels into a coarse cornmeal and cook it as a quick bread, kind of like a tortilla. But it wasn’t until the arrival of the Europeans that people started to bake actual cornbread and eat it as a staple of their diet, just as they baked bread with the milled flour of more traditional cereal grains like wheat or rye: grains which, like the colonists themselves, were brought over from the old country and planted here in the Americas. But until these grains were well established in the New World, cornbread was a staple of the Pilgrim diet: a creative combination of the old and the new, of innovation and tradition which is now an important part of our own cultural heritage as well.

Which brings me to the point of all this culinary history. When we think of the traditional Christian Communion -- the Eucharist -- we think of a symbolic meal comprised of the two staple foods of the ancient world -- bread baked from wheat, and wine fermented from grapes. It is both a reenactment of a traditional Passover meal, but more importantly, a making sacred of that which is ordinary: a sacramental act to commemorate Christ’s sacrifice on the cross.

But the symbolism also works on at least two other levels as well. Communion is a celebration of community -- not just in the sense of breaking bread together, but also symbolized by the very foods themselves: think of how the grain of the fields and the fruit of the vine become as one in the bread and the wine. And then there is also the fact that these foods are alive, with yeast; which is what causes the wine to ferment, and the bread to rise (even if, for Passover, it is baked before it has the chance to rise too much, to symbolize the sudden urgency of the Exodus from Egypt). And in the New Testament, these are both metaphors for the Kingdom of God as well: the New Wine which the old wineskins cannot contain, or the leaven which was hid in a measure of flour, until the whole loaf was leavened.

The food we serve at our own symbolic meal shares these same properties. But it also reflects the heritage of THIS region of the world, and the bringing together of traditional English and Algonquin cuisines, just as they did at the celebration of the First Thanksgiving so many years ago.

And at the end of the day, it really is about Giving Thanks, and expressing our gratitude for the great gift that is life itself. We were, each of us, born into this world naked and helpless. But through the compassion and generosity of others -- beginning most commonly with our parents and immediate family, but including as well friends and neighbors, members of the extended community (including our communities of faith) and of society as a whole, we are protected and nurtured and helped to grow to maturity.

And the ONLY appropriate response to this great gift is one of Gratitude, combined with the commitment to imitate the example of our ancestors, with our own generous and compassionate service to others whose needs are often even greater than our own.

Sunday, November 11, 2007

Armistice and Remembrance

a sermon preached by the Rev. Dr. Tim W. Jensen
at the First Parish Church in Portland, Maine
Armistice Day, Sunday November 11th, 2007


***
In 1933, the members of the Oxford Union, a student debating society affiliated with England's Oxford University, voted the following proposition: "Resolved, that this House will in no circumstances fight for its King or Country." In his subsequent history of the Second World War, Winston Churchill pointed to this "shameful" resolution as an example of the "lethargy and blindness" which caused the British nation "to cower from the menace of foreign peril, frothing pious platitudes while foemen forged their arms." He added: "It was easy to laugh off such an episode in England, but in Germany, in Russia, in Italy, in Japan, the idea of a decadent, degenerate Britain took deep root and swayed many calculations."

One of the supposed lessons of the Second World War, which we have heard repeated so often in our own time to justify American military operations in foreign lands, is that the appeasement of tyrants by reasonable men and women only fuels the fires of their evil aspirations; that peace is best maintained through strength, and by constant vigilance in the defense of freedom. But the students of the Oxford Union in 1933, who as children had helplessly witnessed from afar as their fathers, their uncles, and their elder brothers perished senselessly in the mechanical slaughter of the trenches along the Western Front, had drawn from that experience very different lessons of war and peace. Their resolution reflected not their decadence, nor their degeneracy, and certainly not their cowardice (as they would so shortly have the chance to demonstrate), but rather their profound commitment that never again should the civilized world allow itself to become engulfed by warfare.

Today is Armistice Day, the anniversary of the end of the First World War. And I want to use this opportunity to talk a little about issues of war and peace in a larger context, because it seems to me that much of the original spirit of this holiday has been lost in recent times, and especially in recent days. We now celebrate "Veterans Day,” on which we honor the sacrifices of those who served in wartime; we talk about the heroes of the “Greatest Generation,” who defended freedom and democracy from the threat of Fascist totalitarianism a half-century ago, and how their legacy has now descended on to us. We talk passionately of the need to “support our troops,” regardless of how we may feel personally about the policies of our government which have put them in harm’s way. But we tend to ignore the original sense of the word "Armistice" -- literally, a setting aside of arms. And I personally would like to see a little more of that sentiment observed on this holiday.

The desire for genuine peace, it seems to me, is a universal concern among religious people of good faith, and has been for as long as human history has been recorded. "Blessed are the Peacemakers,” it tells us in Scripture, “for they shall be called God's Children." Peacemaking is more than just the elimination of the threat of war. It is also an active, dynamic, creative way of living which seeks to cultivate the seeds of true harmony and justice even as it cuts away at the roots of conflict and discord, and thus it invariably operates at two distinct levels. The first level might be thought of as one of policy -- the pragmatic things which governments or other organizations do or fail to do to in order to avoid the possibility of war. The second level is one of individual contribution and commitment -- a devotion to peace based on values and principles which are fundamentally religious in nature.

These two levels of peacemaking -- policy and personal commitment -- are quite distinct, although they are also profoundly interdependent; and both of course are subject to the "judgment of history," to which our politicians so frequently appeal. Yet the lessons both of history and of religion are by no means always clear or unambiguous. If, indeed, Churchill was correct in attributing at least some of the blame for the Second World War to the strident pacifism typified by the students of the Oxford Union, it is equally important to remember that it was the similar sentiments expressed by students and others in the 1960's which eventually brought an end to our nation's military involvement in Southeast Asia, just as it was a naive application of the opposite "lesson" which got us involved in that conflict in the first place. Familiarity with policy without the corresponding personal investment simply reduces us to the status of "armchair strategists" -- war and peace become somebody else's problem, while we stand around the sidelines and second-guess. And likewise commitment and action without a solid understanding of the lessons of history leaves open the very real possibility of becoming part of the problem rather than part of the solution -- resulting in a situation where well-intentioned acts merely serve to create the opposite effect from what was intended.

The tragic irony of our current situation in Iraq and Afghanistan is rooted in precisely this kind of “disconnect” between the pragmatic and the idealistic. I am enough of a historian to know that there are times when the use of military force is an appropriate option. There are times, in fact, when it is the only option. But the methods we use to pursue our goals must never be allowed to undermine the very values we aspire to defend. How can be claim to be champions of freedom and democracy when we so freely disregard the same democratic liberties that so many American veterans have fought and died to protect? When emotions run high, as they did in the days immediately following the 9/11 attacks, it is easy to motivate people to follow a course of action which promises to strike decisively at the heart of the problem, and then to justify those actions with the claim that desperate times call for desperate measures. But the real lesson of history is that actions, ultimately, speak louder than words; that rhetoric can only conceal reality for a limited time; and that when our deeds contradict our cherished values and principles, our values and principles become the ultimate losers, and we ourselves become our own worst enemies.

By the 11th of November, 1914, a mere three months after the outbreak of hostilities, the Great Powers of Europe found themselves locked into a stalemated war of attrition which none of them had wanted, but which national pride and rigid mobilization schedules had drawn them to like moths to a candle. After the initial German offensive was blunted by French reinforcements literally rushed to the front in Parisian taxicabs, and the bloody battles in Flanders during the "race to the sea," in which four-fifths of the original British Expeditionary Force were killed, the conflict became deadlocked in a seemingly endless routine of bombardments, raids, and "standing to," in which 5000 men might perish on a "quiet" day, and casualties soared into the hundreds of thousands during "major" offensives, which often resulted in only a few hundred yards of territory lost or gained and the "exchange [of] one wet-bottomed trench for another."

Writing in her book The Guns of August, historian Barbara Tuchman observes that: "...with the advent of winter came the slow deadly sinking into the stalemate of trench warfare. Running from Switzerland to the Channel like a gangrenous wound across French and Belgian territory, the trenches determined the war of position and attrition, the brutal, mud-filled, murderous insanity known as the Western Front." Only after both sides had succeeded in slaughtering an entire generation of young men, while at the same time bankrupting their economies and subjecting their civilian populations to various degrees of hardship and privation, did the influx of fresh troops and war materiel from across the ocean help break the stalemate and cause the German government to sue for peace.

Twenty-five years later, French and German armies once again faced each other along the Western Front, from behind the fortifications of the Maginot and Siegfried lines, in an episode which became known as the Phoney War or "Sitzkrieg." This time the French soldiers were under orders not to fire at Germans they observed moving on the other side of the no-man's land -- because, after all, it would only encourage them to fire back.

Subsequent historical analysis suggests that had the Allies acted decisively within the first few weeks or months of the war, Hitler might easily have been defeated in short order -- indeed, the officers of the German General staff, who also recalled the terrible lessons of 1914, were ready to dispose of their Fuhrer themselves and sue for peace at the first opportunity. But instead the Allies refused to act, and when the Blitzkrieg finally fell in the west, in the spring of the following year, the fortifications of the Maginot line were rapidly bypassed by the German Panzers, France fell in a matter of weeks, and only the evacuation of the British Expeditionary Force in small, civilian-owned boats from Dunkirk preserved the possibility of any resistance in the west.

Perhaps no one could have foreseen the coming of a Hitler in 1918, when at the 11th hour of the 11th day of the 11th month the guns stopped firing along the Western front. Confined to a military hospital recovering from a poison gas attack, this insignificant Austrian corporal felt betrayed and disgraced by his country's surrender to the Allies; while the punitive conditions of the Treaty of Versailles created more than enough resentment among the vanquished to enable his later rise to power on a platform of cultural pride, ethnic hatred, and restored national honor.

Had the victors of the Great War agreed to a just and honorable peace, Hitler might simply have remained a failed artist and frustrated member of the lunatic fringe. It's difficult to say about these things. But the inevitable temptation to punish our enemies rather than behaving generously in victory is rarely a pattern conducive to real peace. Peacemakers everywhere might well take to heart the sentiments of Abraham Lincoln, who served as Commander in Chief during America’s bloodiest and most bitter war, that one best destroys one's enemies by making them one's friends.

That opportunity existed on Armistice Day in 1918, and in many ways it should remain a valid agenda for all peacemakers today. Woodrow Wilson's "Fourteen Points," first articulated shortly after America's entry into the Great War, provide an insightful context for the understanding of such a peace which has not lost its currency even after four generations. I know 14 points may seem like a lot -- in fact, a French diplomat at the time, Georges Clemenceau, pointed out that "The Good Lord had only ten!" -- but the essence of Wilson's vision can be summarized without the need for delving in to his specific proposals for individual nation-states.

Wilson called for an end to secret treaties and military alliances, and their replacement by "Open covenants of peace, openly arrived at" and diplomacy which "shall proceed frankly and always in the public view." He insisted on "Absolute freedom of navigation upon the seas," and "the removal, so far as possible, of all economic barriers and the establishment of an equality of trade conditions among all the nations consenting to the peace." He called for a general reduction of national armaments "to the lowest points consistent with domestic safety," and for the "free, open-minded and absolutely impartial adjustment of all [territorial] claims," with "strict observance of the principle that in determining questions of sovereignty, the interests of the populations concerned must have equal weight with the equitable claims of the government whose title is to be determined."

Above all, Wilson called for the formation of "a general association of nations...under specific covenants for the purpose of affording mutual guarantees of political independence and territorial integrity to the great and small alike." This basic framework of international law, embodying so much of that simple schoolhouse ethic of fair play on a level field, is the legacy which the college professor and Nobel Laureate who served as our 28th President has left to posterity; and on many levels it might still serve well as a foundation for our nation's current foreign policy.

Franklin Roosevelt’s “Four Freedoms” articulated many of these same sentiments in a much more simple and straightforward manner. To the traditional American liberties of Freedom of Expression and Freedom of Belief, FDR added “Freedom from Want” and “Freedom from Fear” – two very tangible benefits of basic personal and economic security which flow from a long and lasting Peace, and perhaps define in pragmatic and palpable terms, the “blessings of Liberty” we hope to secure for ourselves and our posterity. And yet it seems to me that something is horribly wrong when we believe that our own liberty can only be secured through the violent domination of others, and at the expense of their safety and prosperity.

When it comes to the more personal, spiritual aspects of Armistice Day, the lessons are not so easy to summarize. The experience of war on almost any level frequently results in two almost entirely contradictory realizations. The first is a healthy level of cynical skepticism concerning anything which has not been adequately tested by fire; and the second an equally irrational optimistic hope that the sacrifices made by one’s self and one's comrades have not been in vain, and that beyond the unspeakable horror of the battlefield lies an equally unspeakable promise of a better way, which somehow can and will redeem the lives of those who have suffered and died on our behalf, and bring meaning to an activity which is intrinsically without meaning.

Without this optimistic belief in the redemptive power, not so much of violence, but of personal sacrifice, perhaps there would never be another war. Yet without it there could certainly be no hope of an enduring peace either -- for without a willingness to stand up for those who cannot stand up for themselves, in time the innocent would once more fall victim to the ambitious, and fear and avarice again displace the values of tolerance and compassion at the heart of our society.

And so we must continue to speak out in defense of those whose voices have been silenced, and who are no longer capable of defending themselves, in the naive expectation that somehow, someday, it will all make a difference. This skeptical hope, this cynical optimism born of suffering and sacrifice, is perhaps the most critical legacy of Armistice Day. It is a lesson we simply cannot afford to forget if we truly wish to create a safer, more prosperous, more peaceful world....

“In Flanders Fields the poppys blow, between the crosses row on row, that mark our place; and in the sky the larks, still bravely singing, fly scarce heard amid the guns below. We are the Dead. Short days ago we lived, felt dawn, saw sunset glow, loved and were loved, and now we lie in Flanders fields. Take up our quarrel with the foe: to you from failing hands we throw the torch; be yours to hold it high. If ye break faith with us who die we shall not sleep, though poppies grow in Flanders fields....”

In his ground-breaking work of literary and social criticism The Great War and Modern Memory, former Rutgers University professor and World War Two veteran Paul Fussell draws a sharp distinction between the somber tone of the first nine lines of John McCrae’s famous poem, and the “recruiting poster rhetoric” and "propaganda argument...against a negotiated peace....” articulated in the final six. “Words like stupid and vicious would not seem to go too far,” Fussell rages. “It is grievously out of contact with the symbolism of the first part, which the final image of poppies as sleep-inducers fatally recalls.”

And yet sometimes keeping faith with the dead is far more complicated than simply renewing old quarrels, and taking up the torch from failing hands and carrying it once more into the breach. Sometimes holding high the torch demands an entirely different set of actions and attitudes altogether....

{the following bracketed passage was dropped from the sermon preached on Sunday morning, in the interest of time}

[In a on-line essay written just this past week, military historian Lt. Col. William Astore (Ret.) describes a scenario in which “...the world's finest military launches a highly coordinated shock-and-awe attack that shows enormous initial progress. There's talk of the victorious troops being home for Christmas. But the war unexpectedly drags on. As fighting persists into a third, and then a fourth year, voices are heard calling for negotiations, even ‘peace without victory.’ Dismissing such peaceniks and critics as defeatists, a conservative and expansionist regime -- led by a figurehead who often resorts to simplistic slogans and his Machiavellian sidekick who is considered the brains behind the throne -- calls for one last surge to victory. Unbeknownst to the people on the home front, however, this duo has already prepared a seductive and self-exculpatory myth in case the surge fails....”

“The United States in 2007?” Astore asks. “No, Wilhelmine Germany in 1917 and 1918, as its military dictators, Field Marshal Paul von Hindenburg and his loyal second, General Erich Ludendorff, pushed Germany toward defeat and revolution in a relentless pursuit of victory in World War I. Having failed with their surge strategy on the Western Front in 1918, they nevertheless succeeded in deploying a stab-in-the-back myth, or Dolchstoßlegende, that shifted blame for defeat from themselves and Rightist politicians to Social Democrats and others allegedly responsible for losing the war by their failure to support the troops at home.....The German Army knew it was militarily defeated in 1918. But this was an inconvenient truth for Hindenburg and the Right, so they crafted a new ‘truth:’ that the troops were ‘unvanquished in the field.’ So powerful did these words become that they would be engraved in stone on many a German war memorial....”

“Given the right post-war conditions,” Astore concludes, “the myth of the stab-in-the-back can facilitate the rise of reactionary regimes and score-settling via long knives -- just ask Germans under Hitler in 1934. It also serves to exonerate a military of its blunders and blind spots, empowering it and its commanders to launch redemptive, expansionist adventures that turn disastrous precisely because previous lessons of defeat were never faced, let alone absorbed or embraced. Thus, the German military's collapse in World War I and the Dolchstoß myth that followed enabled the even greater disaster of World War II....”]

I profoundly doubt that the nine million French, German, British, Russian, Austrian, Belgian, Italian, Turkish, and American soldiers (I could go on)...soldiers who were slaughtered in the carnage of that Great “War to End All Wars,” rested much more peacefully knowing that within a generation, another estimated seventy million soldiers and civilians would be joining them in that euphemistic “sleep” from which no one ever wakes. The danger of appeasing foreign tyrants is only half the lesson; for the other half, we must look to the Students of the Oxford Union, who understood in ways which we can never fully understand, the dangers of forgetting the unavoidable horror of war itself, and of blind obedience to authority which is out of touch with the human consequences of its commands....



READING: The "Four Freedoms"
Franklin D. Roosevelt's Address to Congress January 6, 1941

In the future days, which we seek to make secure, we look forward to a world founded upon four essential human freedoms.

The first is freedom of speech and expression -- everywhere in the world.

The second is freedom of every person to worship God in his own way -- everywhere in the world.

The third is freedom from want -- which, translated into world terms, means economic understandings which will secure to every nation a healthy peacetime life for its inhabitants -- everywhere in the world.

The fourth is freedom from fear -- which, translated into world terms, means a worldwide reduction of armaments to such a point and in such a thorough fashion that no nation will be in a position to commit an act of physical aggression against any neighbor-- anywhere in the world.

That is no vision of a distant millennium. It is a definite basis for a kind of world attainable in our own time and generation. That kind of world is the very antithesis of the so-called new order of tyranny which the dictators seek to create with the crash of a bomb.

To that new order we oppose the greater conception -- the moral order. A good society is able to face schemes of world domination and foreign revolutions alike without fear.

Since the beginning of our American history, we have been engaged in change -- in a perpetual peaceful revolution -- a revolution which goes on steadily, quietly adjusting itself to changing conditions -- without the concentration camp or the quicklime in the ditch. The world order which we seek is the cooperation of free countries, working together in a friendly, civilized society.

This nation has placed its destiny in the hands and heads and hearts of its millions of free men and women; and its faith in freedom under the guidance of God. Freedom means the supremacy of human rights everywhere. Our support goes to those who struggle to gain those rights or keep them. Our strength is our unity of purpose.

To that high concept there can be no end save victory.

Sunday, November 4, 2007

The Time of Your Life

a sermon preached by the Rev. Dr. Tim W. Jensen
at the First Parish Church in Portland, Maine
Sunday November 4, 2007 - Dia de los Muertos


PRAYER:

Some days the Spirit summons us to bow our heads in reverence, and kneel humbly before the awesome presence and power of all Creation.

Other days we are inspired to lift our eyes skyward to the horizon and toward the heavens, to cast our gaze upon the hills from whence comes our strength.

The Spirit moves where it will; we hear the sound of it, but we know not whence it comes nor whither it goes.

We feel its presence like the wind upon our faces; like a rustling breeze amidst the branches of trees.

The cold, bitter, biting winds of winter, which leave our lips numb and chilled.

The fresh, fragrant gusts of blossoming life in spring.

A cool, summer sea breeze blowing gently over the face of the water.

A breath of fresh air on a brisk autumn morning.

But whatever the season of the year,

And whatever the season of our lives,

The Spirit Calls to us....

Speaking to us out of the whirlwind,

Speaking to us out of the silence,

Calling to us to give Voice to its Truth.

Through our words,

And our deeds,

And our lives....


***
[Extemporaneous Introduction: “Emergency Back-up Sermon Generator”]

I’ve been thinking an awful lot about Time this past week, and not just in terms of this whole annual “Spring Forward, Fall Back” routine. Rather, I’ve been thinking about Time as a measure of our Lives, and of the various ways in which we, as post-modern 21st century men and women, do or do not live our lives in time to the rhythms of the Universe.

And with these reflections have come a momentary period of contemplation upon the nature of Time itself as well; and how our understanding of Time -- what it is, what it means -- has changed over the years as a result of our changing lifestyles. For example, is Time fundamentally linear or circular? Does it progress from beginning to end, or rather repeat itself seasonally for all eternity? Or maybe it’s a little of both. Or maybe it’s really neither. Maybe all the Time we will ever truly know or have is right now in this moment: and the past is just a memory, the future merely a dream. And how can we be sure from one moment to the next whether what we THINK we are experiencing is really real, and not simply some figment of our imagination, an orderly structure we impose upon our subjective experience of a fundamentally chaotic Universe?

We can come back to these abstract metaphysical speculations any time we like; they’ve been around since time immemorial, and I’m not really sure that anyone has actually figured them out yet...except maybe Stephen Hawking. But I do just want to remind everyone here once again why we observe this particular holiday: because this is once more the time of the year when the ancient, pre-Christian Northern European “pagans” -- the Germans, the Scandinavians, the Celts -- believed that the World of the Living and the World of the Dead were at their closest proximity.

And if we simply pause and take a moment to look at the world from their perspective, you’ll see that it all makes perfect sense. This is the Season when the great Circle of Life enters into its period of dark, cold, deathlike dormancy. From life to death to rebirth in the spring, the cycle repeats...yet here in the heart of Autumn is the threshold between the last lingering days of the living and the eternal night that is death. It is a liminal time, when the boundaries are indistinct, and spirits might move freely from one realm to the other.

And of course, over time, and with the coming of Christianity, All Hallows Eve became All Saints Day, and the Feast of All Souls, (and in some Latin American cultures, Dia de los Muertos -- “The Day of the Dead”). Just as the birth of Christ came to be commemorated four days following the longest night of the year; and the miracle of Easter, of course, reoccurs annually in the Spring, on the first Sunday following the first full moon following the vernal equinox. Human beings learned how to tell time in the first place from the heavens. The only real problem is that now we know that the Clockwork Universe tends to run a little slow.

Most of us probably don’t think about it that often, but the reason there are seven days in a week and four weeks to a month is that it takes 28 days for the moon to wax and wane from new to full to new again. But for some inexplicable reason, although mathematically there 360 degrees in a circle, there are actually 365 days to a year, which tends to throw a little glitch into the easy and elegant symmetry of 60 seconds to a minute and 60 minutes to a degree, all of it so neatly divisible by Pi.

And what a difference a day makes. A day, of course, is the amount of time it takes for the earth to revolve once around its axis: a period of time which, by convention, consists of twenty-four hours of daylight and darkness, give or take a few seconds. But the problem, of course, is that the earth wobbles as it spins, and it’s orbit also has a little tilt to it, which means that depending on how far north or south you may be at any given time of the year, within that same 24-hour period some “days” are noticeably longer than others...or at least that part of the day that happens in daylight.

And when does a day properly begin anyway? Does the new day dawn at sunrise, when the rooster crows and awakens all within earshot from their sleep? Or perhaps it’s more reasonable to wait until sunset, at the end of the day, which would logically mark the beginning of the next day as well. Or perhaps it’s really most logical to begin the new day at the Meridian -- high noon -- when the Sun is directly overhead, and thus equidistant in terms of time between dawn and dusk. This is the way a sundial works: one of the world’s oldest and most reliable timepieces. And it is also how sailors at sea have historically marked the time, since it is so essential to their ability to calculate their location, no matter where they may find themselves upon the globe.

But to begin a new day at midnight -- as we do these days -- seems completely arbitrary, especially since without some sort of artificial timepiece, there is really no good way of even telling when midnight is.

The time was that every local community set its clocks by the heavens; they looked up into the sky, figured out when the sun was directly overhead, set the big hand and the little hand straight up to twelve noon, and thus divided the day evenly into two equal halves: Ante-Meridian and Post-Meridian, AM and PM. It wasn’t until the development of railroads in the 19th century that the perceived need for more reliable timetables created a push for “Standard” time -- so that noon in Portland would be the same as noon in Boston or noon New York, even though the sun shines on us a lot sooner here “Down East” than it does in those other places.

And once the timekeepers learned that they could break faith with the heavens and tinker with time, all sorts of mischief was soon in the works. Farmers have always tended to work from dawn to dusk, regardless of when or where they have lived. But modern office and factory workers tend to work in eight hour shifts (typically from nine to five), forty hours a week...so as the days grow longer and the evenings more pleasant, why not simply move nine AM a little earlier in the day, so that folks can save a little more daylight for the evening after work?

And of course, the great irony is that the further we drift from living in harmony with the natural rhythms of the seasons, the more we become a civilization of clock-watchers laboring under artificial light, only to feel like there simply aren’t enough hours in the day to get everything done that we want to get done. At the end of the day, it often seems as though all that our many so-called time-saving technologies have done for us is to accelerate the pace of life itself, leaving us with less time left over for ourselves than our ancestors enjoyed even just a generation ago.

Then again, as the saying goes, we can always sleep when we’re dead....

None of us really knows for certain the measure of our days. We can consult the actuarial tables; we can look to our family medical histories; we can simply cross our fingers, close our eyes, and try not to think about it very much at all. But it’s all just speculation; a little educated guess-work. At some point each of us figures out that we probably have a lot less life left in front of us than is already behind us, but even this is simply an abstraction, because let’s face it -- the past IS history, while each new day is a new beginning. And likewise, there are many who would say that it’s not so much the time we have left to live, as it is the amount of life we squeeze out of the time we do have left...

But the real secret, it seems to me, to truly getting the most out of life in the time we have been given, is to learn how to live life fully present in each moment. And believe me, this is really hard, especially for those of us whose imaginations tend to fly off at the drop of a hat far into the distant future, while at the same time lingering nostalgically over days long past, and procrastinating shamelessly about whatever is near at hand.

But to aspire to live life fully present in the moment, taking each day as it comes while still moving forward toward some future goal, -- patiently, persistently, tenatiously...one day, one step, at a time -- still cherishing those fond memories of good times we might wish would last forever, and letting go of those bitter memories that only hold us back -- it’s a worthy ambition, even though we may never fully achieve it before we die. To live each day as if it were our first, and not just potentially our last...it’s the challenge we all face every day of our lives, no matter how many years we may have already lived.

Which brings us at long last to that age-old question, what happens to us after we die?

The short answer is easy: nobody really knows -- or at least not anybody alive today. And the Scientific answer isn’t really all that much more complicated. Our hearts stop beating, we take our final breath, the synapses in our brains fire for the last time, and the complex organic compounds that make up our bodies slowly but inexorably begin to decay, returning once more to the earth from whence they came.

But is that really the end of “us?” -- a few pounds of chemicals and an awful lot of H2O, recycled back into the system to be used in some other combination. What about our individuality, our unique personality, our “soul” -- that essential “spark” that makes us who we are, and which gives our life meaning?

And I’m sure there are some who would suggest that if this really WERE all that there is, than maybe our lives ARE meaningless....

But I know in my heart that our lives HAVE meaning. I know how much the lives of the people commemorated on this table by these pictures, and these flowers, and these candles, have meant to all of you. And so I know that there is more to life than merely living, and that are deaths are merely another moment in time.

Pray with me now, won’t you?

Loving Creator of all that is, who gives us life and gives our lives meaning... We dwell in this place for but a brief time, yet within this eyeblink on the face of eternity, so much has been given to us. And so we give thanks for this great gift of life, and for the lives of all those who have touched or own, and helped make us who we are today, in this moment. May our own lives speak as testimony to their worthiness, and may their presence among us never be forgotten....