Saturday, September 15, 2018

Climate Seasons and Astronomy Confused: A Societal Blind-Spot

In the Northern Hemisphere, in the Northern E.U. and U.S., it is ludicrous to claim that winter begins not until December 21st. If we go by the claim, the Christmas season is in the fall—joined by Halloween (and Thanksgiving in the U.S.). Similarly, September sports autumn cooling off rather than three more weeks of summer. In many areas, leaves turn fall colors well before September 21st. As a matter of fact, “Climate scientists define summer as the three months from June 1 through August 31st.”[1] Why, then, do meteorologists on television, at least in the U.S.—that vaunted superpower—announce that fall officially begins on September 21st. They even show “fall begins” on the day of the 21st on the week of weather. Similar, the fools show “winter begins” just four days before Christmas, on the winter solstice. That solstice is in the winter—not the beginning of it.
Just for added confusion, astronomers use the names of the meteorological seasons for the four quarters of the Earth’s orbit around the Sun. These quarters are loosely related to the climatic seasons. The “winter” quarter begins on December 21st, the “winter” solstice, when the perpendicular rays of the sun hit their most southern “line” on the Earth; the “summer” solstice, well into summer on June 21st, occurs when the perpendicular rays are furthest north in the Northern Hemisphere. On September and March 21st, the perpendicular rays fall on the Earth’s equator. The solstices are thus associated with daylight extremes, whereas the equinae suggest balanced day and night.
In short, to conflate meteorology and astronomy is a logical error, which is bound to lead to errors and confusion. It surpasses comprehension why weather folks on television apparently do not know that climatically winter in the Northern Hemisphere is in December, January, and February; spring comprises March, April, and May; summer runs from June through the end of August, and fall’s months are September, October, and November. At the very least, the weather “personalities” should be familiar with weather recording. It makes absolutely sense to announce on, say, September 10th that the previous summer had overnight lows much above average, and then say on September 21st that fall has begun. Such a “feat” contorts human nature itself, and yet the blind-spot has unfortunately endured.

1. Doyle Rice, “Can’t Sleep on It: Nights Are Hottest on Record,” USA Today, September 7, 2018.

Friday, March 2, 2018


In Berlin at the Brandenburg Gate on 11/11/11 in 2011, costumes were the norm in the evening as revelers celebrated the numeric convergence. I suspect that unlike the Chinese, the Europeans were struck by the convergence itself, rather by any good luck attached to the numerology. I myself was struck by the convergence alone. Both at 11:11am and 11:11pm, I was surprised that other Americans around me seemed to be either ignorant of the alignment or utterly indifferent to it. It occurred to me that just as a given time-date system is artificial, so too are human cultures—which include political and economic values that are stitched together by leaders who peddle meaning to the masses. Both our systems and our ideologies are all too limiting, yet we can find meaning in them. Perhaps this is ultimately why we have them and the leaders that trumpet them or suggest new ones. I contend that 11/11/11 too plays into the human instinct for sense-making, especially in terms of visual and cognitive symmetries.

At 11:11am on 11/11/11, I limited my “celebration” to sending out some emails to some friends and a general tweet to mark the moment for posterity; curiously, the people around me did not seem aware of the convergence. At 11:11pm, I was at a bar/restaurant listening to a band of old geezers play classic rock (and, sadly, a few Jimmy Buffett songs) from the 1970s. The only convergence in the 1970s was inflation and unemployment in the double-digits. In spite of my protestations, even the people I sitting with seemed utterly indifferent to the coming convergence—even as I took off my watch for emphasis! Still nothing—like watching a train go by on its own momentum. A few people across the room were checking their cellphones and blackberries, but, alas, for more pedestrian purposes than to keep an eye on the coming cosmic convergence. As I rather blatantly went to the lighted doorway to better see my watch at “the moment,” I felt utterly alienated from my own people. It was a case of the one and the many.
When the moment came, as I watched the five numbers on my digital watch all briefly display “11,” I felt like I was on Mars enjoying the thrill of my own private “Earth” moment while the Martians continued to sip their red brew. No, I was not drinking so I did not really think I saw aliens (they are all in Arizona, after all). Rather, I was struck by the divergence in values even amid the convergence in numbers. There wasn’t even a clock in the room! Had I been the manager there, I would have tried to arrange a date-time digital “clock” on a screen. Would the people have counted down the seconds? Would they have paid any attention to it? Walking back to my seat, I wondered whether I wasn’t some reincarnated European reborn in the Midwest as some bizarre joke from Descartes’ divine deceiver, or perhaps I was over-estimating the Europeans’ interest in the convergence. Perhaps it’s simply that I’m too innately unique—a man destined to forever be without a country.
About thirty minutes after 11:11pm, I was chatting with a middle-aged man who had been fired as a band teacher at a local high school. Our conversation came around to political economy. “Greed is good,” he stated in perfect seriousness with his eyes as though bullets aimed directly at me. I reacted as if I had been stunned by a taser gun. No wonder the guy’s students obeyed him. As for the gaping inequality in wealth in the U.S., he insisted that people should be allowed to accumulate without limit—even when they already have tens of billions of dollars. “That’s what America is all about,” he nearly shouted above the din of the band. How dare this even be questioned! The man was indeed voicing values held by enough Americans that he was expressing a major strand of American culture that I could not dismiss as an aberration or quirk. When I claimed that representative democracy itself could be at risk if private wealth gets even more concentrated in a few hands, he replied that the rich would never let America be ruined because they have a vested interest in the system. “The rich created this system,” he reminded me. Sure enough, the delegates at the U.S. constitutional convention in 1787 were creditors deeply concerned over Shays’ Rebellion over debt that had just occurred in Massachusetts a year earlier. That the debtors had fought in the war without being paid yet they still had to make payments on their farm debt made no nevermind to the “Founders.” Was American founded by selfishness and greed? The former band teacher replied, “Yes, of course” as if there were no a thing wrong with that. I was absolutely stunned. I felt like I had been transported to Mars. I countered that even if a bunch of rich guys founded the United States, greed can result in people acting against their own self-interest, paradoxically as they are narrowly obsessed with it. “America can collapse from its own weight on top,” I added as though it were a fact. As I said this, I had already concluded that I was horribly at odds with a major plank in the American lexicon—namely, that economic liberty should not be limited, even at hundreds of billions of dollars being held by one person. In fact, the lack of limit, even when a constraint would be for the good of the system itself, is held by many as a virtue—something to be proud of. That a signature of greed is its lack of limitation is no problem because greed itself is a virtue. I found myself as though I were visiting another planet, though this time without even my own private amusement in watching 11’s match up on my watch. Beyond the cultural ideology, I saw in the leader of the band a sordid selfishness that could only be utterly unapologetic given its nature. All I could say was, “Well, we just disagree. Have a good night. Nice to have met you.” I wondered if the rest of the world had come to say the same thing to the American “tourist” (i.e., ideology) even while admiring our political stability and wealth.
Of course, people can get carried away not only with power and money, but also with convergences such as 11:11 on 11/11/11 in terms of luck, causality and metaphysics. In this respect, American culture is more solid than, say, that of the Chinese. As David Hume argues, we do not understand causality as much as we think. Hence, superstition is as though a perennial temptation—especially in religion, where the lapse is almost always invisible to the beholder. In numerology, the number one represents a beginning or gateway. Having several number ones presumably reinforces the validity of the “beginningness” quality. In other words, the “vibrational frequency of the prime number” increases its power such that its attributes are multiplied.  In the case of the number one, the attributes of “new beginnings” and “purity” are significantly magnified in power in 11/11/11, presumably reaching its zenith at 11:11 (a.m. and p.m., or just once on the 24 hour clock). The fallacy, which I suspect took hold in China, is to say that the increase in power means that there is more apt to be a beginning empirically and even metaphysically. We can resist this temptation to get carried away with even rare line-ups in our own systems, which, after all, are artificial because they are invented and instituted by people. In other words, even though it is a human instinct, sense-making need not over-flow and eventuate into metaphysical significance. We cannot say that acknowledging 11/11/11/ opens up a gateway in one’s life. Rather, a person can actively start something irrespective of the numbers, even if only by spotting and seizing an opportunity.
A numeric alignment can hold its own significance within its own system for the human mind. That is, the significance can be felt even as it is known to be contrived and thus arbitrary from outside the system. As I stood in the lighted doorway waiting for my watch to briefly line up its various numbers to 11:11:11 on 11/11/11 as the rest of the room was fixated on the band (or the walls, or themselves), I presumed no metaphysical significance at all in terms of some beginning about to occur in my life; rather, it was the convergence itself—the fleeting and rare alignment—that galvanized my interest. The sudden turn from 1999 to 2000 was a similar sort of significance in terms of numbers in a particular dating system. People did not need to presume the issuance of a new era or good luck to get excited at 11:59pm on December 31, 1999 about the next minute being so different. Yet was it? Something can be felt as significant even as it is known to be arbitrary, yet such significance can be easily relegated.
Admittedly, it was more difficult to get excited about New Years’ Eve in 2005 or even 2010, given the significance of 2000. Similarly, on 11/11/11, a sense of complacency could have set in regarding convergences of ones. The year 2011 alone contained an extraordinary number of them:
1:11:11 on 1/1/11     
11:11:11 on 1/1/11      
 1:11:11 on 1/11/11     
11:11:11 on 1/11/11     
 1:11:11 on 11/1/11       
11:11:11 on 11/1/11
1:11:11  on 11/11/11      
11:11:11 on 11/11/11 
However, how many of these did the average person observe? I myself completely missed 1:11pm on 11/11/11 even though I was fixated on 11:11am and 11:11pm. I must have been “out to lunch” at 1:11pm. Although it would be 100 years before 11/11/11 would happen again, it would be “only” 10 years and a few months before 2:22pm (forget 2:22am!) on 2/22/22. Technically speaking, missing a “2” (2/ rather than 22/) means that the multiplied power of the “2” will be somewhat less. Trinitarians will have reason to get excited over 3/3/33 at 3:33pm, which will be the day after Ash Wednesday in 2033. However, the number of 3’s is one less than the number of 2’s in 2/22/22. Barring significant life-extending advances in medical science, 11:11 on 11/11/11 in 2011 was the best it could get in terms of the number of numbers in a numeric date-time convergence for those adults who happened to witness that convergence.
That this topic holds any significance whatsoever is I suspect due to the propensity of the human mind to seek and admire order. In terms of symmetry alone, the eye naturally gravitates to 1111111111 rather than 1645564336. The gambling machine that has three windows with a variety of pictures spinning around, we are naturally astonished when the same picture is shown in all three windows. Even so, three lemons does not mean bad luck any more than three apples means good health in the coming year. 11/11/11 is not an alignment by chance, even if the Gregorian calendar itself need not have been adopted when it was. Even so, the planned or arranged alignment, being both of, is inherently pleasing to the eyes and holding significance to the mind, especially if the convergence is rare and fleeting. It is as though everything makes sense, but only for a moment and then it is past. In fact, it is this basic feature of the mind—that which I call the sense-making instinct—that is the basis and appeal of a leader’s vision to followers and an organization or society as a whole. The social reality that is formulated and preached is like a series of ones in a chaotic world of fractal order and disorder.

Wednesday, January 3, 2018

On the "Wedding of the Century": History Made or Manufactured?

In hyping the royal wedding of William and Kate in the E.U. state of Britain, the media even in other E.U. states applied the title, “The Wedding of the Century” in spite of the fact that the century was only in the second year of the second decade. It is rather presumptuous for people alive at such a time to claim so much for their time, and therefore for themselves. Lest our self-constructed bubble unexpectedly bursts, we might let some air out of our self-constructed balloon in a controlled manner such that our bloated egos can survive without too much bruising.

It is interesting how those of us who were adults in the last decade of the 20th century did not look back to any such weddings around 1911 that might have been labeled then as "the wedding of the century."  I do not even know if there were any such royal weddings back then that might qualify. Having seen the film, “The King’s Speech” a decade into the twenty-first century, I came to know a bit about the British royals of the late 1930s, but even then I could only draw a blank from Queen Victoria to the abdication made out of love. Even in terms of American rather than European history, the twentieth century begins for me at the end of World War I and takes off with the roaring twenties—that opening act ending with the ensuing economic drama in 1929. 

From the standpoint of 2011, it seems a tad bit early for us to be labeling anything in our time as definitive for the upcoming century.  From the standpoint of people who will be adults in the 2090s, people like me are like the people who were born around the time of (or just after) the war between the USA and CSA (wrongly called a civil war as the CSA was a separate country rather than a faction contending to take over the USA) and died of old age during the 1930s or at the time of WWII. Such people were practically forgotten to the people who were adults in the (American) States during the 1990s. That is to say, we who vaunt our events “of the century” will barely register to those people who will be in a position to look back on the century.  I suspect that they will look back to the people who will have been in their prime during the 2050s thru the 2070s.  

Those people who celebrate the coming of the next century will look back to celebrities similar to how I looked as far back as to Fred Astaire and Cary Grant. Even the jazz singer whose mami pierced the silent screen in 1929 is barely on my radar screen—as if the coming of sound in moving pictures was merely the start of the century (as though the previous three decades were projected silently on a blank screen).  

 I do not know much at all about the days of Grover Cleveland and Teddy Roosevelt--that is, before World War I. Similarly, adults alive in 2095 may barely know who George W. Bush and Barak Obama were and yet our world (at least in the U.S.) is dominated by discussion about those presidents.  

In the first few years of the second decade of the twentieth century, the U.S. Supreme Court declared the Standard Oil Trust to be unconstitutional--being in restraint of trade. At the end of the 1990s, the Glass-Steagal act separating investment and commercial banking was repealed without any hint of the progressive movement that had given rise to the Sherman Anti-trust Act. Corporations had long since won the day.

In 1913, U.S. constitutional amendments were ratified changing how U.S. Senators were to be chosen (state governments no longer being directly in the U.S. Government) and expanding how the U.S. Government could tax its citizens (at the expense of state taxes). By the end of that century, American federalism was nearly invisible—Washington D.C. having become the focal object in terms of policy.

In 1914, World War I began in Europe. In 2011, the last remaining American veteran of that war died. Memories of that war had long since faded—the Austrian-Hungarian Empire having been replaced by the Nazis and Japanese in the world’s collective consciousness.  I suspect that in 2095, 9/11 as “permanently etched in our memory” will no longer be so, just as December 7th had faded from "living in infamy" by 1995. Pearl Harbor was certainly eclipsed by 9/11. In 2011, Pearl Harbor is all but forgotten as Americans feel profound sympathy for the Japanese suffering in the wake of the 9.0 earthquake and tsunami and cheer the death of Osama as justice.

From the standpoint my desk in 2011, I look out onto a vast field of time that is as of yet unknown and utterly undeveloped. I cannot even imagine what will go down in the 2030s or 2040s. That people not yet born and thus yet to be married will look back on that now-empty field as crowded gives me great pause as to the significance of my time and what claims I can properly make concerning events today. In a way, I feel like I am living in a time before time—before memories yet to be remembered even in the same century.

See related essay: "On the 'Wedding of the Century': Royalty as Natural or Exaggerated?"

Wednesday, November 15, 2017

Getting the Seasons Officially Wrong: A Case of a Category-Mistake

Joel Achenbach of the Washington Post has not quite turned the corner with respect to spring, and the seasons in general. You see, “season” is used in two distinct though related ways in English. It can refer to four distinct weather/plant-life conditions or to the four parts of the earth’s orbit around the sun. Given the tilt of the Earth, the two are related but they do not occur together. While Achenbach acknowledges that the vernal equinox typically on March 21st “is a moment of time specified by the motion of the Earth around the sun,” he refers to this as the official start of the meteorological spring. In actuality it is not. In the Northern Hemisphere, meteorologists record data from December, January and February as winter and March, April and May as spring. So in March 2012, meteorologists could already conclude that the preceding winter had been the fourth warmest since the record-keeping began.

Consider the insanity in claiming on December 19th in the Northeast, the Northern Midwest, or on the Northern plains or further west that it is still fall. Yet even television weather people make the mistake of representing the winter solstice—the shortest day of the year in the Northern Hemisphere—as meteorological and botanical rather than astronomical in nature (to say nothing of the sheer stupidity in ignoring the obvious winter conditions of snow and ice). Meanwhile, still other people render the astronomical event as religious in nature. The ascendancy of the evergreen on the longest night is religious for those people even as it is botanical to others.

For my purposes here, it is sufficient to note that astronomy is distinct from  meteorology and botany. The latter two are relatively coincident as phenomena. To make meteorology and botany wait on an astronomical mark conflates different categories that do not cohere. It is not surprising that such thinking results in some rather obvious mistakes.

I argue that the same sort of cognitive flaw takes place in comparing a state in one empire-scale union (e.g., France) with another entire empire-scale union (e.g., the U.S.). The respective unions’ states are equivalent both in scale and politically in being semi-sovereign. Citizens in California could just as well say, “In California we do X (that really could be just about anything), while in the E.U. you do Y.” This statement seems strange on both sides of the pond, yet no one bats an eyelid at: “In the Netherlands we do X, while in the U.S. you do Y.” The asymmetry is based on European states’ rights (i.e., the antifederalist movement in American terms) and (frankly) American ignorance at America’s expense. So too, there is ignorance in a meteorologist announcing changes in the weather seasons based on astronomical bench-marks in the Earth’s orbit even as meteorologists do not use those bench-marks in recording data.

I suppose it is in reaction to the meteorologists’ inexcusable carelessness in conflating meteorology and astronomy (they are meteorologists, after all) that I note the beginning of a season by the actual shift in weather conditions. That is, I go from the empirical conditions on the ground. March of 2012 was warm even for spring across the Northeast, Mid-Atlantic and Midwest. A day or two into the warmth, I naturally started referring spring having arrived. Sure, winter could have returned and then it would have been winter for those days, so the seasons can refer to particular weather conditions (being meteorological). Yet the seasons can also refer to more long-standing clusters of botanical/meteorological conditions. In that March, as soon as the flowers and buds were visible, that only added to my determination to say to folks, “Well, spring has arrived”—meaning on a more long-standing basis than just a warm spell. No one even put up a fight—even as the weather folks on television were still marveling that spring was still weeks away. They just made themselves out to be stubborn idiots, frankly. When they finally “celebrated” the arrival of the astronomical event of there being no tilt in the Earth’s relationship to the sun as if the event were meteorological, the weather “personalities” resembled people who get to a party hours late and announce that the party has officially begun. People at the party obviously know it has been going on for hours, and naturally look confused and ask the host, “who invited those idiots.”  Unfortunately, it doesn’t do any good to talk to a television set; the talking heads keep right on going, completely sure of themselves.

In March of 2012 in many of the northern republics of the U.S., it would have been crazy not to refer to the conditions on the ground, which included daffodils and even tulips flowering and bushes and even trees budding, as spring. Insisting that weeks in the 70s and even 80s are still winter just points to the fault in using the names of the weather/plant seasons to refer to the astronomical quadrants of the Earth’s orbit. There is no “spring” in outer space. If anything, it is a perpetual winter, though even this analogy fails. Furthermore, spring arrives at different times in North America, depending on how far north one happens to be. It really is a regional affair.

The standardization of record-keeping (e.g., spring as March, April and May) is entirely reasonable, but the category mistake with astronomy goes too far, cognitively/logically as well as empirically. Sticking to such a mistake even while making such obvious blunders (such as that 80 degrees with flowers blooming is still winter, or 25 degrees with snow is still fall) is a strange choice that suggests a certain mentality, given that the weather person could simply stay mum on the issue rather than say something that can easily be anticipated as looking stupid. Why even announce that it is still winter while standing outside in shorts among flowers? Why go to the trouble of announcing a category mistake as if it were valid? I suspect that part of the answer is an over-valuing on things being official. Besides the artificiality in such hypertrophy, the mentality involves the flaw of denial. “It doesn’t matter whether it is pouring outside, if there is no rain in my gauge it is not raining.” In a way, this is a version of lying. Refusing to admit an empirical observation on account of an ideological value one holds (excessively), one is willing to lie.

It is not as if the meteorological and astronomical change together. At the very least, the distinctly astronomical nature of the vernal equinox in March (when the earth has no tilt relative to the sun—which happens also at the equinox in September) should be specified rather than implying that the matter is meteorological in nature. Even though the days are getting longer in March (an astronomical matter), it takes time for the air to warm. Accordingly, the meteorological and botanical are not coincident with the astronomical. Treating an astronomical event as if it were meteorological is thus an error; it is at the very least misleading. To willingly mislead just to be official is the sordid mentality that perpetuates this ongoing category mistake.

Joel Achenbach, “A Warm, and Official, Embrace of Spring,” The Washington Post, March 20, 2012.

Friday, June 9, 2017

The Intellectual Aesthetic

Nonmaterial, or intangible, things should not be dismissed in assessing what makes life worthwhile. Why a nonmaterial thing makes someone happy is not necessarily obvious even if it does. One such thing that makes me happy can be termed, intellectual aesthetics. This term requires some unpacking, as does my point that such an aesthetic can trigger happiness. In fact, the commonly presumed association between aesthetics generally and happiness is itself in need of a closer look.

Aesthetics are typically associated with the eye—meaning sight. A beautiful painting, photograph, or picturesque vista can be said to be aesthetically pleasing. Looking out over the Grand Canyon in Arizona, for instance, can prompt a person to feel a sense of awe. Beholding such a sight is also typically associated with happiness even if the question of why a beautiful view, or object such as a painting or sculpture, makes the beholder happy. Receiving a present or a lot of money, and the birth of a child are quite understandably associated with happiness. Optical aesthetics requires more in the way of explanation.

That a theory or philosophical system of thought could make an intellectual happy rather than merely impressed is particularly difficult to understand, for the aesthetics of an idea or relation between them is itself barely recognized even among intellectuals. An analogy may be helpful. I liken a theory to a physical model of a molecule. The balls are the ideas and the sticks connecting the balls represent reason. A theory thus has a particular “shape,” albeit in the mind’s eye, hence nonmaterial, or intangible. This mental shape itself, rather than merely the ideas themselves, can cause pleasure in the beholder.

Friday, September 30, 2016

A Comet’s Cosmic Song: Evidence of Plato’s Justice as Harmony of the Spheres?

On September 30, 2016, the European Space Agency’s Rosetta spacecraft ended its mission orbiting Comet 67P. The mission added knowledge on how planets came together and how life arrived on Earth. “One of Rosetta’s key findings is that comets are probably not the source of Earth’s water.”[1] I submit that of even greater importance is a finding that can be indexed as philosophical in nature.

In particular, the European Space Agency “released audio of a ‘cosmic song,’ created by the magnetic fields oscillating in the trail of particles flying off the comet.”[2] In particular, movements in the comet’s Magnetic field are caused by solar particles hitting and electrically charging the comet’s atmosphere. The “song” resembles the sounds that whales make. Perhaps it resonates with our music as well. For one thing, the beat of the cosmic song is regular and the pitch varies from “note” to “note.”

Philosophically, the finding may confirm Plato’s theory of justice as “the harmony of the spheres” being in line with the harmonies of a well-ordered city (polis) and mind (psyche). Justice “just is” the harmony between and within these three things—the universe, the city, and the mind. The harmony, as with our music, has mathematical aspects (e.g., low, middle, and high notes, of intervals of duration (e.g., eighth, quarter, half, and full notes).  The comet’s “cosmic song” adds to the accumulating empirical support for Plato’s theory.

It would be really astounding were the mathematical-musical vibrations of a reason-ordering-the-passions human mind and the vibrations of a reason-ordered city in sync with the vibrations given off by suns, planets, and comets—with those respective mathematical-musical vibrations in harmony with each other. That such a confluence is itself just blows the mind.

The implication is that keeping your passions under the control of your reasoning ability puts you in a very subtle sense in harmony with the “cosmic song” of the observed comet. Perhaps people enjoy music so much because it can serve as a mediator helping the mind to be well-ordered (hence suitable for order-imposing reasoning) and in sync with the harmonies of the heavenly spheres, including comets. It is worthwhile simply pondering how justice as we typically construe the term boils down to that “syncness.”

[1] Kenneth Chang, “Rosetta Mission Ends With Spacecraft’s Dive Into Comet,” The New York Times, September 30, 2016.
[2] Ibid.

Thursday, April 2, 2015

God's Gold

Stepping away from the obtuse academic writing that characterizes a treatise such as Godliness and Greed, I have written a book entitled God's Gold, which is geared to general college-educated readers who are interested in the topic of Christian takes historically on how greed is related to profit-seeking and wealth. In the text, I try to explain why the historical trajectory in dominant Christian attitudes towards profits and wealth moved from anti-wealth to pro-wealth. Did the religion stray, or is a pro-wealth view stitched into the very fabric of the religion? Put another way, how much are Christian theology and ethics of the world rather than merely in it? What can business practitioners who self-identify as Christian do to keep their work and compensation free from the taint of greed? The book is geared to answering these questions.