Category: Get Smarter

A gratitude for today, and this moment

A gratitude for today, and this moment

There’s a catch-all prayer in Judaism for gratitude. I’ll use my own transliteration here (that is, I’m going to write the Hebrew words using English letters in a rough pronunciation), but the prayer is called the shehechiyanu. The full prayer roughly translates to, “Blessed are you, Lord our God, who has granted us life, sustained us and enabled us to reach this day.” It goes like this:

Baruch atah adonai, eloheinu melech ha’olam, shehechiyanu v’kiyamanu v’higiyanu lazman hazeh (hear it here).

It’s said a lot. The beginning of every holiday. When friends gather for the first time. When family gathers for the first time in a while. The first time you perform a commandment in the new year (such as giving to charity or going to synagogue). The first time eating a particular food in the new year.

It’s an eleven-word gratitude practice you can utilize any time you need one. Twenty words if you want to use the English translation I gave.

In case you want to go deeper:

• My Jewish Learning points out the shehechiyanu is a reminder to stay present.

• The Trust Center for Early Education at Temple Ohabei Shalom points out that the shehechiyanu is a good marker for observing otherwise overlooked events in our children’s lives; birthdays, sure, but also physical growth, science projects and recitals.

• Two rabbis at a Texas synagogue give a sermon on shehechiyanu, including the importance of being alive in regard to prayer.

• Rabbi Yirmiyohu Kaganoff takes a deep dive on when the oral tradition tells us to recite shehechiyanu — and when not to. Note: Bracha means “prayer” (some translate it as “blessing,” but the context is “a blessing over a meal,” not “the post-op nurse was a blessing”).

• Here’s more deep discussion from Rabbi Avi Zakutinsky.

Incidentally, the way rabbis Kaganoff and Zakutinsky discuss the question of when to say shehechiyanu — with reference to various texts, many of them conflicting — is how Jews discuss matters of faith throughout history. It can be very interesting. One example is the argument several rabbis have in regards to when you can say evening prayers. Some say they should be said after sundown but before midnight. Others say evening prayers can be said after midnight but not after first light. Still others argue that the prayers may be said at any time before someone goes to bed, even if it is before sundown or after first light.

George Washington’s Thanksgiving proclamation

George Washington’s Thanksgiving proclamation

I’ve been watching, sporadically, the lessons on Thanksgiving they’re teaching my five-year-old niece over Zoom, since that’s how kindergarten is taught these days.

It is, of course, the child-friendly version, during which British religious refugees pilgrims spend a couple months struggling under harsh conditions to cross the Atlantic in a too-small ship called the Mayflower and land in a strange place and thankfully bump into some very friendly native peoples who welcome the settlers, show them where the fish are and which land is arable and then sit down to a nice meal at harvest time.

In 1621, if you saw an unexpected boat coming, it was probably someone who wanted to steal your stuff. This meal might have happened, but it probably didn’t happen the way it’s taught.

Last year about this time, I wrote a bit about the Thanksgiving origin myth. Really, it’s a gratitude-for-the-harvest holiday, which is something practiced in many traditions.

In 1789, George Washington, early in the first presidency in our nation’s history, declared a day of Thanksgiving for November 26, which happens to be the date the Thanksgiving holiday falls on this year.

It was declared as a day for Americans — a newly minted collective — to thank God for the gift of a new nation. Here is his proclamation, laid down October 3 of that year.

By the President of the United States of America. a Proclamation.
Whereas it is the duty of all Nations to acknowledge the providence of Almighty God, to obey his will, to be grateful for his benefits, and humbly to implore his protection and favor—and whereas both Houses of Congress have by their joint Committee requested me “to recommend to the People of the United States a day of public thanksgiving and prayer to be observed by acknowledging with grateful hearts the many signal favors of Almighty God especially by affording them an opportunity peaceably to establish a form of government for their safety and happiness.”
Now therefore I do recommend and assign Thursday the 26th day of November next to be devoted by the People of these States to the service of that great and glorious Being, who is the beneficent Author of all the good that was, that is, or that will be—That we may then all unite in rendering unto him our sincere and humble thanks—for his kind care and protection of the People of this Country previous to their becoming a Nation—for the signal and manifold mercies, and the favorable interpositions of his Providence which we experienced in the course and conclusion of the late war—for the great degree of tranquillity, union, and plenty, which we have since enjoyed—for the peaceable and rational manner, in which we have been enabled to establish constitutions of government for our safety and happiness, and particularly the national One now lately instituted—for the civil and religious liberty with which we are blessed; and the means we have of acquiring and diffusing useful knowledge; and in general for all the great and various favors which he hath been pleased to confer upon us.
and also that we may then unite in most humbly offering our prayers and supplications to the great Lord and Ruler of Nations and beseech him to pardon our national and other transgressions—to enable us all, whether in public or private stations, to perform our several and relative duties properly and punctually—to render our national government a blessing to all the people, by constantly being a Government of wise, just, and constitutional laws, discreetly and faithfully executed and obeyed—to protect and guide all Sovereigns and Nations (especially such as have shewn kindness unto us) and to bless them with good government, peace, and concord—To promote the knowledge and practice of true religion and virtue, and the encrease of science among them and us—and generally to grant unto all Mankind such a degree of temporal prosperity as he alone knows to be best.
Given under my hand at the City of New-York the third day of October in the year of our Lord 1789.
Go: Washington

Think about your pumpkin pie a little differently this year.

Bayt al-Hikma 2.0: Knowledge and the limitations of language

Bayt al-Hikma 2.0: Knowledge and the limitations of language

Once upon a time, the story goes, everyone on Earth spoke the same language. One day, a bunch of people got together and said, “Let’s build a tower up to the heavens so we can be equal with God.” And God looked down upon their arrogance and made them all speak different languages.

So goes the biblical story of the Tower of Babel, and it seems like there might be evidence such a tower actually existed, as a temple in Babylon.

Aside: Contrary to popular belief, this is not where we get the word babble.

Once people start speaking different languages, it gets very difficult to communicate. It’s one thing when you and your neighbor can’t communicate through the sophisticated means with which you communicate with some other members of your community. It’s another thing altogether when there’s one person who says they can bridge the gap between you and your neighbor, being able to translate.

It takes a fair bit of trust in a stranger to allow him (or her, but give me the grace of just picking one the rest of the way?) to broker communication between you and someone else. And it invests a lot of power in that stranger.

As we know from our adages, power corrupts; absolute power corrupts absolutely.

Throughout the Middle Ages, that absolute power was centered in The Church. Only priests had a direct line to God. They were the only translators. Priests told everyone — commoners and royalty alike — who was good and who was a sinner. If you paid a priest enough, you would be absolved of your sins.

And then Johannes Gutenberg invented the printing press, translated the Bible into German and distributed a few copies, and set the groundwork for the Reformation.

Once more people could read the Bible for themselves, they could see the corruption in the Church.

In about 1570, the Church took back its power by standardizing a Mass in Latin, a language that had disappeared a thousand years before. It took just about 400 years before the Vatican allowed vernacular services.

With the rise in wealth and power of the U.S. over the past century, especially in the wake of anti-German sentiment following World War I, English has become nearly a global lingua franca — that is, basically any more-or-less cosmopolitan area you can visit in the world, you’re likely to be able to conduct at least some business in English.

A regional lingua franca becomes important in some parts of the world where there are still many different groups speaking many different languages. In some parts of Africa, French is the lingua franca — some people might speak Wolof or Xhosa but also French and perhaps English.

Technically, English is the lingua franca of the U.S., too — we don’t have a federally dictated official language.

I lay this out because, as a native English speaker, I have never, in my 43 years, and may very well never, find myself in a place where I must use a different language. I used to be able to get by in Spanish, and I could perhaps understand better than I could speak; even in Israel, I didn’t really need to know anything beyond “yes,” “no,” and “English, please.”

There are very few websites I can’t read or have translated on demand, and very few books that aren’t written in or translated into English.

Basically, I can consume any information I want. Even growing up going to synagogue (and the associated religious education), we were encouraged to read English translations and ask all the questions we wanted — and to form some of our own opinions.

How different is that from the Roman Catholic Church of most of the past two thousand years?

But it turns out there are still plenty of people controlled by language.

Arabic, for instance, is one of the most common languages spoken by internet users, but less than one percent of the web is available in Arabic, according to Ideas Beyond Borders, a group launching a project called Bayt al-Hikma 2.0, a project to translate books and articles into Arabic.

This isn’t just about any books and articles. It’s about getting ideas that gatekeepers don’t want getting into closed societies translated into Arabic.

The original Bayt al-Hikma, or House of Wisdom, was a library to house works possibly translated from Greek or perhaps Sassanian history. After being bbuilt in the eighth century, it was destroyed in 1258 when the Mongols invaded Baghdad.

Yes, that Baghdad. Most of the world is a lot older than the U.S. A lot older.

I learned about this from listening to Melissa Chen on Joe Rogan’s podcast. Apparently there’s no word in Arabic for feminism. They had to create a special project to translate a Wikipedia article on Marie Curie into Arabic to show that there were female scientists.

No one had brought in secular or atheistic works like Enlightenment Now or Waking Up.

That’s controlling a population, much like the Church used to do.

When you get to finally learn about new concepts such as humanism and advanced education of women, now you’re opening up your society to abundance — and it’s abundance for the masses, not for the region’s elites.

Remember, there is not a finite amount of awesome in the world. As Thomas Jefferson famously pointed out, if I use my candle to light yours, you don’t take half my flame. We are now both fully illuminated.

We need more of this in the world.

How do we pick a winner? The Condorcet paradox and our crazy primary season

How do we pick a winner? The Condorcet paradox and our crazy primary season

Remember a couple of weeks ago when I said I’d do a deeper dive on some new-to-me concepts? Well, primary season here in the U.S. seems like a good time to look at the Condorcet Paradox.

Let’s start with a brief look at Simpson’s paradox, since Condorcet is a variant.

Simpson’s paradox is the ability to use statistics to argue either side. This video makes it easy, but if you don’t want to take the four minutes to watch it, it goes something like this:

Let’s say you’re testing two drugs for effectiveness over two days. On Day 1, Drug A helps 63 of 90 people (70%) and Drug B helps 8 of 10 people (80%). On Day 2, Drug A helps 4 of 10 people (40%) and Drug B helps 45 of 90 people (50%).

Drug B won on both days, so it’s the more effective drug, right? Well, wrong. If you add them up over the two days, Drug A helped 67% of people while Drug B helped 53% of people.

You can go a little deeper with this video, which seems to prove that I’m a rabbit, or something like that.

The Condorcet variant applies specifically to elections involving more than two people (or options). It says you can’t accurately pick a winner.

The video below goes into why, and shows how, in a field of five candidates, a single round of voting can yield five different winners, depending upon how you decide to count.

Let’s move this briefly into the realm of U.S. politics. Briefly, because I’m probably not going to get it right if I go on too long.

First off, if you haven’t been following what’s going on in the Alabama Democratic Party (and really, unless you’re a democrat in Alabama, why would you?), spend five minutes watching a little something my colleague Kyle Whitmire put together last October.

It’s a perfect overview of politics in the U.S. these days.

Remember Paul Ryan? He was speaker of the House of Representatives until 2018. That ended 13 months ago and I have to ask if you remember him, which is a good illustration of the American attention span. He didn’t initially endorse Donald Trump during the 2016 election season, so his approval rating took a nosedive and he changed his mind.

Changing one’s mind is the sort of thing that in 2004 Republicans said, essentially, disqualifies you from office.

So basically, if you’re stubborn, refuse to incorporate new evidence into your opinions and only hold people who root for the other team accountable, it’s all good. Noah was 900 years old when he built a giant boat by himself, the Earth is flat and our planet is the center of the universe, because if you never accept new information, you’re all good on the old stuff!


In the biblical story of the Garden of Eden, when Eve eats the fruit of the Tree of Knowledge, Adam and Eve realize they’re naked. It’s not that they weren’t naked before, they simply didn’t notice, or understand what that meant.

That’s what happens here. Nothing changes with our election, we just get a better understanding of what we’re looking at when we see numbers.

A little knowledge, on one hand, is a dangerous thing. Ignorance, on the other, is bliss. Maybe. I kind of like having the control of knowing more stuff.

Polls might be the biggest problem with our elections, especially in primary season, when there can be a dozen or more candidates. You can lead a field of 12 with less than 10% of the vote. You could pick up 12% of the vote and be at 50% over everyone else (11 candidates at 8% is 88%, leaving you with 12%, or 50% more of the vote than everyone else).

It’s nice to be leading, but who wants to brag about being the candidate that 88% of people aren’t interested in? [The snarky answer, of course, being someone running against a bunch of candidates that 92% of voters aren’t interested in.]

Polls also don’t mean much when we get down to two candidates (while technically we’re never down to only two candidates, our de facto two-party system marginalizes third-party candidates and, for the most part, they tend to show up as rounding errors, at least over the past couple of decades — Ross Perot certainly took votes away from George W. Bush in 1992 and Ralph Nader probably had something of an impact in some states in 2000).

Polls don’t matter much for three reasons:

(a) If you poll people nationally, you’re gauging popular vote, which is not an indicator of who wins the election, since we have a representative democracy that makes use of an electoral college system. Donald Trump received 2.9 million fewer votes than Hillary Clinton in 2016. He was the fifth president to receive a minority of the popular vote.

(b) We don’t have the capacity as individuals to process large sets of data all at once. The more meaningful poll — and let’s be honest, not all polls are generalizable — would be percentage of voters in Alaska voting for a candidate, plus the percentage of voters in Alabama voting for a candidate, and Arizona and Wyoming and Maine and Oregon and Texas and … you get the idea. The reason that’s more meaningful is the electoral college system. Unfortunately, with the exception of perhaps a handful of outliers, you can’t just give people fifty-plus sets of percentages and expect them to be able to extrapolate the pertinent information.

(c) The Condorcet paradox. When we poll for Candidate A vs. Candidate B vs. Candidate C, you run into the problem where people prefer candidate A to candidate B, candidate B to candidate C, and candidate C to candidate A. Who wins? As we saw in the video above, it all depends on how you decide to count.

To learn

To learn

I’m not going to turn this into a Masonic blog, but this year I’ll be doing a lot of study in Freemasonry, so it’s going to pop up here from time to time with lessons to generalize to all of us.

Because we don’t write stuff down, there is occasionally a debate — sometimes running for centuries — as to what something means. The phrase, “to learn to subdue my passions and …,” is one such piece that causes confusion.

The question concerns whether there should be a comma after “to learn” — are we instructed “to learn to subdue our passions,” or are we instructed to learn and to subdue our passions? I didn’t finish the phrase above (Masons will know how it finishes), but is it a two-item list, or a three-item list?

The arguments are interesting, particularly the more recent writings. Those who believe it is a three-item list tend to have an argument along the lines of, “Of course we’re instructed to learn! How else could we improve?” while the people on the other side have an argument along the lines of, “Of course we don’t need to be instructed to learn! How insulting to think we would need to be told that!”

There’s a clear admonition, then, from people on both sides: Learn.

The word learn has been with us for a long time. Since Old English, in fact, which was very much like German and was spoken in England until the Norman invasion led to the Great Vowel Shift of 1066.

Put simply, to learn means “to acquire knowledge.”

Almost everything we do is, in some way, learning. We take in new information all the time. Even driving the same road to work every day, you pass different cars parked along the route, trees missing their leaves, houses being painted, grass being mowed along the highway, you get the idea. We take in new information, we process it, it’s something we’ve learned.

But does it make us better?

If we’re always looking to improve — and it’s my opinion that we should — we should seek to learn more.

Some people say make your strengths stronger; some say shore up your weaknesses. I say it doesn’t matter. Be curious. Learn new things. If you come across something unexpected you enjoy, go deeper. Use your library. Spend some time on YouTube. Sign up for The Great Courses Plus or Udemy or get yourself enrolled in some MOOCs.

Get smarter, get better.

Thanksgiving and the origin myth

Thanksgiving and the origin myth

Every year, my aunt would make a pumpkin-pecan pie, with pecans in concentric circles covering the entire top of the pie. Every year, someone would grab the center pecan (it would always be a different family member). Ever year, my aunt pretended to be angry when the pie made it to the table.

My parents’ home in Springfield, Mass., was the gathering place for my family for years. By “my family,” I mean it. My siblings and me, aunts, uncles, cousins, occasionally girlfriends and boyfriends and later spouses, grandparents, sometimes a stray college buddy with no other plans over the break.

We opened up the dining room table, added on an extra folding table or two if necessary and dug up every chair in the house, including the dusty chairs inherited from past generations that folded in various directions we had to rediscover.

Some years, there were 25 people around my parents’ table.

There were, of course, memories made. The year of the Really Bad Jell-o Mold. The year my younger cousin ate an onion like an apple for $5. The one time we ate so early we had to get some pizza later. The time the chocolate-covered strawberries never made it to the dessert table and by the time my aunt was on her way home and called to let us know, we’d already demolished them while watching The Three Stooges.

Occasionally a friend would drop by late, after everyone was gone and the family was basically passed out watching television.

That would be about 8 p.m., of course.

Now the family has spread out some. We’ve lost a generation in some parts of the family, and gained a generation in others. My parents host more like eight or 10 people; there are two, sometimes three branches of the family that get together at different locations.

Those are my family’s Thanksgiving traditions. I’m sure, if you’re in the U.S. or Canada, your family has some, too, be it a family meal or volunteering at a shelter or a protest.

How did we get here, though? What is the origin of us making hand-turkeys and playing pilgrims-and-Indians in elementary school? And am I even allowed to write those words in today’s world?

Most of what we know about the first Thanksgiving celebrations comes from Of Plymouth Plantation, a diary by William Bradford, who was governor on-and-off of what was then Plymouth Colony for about 30 years between 1621 and 1657, says Country Living.

Bradford noted in his manuscript that the pilgrims of Plymouth had enjoyed an especially good harvest in the fall of 1621. In honor of their good fortune, they planned a meal to celebrate and give thanks for the abundance of food. The local Wampanoag natives had worked along with the pilgrims to hunt, fish, and gather much of that food—and they’d even taught the pilgrims about many of those tactics in the first place. For that reason, they joined in to give thanks for it all (and yes, there was a cooked fowl dish, noted Bradford, but no mention of pie!).

“Prayers of thanks and special thanksgiving ceremonies are common among almost all religions after harvests and at other times,” notes Wikipedia. “The Thanksgiving holiday’s history in North America is rooted in English traditions dating from the Protestant Reformation. It also has aspects of a harvest festival.”

The Reformation was a time in the 16th century when England broke away from the Roman Catholic Church.

The first Thanksgiving wasn’t actually a Thanksgiving, points out National Geographic. Those early pilgrims who landed at Plymouth celebrated Thanksgiving with a fast, not a feast. Really, it was a harvest celebration, and a commemoration of a peace treaty that early English settlers and the Wampanoag signed about seven months earlier.

That treaty lasted 50 years, by the way.

But, National Geographic goes on to say:

• A year before the first Thanksgiving, the pilgrims raided Native American graves.

When the pilgrims arrived in Cape Cod, they were incredibly unprepared. “They were under the persistent belief that because New England is south of the Netherlands and southern England, it would therefore be warmer,” says Mann. “Then they showed up six weeks before winter with practically no food.”
In a desperate state, the pilgrims robbed corn from Native Americans graves and storehouses soon after they arrived; but because of their overall lack of preparation, half of them still died within their first year.

• The pilgrims could only settle at Plymouth because thousands of Native Americans, including many Wampanoag, had been killed by disease.

• The peace that led to the first Thanksgiving was driven by trade and tribal rivalries.

Before the Wampanoag suffered losses from disease, they had driven Europeans like John Smith away. “Now,” says [pilgrim settler] Mann, “the Wampanoag [were] much weaker because of the disease, and they’re much weaker than their hated adversaries, the Narragansett.”
Ann McMullen, curator at the National Museum of the American Indian, says that the Wampanoag weren’t necessarily looking to make alliances against the Narragansett; but “because the Wampanoag were in a slightly weakened position,” they realized that an alliance with the pilgrims “could fortify their strength.”

“There are always two sides of a story,” say the writers at Native Hope. “Unfortunately, when it comes to the history of Thanksgiving, generations of Americans have been taught a one-sided history in homes and schools.”

While the peace treaty with the Wampanoag may have been enforced after that first harvest dinner, the settlers went on to keep slaughtering other tribes, including the local Pequot.

Thanksgiving Is a Day of Mourning for Some Native Tribes
It’s important to know that for many Native Americans, Thanksgiving is a day of mourning and protest since it commemorates the arrival of settlers in North America and the centuries of oppression and genocide that followed after.
For the last 48 years, the United American Indians of New England have organized a rally and day of mourning on November 22nd. Here’s what they have to say about this choice to mourn on Thanksgiving:
“Thanksgiving day is a reminder of the genocide of millions of Native people, the theft of Native lands, and the relentless assault on Native culture. Participants in National Day of Mourning honor Native ancestors and the struggles of Native peoples to survive today. It is a day of remembrance and spiritual connection as well as a protest of the racism and oppression which Native Americans continue to experience.”

Other Native Americans still get together with friends and family, they go on to write, since the concept of Thanksgiving — giving without the expectation of reciprocity — is a Native tradition.

Susan Bates expounds a bit on the slaughter:

The story began in 1614 when a band of English explorers sailed home to England with a ship full of Patuxet Indians bound for slavery. They left behind smallpox which virtually wiped out those who had escaped. By the time the Pilgrims arrived in Massachusetts Bay they found only one living Patuxet Indian, a man named Squanto who had survived slavery in England and knew their language. He taught them to grow corn and to fish, and negotiated a peace treaty between the Pilgrims and the Wampanoag Nation. At the end of their first year, the Pilgrims held a great feast honoring Squanto and the Wampanoags.
But as word spread in England about the paradise to be found in the new world, religious zealots called Puritans began arriving by the boat load. Finding no fences around the land, they considered it to be in the public domain. Joined by other British settlers, they seized land, capturing strong young Natives for slaves and killing the rest. But the Pequot Nation had not agreed to the peace treaty Squanto had negotiated and they fought back. The Pequot War was one of the bloodiest Indian wars ever fought.
In 1637 near present day Groton, Connecticut, over 700 men, women and children of the Pequot Tribe had gathered for their annual Green Corn Festival which is our Thanksgiving celebration. In the predawn hours the sleeping Indians were surrounded by English and Dutch mercenaries who ordered them to come outside. Those who came out were shot or clubbed to death while the terrified women and children who huddled inside the longhouse were burned alive. The next day the governor of the Massachusetts Bay Colony declared “A Day Of Thanksgiving” because 700 unarmed men, women and children had been murdered.

Here’s the thing: History is written by the winners. It always has been.

While I do believe it’s important to acknowledge the truth in history, the fact is the history of humans is filled with people killing people who looked, sounded or believed differently from them, and then inflicting their beliefs on the defeated group.

So let’s do this, maybe: Let’s remember there seriously is plenty for everyone and learn to live with our differences moving forward. Let’s acknowledge that history is full of some people being shitty to other people for stupid reasons. But let’s allow our good traditions — gathering with friends and family, eating, being grateful, volunteering at shelters, whatever it is we do — to shine through.

Why do we celebrate birthdays?

Why do we celebrate birthdays?

It’s my birthday. Thanks for the well-wishes, but if you really want to do something for me, here are some charities you could support:

St. Jude Children’s Research Hospital
Fight for the Forgotten
Shriners Hospitals for Children
Pencils of Promise
Charity: Water

I’m 43 years old today. According to Patton Oswalt, I’m allowed to celebrate again in seven years.

But why do we celebrate birthdays, anyway? And why cake? And candles?

Fun fact: Oct. 5 is considered to be the most common birthday in the United States, with the least common being May 22. Everybody do the math? That makes New Year’s Eve the most common conception night, and sometime in mid-August the least. If you’re counting back from November 20, I guess Dad had a good Valentine’s Day.

We started celebrating our birthdays because of gods. It goes something like this: When ancient Egyptians crowned a pharaoh, the pharaoh was believed to transform into a god. Egyptians would celebrate the anniversary of the pharaoh’s becoming a god.

The first birthdays.

Later, the Greeks held annual celebrations of their gods, and, during the celebration of Artemis, the moon goddess, they would make moon-shaped cookies and added candles to mimic the light of the moon.

Next, the Romans began celebrating the anniversaries of the birth of their male friends and relatives.

Later, the Germans added cakes to celebrations, and finally, in the 1930s, we get the song “Happy Birthday.”

I’m not sure when we started celebrating the birthdays of women and children, but hey, it’s nice to have everybody on board, especially when there’s cake involved.

Early Christians declined to celebrate birthdays, declaring it a pagan ritual. That sounds a little odd to me, given Christmas trees, the Easter Bunny and the celebrations of All Saints’ Day and Day of the Dead.

Modern Christianity has evolved on this (obviously). Some consider birth as a beginning, and a reason to be grateful.

Veterans Day

Veterans Day

We continue our exploration of U.S holidays. with Veterans Day. This will be a short one; there’s not a lot of waffling on having a holiday like there was with Flag Day and not a long trek to get where we are with competing holidays like with Memorial Day.

Here’s the short version of the evolution of Veterans Day:

• It began as Armistice Day, celebrating WWI veterans, on Nov. 11, 1919.
• In 1954, President Eisenhower changed the name to Veterans Day and made it a celebration of all veterans.
• In 1968, Congress passed the Uniform Holidays Bill and moved it to the fourth Monday in October when the law went into effect in 1971.
• In 1975, President Ford moved it back to Nov. 11.

Not a whole lot of hemming-and-hawing over stuff — except now we know why most holidays give us a three-day weekend (or four-day weekend, in the case of Thanksgiving), but Veterans Day is the one day that can totally mess with your head.

Given the meaning, maybe it ought to.

Anyway, let’s look at the long(er) version.

From History Channel:

Veterans Day originated as “Armistice Day” on Nov. 11, 1919, the first anniversary of the end of World War I. Congress passed a resolution in 1926 for an annual observance, and Nov. 11 became a national holiday beginning in 1938. Unlike Memorial Day, Veterans Day pays tribute to all American veterans—living or dead—but especially gives thanks to living veterans who served their country honorably during war or peacetime.

In Britain, two minutes of silence are observed at 11 a.m. on November 11 each year.

Here is President Wilson’s message to Americans declaring Armistice Day on November 11, 1919, per Wikipedia.

A year ago today our enemies laid down their arms in accordance with an armistice which rendered them impotent to renew hostilities, and gave to the world an assured opportunity to reconstruct its shattered order and to work out in peace a new and juster set of international relations. The soldiers and people of the European Allies had fought and endured for more than four years to uphold the barrier of civilization against the aggressions of armed force. We ourselves had been in the conflict something more than a year and a half.

With splendid forgetfulness of mere personal concerns, we remodeled our industries, concentrated our financial resources, increased our agricultural output, and assembled a great army, so that at the last our power was a decisive factor in the victory. We were able to bring the vast resources, material and moral, of a great and free people to the assistance of our associates in Europe who had suffered and sacrificed without limit in the cause for which we fought.

Out of this victory there arose new possibilities of political freedom and economic concert. The war showed us the strength of great nations acting together for high purposes, and the victory of arms foretells the enduring conquests which can be made in peace when nations act justly and in furtherance of the common interests of men.

To us in America the reflections of Armistice Day will be filled with solemn pride in the heroism of those who died in the country’s service, and with gratitude for the victory, both because of the thing from which it has freed us and because of the opportunity it has given America to show her sympathy with peace and justice in the councils of nations. outlines the differences between Veterans Day and Memorial Day:

Memorial Day honors service members who died in service to their country or as a result of injuries incurred during battle. Deceased veterans are also remembered on Veterans Day but the day is set aside to thank and honor living veterans who served honorably in the military — in wartime or peacetime.

Veterans Day fact sheet from the VA »

Holidays in liminal spaces: Where Halloween, Day of the Dead originated

Holidays in liminal spaces: Where Halloween, Day of the Dead originated

Upon that night, when fairies light

On Cassilis Downans dance,
Or owre the lays, in splendid blaze,

On sprightly coursers prance;
Or for Colean the rout is ta’en,
Beneath the moon’s pale beams;
There, up the Cove, to stray an’ rove,

Amang the rocks and streams
To sport that night;
Amang the bonie winding banks,

Where Doon rins, wimplin, clear;
Where Bruce ance rul’d the martial ranks,

An’ shook his Carrick spear;
Some merry, friendly, countra-folks

Together did convene,
To burn their nits, an’ pou their stocks,

An’ haud their Halloween
Fu’ blythe that night.
      ——Robert Burns, from Halloween
        The first known time the word “Halloween” was used in print

In the U.S., anyway, we have an uncomfortable relationship with death. You might call it “out of sight, out of mind.” If they’re dying of natural causes, very few people die in the home anymore, and if they do, we shove them out the door as quickly as possible.

We get mildly uncomfortable when we hear Doug Stanhope tell a (hilarious) story about not so much assisting as bar-backing his mother’s suicide.

When we opt for open-casket visitations, usually the individual has been pumped full of preservatives and made up to look living, but really, looks like a creepy silicon doll more often than not.

But there are cultures that don’t look at death with such fear. One Japanese custom has family members sorting cremated remains with chopsticks. The Jewish prayer of mourning, the kaddish, doesn’t mention death at all, and is instead something of a plea for peace and blessings in our lives.

That brings us to a couple of autumn holidays: Halloween, celebrated each year on October 31, and the Day of the Dead, celebrated from October 31 through November 2 every year.

Both have roots in the Gaelic and Celtic pagan celebration Samhain (pronounced SOW-en). That millennia-old holiday commemorates the end of the harvest season and the beginning of the darker part of the year. It is considered a liminal time, when spirits could cross into the land of the living, and celebrants believed they had to appease the spirits to ensure their cattle and land would survive the winter.

Dia de los Muertos, or Day of the Dead, melds Samhain with ancient Aztec and Toltec ceremonies. It’s also tied to the Catholic All Saints’ Day (November 1) and All Souls’ Day (November 2).

The holiday is very much a celebration of the lives of deceased family members, whose photos are placed with candles, flowers and other offerings on household ofrendas, or altars. As with Samhain, it’s a liminal time; deceased ancestors can visit the living.

While the jack-o-lantern is a common symbol of Halloween (more on that later), the skull is the symbol of the Day of the Dead. Revelers paint their faces, and parade in bright colors.

Printer and cartoonist José Guadalupe Posada created the most famous of the Day of the Dead skeletons, La Calavera Catrina, which is a re-imagining of the Aztec goddess of the underworld, Mictecacíhuatl.

If you want a beautiful, if Disney-fied, family-friendly look at the Day of the Dead, watch Coco.

More: »

Halloween, meanwhile, has roughly the same origins as Day of the Dead minus the Aztec influence. The night before All Saints’ Day was deemed “All Hallow’s Eve” by the Catholic Church sometime around the mid-ninth century.

The Scots poet Robert Burns contracted Hallow’s Eve to Halloween in a a href=”” target=”_blank” rel=”noopener noreferrer”>1785 poem.

Celebrations of spirits around Halloween turned into mischievous gatherings, which turned into modern-day trick-or-treating.

The legend of the jack-o-lantern is an interesting one I never knew. It seems as though a town drunk named Jack trapped Satan up a tree, and promised he’d let the devil come down if the Prince of Darkness never took his soul. Jack died, as people tend to do, and being a conniving drunkard, he didn’t get into Heaven, but the devil kept his end of the bargain, and wouldn’t take Jack’s soul. He did give Jack some flame to light his way through the world, and there is your jack-o-lantern.

The official Jehovah’s Witness website uses Bible verses to decry various aspects of Halloween.

I hope you’re enjoying this rundown of mostly secular holidays we’re doing this year. It’s been fun to learn about the origins of some of these celebrations we take for granted.

What is creativity, and where does it come from?

What is creativity, and where does it come from?

What, then, is creativity? It is the innate quest for originality. The driving force is humanity’s instinctive love of novelty — the discovery of new entities and processes, the solving of old challenges and disclosure of new ones, the aesthetic surprise of unanticipated facts and theories, the pleasure of new faces, the thrill of new worlds.
            ——Edward O. Wilson, The Origins of Creativity

Until very recently, neuroscience has had literally no clue what creativity looks like in the brain. In one book I read, dated 2015, the author said neuroscience just hasn’t figured out what creativity is yet.

If you read our series on happiness (Part 1 | Part 2 | Part 3) or empathy (Part 1 | Part 2 | Part 3 | Part 4), you’ll know there are some things we have quite a bit of information on.

But our knowledge of the brain is constantly expanding, and now we finally do have some understanding of the parts of the brain involved in creativity — or innovation or creating novel things, whether they be cell phone designs, symphonies or programming languages, however you’d like to state it.

I’m going to attempt in the first part of this series to not simply write a book report on Elkhonon Goldberg’s Creativity: The Human Brain in the Age of Innovation, a 2018 book on just what I’d been hoping to learn.

But first, a reminder from Susan Reynolds:

You have the world’s greatest supercomputer, with over 20 billion brain cells (neurons) and the possibility of forming 100 trillion synapses (neuronal connections) at your disposal. And here’s the kicker: In many cases, the types of connections that form are up to you — they are guided by your experiences, your decisions, and your desires.

Yes, your brain is amazing, it can make more connections than you can count, and you can choose your own adventure, as it were.

We write out things like billion and trillion for the sake of clarity, but let’s write those numbers out, just for laughs, and see if we can put them in perspective.

You have over 20,000,000,000 brain cells.

You can form over 100,000,000,000,000 neuronal connections.

For perspective:
• A million seconds is 11 and a half days.
• A billion seconds is almost 32 years.
• 20 billion seconds is 634 years.
• 100 trillion seconds is 3,170,979 years.

Sure, some of those neuronal connections do things like help you zip your pants, get plugged up with television advertising jingles from your teen years, and bring you back to your childhood when you smell something your mother used to cook.

But you’ve got a couple left over to help you create something new.

How does that happen?

It has to do with two parts of the brain that have only been given their just due in the past half-century or so.

While Goldberg argues that the whole brain is required for creativity — there doesn’t appear to be a distinct neural pathway creativity follows like that we saw with James Fallon in our empathy series — the primary parts of the brain linked with creativity are the prefrontal cortex and the right hemisphere.

Goldberg says these two areas have been treated like “Cinderellas” by psychology and neuroscience until very recently. We only stopped doing frontal lobotomies in the 1960s (this procedure severed the connection between the prefrontal cortex and the rest of the brain), and shock therapy was applied to the right hemisphere way more than to the left, which deals with language.

Let’s back up a second and take a look at the concepts we’re dealing with (you know by now I’m a fan of definitions and etymology).

Creativity. Here’s a definition, slightly modified to take the word create out of the definition:

the ability to transcend traditional ideas, rules, patterns, relationships, or the like, and to make meaningful new ideas, forms, methods, interpretations, etc.; originality, progressiveness, or imagination:

The word dates back to about 1800, so it hasn’t been in use very long.

Innovation. Meaning “the act of introducing new things or methods,” we’ve been using the word innovation in this way since the 1540s.

Novelty. This is admittedly different from the other two. Creativity and innovation are types of human output; novelty really is a type of input. It means “the state or quality of being new, or unique; newness,” and dates all the way back to the late 1300s.

So really what we’re getting at is making something new.

There are three commonly studied brain networks, Goldberg writes:

• The Central Executive Network (CEN)
• The Default Mode Network (DMN) — you’ll remember this is the brain’s chatterbox
• The Salience Network (SN)

The CEN and DMN, you might remember, work antagonistically — that is, when one is working, the other is off.

The CEN is what’s at work when we tackle a complicated task.

The DMN is what’s at work when we have no task, or we’re on “autopilot” (but not in flow). Consider, for example, when you have a repetitive task that you’ve done a lot and don’t need to concentrate on, like cooking eggs or driving. If something goes wrong, your CEN takes over.

The SN was discovered more recently and has been studied much less, but it seems to operate as a one-way switch to turn off the DMN and turn on the CEN. For example, if you’re coasting along on the highway, letting your mind wander, your DMN is at work. Let’s say there’s a sudden downpour and visibility goes down and cars start slowing and suddenly you need to pay attention; your SN will knock your CEN into drive. Your DMN may take back over later when everything clears up, but your SN doesn’t flip that switch, only the other way.

And it’s salience that’s important here, it seems. Something that is salient is leaping out at you right now.

In setting up neuroscience experiments, Goldberg asserts, scientists have been generally concerned with what is correct — press a button when the light is red, or is this picture the same as the last picture — instead of what is salient — that is, what I’m thinking about now (relevant). Creativity, innovation, etc., is more about salience, which is only just starting to be studied.

We’re going to talk a lot more about how our brains process information — and novelty, in particular, given our quickly-changing digital world — over the next couple of parts of this series, but we should start with a basic understanding of how our brains process the world.

Our brains work fundamentally on pattern recognition, Goldberg writes. It’s why we know a person we’ve never met is a person and why we know a tree we’ve never seen before is a tree. It’s the same with complex ideas — we just need to take the patterns and put them all together.

When we fail to recognize a pattern, however, such as an object we’ve never seen before, or a doctor who’s never seen a set of symptoms, “the brain has encountered a novel challenge. A subjective feeling of being confronted by novelty is mostly the result of the left hemisphere’s failure to find a solution by failure of the incoming information to resonate with any of the previously formed attractor networks.” (Goldberg, 25)

And that’s where creative problem solving steps in.

Let’s also discuss metacognition, since that will come up again.

The prefrontal cortex is responsible for metacognition, which is to say that it takes all the things the other parts of the brain are doing and ties them together into something a little more cohesive. Goldberg likens this to building with a Lego set. If all the bits of information we have stored in our brains are Lego pieces, he writes, the prefrontal cortex — which doesn’t fully mature until our early-to-mid-30s — is responsible for shuffling the pieces around, assembling, dissembling and reorganizing everything.

The ability to combine the elements of old ideas into new configurations is essential for generating new ideas and new concepts, which in turn is essential to creativity. This is how the figments of human imagination like mermaids and unicorns came into being, and this is how many important scientific ideas, technological inventions, and artistic creations have been born. (Goldberg, 47-48)

A lot of this work we do when we’re asleep or otherwise in silence.

In Why We Sleep, Matthew Walker writes that “Sleep provides a nighttime theater in which your brain tests out and builds connections between vast stores of information. … In ways your waking brain would never attempt, the sleeping brain fuses together disparate sets of knowledge that foster impressive problem-solving abilities” (p. 132).

In other words, when you go to sleep and shut off that conscious brain, your dreaming brain will go buck wild.

REM sleep, he notes, offers the “benefit of fusing and blending those elemental ingredients together, in abstract and novel ways. During the dreaming sleep state, your brain will cogitate vast swaths of acquired knowledge and then extract overarching rules and commonalities — ‘the gist.’ We awake with a revised ‘Mind Wide Web’ that is capable of divining solutions to previously impenetrable problems” (p. 219).

We’ll get to more prescriptive measures in the fourth (and final) part of this series, but if you’re trying to piece together some new stuff you took in during the day, sleep on it.

Next up: An argument for creativity.