Category: Empathy

Coronavirus: Personal responsibility, public responsibility, truth and vetting

Coronavirus: Personal responsibility, public responsibility, truth and vetting

It’s been a crazy couple of months, right?

It’s so difficult to vet information about something as confusing and fast-moving as a new virus — that’s what this new coronavirus, COVID-19, is — and knowing what to do and when to do it is tough.

The most level-headed discussion I’ve heard so far is Joe Rogan’s podcast with Michael Osterholm, head of the Center for Infectious Disease Research and Policy (CIDRAP) at the University of Minnesota. Keep in mind that the up-to-the-minute facts will be outdated, but you can’t avoid that when you’re just having an off-the-cuff conversation.

There are a few things we should consider, and some things that will be interesting in retrospect.

We talked about vetting good information on JKWD a bit ago, so let’s start there.

I will admit that I have no concept of what the “average” person is getting for news about coronavirus. I work closely with 10 newspaper-affiliated websites for 40 hours a week. I probably see 100 coronavirus-related stories per shift, in addition to the items I seek out in my preferred local, state and national media outlets, my Twitter feed (including the governor of Georgia, the CDC and WHO) and whatever else happens to cross my eyes and ears while I’m not working.

Most of you probably aren’t getting 750-plus pieces of information a week unless you’re sitting there glued to it at all times, and if you are, you should probably stop that immediately. It’s exhausting, and it’s going to be around for a while.

But I did talk to someone at a local business recently, who had no idea that Italy was entirely shut down and had thousands of deaths.

If you want to be informed about this — and I think you should be; I’ll address that when I discuss personal responsibility — don’t go overboard, but choose wisely. Read your local newspaper website, and maybe a couple local TV websites. Check the website of the biggest newspaper in your state, and, if it doesn’t have good capital coverage, the site of the paper in the capital area (in many states, the biggest market is the capital, but that’s not true across the board — Illinois and Pennsylvania are two examples). Check traditionally reliable sources like the New York Times and Wall Street Journal.

If you want international coverage, the BBC is always a good place to turn.

Find places that report the facts. If they make recommendations and/or criticisms, make sure they’re backed up with reported facts, not viewpoints from politicians. For example, “three people died in such-and-such city” is a fact. “Only a few people died in such-and-such city” is a viewpoint. Tell the families of those three people that it was “only a few people.” Now extrapolate that to the hundreds that are dying a day in some places.

It’s easy to get people to freak out too much or not enough by putting a viewpoint on our facts. But people are smarter than we give them credit for — let’s give them the facts and let them make informed decisions.


Let’s talk responsibility, and two types, which I’ll call personal and public.

Personal responsibility pertains to the things you owe yourself — information gathering, self-reflection, good habits, etc.

Pubic responsibility pertains to the things you owe to others — to not infect your neighbors and family, to tell people the truth as you understand it, etc.

Most adults will probably get this virus, even if they show mild or no symptoms (that’s important because you can have no symptoms and pass it along to people you come in contact with). Flu pandemics seem the model we can learn from (long read: a NIH workshop summary on Spanish flu of the early 20th century). Some 1.4 billion people tested positive the H1N1 flu (about 20% of the population of the world, including minors) a little over a decade ago, and at the high end of estimates, killed about 575,000 people (about .04%).

If you figure that you didn’t get tested if you had no symptoms or only had mild cold-like symptoms, a lot more people had it.

With COVID-19, we’re also having a problem with getting enough tests, so our current numbers are underreported because technically, if you can’t get tested, you don’t have it.

As I’m writing this, Georgia’s (USA, not the country) reported cases are increasing between 40 and 50 percent per day. I know of one person who said she couldn’t get tested at a local health center despite being immunocompromised and showing symptoms because they didn’t have any tests. The local health department said they were waiting on tests, and the local office of the Centers for Disease Control (CDC) was also waiting on tests. So if some high-risk people aren’t able to get tests, you can see how most people who are asymptomatic or only get mild cases (which is about 80 percent of people who test positive, by the way) won’t get tested.

No matter how high the numbers climb, we won’t be reporting all cases.


Centers for Disease Control info & guidelines | World Health Organization information


Even health professionals don’t seem to be able to predict how the virus will hit any one individual. We know that the respiratory repercussions seem to be really bad, so people with COPD or asthma or who are smokers are likely to get hit pretty hard, but sometimes they’ll get a mild case.

Which means that no matter how health you are, you can’t predict how it will actually hit you if you get it. So try not to get it.

And in case you are asymptomatic, you can’t predict how it will affect other people if you pass it along, so try not to. You’re probably going to pass it along to people you live with. But there are some easy steps you can take to make sure you don’t give it to anyone else — steps like stay the heck away from them.

We have an elderly neighbor. She has adult children who stop by and check in on her, and I imagine they call her, too. When we go to the grocery store, we call her to see if she needs anything.

Our responsibility to her in delivering things to her should be to wash our hands, wipe down the goods we’re bringing her, put them in a separate bag, place them outside her door, and ring her bell or call her to let her know they’re there.

Our responsibility to ourselves in that case would also be to wash our hands again after touching her bell.

It’s not that difficult, but it’s important.

We also owe it to ourselves, if we wish to stay healthy, to only go do important things, like getting food and medicines. Yeah, it stinks staying home. But think about things like going out to eat. Do employees at your favorite restaurants have paid sick leave? Are they likely to stay home if they’re mildly symptomatic — say a little sore throat and a sniffle, like if they had a cold? Probably not, if they don’t have much in the way of sick days.

Our trusted officials (such as elected and appointed members of government, but also the heads of organizations like hospitals and urgent cares and shopping malls and the sorts of places people gather) also have a responsibility to us. We’ve asked them to lead in times of crisis, and this is most certainly one. Give us facts. When you give us directives and/or suggestions, back them up with facts, because people will be more compliant if they understand why.

Viruses aren’t partisan — they don’t stop outside your mouth and ask who you voted for before deciding whether to infect you — so neither should the directives handed down by particularly government.


When this pandemic ends, there will be some interesting things to study, from a social science perspective (we know that biologists and virologists and geneticists will do their thing).

• What traits of leaders did the best and worst at containing the virus? China, where the virus started, was slow to admit its existence. The US, fairly far away from the epicenter of the pandemic, didn’t take it seriously at the start.

• What cultural traits did the best and worst at containing the virus? Did people stay home when told more in collectivist cultures than in individualist cultures?

• What cultural and leadership traits correlated with the least economic interruption and quickest recovery? What measures had the highest impact, both positive and negative?

• From a media perspective, how does this new era of reader-driven content selection and bottomless news hole affect coverage, especially deep reporting?


Stay healthy and safe, folks, and don’t overwhelm yourself with too much information.

Share
Moving the goalposts, changing the rules

Moving the goalposts, changing the rules

When you’re training a dog, consistency is key. There’s a language barrier, so you’re teaching with either a carrot or a stick. The thing is, the dog has to understand what the reward or punishment is for.

This isn’t a post about training dogs. Hang with me for a bit.

There are some things that are innate to dogs that you need to teach at human scale. One trainer taught us about walking. The alpha walks in front. Always. The non-alpha dog’s hip doesn’t pass the alpha’s hip, and when the alpha stops, a line drawn between its front toes and extended past is the stop line for non-alpha dogs; crossing this line is a challenge to alpha status.

We can translate that pretty easily into human actions. When you’re walking a dog on a leash, the dog’s nose and even neck might extend past you, but as soon as the dog’s front hip passes you, the dog is leading the walk. When you stop with, say, your toes together, the dog might step in front of that line but if you’re alpha it should retreat behind your toes. as it stops.

You need to decide how to communicate that you will reward for compliance or punish for non-compliance, but that’s something a dog understands.

Something dogs don’t do innately is sit on command. They’ll do it when they understand it’s what you want, but you have to find a way to explain to the dog that sitting is what you want.

Once you establish these rules, any deviation from them is confusing to the dog. You can add new commands, but you can’t change the rules. You can change the reward or punishment, but it has to be for the same thing. The easiest way to explain this is shifting from a steady reward to a variable reward. To teach the behavior, you reward every time. To keep it going, you reward at random.

That’s a Skinner experiment, by the way. If you give a rat a treat every time it hits a lever and then stop the treats, it will only hit that lever a few times and then give up. If you give a rat a treat sometimes when it hits the lever and then stop the treats, it will keep hitting that lever for a while before it gives up.

So you can change the reward (or punishment), but again, you can’t change the rules.

It’s the same with people. Even the most flexible among us are unlikely to keep trusting someone who changes the rules.


What you can do, though, is move the goalposts.

It’s one thing I’ve been doing with my daughter. She knows the rules are, if she wants something that is somewhere she can reach it, it’s up to her to go get it. Sometimes, though, that something moves.

We go to the library for story time. There’s an area rug that’s a reasonable place for kids to crawl around, and, at the back of the rug there are two long upholstered benches, which effectively help section off the reading area.

We arrive early so we can pick out some books to read at home for the week, and to get some moving around in. Typically, I’ll put her somewhere on the rug in a seated position, put our tote on one end of the bench, and walk over to one bookcase and start grabbing books.

When the baby makes her way to the bench, pulls herself up and moves near the bag, I bring it all the way to the other end of the bench and choose books from the another bookshelf.

That’s moving the goalpost. I didn’t change the rules — the bag is still there when she reaches it and she still has to move to it — but I did move the goalpost.

You have to do this carefully, of course. If I never let her reach the goalpost, she’ll just give up. If you’re setting numbers goals for employees, you can’t move the goalposts, or they’ll never bother trying to get to the first one.

But you can set additional goalposts. You could set an additional incentive at 10 percent over the initial goal, for instance.

When I take the bag and bring it to a bookshelf on the other side of the bench — so that my daughter would have to either climb over the bench, crawl under it or go around it — now I’ve changed the rules. Instead of doing something she knows how to do and practicing it, I’ve made her solve a new problem. And yes, we certainly do that, but not as part of a game with different, known rules.


Now here’s an exercise for you.

• Where in your life are you changing the rules for others? This could include not properly explaining the rules — are you making covert contracts? — or changing the game entirely, such as changing from requiring your three-year-old to eat his vegetables to get dessert to using the toilet to get dessert without first setting up the new game?

• Where in your life are you changing the rules for you? Are you setting yourself up to fail, or worse, letting yourself off the hook/

• What goalposts can you move for yourself? Can you have better relationships, save more money, create more experiences, improve your health?

• What goalposts can you move for others? Can you challenge your friends, spouse or coworkers to be better?

Share
Discovery: Old promises and where social media can be

Discovery: Old promises and where social media can be

Back in 2008 and 2009, before the colleges and businesses flooded Twitter, a bunch of us Central New Yorkers met on the social platform. I spent New Year’s Eve 2009-2010 with Twitter friends. Two of them were involved in my wedding, as officiant and DJ. Our photographer was a friend before Twitter, but that’s where we firmed up our friendship.

Fast-forward a decade, and Twitter was a cesspool. Facebook, and to some extent Instagram, became somewhat the same.

I wrote:

About six weeks ago, I was feeling buried in hatred. The bile and ugliness that has become a substitute for discourse in today’s world was too much. I went over to my Twitter account and deleted almost 30,000 tweets — 10 years of thoughts, connections, replies and occasionally a joke that didn’t land.
 
Something miraculous happened: nothing. Nobody said anything. Nobody unfollowed my now-empty account. I was hurt for about 12 seconds, and then I remembered that the reason I left the service in the first place was the community I’d built was gone. The followers were still there. The friends I’d made were still my friends. But Twitter was only serving as a shouting board; there didn’t seem to be any more listening.

But it turns out it’s in how you use it.

We know that algorithms are meant to make you miserable. It’s most evident with Facebook, but it can kill your Google News feed, your YouTube feed (especially if you let the auto-play run), your Twitter feed and basically any other network that runs on an algorithm trying to keep your attention.

Recently, though, I’ve tried to make things fun for me again. My Instagram feed seems to be almost chronological now, and I’ve turned off the auto-play on YouTube — that’s where it just picks a related video to play so eventually you get deeper into a rabbit hole — and I’ll tell you what I’ve done with Facebook and Twitter to make them more fun.

For Facebook, I joined a couple of groups that I enjoy being part of. Then I added an extension to my Chrome browser called News Feed Eradicator for Facebook. It does what it sounds like: dumps your feed.

When I visit Facebook, I see the usual stuff on the left and right rails, and in the middle, where you see first the option to post a status update and then your feed, I see the option to post a status update and a randomized quote:

If I want to see what a specific friend or business is posting, I can go to that profile, but generally speaking, I just go to my groups when I feel like it.

It turns out there are good conversations taking place on Facebook; you just have to know where to look. Or where to not look.

For Twitter, I did the thing that a lot of us did back in 2008 or whenever we joined: I hijacked some follow lists. Specifically, I went down the list of people Douglas Rushkoff and Chuck Klosterman follow.

It pretty quickly led me to the sort of thing I used to love finding, before you could tweet threads (I’m going to go down the rabbit hole on a couple of these items in the coming weeks because I find them interesting):

Here’s the unthreaded version (that also wasn’t a thing in 2009, by the way).

Know that if you’re fed up with social media but also either have to be on it for work or you have that dopamine addiction that you feed with likes and such, you can make things better for yourself without going cold turkey.

Share
Little lessons from family

Little lessons from family

Next week on JKWD, we talk to author Dillon Barr. One of the things he writes about in his book The Happiness Gap is the little lessons we learn from people in our family.

He tells a lesson of compassion, taught by his dad. When he was a kid, he did the very little-boy thing of taking a magnifying glass to burn ants; his father pointed out they’d done nothing to deserve it.

I talked about my grandfather; we called him Zadie. His wife, my grandmother, was Bubbie.

When Bubbie died, the rabbi came to condo to learn about her for the eulogy. He asked how long the courtship was, and without skipping a beat, Zadie told him, “61 years.”

I was the one to stay with Zadie the night following the funeral, to see everything was OK. Before we retired for the night, I asked him what tomorrow looked like. He told me we just get up and go about our day; that’s what we do.

In a long-surviving couple, when the man dies, the woman gets more involved in the communities they were involved in, whether that’s the senior center or a book club or a stitch-n-bitch circle.

When the woman dies, the man usually dies fairly quickly. Men don’t often form friendships in adulthood. Zadie was a Mason and a Shriner, so at least he was definitely going to have people he could call on.

He lived another 13 months. He wouldn’t let the grandkids visit him near the end, but he took our calls every day. When it came time for him to lay down his working tools, he waited until his three children (my mom, aunt, and uncle) could be at his bedside. He told them to keep the family together.

And we are. My aunt and uncle, and some of the cousins, still live near each other and see each other. Our branch of the family — my parents, my sister, my brother — live in a different part of the country, and we still gather frequently, often at my home.

When I sit at my desk — when I’m working, or writing a blog post, or recording a podcast, for example — I’m looking straight across at a picture of Zadie from his upsherin, the traditional first haircut a three-year-old Jewish boy gets. It’s a good reminder that he’s here and keeping an eye out (yes, that’s the hair from the haircut, which would have been done in late 1926, and yes, it’s hung straight despite the angle I took the photo of the frame from).

What are lessons you learned from your family?

Share
Losing our humor: Speech, consequences and throwing stones

Losing our humor: Speech, consequences and throwing stones

At the end of 2019, Don Imus died. For almost 50 years, he was one of those people media outlets like to call a “controversial radio talk show host.” When Howard Stern entered the picture, that moniker turned into “shock jock.”

Imus was good at his job. His job was to talk to a soundboard, engage people he couldn’t see, and keep listeners. He gave and raised money to help beat childhood cancer, but he also taunted a lot of people, frequently in a mean-spirited way. Certainly not everyone was upset to see him go.

In 2007, he called the Rutgers women’s basketball team a bunch of “nappy-headed hoes.” It was more racist than jokey, given his history, previous and subsequent. In 1993, he’d called journalist Gwen Ifill, who was black, a “cleaning lady.” He used anti-Semitic terms to refer to Stern in 1984. When talking about former NFL player Pacman Jones’s arrests in 2008, he asked, “what color is he?”

Like all of us, he had some good, and some bad. He got fired a couple of times, but, as I mentioned, he was good at his job, and he always found a place to land.

This isn’t about Don Imus, though. It’s about talking, and, more specifically, the consequences of speech in an ever-more-public world.


The first amendment to the U.S. Constitution reads thus:

Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.

Of course, there are limitations to each of these freedoms. You can’t sacrifice children, even if your religion calls for it. You can’t incite a mob to violence or make serious threats of violence, even though that limits speech. You actually have to get a permit in most places to peaceably assemble.

And remember, these are only legal freedoms. They have nothing to do with how the masses respond, rightfully or wrongfully.

I work for a collection of newspaper-affiliated websites. Most of the sites allow commenting on their stories, and some stories garner a lot of comments. Hundreds. Sometimes thousands. We try to keep an eye on comments, and disable comments that we feel violate our community rules. Those who violate them severely or repeatedly may see their accounts blocked for a day or pulled altogether.

Personally, I think our rules are clear and easy to understand. We reformatted a section of our user agreement to take out the legalese. Read them here:

Differences of opinion make for great discussion, but please do not abuse other users through name calling or ad hominem attacks.
 
Do not post dehumanizing material. This means content that is racist, obscene, xenophobic, homophobic, misogynistic or bigoted against individuals or groups. Please help the community by flagging such content.
 
Please use common sense. Do not violate anyone’s privacy by posting identifying information or encouraging anyone else to do so. Do not encourage violence or criminal activity.
 
Please stay on topic. Posts that criticize moderation or distract from the article’s topic by introducing unrelated hot-button topics may be removed.
 
Please be thoughtful. Comments that negatively characterize broad groups of people may be removed. Such assertions, which may feel satisfying to write, are unlikely to change anyone’s mind and make it significantly more difficult to have a productive discussion.
 
Ask: Is it true? Is it necessary? Is it kind?

I don’t believe all comments need to be kind, but I do believe they shouldn’t be unnecessarily mean.

We are frequently accused of being opponents of free speech. When asked about free speech personally, I present these two caveats:

(1) Freedom of speech does not mean freedom from consequences (i.e., say what you want, but you might get fired, kicked out of a restaurant, or punched in the face).

(2) Right to free speech does not mean right to a platform. Nobody is required to publish your book or make your movie, and our sites are not required to let you post whatever you want (and for that matter, YouTube, Twitter and Facebook can say farewell whenever they want; they’re all businesses that you don’t own).


Enter Chuck Bonniwell. You’re probably by now familiar with Bonniwell’s story, though you may have forgotten since it happened almost three weeks ago.

A talk show host for Colorado radio station KNUS, Bonniwell commented that we needed “a nice school shooting” to interrupt the House Donald Trump impeachment proceedings.

He and his co-host immediately knew it was the wrong thing to say. You can’t un-say things when the mics are live and so is the broadcast, but you can sure try to backpedal.

And this, obviously, was something there’s no backpedaling out of. Bonniwell was unceremoniously fired, which, of course, he should have been. It’s what happened next shows how far we’re falling.

Most people recognized this was a joke. Most people recognized it was a bad one. No one thinks Bonniwell shouldn’t have been fired. No one actually wants a school shooting.

Once upon a time, local radio was local radio. Only people listening would have heard about this. A story like this, maybe the local paper picks it up. Maybe the wire picks it up from there and it’s read and heard all over the country.

Maybe a few million people would get angry in the privacy of their own living rooms. But now with social media, not only can they get angry in the privacy of their own living rooms aloud for the world to see, they can tag Bonniwell and make sure he has an opportunity to know what they all think of him.

Everyone.

Based on this one misguided thing he said ‐ a thing he knows was misguided. It’s not even like he thinks it was a good thing to say. Seconds after it came out of his mouth, he knew it wasn’t funny, he was going to be fired, and for a while, he was going to be that guy that joked about a school shooting.

But no one is prepared for the entire world telling you you’re a bad person. Especially since most of the world doesn’t understand the difference between a bad person and a person who says a bad thing.

If a four-year-old hits a classmate, you don’t tell the four-year-old she’s a bad person. You tell her that hitting people is a bad thing to do. If you tell her she’s a bad person, it’s not long before she believes she’s a bad person only capable of doing bad things.

I know absolutely nothing about Bonniwell. I didn’t even listen to the clip. He might be a horrible human being, but probably not. He has a wife and he had a job at a place where other people work, in a decent-sized market, which means he probably had his start somewhere else.

But if, I don’t know — hundreds of thousands? millions? — of people on social media call him evil and bad and horrible, maybe he actually becomes those things. You say something to someone enough, they believe it.

Don Imus? Not exactly the picture of tolerance, but not everything he did was terrible. People are nuanced.


I remember the exact moment I became aware of the things we are now calling “cancel culture” and “virtue signaling.”

A PR representative named Justine Sacco flew to South Africa. She tweeted this: “Going to Africa. Hope I don’t get AIDS. Just kidding. I’m white!”

By the time she’d landed, she was trending on Twitter, with #HasJustineLandedYet. Someone came to the airport in Cape Town to take her picture to show she had, in fact, landed. She was fired from her job.

I took part in the piling on.

If you don’t know someone, it’s hard to tell on Twitter if they’re telling a joke, unless it’s super well-crafted. Or their bio actually identifies them as a comedian, or the account as a satirical account.

Obviously, that’s a terrible thing to say. But if you were a comedian, you could probably get away with it.

Cancel culture is the notion that, if you’ve ever said or done something awful, you are a horrible human being and you don’t deserve to ever have a job or a family or anything. There is no such thing as reform, as learning, or jokes.

Think James Gunn, the director who was fired from the Guardians of the Galaxy franchise after someone pointed out some offensive, not-funny-anymore, decade-old tweets. He was later reinstated, because he’s good at his job and he doesn’t actually think rape is funny.

Or Kevin Hart stepping down as 2018 Oscars host after someone found some homophobic tweets from 2009 up through 2011.

You know, because people can’t evolve, ever. That’s not true, actually. Most people are constantly evolving. If you hold everyone up to who they were 10 or 20 years ago, you’ll never see who they are now.

Jon Ronson wrote a story for the New York Times about Sacco, and he later wrote a book about people who had been shamed publicly.

Virtue signaling, on the other hand, isn’t about the person who once said something bad, it’s about the person pointing it out. It’s a way to say, “I, too, am outraged by this.” When you post on Facebook about something most of your friends will agree with, you’re virtue signaling. When I piled on to #HasJustineLandedYet, I was virtue signaling.

I try not to do it anymore. If I’m tweeting angrily about something, I try to back it up with reasoned thought. If I can’t, I don’t tweet it, or I admit that I don’t know why I’m angry.


So what’s wrong with pointing out problem speech when we see it?

On the surface, nothing. But what happened to the days of pulling the offender aside and saying, “hey, that was kind of messed-up”?

What we do now is more akin to lining up, pointing our fingers at the offender and saying, YOU MESSED UP! and maybe tossing some rocks on your way out.

And much like the stoning scene in “The Life of Brian,” most of us do it anonymously or semi-anonymously.


The path forward

I’m not sure how much uglier our public discourse can get. I can’t remember the last time I saw a thoughtful discussion in an online textual forum. That might have something to do with how we read online. On a big screen (like a laptop), we read in an F shape — we read the first couple of paragraphs in full, then we scan down the beginning of subsequent paragraphs, occasionally reading a full line. That’s why a lot of long-form writing is broken up by so many photos and videos and lines, so that if you skip, say, seven or eight paragraphs, maybe you’ll pick back up for a couple here and there.

On a small screen (like a cell phone), we read as long as our attention lets us. Our attention spans are getting shorter and shorter.

And when we do read long-form, we tend not to respond in long form.

I think what’s going to happen, for the thoughtful, is a move away from social media as a forum. It will become more and more a marketing space. Podcasts, YouTube and IGTV are probably the future, although more than likely they’re the present, and I’m just behind. I mean, look at this here; I still write a blog. Most of the YouTube videos I watch are video versions of podcasts that I would be listening to on my phone if I weren’t sitting in front of the computer for work. I almost never turn the sound on for an Instagram video; I’ve never clicked through to “watch the full IGTV video.”

This sounds pessimistic, and it really is my view that we’re not in a great place. It’s also my view that, if we want this to change, we will need to alter our course significantly.

I also think it’s worth altering our course and fixing it. The thing about hard work? It’s hard. That doesn’t mean we shouldn’t try. Onward.

Share
Give a shit

Give a shit

“It’s so hard to get people to give a shit,” Whitney Cummings tells Bert Kreischer.

And she’s talking about everything — #MeToo, truth, your friends.

Cummings had just come off a social media blackmail incident. She took a selfie in the shower, put it on Instagram and Twitter, and then realized her nipples were showing. She took it down, but not before someone downloaded the image and began contacting her, asking for money in exchange for not posting the photo.

She took charge, posted the image, and some of her friends in the comedy world added to the thread with some of their own embarrassing pics.

They gave a shit about her, and proved it publicly, sometimes in a funny way that most of us in the “normal” work world never would have done.

The more I think about it, this is the essence of what I want Better Humanhood to be, in three words.

Give a shit.

Why? Because it’s important. We post belittling memes on Facebook. We yell at strangers on Twitter. We find something we don’t like on page 147 of a new book and we declare everything in that author’s collected works worthless. We taunt people for not knowing the answer to every conceivable question, and we refuse to allow that people grow smarter as they age.

Yet when we sit down and speak to people in real terms, without hypotheticals and without making up some mysterious “other,” we can disagree civilly and maybe even learn from each other.

The people we are when we’re having a sandwich and a beer together, uninhibited by the expectations of people in our in-group or the need to virtue signal (that is, declare a view popular among people you wish to impress, totally devoid of nuance)? Those are the people we should be all the time.

If Red Sox and Yankees fans can sit together in the stands, eat hot dogs and enjoy the game — meanwhile hating with every fiber of their being the other’s favorite uniform — surely we can disagree on the best way to move our world forward without throwing bricks at each other.

If we can’t give a shit about each other, why even be here? Be better.

Share
The myth of celebrity

The myth of celebrity

I was listening to Sean Lennon on Marc Maron’s podcast recently. Yes, that’s one of the sons of Beatle John Lennon and Yoko Ono.

He put a couple of things in perspective for me.

One was the role of a dad (which is something I think about every day, by the way). He discussed how surreal it was to watch thousands of people gather in Central Park every year on his dad’s birthday for years after John Lennon was killed. He said he hears all the time from people, “you have no idea what your dad meant to me.”

Which he feels kind of hijacked his dad’s death from him — this wasn’t just a man who moved him with his music. John Lennon was the guy who cut his meat for him at dinner and taught him how to put on pants.

[Pause. We’ll get back here, I promise.]

The other thing Sean Lennon said was that he tries not to spend too much time with famous people he admires, because eventually, whatever expectation he had of them on a pedestal crumbles, and he can see their humanity.


Joan Jett has been a rock star my entire life. She first recorded a version of “I Love Rock & Roll” in 1979 (the hit version came a couple years later). I turned three that year.

I had the opportunity to have a brief phone interview with her, in 2002. She was open to chatting but generally quiet, something that also came across in person the couple of times I’ve been able to meet her; much different from her giant on-stage persona.

I don’t remember a lot of that 10-minute interview, more than a decade and a half later. I do remember asking about politics in our post-9/11 world; she had things to say and not enough time to say them. But here’s a thing that still stands out to me: I asked her what it was like to have to play the same song, night after night, 300 or more performances a year.

Because if you went to Joan Jett concert and she didn’t play “I Love Rock & Roll,” you’d be upset.

I imagined it would get old, especially, at that time, more than 20 years on.

In fact, she said, it was the opposite. Imagine everyone in the audience, every night, anticipating the song all night, the cheering when they heard the opening bars, and then hearing everyone — everyone — singing the song she made famous. (No, she didn’t write it, but it’s been four decades since people were flocking to see Alan Merrill, and if you’ve heard his version, with or without the Arrows, it was probably because you went searching for where it came from.)

It doesn’t matter how the rest of your night is going. If it was the worst day ever, and you had that waiting for you at the end of the day — standing in front of 100 or 1,000 or 20,000 people, smiling and singing your music to you — your day ends well.

Because celebrities are just people. They have bad days, too. Jett happens to have a go-to bad-day killer.


If you’re a fan of the Evil Dead films and TV series, Bruce Campbell comes across in real life basically the same way he comes across on the screen. Not quite so arrogant, maybe, but he is confident and he doesn’t have time for your bullshit.

He tells crowds at speaking events that, when it comes time for Q&A or meet-and-great or book signings or whatever, no, he will not say “woodshed.”

“I am not your monkey,” he says.

[There’s a scene in one of the movies when the visual and audio are from different takes; you hear the word “woodshed,” but Campbell’s character, Ash, doesn’t move his lips.]

Just like you have a sibling who’s sick of hearing an embarrassing story about themselves every year at Thanksgiving for decades, it’s just not funny to him anymore.


Celebrity is an old word, dating to the late 1300s, when it was related to celebrations of religious or social rites. By the 1600s it came to mean fame, and it was in the 1800s that it came to mean a person.

Of course, it also still means fame. So, a celebrity is a person with celebrity.

Celebrate good times. Come on!


Sean Lennon wears pants. When he was a little kid, someone taught him how to put pants on; that someone happened to be one of the most famous songwriters the world has ever known. And that someone was killed when Sean Lennon was five years old.

Apart from all of the celebrations of his dad’s life on the anniversary of John Lennon’s murder, Sean Lennon is reminded every year on his own birthday — Sean and John were both born on October 9, 35 years apart.

In December of this year, John Lennon will have been gone for 39 years. He lived only a couple of months past his 40th birthday.

Who knows what he would have done artistically? He and Yoko Ono were married for 11 years, about 10 of them after The Beatles broke up. Sean Lennon credits her for his music and film career — she’s very good at the technical aspects, and she creates across a variety of art forms — so John Lennon had plenty of prospects in the realm of creative partnership.

It seems they — John and Yoko — never cared much for the expectations of the crowd. In fact, looking back on the creative arc of The Beatles, the same is true. This is a band that recorded “I Want To Hold Your Hand,” “She Loves You” and a version of “Twist and Shout” in 1963 and four years later they were churning out “Lucy in the Sky With Diamonds” and “Strawberry Fields Forever,” followed in 1968 by the likes of “Happiness is a Warm Gun” and “Rocky Raccoon.”

But more than artistically, what could he have done as a human and a father?

Sure, his wealth, like that of many other people, could have done a lot of good in the world if he wanted it to. But it’s his relationships with his wife and his sons that really could have made the biggest impact.

Kim Kardashian may be helping get people out of jail for minor drug offenses, but apart from cursing her kids with socially unusual names (Psalm, Saint, North and Chicago), feeding her kids, teaching them how to use the toilet and all the things that get them ready for life — that might not have the widest impact, but it certainly has the deepest.


A reminder, then, that celebrities are people. I think the biggest issue for celebrities is a societal cocktail of myths composed of expectations, ownership and dehumanization.

I want to start with dehumanization. I think it applies especially to youth stars. We gave Justin Bieber, Britney Spears and Lindsay Lohan millions of dollars when they were teenagers. There are well over a thousand malls in the U.S. that don’t allow teens without supervision.

Translation: We can’t trust kids to handle themselves with $10 in a mall food court, but we’re shocked when teenagers who get more money than most people ever see before they get their drivers licenses go off the rails?

This shows exactly that celebrities are just people. If you were a fairly well-disciplined teen who only occasionally got into a little mischief, what were the factors keeping you from going absolutely out of your mind crazy? Maybe you needed that job to pay your car insurance. You needed your parents to not ship you off to some juvenile detention center. Mrs. Murphy down the street was always watching out for who was causing trouble.

But what if you didn’t need a job or anybody to buy food or keep a roof over your head? What if millions of people admired you? And what if almost as many millions of people would cheer if you fucked up?

That’s the dehumanization of celebrities.

Expectations and ownership are somewhat related. Expectations may come regarding creative output, or they may come regarding interaction, which I think is implied ownership. Some examples.

Do you think the “Across the Universe” Beatles may have had a different fan base from the “Ticket to Ride” Beatles? I’m sure there were some people who came in at “Ticket to Ride” and stuck around for “Across the Universe,” but my guess is that any contemporary fans of both those iterations came in at “jai guru deva om” and worked their way backward to “She’s got a ticket to ride and she don’t care,” while most of the people who profess to enjoy both the early and the later sounds weren’t picking up the music at the time it came out.

That’s expectation for creative output.

The “ownership” side can be a little more tricky. Think for a minute about JD Salinger. His most famous work was published in 1951 and almost 70 years later it still sells a quarter million copies each year, even though The Catcher in the Rye is most certainly widely available at your local used book store and library.

You might remember that when he died in 2010 people wrote tributes as to how much he and his work would be missed. But at that point he hadn’t published anything new in 45 years and hadn’t given a public interview in 30 years. If you were on Twitter in 2010 to “honor” him, chances are you could barely put together a sentence the last time anybody had heard from him.

Yet the public felt such ownership over him that it entirely ignored his decades of staying out of the public eye.

Another aspect of expectation and ownership is the notion that celebrities are only celebrities because the public lets them be, and can dictate all the interaction. But a lot of being famous is about putting in hard work — just because someone else does an entirely different kind of work than you do doesn’t mean they’re not working their ass off at it. So if someone famous wants to just eat dinner with their family? No, they don’t owe you a selfie.

And if someone doesn’t want to shake your hand in the restroom? Get over it. That’s a person, not someone you own.


As the case of JD Salinger shows, it’s nearly impossible to just stop being a celebrity. According to Kurt Cobain’s suicide note, it really was not fun.

For example when we’re back stage and the lights go out and the manic roar of the crowds begins., it doesn’t affect me the way in which it did for Freddie Mercury, who seemed to love, relish in the the love and adoration from the crowd which is something I totally admire and envy. The fact is, I can’t fool you, any one of you. It simply isn’t fair to you or me. The worst crime I can think of would be to rip people off by faking it and pretending as if I’m having 100% fun.

(Yes, I’m aware there are theories that it wasn’t really a suicide.)

Treat people well. And treat celebrities like they’re people, because they are. Except Lassie and Benji, who were dogs.

Share
Gaining some perspective

Gaining some perspective

A professor in my introductory philosophy class in college used to open class with something like a brain teaser to tweak our perspective. Here’s one:

An electric train is traveling at 70 mph due west. A harsh wind is blowing 40 mph at 30 degrees east-southeast. Which direction and at what speed is the steam blowing?

Now, this is a little easier when you can read back through it, but the answer is, of course, “there is no steam; it’s an electric train.”

Another one was:

Arizona has the largest population of people with asthma. Why?

Some answered that the air must be very polluted. The real answer, though, is that the air is so clean that people with severe asthma move there in large numbers, since it’s so easy to breathe.

It’s a matter of taking a few moments, thinking about the information provided, and gaining some perspective.

Back in the first part of our series on empathy, we gave Roman Krznaric’s definition of empathy from Empathy: Why It Matters and How To Get It as “the art of stepping imaginatively into the shoes of another person, understanding their feelings and perspectives, and using that understanding to guide your actions” (p. x).


Perspective can be difficult, particularly if you systematize your world.

If it’s not obvious already, we’re using perspective to mean “the state of one’s ideas, the facts known to one, etc., in having a meaningful interrelationship.” You might be familiar with the word in terms of spatial relationships (such as making a two-dimensional drawing appear three dimensional by sending lines — such as streets and buildings — toward a vanishing point), and that’s actually how the word originates, as the “science of optics.”

The first appearance of “perspective” as a sort of mental outlook comes in 1762.

For an exercise in perspective, wander on back through history 50 years at a time. To hear interesting perspectives you may not have considered, listen to some Revisionist History.


Let’s make an argument for considering perspectives different from your first take.

First, it’s at the very least an interesting mental exercise, and we spend an awful lot of time not doing any actual thinking. Thanks, Facebook and YouTube!

Next, you don’t even have to see something from another person’s point of view. If you can step outside of your own emotional response to an event or an assertion, it shows you have control over your emotions and your thoughts. Not in a way that shows you don’t feel, but in a way that shows you’re not ruled by emotion, which is particularly important when you’re in a potentially dangerous situation, especially if you’re responsible in such a situation for a family or other group.

Being able to consider various perspectives gives you power, plain and simple.

Let’s now consider seeing something from other people’s perspectives.

You might learn something, particularly about context. I’m going to pick a little bit on President Trump right here (I know, been a while since I’ve done that), in particular with two things, once during the campaign, and once more recently.

The first is “America First.” We actually wrote about this a few years ago in a discussion of context. The America First Committee shut down when the U.S. entered World War II.

On December 7, 1941, the Japanese Navy pulled a surprise attack on the U.S. naval base at Pearl Harbor in the Hawaiian Territory (Hawaii was not yet a state). The attack marked America’s entry into World War II.
 
It also marked the beginning of the end of the America First Committee, a large anti-war group that shut down on December 10 of that year.
 
This wasn’t the tie-dyed hippie peace, love and understanding anti-war movement we all know from movies about Vietnam. And it wasn’t the “hate the war, love the troops” anti-war groups we know from the more recent American wars.
 
This was a “we’re white Protestant Americans, screw everybody else” group. They were hard-left isolationists. They wanted to make sure America didn’t bail out Europe (you know, again, like after the first World War). They wanted America to turn away Jews fleeing the Holocaust. They wanted to shut the borders, cut off aid, and rely on homegrown everything — avoid all international trade as long as possible.

Meanwhile, in a speech this past spring, Trump told a room full of Jewish Republicans that America is full (he also called Benjamin Netanyahu their prime minister, even though Netanyahu is Israel’s prime minister and the people he was speaking to were Americans).

In 1942, boats full of Jews from Europe showed up on American shores. They were told America was full, and they were sent back to Europe, where almost all of them would become Holocaust victims.

And, in reality, America can’t possibly be full. They’re building 31 new houses two blocks away from me. If America weren’t taking new people, the only people who could go into those houses would be vacating 31 other houses, so there’s certainly room for 31 more families. Just saying.

If you have the ability to see the world from someone else’s perspective, it makes it pretty easy to not hurt someone intentionally.

And another reason to consider others’ perspectives is to get an understanding of their communicated intent. Communication is a two-way street, so when you are on the receiving end, maybe take a minute and determine what the person you’re listening to really meant when they said something.

Share
Plant parents, ‘first world problems’ and human improvement

Plant parents, ‘first world problems’ and human improvement

I was sitting at work, reading through the wire, and came across the headline, “Millennials fall in love with houseplants.”

I was worried for a moment that this was going to be about some moron who wanted to push legislation to allow us to marry our houseplants. Thankfully, we’re still better than that. The real story is that millennials are reviving the market for houseplants.

It got me thinking, though, about all the stuff we’ve blamed on millennials. Here’s the thing: it’s our fault, those of us who are members of Gen X, who are Boomers, and further down the line. We do it on purpose, and every generation hates what they allowed the next generation to do.

Let’s first be clear on whom we’re talking about. Millennials are not teenagers. Writing in early 2019, some millennials are approaching 40. Depending upon whom you ask, millennials are the cohort born between 1980 and 1994; I’m going to use the cohort studied in social science defined by those born between 1981 and 1996.

That means that right now, the millennial generation (so called because they were coming of age around the turn of the millennium), is comprised of people between 22 (turning 23 this year) and 38 years old. Those are people who are at a fairly wide range of points in their lives, from within the first five years of their careers to people who have become billionaire entrepreneurs.

It’s also a somewhat split generation, technologically. I’m a late Xer, born in 1976. I first got the internet in college; it was dial-up and we didn’t have Mosaic available, so the web was image-less. An older millennial who got the internet at the same time was getting it in middle school and by the time they got to college, most universities had some sort of T-1 connection and images were easy to download.

Millennials born later, however, got high-speed internet in their homes before they went to high school. The youngest millennials got high-speed internet in their homes before they were in kindergarten.

That’s a huge technological advance, and it’s rare.

Think about the technology someone born in 1990 had by the time they were in high school versus the technology someone born in 1940 had by the time they were in high school.

The iPhone came out in 2007. By 1955, half of all homes had a black-and-white television.

Imagine if we’d just skipped animal-drawn plows and gone straight from hand-tilling to big commercial tractors, in the course of a generation and a half or so. That’s what happened from young Boomers and older Xers to millennials.

If every generation shakes their heads and says, “kids these days,” this is a leap beyond the typical gaps in music, clothing and pastimes.

My wife’s grandmother died in 2015 at the age of 99. She went around with her physician father in a horse and buggy making house calls in Syracuse when she was a little girl. In her later years, she sat in an easy chair getting a tour of her grandson’s house in Japan on a computer screen while they had a nice chat.

We are in a weird time.


Do a Twitter search for #FirstWorldProblems. We’re not trying to cure polio anymore. We’re not looking for a solution for all the horse scat on our city streets. We’re not trying to find a reasonable place to dig a hole for our waste that will be far enough away from our homes and wells that we’re not going to get sick but close enough that we’re not so worried about bears.

On the one hand, there’s, “Really? This is what you complain about?” On the other hand, there’s, “Hey, we don’t have any real problems, so we’re just going to make up stuff to complain about.”

Look, normally, we innovate incrementally. But occasionally, we get this huge jump. At the end of the 19th century, the biggest technological problem we faced was a more efficient way to get horse feces off the streets of urban areas. So many people traveled by horse and buggy that it was hard to keep your boots clean while walking, the smell was crazy, and disease was everywhere, especially when it rained.

Some people tried inventing new things to collect and dispose of the fecal matter.

But then someone invented the car, and people largely stopped traveling by horse and buggy.


We used to think no human could run a sub-four-minute mile. Then, in 1954, a British runner named Roger Bannister did it. His record lasted about a month and a half.

Over the past decade, more than a dozen U.S. runners are running sub-four-minute miles every year.

We are hugely adaptable to any challenge put in front of us. Read The Rise of Superman by Steven Kotler. As soon as we see something accomplished we previously thought was impossible, lots of people manage the feat and improve upon it.


So what happens when we’re not sure what needs accomplishing?

The problems millennials were told they’d have to solve while they were growing up are not the problems they’ll have to solve at all.

Which means they’ll either tackle problems we hadn’t thought of yet, or they’ll create problems to tackle.

I don’t think this is a generation that will get much satisfaction out of incremental improvement. People in this generation are making millions on Instagram and YouTube. They’re making billions creating things like Facebook and Snapchat.

A booming houseplant industry isn’t the only thing coming out of the latest installment of, “Hey, you! Kids! Get off my lawn!”

And the generation that’s grown up since, entirely in the internet world? We’re not going to recognize people or the planet when Generation Z has matured and shown its identity.

Nobody stops progress; we either get on the train or get run over by it.

Share
From empathy to kindness and compassion

From empathy to kindness and compassion

We said in the first installment in this series on empathy that we’re at least as interested in kindness and compassion as we are in empathy, and a lot of the arguments in this installment grow out of the notion that we should put together moral codes and systems that result in kindness and compassion, without requiring of people the neural work of empathy.

As we mentioned last time, religions and governments are two types of institutions that attempt to instill such codes, but they do so in such a manner that frequently sets up us-versus-them scenarios: if you don’t ascribe to my faith, my attempt to instill a moral code on you based on my faith simply makes you angry.

Similarly, one need look no farther than our own Congress here in the U.S. to see an us-versus-them mentality on even the smallest thing, never mind if we were to try to instill moral rules instead of merely a code of governance.

First off, why even have a code? In her book Ordinary Grace, Kathleen Brehony relates an old Middle Eastern saying: “Trust Allah, but tie your camel to a post” (p. 174).

In other words, trust in a benevolent power, but recognize there are still assholes around.


The arguments against empathy as a moral code largely fall under the guise of empathy not being enough. Of course it’s not. Nothing is; if we only needed one thing, we wouldn’t have had to build a whole system.

We do know from the first part of this series that empathy is a specific response, both emotional and physical, that can be identified in the brain.

“Some people use empathy as referring to everything good,” writes Paul Bloom in Against Empathy, “as a synonym for morality and kindness and compassion. And many of the pleas that people make for more empathy just express the view that it would be better if we were nicer to one another” (p. 3).

Well, yes, it would be better if we were nicer to one another. That’s the whole point. But we can’t force empathy. Some people just don’t have it, as we saw with James Fallon in The Psychopath Inside.

That’s why we create systems.

But, Bloom writes, a system based on empathy is not enough; it doesn’t reach enough people. We can empathize with one or two people simultaneously, he points out, but not millions. “This perverse moral mathematics is part of the reason why governments and individuals care more about a little girl stuck in a well than about events that will affect millions or billions” (p. 34)

Barry Schwartz, in a book of the same name, argues for what he calls Practical Wisdom, and notes that empathy is part of the schema.

[E]mpathy — the capacity to imagine what someone else is thinking and feeling — is critical for the perception that practical wisdom demands. Such empathy includes both cognitive skill — the ability to perceive the situation as it is perceived by another — and emotional skill — the capacity to understand what another person is feeling (p 21).

Ultimately, empathy is an emotion, and it can take away our ability to make good decisions. “Feelings are compelling,” write Greg Lukianoff and Jonathan Haidt in The Coddling of the American Mind, “but not always reliable. Often, they distort reality, deprive us of insight, and needlessly damage our relationships” (p. 34).

Schwartz describes a hospital janitor whose job description is so focused on cleaning that it doesn’t even mention people. The janitor, however, understands he’s also there to make patients and their families welcome. It’s his job to vacuum the lounge, but maybe he deviates from his normal path because a family who has been there all day is napping. That’s empathy at play in the right way — the janitor’s emotional quotient is high enough that he is comfortable diverting from what his bosses want.

On the other hand, Schwartz describes a doctor who had such empathy for an elderly cancer patient that he spared the patient the indignity of rolling him over to check for bed sores, and the patient went into septic shock. The episode showed too much empathy and not enough detachment.


So maybe empathy isn’t always good?

As we mentioned near the end of the post on systematizing moral codes, humans can make exceptions that machines can’t. A robot would have saved the cancer patient’s life by checking for bed sores, because that’s what its programming said to do. But the robot also would have vacuumed the lounge, waking the family who was finally getting some rest.

“There is a long history of suspicion that emotion is the enemy of good reasoning and sound judgment, and rightly so,” Schwartz notes. “Emotions can often control us instead of the reverse” (p. 21).

But emotion isn’t all bad, he points out later. “Practical wisdom is not simply knowing the right thing to do but actually being motivated to do it. And often it is emotion that propels us to act” (p. 75).

Whatever it is that propels us to act, we still need to determine what it is we should act upon — what, exactly, are the actions we should take?

InThe Moral Arc, Michael Shermer writes abut morality.

Morality involves how we think and act toward other sentient beings and whether our thoughts and actions are right (good) or wrong (bad) with regard to their survival and flourishing. … [A] principle of moral good is this: if other persons are involved in an action, then always act with their good in mind, and never act in a way that it leads to their loss or suffering (through force or fraud) (pp. 334-5).

Bloom argues that morality is so important to us it seems to have evolved to be an ingrained feature of humans. “There is a lot of evidence,” he writes, “that the foundations of morality have evolved through the process of natural selection. We didn’t think them up” (p. 6).

Schwartz calls his practical wisdom a “moral skill” (p. 8).

Bloom does point out some problems with empathy that address its specificity. “Empathy causes us to overrate present costs and underrate future costs” he writes (p. 55), pointing to the murder rate in Chicago.


Way more schoolchildren are killed every year in Chicago than were killed at Sandy Hook. Donations poured into Newtown after the shooting despite the fact that it’s a wealthy community and they had no place to put all the stuff and asked people to stop donating, but most people don’t send anything to Chicago (p. 32)

Humans also, as a species, create tribes, and we are more likely to support those in our own tribe. “Intellectually,” Bloom writes, “a white American might believe that a black person matters just as much as a white person, but he or she will typically find it a lot easier to empathize with the plight of the latter than the former” (p. 31).

Empathy isn’t the only thing we need to make good moral choices, he argues. “There is more to kindness and morality than empathy. To think otherwise is either to define empathy so broadly as to gut it of all context” (p. 26).

“Consider things like compassion and concern,” he writes. “You don’t empathize with people dying of malaria but you certainly feel compassion or concern for them” (pp. 40-41).

Writing in Enlightenment Now, Steven Pinker considers the wisdom of crowds. If you were to ask a hundred people how many jelly beans were in a jar or how much a prize pig weighed, you’d get a wide range of answers, but you would find that when you averaged out the answers, you’d be pretty close to the correct figure.

The wisdom of crowds can also elevate our moral sentiments. When a wide enough circle of people confer on how to treat each other, the conversation is bound to go in certain directions … we’d be wiser to negotiate a social contract that puts us in a positive-sum game: neither gets to harm the other, and both are encourage to help the other (p. 28).

If we’ve done that down the generations, perhaps Bloom’s assertion that some morality has evolved with us through natural selection is true.

“For all the flaws in human nature,” Pinker continues, “it contains the seeds of its own improvement, as long as it comes up with norms and institutions that channel interests into universal benefits” (p. 28).

In other words, we don’t need to set up all these us-versus-them scenarios. It benefits us to create win-win propositions to move humanity forward.

In his manifesto Team Human, Douglas Rushkoff really puts forth why we might want to keep pushing win-win solutions:

We cannot be fully human alone. Anything that brings us together fosters our humanity. Likewise, anything that separates us makes us less human, and less able to exercise our individual or collective will (pp. 3-4).

Species in the wild have grown to cooperate, and that’s how they survive. Maybe we’ll learn that for ourselves, too, whether it means moving toward cooperation through empathy, or through something more akin to kindness and compassion.

Share