Fascinating Facts and Trivia.

Professor Christy Wampole notes that the essay form has become very personal:

Essayism consists in a self-absorbed subject feeling around life, exercising what Theodor Adorno called the “essay’s groping intention,” approaching everything tentatively and with short attention, drawing analogies between the particular and the universal. Banal, everyday phenomena — what we eat, things upon which we stumble, things that Pinterest us — rub elbows implicitly with the Big Questions: What are the implications of the human experience? What is the meaning of life? Why something rather than nothing? Like the Father of the Essay, we let the mind and body flit from thing to thing, clicking around from mental hyperlink to mental hyperlink: if Montaigne were alive today, maybe he too would be diagnosed with A.D.H.D.

The essayist is interested in thinking about himself thinking about things. We believe our opinions on everything from politics to pizza parlors to be of great import. This explains our generosity in volunteering them to complete strangers. And as D.I.Y. culture finds its own language today, we can recognize in it Arthur Benson’s dictum from 1922 that, “An essay is a thing which someone does himself.”

Anyone who has read even a little bit of science fiction is familiar with the idea of one of the most ominous threats to space exploration: the idea that an astronaut will lose his mind and attack other astronauts or destroy the mission itself. It’s a real enough threat that NASA is developing technology to prevent emotional meltdowns before they happen. Katie Drummond reports:

NASA is conducting its own research on the issue. Last week, the agency handed out a $1.3 million contract to psychologists at Michigan State University to further the development of a psychosocial sensing “badge” that astronauts would wear during their mission to the red planet. The pocket-sized badges, says project leader Steve Kozlowski, PhD, will be designed to track physiological markers of an astronaut’s psychological health — like blood pressure and heart rate — as well as the dynamics of their social interactions. “You can never ensure that nothing bad will happen,” Kozlowski said. “But a coherent means of assessing interactions and stress … is one way to protect against any negative outcomes.”

According to Mental Floss, yes:

Almost every syllabus, teacher and standardized test points to the ubiquitous No. 2 pencil, but are there other choices out there? Of course.

Pencil makers manufacture No. 1, 2, 2½, 3, and 4 pencils—and sometimes other intermediate numbers. The higher the number, the harder the lead and lighter the markings. (No. 1 pencils produce darker markings, which are sometimes preferred by people working in publishing.)

The current style of production is profiled after pencils developed in 1794 by Nicolas-Jacques Conté. Before Conté, pencil hardness varied from location to location and maker to maker. Earliest pencils were made by filling a wood shaft with raw graphite, leading to the need for a trade-wide recognized method of production.

Writing in the New Yorker, Sam Sacks argues that debates about enduring literature are less about the artistic merit of the work and more about the social issues they raise:

A look through the Classics section of bookstores—in America or any of the Western democracies—bears out de Tocqueville’s instincts. The offerings are wide-ranging, tilting toward diversity and inclusion. But, more to the point, artistic brilliance is no longer the most important determining factor. What makes a classic today is cultural significance. Authors are anointed not because they are great (although many of them are) but because they are important.

In other words, the current criteria for classics are more a matter of sociology than of aesthetics. That’s why prose-toilers like George Orwell and Aldous Huxley are securely fixed in the canon while masters such as Frank O’Connor and Eudora Welty could easily be left out. “1984” and “Brave New World” are embedded in the weave of language and history, but what does Welty have going for her apart from stylistic perfection? Henry Miller survives—and will continue to survive—because the country once found him shocking enough to censor. (Likewise, D. H. Lawrence might very well be a footnote if not for “Lady Chatterley’s Lover.”) There’s better prose in the average issue of Consumer Reports than in most Upton Sinclair novels, but “The Jungle” triggered actual legislative reform and will therefore last as long as the United States does.

Farai Chideya, writing in The Nation, argues that American media is dangerously white and elite:

When I was a kid, my family loved watching science fiction films and television shows. Some of them, from Star Trek to Soylent Green, featured a multiracial band of humans, plus various sentient life forms. But in other features—let’s say the awesomely campy Logan’s Run—everyone (or nearly) in the future was white. My family suspended disbelief for the duration of the movie. Then, depending on our mood, we either laughed at or lamented the idea that anyone thought the future would be monochrome, except for the pantsuits.

Today I feel like I’m watching that movie all over again. This time, it’s called The Future of Journalism, and we can’t afford to suspend our disbelief….

A report by the Radio Television Digital News Association, meanwhile, found that in 2011, when 35.4 percent of Americans were considered “minorities,” only 20.5 percent of those employed in television were people of color; and, shockingly, only 7.1 percent of radio employees—in that medium, a sharp drop since 1990.