Fascinating Facts and Trivia.

It was the work of a couple of cartographers:26SCIB-popup-v2

We call ourselves Americans today because of the map’s makers, Martin Waldseemüller and Mathias Ringmann, young clerics in the cathedral village of St.-Dié, France. By incorporating early New World discoveries, their map reached beyond the canonical descriptions of Old World geography handed down from Ptolemy in the second century. On a lower stretch of the southern continent, the mapmakers inscribed the name “America” in the mistaken belief that Amerigo Vespucci, not Columbus, deserved credit for first sighting a part of that continent, South America.

Or possibly they favored Vespucci because he held more firmly to the growing consensus that this was indeed a New World, not the Indies (as Columbus so wanted to believe), and because he wrote more colorfully than Columbus about the people he encountered.

Writing in The American, Reuven Brenner argues that we need to reject the traditional four year model for a college degree:images (1)

There are at least 16 million youngsters enrolled in post-secondary education, with approximately 4 million graduating every year. Assume that from now on, each year, 4 million students join the labor force a year earlier. Each generation would stay one year longer in the labor force. How much annual income and how much wealth would this generate?

Assume that after graduation the average salary would be just $20,000 and remain there. With 4 million students finishing one year earlier, this would add $80 billion to the national income during that year. Or at an average annual income of $40,000, it would add $160 billion. Assume now that the additional $80 billion in national income would be compounding at 7 percent over the next 40 years. This would then amount to an additional $1.2 trillion of wealth – for just one generation of 4 million students joining the labor force a year earlier at a $20,000 salary. At $40,000, this would amount to $2.4 trillion by the fortieth year – again, for just one generation of 4 million people joining the labor force a year earlier. The added wealth depends on how rosy one makes the assumptions about salaries or compounding rates. Add 10, 20, or 30 generations, each starting to work a year earlier, and the numbers run into the tens of trillions of dollars.

Advocates for universal health are often heard declaring that health care should be a “universal right.” Avik Roy, at Forbes, argues that whetherimages one considers health care a positive or negative right, the claim is problematic:

It’s a great applause line, isn’t it, to say that “health care is a universal human right.”

But after the applause has died down, we’re left with the question that the left rarely takes time to answer: what is health care?

Let’s say there’s a new treatment for terminal prostate cancer, one that extends your life, on average, by two months. The treatment costs one million dollars per patient. Does every American have a right to that treatment? Is two months of life worth a million dollars?

What if I smoke two packs a day, and I come down with chronic obstructive pulmonary disease, a costly chronic condition. Do I have a right to the money of other people, in order to care for a disease that I, in all likelihood, brought upon myself?

A progressive might respond that we need to provide basic health care to everyone, so that no one is left dying on the street after getting hit by a bus. But we already provide “free” emergency care to every American. So what else counts as basic health care? Is Viagra health care? Is all health care a right, or just some? And who decides? These are the questions that no applause line can adequately answer.

While Descartes famously argued that animals “eat without pleasure, cry without pain, grow without knowing it; they desire nothing, fear nothing,images know nothing,” modern scientific research is demonstrating that they may have much more cognitive capacity than we have recognized.  John Jeremiah Sullivan explains in Lapham’s Quarterly:

New animal behaviors and capacities are observed in the wild, often involving tool use—or at least object manipulation—the very kinds of activity that led the distinguished zoologist Donald R. Griffin to found the field of cognitive ethology (animal thinking) in 1978: octopuses piling stones in front of their hideyholes, to name one recent example; or dolphins fitting marine sponges to their beaks in order to dig for food on the seabed; or wasps using small stones to smooth the sand around their egg chambers, concealing them from predators. At the same time neurobiologists have been finding that the physical structures in our own brains most commonly held responsible for consciousness are not as rare in the animal kingdom as had been assumed. Indeed they are common. All of this work and discovery appeared to reach a kind of crescendo last summer, when an international group of prominent neuroscientists meeting at the University of Cambridge issued “The Cambridge Declaration on Consciousness in Non-Human Animals,” a document stating that “humans are not unique in possessing the neurological substrates that generate consciousness.” It goes further to conclude that numerous documented animal behaviors must be considered “consistent with experienced feeling states.”

Artist Stefen Chow and economist Lin Hui-Yi wanted to visually represent what poverty means in different parts of the world and created a projectTPL_USA-072 called The Poverty Line, which shows “visual portrayal of items found in that country that could be bought by a person living at the poverty line.”

For developed countries, where there is relatively updated household consumption data, we focus on the average daily amount that a person at the poverty line would spend on food. For developing countries, we use the average daily amount that a person at the poverty line earns/spends. We have faced challenges in trying to develop a calculation method that makes sense across different countries’ systems, and this is our way of bringing it together.