Authors: Dan Lewis
A really interesting tidbit buttresses this theory: If your fingers’ nerves are somehow severed from the rest of your nervous system (and therefore your brain), they no longer prune up in water. This almost certainly means that the pruning effect is triggered by the brain and is not simply related to absorption.
But other wrinkles (pardon the pun) remain. So far, we have not yet been able to determine whether pruney fingers do, in fact, help us grip wet things better. Also, there is demonstrable evidence of the first theory occurring—blood vessels contracting, skin absorbing water, etc.—which again suggests that there is a lot more going on there than we know.
BONUS FACT
Water can be used to check patients for brain damage, using something called the caloric reflex test. Typically, when cold water is inserted into a person’s ear canal, his or her eyes will reflexively “look” toward the opposite ear. But when warm water is put in the person’s ear, he or she will “look” toward the ear with the water in it. People with significant damage to the brain stem do not have the same reaction.
Go to a mirror and look at either of your eyes. Then, while keeping your head still, look at the other one. As you do this, your gaze will change targets, as you are now looking at something different than before. But your eyes will not appear to move.
Now, go find a friend and repeat the experiment. Ask him or her to tell you if your eyes move as you glance from one eye to the other. Invariably, your friend will tell you that your eyes did indeed move—and obviously so. Switch roles and the illusion becomes obvious: Your friend, staring into the mirror, is moving his or her eyes—but unlike the rest of the world, sees no movement.
What’s going on here? Our brains are protecting themselves from the fuzzy, blurry imagery we’d otherwise “see” as our eyes glance quickly from point to point. That movement—called a “saccade” (pronounced “sah-COD”)—is simply too quick for our brains to deal with. So the brain, in effect, ignores what the eyes see, in a phenomenon called “saccadic masking.” Instead of processing and recording the blurred image otherwise caused by the eye movement, the brain replaces that milliseconds-long moment with a still image of the second item your eyes look at. This image replacement can create an eerie effect if you quickly dart your eyes at an analog clock, causing the clock’s second hand to appear momentarily frozen in time (known as the “stopped clock effect”).
During these saccadic masking moments, we are, effectively, blind. According to some, these tiny moments of time lost down the memory hole add up to as much as thirty to forty-five minutes a day—leaving us temporarily blind for roughly 2 percent of our lives.
BONUS FACT
The eyes of most birds do not move. In order to keep their world from bouncing around as they move, these birds have developed the ability keep their heads in the same place, relative to the rest of the world, even if the rest of their bodies are in motion. That’s why chickens, turkeys, pigeons, and other birds bob their heads as they walk—they’re trying to keep their eyes parallel to the ground. It also helps them with depth perception. Turkeys, for example, have eyes on opposite sides of their heads, and therefore have no natural 3-D vision; the bobbing provides extra visual information so they can estimate relative distances. However, this does not mean they have worse vision than us humans. Turkeys can turn their necks much farther than people can, allowing them to see things a full 360 degrees around.
Every November, many American families gather around the table, feasting on a Thanksgiving meal—the centerpiece of which is a turkey. It’s a celebration of many things but historically stems back to 1621, when European settlers (“Pilgrims,” as American elementary school children will surely tell you) marked the harvest with a similar celebration.
Turkeys are indigenous to the United States and Mexico; in fact, Europeans only first came into contact with turkeys roughly 500 years ago, upon discovery of the New World. So how did the turkey (the bird) end up with the same name as Turkey (the country)? Let’s follow that bird’s history from the New World to the Old.
As far as we can tell, the first European explorers to discover (and eat) turkey were those in Hernán Cortés’s expedition in Mexico in 1519. Spanish Conquistadors brought this new delicacy back to Europe and by 1524 it had reached England. The bird was domesticated in England within a decade, and by the turn of the century, its name—“turkey”—had entered the English language. Case in point: William Shakespeare used the term in
Twelfth Night
, believed to have been written in 1601 or 1602. The lack of context around his usage suggests that the term had widespread reach.
But the birds did not come directly from the New World to England; rather, they came via merchant ships from the eastern Mediterranean Sea. Those merchants were called “Turkey merchants” as much of the area was part of the Turkish Empire at the time. Purchasers of the birds back home in England thought the fowl came from the area, hence the name “Turkey birds” or, soon thereafter, “turkeys.” To this day, we’re simply carrying on the mistake of a few confused English-speaking Europeans.
But not all languages follow this misconception. Others, such as Hebrew, get the origin just as wrong, but in the other direction. The Hebrew term for turkey, transliterated as
tarnagol hodu
, literally translates to “chicken of India,” furthering the Elizabethan-era myth that New World explorers had found a route to the Orient. This nomenclature for the bird is so widespread that it makes a mockery of the historical basis for the term “turkey” in English. Why? Because the Turkish word for turkey isn’t “turkey.” It’s “hindi.”
BONUS FACT
As for Turkey, the country? The story isn’t as interesting. The word Turkey—actually,
Türkiye
in Turkish—can be broken up into two parts. “Türk” is a reference to people, potentially meaning “human beings” in an archaic version of the Turkish language. The “-iye” suffix most likely meant “land of.”
At $10–15 a pound, lobster is priced too high to be anything other than a delicacy. Even if you bring home the crustacean alive and cook and prepare it yourself, it is going to be an expensive meal. So for most Americans, lobster is a menu item reserved for a special occasion. Weekly would be a lot; twice weekly is out of the question. Only in a dream, perhaps.
Or, if you were a rather poor worker in the early part of the nineteenth century, a nightmare.
Lobsters are very plentiful in coastal New England, particularly in Maine and Massachusetts. The Pilgrims likely dined on lobster at the first Thanksgiving and there are tales of two-foot high piles of lobsters simply washing up on the shores during that time period. And where things are plentiful, they’re often cheap. Today, one can find high-quality lobster in Maine at about $5 a pound—much less than the going rate anywhere else. But from the 1600s through much of the 1800s, Maine and surrounding coastal areas were the
only
places one could reasonably find fresh-cooked lobster. As reported by the late David Foster Wallace in the similarly deceased
Gourmet
magazine, before we had the infrastructure and equipment required to ship live lobsters around the country (and later the world), the crustaceans were killed before they were cooked, just like almost any other animal. Precooked lobster meat in hermetically sealed cans doesn’t taste very good; Wallace noted that the protein-rich meat was used as “chewable fuel,” and not as the culinary draw we think of it as today.
Massive quantities of cheap food that doesn’t taste very good … that’s not a recipe for thrilling dinner guests. It is, however, a solution to other problems—such as, how do you feed prison inmates or indentured servants? That’s exactly where the lobster meat ended up. Wallace says that some states had rules insisting that inmates not be fed canned lobster meat more than once per week, and other sources note that indentured servants often demanded that their contracts limit the lobster meals to no more than twice weekly.
Once lobsters could be transported, alive, over long distances, the Maine lobster canneries began closing up shop. In the 1880s, lobster started to become a much-sought-after entree in Boston and New York, and over the next few decades, that custom spread across the country. By World War II, lobster was a high-priced treat—officially. Most foods were subject to wartime rationing, but not lobster because of its designation as a delicacy.
BONUS FACT
Generally speaking, expensive versions of a food item taste better than their cheaper counterparts. (After all, that’s why they’re more expensive.) Lobster—the cooked-live variety—is the exact opposite. Lobster quality is rated based on the hardness of its shell, as lobsters that have recently shed their old shells (and are growing new ones) typically have the sweetest and most desirable meat. Unfortunately, they also have the least amount of meat and are the hardest to transport, because their shells are so new. So these lobsters are kept local to New England and are only served in places where lobster is relatively common. Lower-quality lobsters, with harder shells and with more meat, are shipped around the world, and are the only lobsters offered to the captive markets. Because of this a lobster in Europe will almost certainly be of lower quality than one in Maine, but will cost as much as ten times the price.
The Internet. The automobile. Toilet paper. All these have been heralded as the greatest things since sliced bread. Which means that sliced bread, itself, has to be a pretty amazing thing.
Turns out, it is. But in order for us to collectively learn that lesson, the American government had to ban it.
Sliced bread—machine-sliced, that is—was almost invented in 1912, when a man named Otto Fredrick Rohwedder came up with a prototype and blueprints for an automated bread-slicing machine but lost his work to a fire. Undeterred, Rohwedder rebuilt the machine held in his mind’s eye. In 1928, he had a machine up and running, and by July of that year, sliced bread was being machine-produced for the masses. The marketing behind the product set the stage for the neologism used today, as Rohwedder’s bread was advertised as “the greatest forward step in the baking industry since bread was wrapped.” By 1930, Wonder Bread switched to a sliced-bread product, selling machine-sliced bread nationwide for the first time.
Sliced bread made bread consumption increase, as expected. But when World War II rationing came to the forefront of the American economic war effort, Food Administrator Claude Wickard targeted the greatest invention of the century (so to speak). Hoping to reduce the amount of wax paper used in general—because, Wickard presumed for some reason that sliced bread required more wax paper than its unsliced counterpart—and also hoping to reduce bread prices, Wickard banned the sale of sliced bread domestically, effective January 18, 1943. This was immediately met with protestations from housewives arguing that their household efficiencies were crippled by a short-sighted ban. Because, after all, sliced bread was the greatest thing in recent memory.
Local politicians took the matter into their own hands. New York City mayor Fiorello LaGuardia noted that bakeries could use their own bread-slicing machines, selling the freshly baked (and sliced) product direct to their customers. But the Food Distribution Association put the kibosh on that several days later, requiring the cessation of any commercial bread slicing, in order to protect those bakeries that either did not have bread-slicing machinery or wished to do their all for the country’s war efforts.
In any event, the ban was short lived. On March 8, sliced bread was once again allowed in the United States, its greatness preserved for generations present and future.
BONUS FACT
Toilet paper is not the greatest thing since sliced bread—it can’t be, because toilet paper predates sliced bread by more than fifty years. Commercial toilet paper was invented in 1857 by a New Yorker named Joseph Gayetty, who sold packs of 500 sheets (each containing a watermark with his name) for fifty cents. Its marketing language called the product “the greatest necessity of the age,” so perhaps, sliced bread is the greatest thing since toilet paper.
Go to any grocery store bread aisle and you’ll find—one hopes!—bread. Most of the bread does not just sit on the racks as is; typically, the loaves are wrapped in bags, held shut with a twist tie or a plastic tag. And you may notice that many of those ties and tags are colored—blue, orange, green, or a litany of other hues. In many cases, the colors vary even within the same brand; the shelf of Wonder Bread may have tags of five different colors.
Laziness? Rampant colorblindness in the factory? Or maybe bread makers just don’t care? Nope. For some, it’s a quality assurance tactic.
For more than a decade, Internet folklore claimed that the tags were quick visual clues that indicated the day of the week that the bread was baked. The urban legend held that stock clerks could easily identify loaves that were no longer fresh by looking for tags of a certain color. This would save a lot of time, as manually looking at expiration dates is labor intensive. In theory, so the legend goes, bread makers used these ties to make the supermarkets’ jobs easier, and thus making it much less likely that a customer would have a bad experience.
According to urban legend fact-checker Snopes, this piece of Internet folklore is—a rarity!—mostly true. Many bread manufacturers use different color tags each day, in order to help ensure that what reaches the end consumer is of high quality.
But there is no need to try and crack the code—and, in fact, it probably isn’t possible for the average consumer, because there isn’t only one system. Although news reports about the bread tags (and even more often, the e-mails forwarded around) suggest that savvy consumers can avoid getting a stale loaf simply paying attention to the tags, that isn’t the case. In general, the color system is intended for the supermarkets, whose employees should be removing the old bread before it goes stale. Similarly, the color-coding system is not standardized across all brands; each manufacturer can choose to adopt its own system, if it adopts one at all. For example, when a CBS San Francisco reporter followed up on the Snopes report, she found that at least one company simply printed the expiration date on their (always light blue) tags. As a consumer, therefore, it is always better to check the sell-by date.