The Splendour of the Ordinary

Featured string: The Splendour of the Ordinary

Spending a lot of time with museum collections and history, as I’m lucky enough to be allowed to, means grappling with the surprising timeline of inventions and “normalisations” (when using stuff that’s been invented becomes normal–this can sometimes be a large gap. For example, the first video call was made in 1927, but it wasn’t until the 2010s that Skyping became something unremarkable). It also means coming to appreciate just how much difference developments do and don’t make to the way people live their lives.

There’s an old adage that if something was common when you were growing up, then that’s what’s “normal” forever; for people growing up in a period of constant, high-speed technological change that might be the change itself, rather than the technology. For people growing up before portable, instant communication methods, that “normal” might just be “not being available”.

This is one of the reasons the Science Museum’s Making The  Modern World gallery is cool. It takes things that each generation of us think of as normal, and presents them as inventions, as museum exhibits to be looked at and thought about differently: when did they become normal, everyday things? What was it like when they were first becoming Normal?

It’s not just the present ordinary, either, but things that were once ordinary and have since become archaic…

Model of a contraceptive pill, unknown maker, c. 1970

The results of endocrinological research seem so commonplace that it’s hard for younger generations to fully get our heads around the idea that as recently as the 1950s, the only really effective form of birth control was the condom, reliant wholly on proper use and on people with penises agreeing to their use; and there was no functional method of stopping periods, no hormone therapy for menopausal people. It’s often said that the contraceptive pill sparked the (heterosexual) sexual revolution (it might be better understood as a part of the overall landscape) by placing control over pregnancy firmly in the hands of the people getting pregnant.

The truth is hormones (the endocrine system) are still a profoundly mysterious area not only to non-scientists (and journalists in particular) but to medicine; their operation is complex and wide-reaching, and they’re subject to a lot of urban legend. With increasing numbers of transgender people feeling safe enough to use hormone replacement therapy to achieve comfort and physical and mental wellbeing, it’s possible that there will be more wide-ranging areas of research too–both into sex hormones (testosterone, oestrogen and progesterone) and into the many, many other hormones that go into making the human body work and grow.

Roadside advertising board for mobile telephone repair shop, unknown maker, 2012

While mobile phone technology has progressed a long way in less than one human lifetime, from “only radios can do that and only over a short distance” through to “I’m just going to video call my friends in Australia from Edinburgh, hold on”, the need to repair goods has been around forever. Many thousands of people throughout history have made a good living out of fixing the things that other thousands of people broke, after all.

Not everyone finds this ingenuity, thrift, and entrepreneurial spirit as uplifitng a sign of the adaptability of humankind, however. Apple, one of the largest tech companies in the world, has an almost pathological aversion to letting non-employees repair their goods. Fear of industrial espionage may play a large part in this, but it’s something of an own goal: many former generations of engineers got their start repairing broken household electronics.

Fortunately, ingenuity and the desire to share knowledge know few bounds, and insight on how to conduct mobile phone repairs is absolutely everywhere.

Bottle of cod liver oil, Ministry of Food, 1939-1955 

Understanding what human beings need from our diets is still very much an ongoing process, as anyone who’s read a newspaper in the last forty or so years will have noticed. It seems like very day a new substance either gives you cancer or cures cancer, makes you fatter or helps you lose weight, raises or lowers one cholesterol or another, or just isn’t there in our diets as much as we need.

The truth is that there are some absolutely essential components to the human diet: proteins, fats, carbohydrates, fibre, minerals and various vitamins. Vitamins weren’t discovered in and of themselves until the twentieth century, but the effects of their absence were known–and often the appropriate cures–for some considerable time before that: scurvy (vitamin c deficiency), beriberi (vitamin b-1 deficiency), rickets (vitamin d deficiency), pellagra (vitamin b-3 deficiency), and xerophthalmia (vitamin a deficiency).

Scurvy’s solution was discovered in fruit, and in particular those high in “ascorbic acid”–that translates literally to “acid against scurvy”; xerophthalmia requires vitamin a, found in abundance in carrots–so in a roundabout way, they really will help you to see in the dark, or at least not have a miserable time with a progressive eye disease.

Rickets, caused by an absence of vitamin d, was the bane of Victorian England. Poor diet and the country’s naturally sun-free weather led to generations of industrial kids with bendy, bow-legged bodies. In 1918 Sir Edward Mellanby made the discovery that supplementing an otherwise bland diet with cod liver oil strengthened developing bones, and entire generations of children had to look forward to a daily dose of the pictured bottle which was, let’s face it, not very nice (it comes in capsules now).

Continuity and normality for the country at war (and in during the lean times after the war) were important both for winning the war and for recovering after it: the Ministry of Food oversaw rationing, including the provision of bottles of cod liver oil to prevent the war generation from growing up with rickets to go with their abandonment issues.

It’s difficult to imagine that the population of the UK would, in the current day, accept food from their government, or believe that an official body might have useful information to share. How future governments may respond to food shortages is also something we can’t fully predict.

Shelvador electric compression domestic frigerator, c. 1934-1935

Refrigeration is so commonplace that it’s easy to forget it might ever have been revolutionary or destructive to sections of the economy, but it’s fair to say that the development of both industrial and domestic refrigeration dramatically changed social habits, life expectancy, and at least one industry.

Prior to refrigeration in homes, the ice-delivery industry boomed: delivery of blocks of ice for use in an icebox was a regular necessity for keeping fresh groceries–milk in particular–from going off.

Regular, often daily shopping and deliveries were necessary to ensure fresh food in households. At best you might be able to preserve some food in a nice cool cellar (where rats often got at it), or on a cold windowsill in winter, but none of these situations were ideal, and former generations may well have ended up with Accidental Cheese even more often than university students with no sense of the passage of time and a capacious fridge…

Domestic refrigeration and industrial refrigeration has also had profound implications for medicine. Take insulin. Insulin doses need to be kept cool: in the clinic, and in the home. Managing type 1 diabetes without insulin is extremely difficult, and there is very little reputable evidence on doing so successfully even now. Insulin is far from the only medication which needs refrigeration to keep it stable and useful!

Miraculous though refrigeration is, it’s also had a darker side. Apart from increasing the energy demands of every home, until the late 90s and early 00s most fridges and freezers were operational using CFCs (chlorofluorocarbons), which have had a damaging effect on the earth’s atmosphere, accelerating climate change. It seems the cost of cooling our food and medicines has been heating our planet.

Early electric bulb, unknown maker

We live in a world bathed in artificial light: in many cities wildlife has adapted to what is effectively perennial dusk, often just eternal day in a hundred different hues. Light is available to us at the clap of a hand, when we walk into a room, in any colour we want (including ones we can’t see), whenever we want, in our pockets, on our bodies, perfectly imitating sunlight… so we take it somewhat for granted. 

But electric light hasn’t been with us for as long as it feels like. And even gas lamps, dangerous as they were, only came to us in the 18th century (outside of China). Prior to that oil lamps and candles were the only available options: hot, dim, and often quick to run out–and difficult to get lit in the first place (once again, China led the world by inventing matches significantly before anyone else managed it).

Keeping our homes and workplaces lit has reduced eye strain, lengthened available working hours, improved safety, and given many the option to improve their understanding and education by reading after work is done (not really a possibility by candlelight, or with poor eyesight). Keeping our streets lit (which curiously started much earlier, in the 17th century in England, although it had been going since the 4th century CE in Antioch) made them a safe prospect. at least for a while, and helped usher in the concept of night-life and 24hr cities–once again expanding how much time people could and did spend both working and playing.

The toll of electric light on the environment is immediately visible in the form of light pollution and less immediately in the form of our civilisation’s ever-increasing energy needs (although lighting makes up a small part of the overall need compared to, say, producing aluminium). Nevertheless, the question of reducing energy consumption for lighting (and its associated pollution) has weighed on people’s minds, leading various cities to push for adopting LED street lighting (a controversial topic) and other bodies to explore bioluminescence (naturally-produced light by algae and bacteria) as alternatives to incandescent or low-energy light bulbs.

Silver mirror from a set of woman’s toilet articles, artist unknown, 1st Century CE Roman

Often a series of inventions and discoveries throughout history will try to address the same need, repeating on a theme until the most convenient iteration develops. Figuring out how to make a mirror that works as well as a still pool of water (or better, although it’s often hard to imagine things that have no basis in previous experience) took a lot of doing, and for many centuries highly polished metal or obsidian shards were they only way to achieve it. While that could be enormously reflective, getting enough metal to create a reasonably-sized surface and rigid enough that it didn’t warp, bend, or dent, thus distorting the reflection, was an expensive deal. And most metals oxidize (not silver or gold), meaning that it became more expensive still. Obsidian, plentiful in some parts of the world, doesn’t always come in large sheets–and the reflections are dark and ghostly.

In fact, it wasn’t until the 12th Century CE that Europe at least began to get the hang of manipulating glass and silvering it to an extent that mirrors could be produced. This began in Venice, and spread out across the rest of Europe in the 1500s and 1600s… associated initially with wealth and influence (paying to have something so fragile imported from Venice was, after all, going to be costly). For more on the history of mirrors, there’s a charming summary from the Joukowsy Institute for Archaeology and the Ancient World.

Mirrors had a fairly important role to play in the development of telescope technology, followed by microscopy, and are of course a vital part of road safety.

Easily-portable and vast-scale glass mirrors would seem to be the end of progress in this area, but the number of times I’ve seen people at tube stations at 5am trying to use their front-facing phone camera as a mirror to apply mascara or put in contact lenses suggests otherwise. How we present ourselves to the world is not only now reliant on mirrors, but on creating changes to how we’re reflected in mirror, apply moving filters that map to our facial features without any apparent surprise that this is possible at all!

And because the human species loves information almost as much as it loves looking at itself, we’ve begun to take up a new hobby: creating smart mirrors that allow us to overlay information–even information on the history of mirror-making–over our own reflections.

Carriage clock: Eight-day spring-driven movement with alarm and repeat, Louis Antoine Breguet, 1830-1835

Although these days a carriage clock is usually just thrown into the bargain when trying to secure an insurance or pension deal, there was a time when these were objets de désir and greatly sought-after. The same way that an Apple watch now is a mark of a certain type of person, so once a pocket-watch was the mark of a certain type of person (although it could be argued that it still is a mark of a somewhat different type of person), and at one point merely owning a clock at all was a sign of serious affluence.

Something as simple as telling the exact time was once almost unimaginable, and concepts of smaller units of time more-or-less didn’t exist. Sundials, candle-clocks, water-clocks (which led to some of the earliest developments of clock-work), and hour-glasses, often all ornate and exquisitely beautiful, stand testament to the attempts of humanity to measure out periods of time shorter than mornings and afternoons.

But now beautiful, exact timepieces like this–and their vital navigational function (a search for a solution to this no doubt had a serious impact on the speed at which the study of time-keeping improved: as the world expanded for Europeans, the need to navigate it safely became ever-more important; it’s worth noting that plenty of cultures outside of Europe had already nailed the art of exact navigation via other means, but Europeans weren’t in a position to discuss it at that point) have become semi-obsolete as everyone relies on their phones for the practical function of time-keeping.

And yet we’re still drawn to the convenience of a watch on the wrist, whether for displaying the time or for displaying more vital information. The sound of a clock ticking is a story-telling cue in many forms of audible media, while high-speed movement of clock hands is a convenient visual shorthand for the speeded-up passage of time. There’s also evidently a wide market for embellishments with the inner workings of analogue clocks. It’s just not always obvious how complex and brilliant a machine a clock is, because it’s so commonplace and seen as so infallible–hence the phrase, running like clockwork.

So we’re surrounded, day by day, with little miracles that have become mundane, and day by day things which were once “can’t do withouts” are done without, and become less relevant, closer to being curiosities than necessities; the things which were once against nature become commonplace, and may, one day, become ancient history. And we can’t always predict which.

Related: Scenes from everyday life, Against Nature

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s