Reforming Our Food Culture

Reforming Our Food Culture

Steven Poole declares that ‘Western culture is eating itself stupid’. His book You Aren’t What You Eat (2012) pokes fun at the snobbery, fads and celebrity culture that attend ‘foodie’ culture.
The term ‘foodie’ emerged in the 1980s, but the idea of discussing the enjoyment of food is much older. In France it goes back to the start of the nineteenth century when it became socially acceptable to do so.
We dispense with that ‘gastronomy’: ‘the art and science of delicate eating’, at our peril. Gastronomy enjoins restraint and reflection and is ‘the common bond which unites the nations of the world’, according to Jean-Anthelme Brillat-Savarin one of its prime movers.
A gastronomic sensibility is valuable to our health, motivating us to consume a wide range of nutrients. But there is a challenge to reconcile our enjoyment with considerations of environmental impact and our health. Exploring our pleasure should make us sensitive to those who live with insufficiency.

Stuffed and Starved

In the West we eat too much, and in the South they eat too little. Despite increasing globalization we have not addressed that contradiction. A billion are now overweight or obese in the developed world where, shamefully, 50% of food is wasted. Alas almost that number are undernourished or starving in the developing world.
It should be a straightforward matter of handing over our excess. But with the best will in the world this approach will not work: transport networks, functioning bureaucracies and peaceful conditions are all required, and dumping our surpluses removes income from Third World farmers and an incentive to innovate and improve.
Moreover, much of what gives rise to obesity in the West is connected to over-consumption of junk foods. A world cannot be fed on soft drinks. Our working class neighborhoods are often ‘food deserts’ without access to fresh, healthy and competitively-priced food. There gastronomy cannot take root.
Meanwhile in the Third World, real deserts are expanding as droughts become more prolonged and land resources mismanaged. Exponential population growth and failing states leaves much of sub-Saharan Africa in food insecurity.

The Green Revolution

The seemingly limitless supply of food we have in the West can be explained by the so-called Green Revolution which occurred in agriculture after World War II. It involved the deployment of high-yielding strains of common cereals in combination with synthetic fertilizers and pesticides derived from fossil fuels. A hectare of wheat which previously yielded two tons can now yield eight. Similar feats were achieved with other common grains.
Nobel laureate Norman Borlaug is regarded as its instigator. He and his collaborators corrected a structural deficiency in the stalk of wheat which could not support heavy grains. Previously the most fruitful plants collapsed under the weight of their own seeds before maturity. Borlaug’s group developed dwarf strains that could stand up to the weight of bulbous grains, thereby doubling yields. Today, almost every kernel of wheat consumed by man and beast is derived from Borlaug’s selective breeding.
But the resulting monocultures have increased vulnerability to disease; according to the authors Fraser and Rimas in Empires of Food: ‘Today our landscape is a lot like that of Ireland and Sri Lanka immediately before the famines. We devote much of our earth to a very small number of crops’. Borlaug strains depend on polluting and finite fossil fuel to survive.
Much of our increased yields are fed to livestock; only 20% of US corn is eaten directly by humans. The Green Revolution has made animal products affordable but the cost of maintaining this in terms of global warming and energy use is becoming apparent.
Last year’s disastrous corn harvest in America is bringing the issue into sharp focus. A choice is unfolding between maintaining the affordability of two icons of American life: the hamburger and the motor car. The livestock industry are petitioning to weaken or abolish the ‘ethanol mandate’, requirements Congress set on the use of corn as automotive fuel, on grounds that it could bring about a collapse in meat production.

Pre-domesticated Varieties

Research conducted by Unilever may have revealed the nutrition of the future. Many pre-domesticated varieties of plants reveal significantly higher levels of nutrients than varieties currently grown. An older variety of apple, the Egremont Russet, has up to 10 times more of a phytonutrient than some modern varieties. The researchers hypothesise that this finding will be just one example of older plant varieties being richer in nutrients and fibre.
Dr Mark Berry, who led the research said: ‘The plants we eat today like fruits and vegetables have often been bred and selected on their weight-based yield per acre of land, and not necessarily on the nutrient content of the produce.’ He adds: ‘Perhaps a better strategy for human health, not to mention sustainable agriculture, would be to buy plants not based on their weight, but on their nutrient content.’
This view reflects research into pre-domesticated cereal grains which have strikingly more protein content compared to modern cultivars.
A gastronomic sensibility prizes this variety. Instead of artificially manipulating conditions with synthetic inputs, we can isolate a wide variety of strains deemed suitable to particular locations. Different regions can express distinctive terroir from carefully selected crops.
This diversity will make our crops more resilient. Biodiversity can even be harnessed to increase productivity through permacultures and forest-gardening.
These varieties can even play a role in addressing the obesity epidemic. The decreased nutritional-value of many foodstuffs is affecting satiety levels. We can consume hundreds of calories of sugar in a soft drink without the hormone ghrelin being released which lets our brain know we’ve had enough. Foods richer in nutrients and fibre confer greater satisfaction.
By shifting away from the production of animal product which requires far greater use of land, energy and water resources we can easily find room for lower-yielding, nutrient-dense varieties. With a raised gastronomic awareness we might waste less.

Food Sovereignty

But how can the cultivation of lower-yielding strains have any relevance for developing countries which confront the challenge of scarcity?
Many scientists argue that GMO technology offers solutions and are attempting to develop biological nitrogen fixation in crops such as wheat which would allow them to survive without synthetic fertilizers. They dangle the prospect of decreased energy dependency and pollution, but admit successful adaptation is many decades away, and may never be achieved. But the advance of GMO also decreases diversity and could have unforeseen effects.
A more sensible approach is for farmers to develop a wide variety of strains suited to different conditions. Lower-yielding varieties might prove more bountiful as the ensuing diversity would be less susceptible to disease and less dependent on polluting inputs derived from fossil fuels. Decrying a prevailing ‘industrial’ model of development in the Third World, Concern Worldwide argue: ‘smart site-specific agroecological approaches that increase production, conserve natural resources and are tailored to specific human and environmental conditions should be favoured’.
It may be that in the Third World raising education levels, gender equality and increasing access to the internet will bring great rewards to farmers there. Indigenous development can occur rather than the familiar story of Europeans bringing progress.

Shifting Diets

Complete self-sufficiency for most countries based on a wide variety of pre-domesticated and native crop varieties would be difficult to achieve, but increasing diversity could benefit our agriculture and improve nutrition.
A global community must retain surpluses to confront shortages. By shifting away from livestock production in the developed world we can produce more food and improve its nutritional quality. A reduction in the consumption of animal products should bring health benefits.
A shift in global diets is required to confront the challenges of obesity, global warming, peak oil and growing populations. A gastronomic sensibility can help inform our choices.

An Enduring Legacy – Lessons from the Great Famine

(Published in Village Magazine, November 2012)

Who was to blame for the Great Famine? This thorny question rears its head with the recent publication of the Atlas of the Great Irish Famine by Cork University Press. We may accept the detached assessment of the American economic historian Joel Mokyr expressed some years ago that ‘Ireland was considered by Britain as an alien and even hostile country… the British simply abandoned the Irish and let them parish’; but we should not ignore how many Irish Catholics profited from this great rupture in our history which led to a population reduction of over two million due to starvation and emigration. The enduring legacy must be explored.

Irish people at the time were treated as second class citizens by their government; relief for desperate hungry victims was not a statutory right under the Irish Poor Law, as it was under its English equivalent. Successive failures of the potato crop 1845-50 caused by the blight phytophtera infestans did not lead to market intervention that occurred where grain harvests failed in England. Irish grain continued to be exported and insufficient cheap maize was purchased on the international market at key points. Moreover, the infamous Gregory clause of the Irish Poor Law denied relief to tenants holding more than a quarter acre unless they surrendered their tenancy which turned it into a charter for land clearance and consolidation.

But in emphasising the inaction of remote authorities in Westminster we overlook the gains made by Catholic Irish farmers holding substantial farms above 20 acres. In one contribution to the Atlas Kerby A. Miller writes: ‘an unknown but surely very large proportion of Famine sufferers were not evicted by Protestant landlords but by Catholic strong and middling farmers, who drove off their subtenants and cottiers, and dismissed their labourers and servants, both to save themselves from ruin and to consolidate their own properties.’

A commitment to laissez faire, as well as a sense of providentialism that cast natural occurrences as part of a divine plan, informed the thinking of the leading British policy-makers at the time, foremost the Assistant Secretary to the Treasury Charles Trevelyan who was responsible for relief measures. He concluded afterwards: ‘The result in Ireland has been to introduce other better kinds of food, and to raise the people, through much suffering, to a higher standard of subsistence.’ To the enduring chagrin of Irish nationalist he was knighted for his services in 1848.

The response of British authorities can be situated within a larger context of a shift in Imperial policy and an ongoing Agricultural Revolution whereby: ‘Farming changed from being an occupation primarily concerned with extraction from the soil into one involving the purchase of raw materials which were processed to produce a saleable product.’

The repeal of the Corn Laws in 1846 was the great triumph of laissez faire. In contrast to most European states where protection was extended to farmers, agriculture in the British Isles was thrown open to the free market.
Those who derived wealth from industry rather than land would henceforth guide British policy. Free trade would drive down the cost of food in the ‘workshop of the world’. Henceforth regions of the Empire would specialise in the production of particular foodstuff for sale on the international market, with the development of steamships making this possible. In contrast, in the same period in France a high proportion of production continued to be consumed on the farm or within the locality.

Politically, the occasionally benign paternalism of the landed aristocracy would no longer hold sway. The first editor of The Economist James Wilson, answered Irish pleas for public assistance with the claim that ‘it is no man’s business to provide for another’.

Within this constellation Ireland would supply beef and dairy for its near neighbour; tillage and horticulture, particularly carried out by peasants at a subsistence level, should be abandoned. By 1900 pastoral farming dominated as never before. It hardly mattered that a succession of Land Acts (1869-1904) had transferred ownership to former tenants. Those independent farmers would continue to generate ‘saleable products’ for the market.

An old way of life died for good as a result of the Great Famine. Subsistence communities, known as Clachan, were wiped out. Granted, Irish peasants were unwitting architects of their demise: plentiful potatoes allowed for early weaning which generated exponential population growth; almost 9 million in 1845.

Parts of Ireland had some of the world’s highest population densities, but according to Mokyr was not overpopulated on the eve of the Great Famine. It was the switch to pasture that made it so. Fernand Braudel observes: If the choices of a society are determined solely by adding up calories, agriculture on a given surface areas will always have the advantage over stock-raising; one way or another it feeds ten to twenty times as many people.’

Perhaps improvement in education levels, especially with the advent of free primary education in 1831, could have encouraged family planning and improved employment prospects. A more ordered transition to modernity might have occurred instead of the fearful flight to cities such as Liverpool, Glasgow and New York. But this would have required a government committed to the welfare of the population, and a settlement of the land question whereby gross inequalities, the legacy of seventeenth century conquest, were extinguished. However, Kerby observes that even: ‘Catholic nationalist (wealthy farmers and townsmen) as well as the overwhelming majority of the Catholic clergymen were much too conservative to countenance a peasant assault on Irish property relationships.’

A genuine revolution in land-ownership might have achieved this, but the demise of smallholders made the Land War of the 1880s a battle for the spoils of the Great Famine.

Exploring these ‘what ifs’ is counterfactual history, but it is important to recognise that the Great Famine was not inevitable and that the system of land-usage dominated by livestock for the international market that endures to this day is a recent innovation. One Protestant landowner referring in the 1850s to this shift said: ‘the extermination of humans and the substitution of brute animals for the human race on the soil of Ireland, is not an improvement grateful to my mind.’

Prior to the Great Famine Irish peasants were comparatively healthy. Irishmen’s heights were greater than those of equivalent Englishmen in a variety of occupations and situations and life expectancy was greater than most other Europeans except those of Denmark and England.

They had a sparse diet relying primarily but not exclusively on the potato; it actually constituted only one third of the land under tillage in the 1840s. They also consumed oats, especially in Ulster, vegetables, wheat and barley, butter milk, and whatever could be foraged in the form of seaweed, shellfish, berries and nuts. For most meat was a rarity. With a settlement of the land question diets would have become more varied based on the locally-sourced ingredients enumerated with less reliance on the potato.

The second half of the nineteenth century saw a dramatic shift in diet away from what was produced locally; beef and dairy were only for the tables of the well-off in Ireland. Between 1859 and 1904 sugar consumption rose tenfold and with it came increasing mortality from diabetes. Baker’s bread became the staple, and sugary tea the succour of the poor. This was Trevelyan’s idea of a ‘higher standard of subsistence’.

In an article written in 1913 George Russell (A.E.) observed of the transition: ‘There is no doubt that the vitality of the Irish people has seriously diminished, and that the change has come about with a change in the character of the food consumed. When people lived with porridge, brown bread and milk as the main ingredients in the diet, the vitality and energy of the people was noticeable, though they were much poorer than they are now… When one looks at an Irish crowd one could almost tell the diet of most of them. These anaemic girls have tea running in their veins instead of blood. These weakly looking boys have been fed on white bread’.

It is worth considering the effect of colonisation on the eating habits of the Irish who transitioned to a diet that was a product of colonisation, a trend that has continued. As Homi Bhaba puts it: ‘Although colonised subjects endeavour to imitate or mimic the behaviour of the coloniser, the mimicry is always imperfect – almost the same but never quite’.’

In response to colonisation we invented sporting codes, but because our colonisers had a stunted gastronomic culture we did not invent one for ourselves. But as this emerged in Britain in recent times there has emerged a pallid mimicry: our versions of Nigella and Jamie are neither as sultry nor as charming.

A self-respecting Irish gastronomy might hark back to the tradition of the Clachan, instead of the present models of taste that favour the livestock produce of land clearances. The food of the Clochan was light, wholesome and ecologically sensible. It should appeal to the contemporary gastronome.
Moreover, recent research by Goodland and Anhang has shown that up to 51% of anthropogenic greenhouse gas emissions emanate from livestock farming. It may be a sad irony of history that Irish livestock-farming will indirectly contribute to famines in the Third World as climate change brings drought and ecological catastrophes.

(http://villagemagazine.ie/index.php/2012/11/lessons-from-the-famine/)

Too Much of a Bad Thing

(London Magazine, December 2011)
So many tears have been shed for sugar that by rights it ought to have lost its sweetness.
Maguelonne Toussaint-Samat

The English palate, especially the working class palate, now rejects good food almost automatically.
George Orwell

Type: ‘Haiti’; ‘Dominican Republic’; and ‘border’, into an image search on Google. A split-second-cyber-miracle-later a startling aerial photograph of a portion of the island of Hispaniola shared by those countries appears. The Dominican side is blanketed in verdant forest with occasional yellow patches, but to the east in Haiti green has given way to arid yellow.

The stark contrast reveals the environmental devastation that sugarcane agriculture has wrought, dissolving forests as if enamel from teeth. According to the World Wildlife Fund it has ‘caused a greater loss of biodiversity on the planet than any other single crop’. This is compounded by over-population, a legacy of sugarcane’s labour-intensive agriculture, which leaves Haiti with a mere 1% of forest cover. Next door, the Dominican Republic retains 28%.

By the end of the 18th century Haiti, then known as Saint-Domingue, was the cash cow of the French Empire, accounting for two-thirds of its overseas trade. A plantation system based on slave-labour brought fantastic wealth to its ruling class: ‘rich as a Creole’ entered popular parlance.

The Haitian Revolution 1791-1804 ended that iniquitous system, and former slaves came to power for the first time. But sugarcane’s scars fester on the body politic, as on the landscape, and Haiti was crippled by huge debts from its inception after France compelled its former colony to pay massive compensation to dispossessed plantation owners. Outside interference continued, latterly emanating from the United States. The ills of a system that generated Papa Doc and the Tonton Macoute originate not in the frailty of the Haitian people but the after effects of the insatiable (mainly) European appetite for sugar.

Sugarcane originates in Papua New Guinea but is now cultivated in many tropical countries that enjoy hot and wet conditions. It even reached far-flung Easter Island where archaeologists have discovered the highest incidence of cavities and tooth decay of any known prehistoric people. First processed into solid sugar in India around 350 AD, cultivation and consumption then moved steadily westwards. It is said that sugar followed the Koran.

First treated as a spice it was rarely encountered in Europe prior to 1000 AD, but became a fixture in aristocratic cookery during the Crusades. After the fall of Acre (1291) cultivation moved to Cyprus and soon spread throughout the Mediterranean world.

Desserts were not a feature of medieval banquets with pricey refined sugar used sparingly in otherwise savoury dishes. Only after Catherine de Medici’s marriage to Henry II of France in 1533 did the idea of climaxing a meal with a sweet conclusion become de rigeur for the few who could afford it. Most Europeans would not have encountered it prior to the 18th century, but by 1900 it had become a staple, especially in England. According to anthropologist Sidney Mintz: ‘the diet of a whole species was gradually being re-made’.

Colonisation of the New World serviced Europe’s growing addiction. Settlers, beginning with Christopher Columbus, grew it and more than elusive gold, sugarcane offered a real El Dorado. But production was dependent on slavery, a pernicious system that first exhausted and then extinguished the native Arawak population before Africans were resorted to: approximately 13 million endured the murderous indignities of the Atlantic crossing, and of the 11 million that survived 6 million were destined for sugarcane plantations, in which ‘the deadliest form of slavery’ prevailed. In those appalling conditions a new species of racism emerged where Africans, ‘the sons of Ham’, were often treated worse than livestock. The racist language of the plantation survives to the present day, co-opted by successive political movements that relegate fellow-humanity to the status of inferior animals. Eric Williams argues that ‘slavery was not born of racism; rather, racism was the consequence of slavery’.

According to Elizabeth Abbot: ‘Whites relied on blacks to produce their sugar, counted them as their biggest capital investment, enslaved and mistreated them, vilified their race, sexually assaulted and fell in love with them, and lived dependent on and surrounded by them.’ The cruelty catalogued in Abbot’s book: Sugar A Bittersweet History, is shocking and its legacy is the continued instability of post-plantation societies. With the demise of most of the French West Indies the British West Indies dominated the market, although countries such as Brazil gained increasing market share in the era of free trade that followed the repeal of the Corn Laws in 1846.

The Slave Trade was prohibited in 1807, but full emancipation only arrived in the British Empire in 1833. Slavery on sugarcane plantations endured until 1888 when it was finally stamped out in Brazil. Europeans and Americans continued to consume slave-produced sugarcane until that point. Abolition was the fruition of a long and worthy campaign, but the system that replaced it, indentured labour, involving the transport and virtual incarceration of coolie labourers from India and China, was almost as bad. It has left a further legacy of racial tension in the West Indies and places further afield like Fiji.
Humans have a natural inclination towards sweet food and refined sugar (sucrose) is a pure expression of this. In sweetness our bodies recognise easily-digestible caloric value. But as adults we rarely enjoy food that is purely sweet, usually preferring a balance of tastes. It is important, however, for us to be wary of the bitter taste as this may indicate indigestibility or even poison; a child’s aversion to coffee or beer is quite understandable. Over time most of us acquire a taste for strong-tasting bitter substances, often for the stimulation and even intoxication they impart as much as any nutritional benefit.

According to Sidney Mintz: ‘sweet-tasting substances appear to insinuate themselves more quickly into the preferences of new consumers while bitter substances are “bitter-specific”’. Thus, ‘liking watercress has nothing to do with liking eggplant [aubergine] for instance.’ A sweet tooth is not discerning: the taste of sucrose derived from cane or beet is virtually identical, and High Fructose Corn Syrup (HFCS) has much the same character – witness Coca-Cola’s successful substitution of cheaper HFCS for sucrose in 1984. Trying to substitute the bitter flavour of root beer for bitter cola would be another matter.

The increased sucrose consumption which began at the end of the 18th century at all social levels was predicated on low price but also on a seductive combination with chocolate, coffee, and tea. These bitter drug-foods became cheap and plentiful for Europeans at precisely the same time: the end of the 18th century. Sucrose took the edge off the bitter taste which balanced excessive sweetness. Coffee, tea and chocolate consumption would not have taken off in isolation, but equally sucrose alone would not have had the same appeal.

Mintz says that in England tea ‘triumphed over the other bitter caffeine carriers because it could be used more economically without losing its taste altogether’. In reaction to the heady days of the gin-soaked 18th century the temperance movement lauded it as ‘the cup that cheers but does not inebriate’. For impoverished workers of the Industrial Revolution, tea in combination with sucrose provided calories, as well as stimulation and an enduring social ritual. Mintz argues, persuasively, that cheap sucrose was an important fuel for workers in the Industrial Revolution. Over-worked and under-paid, they now had access to fast food that would get them through the day.

Horrendous slave-labour in the West Indies was providing energy for harsh wage-labour in Britain. Moreover, Eric Williams argues that huge profits generated from sugarcane ‘fertilized the entire productive system of the country’. It also provided jobs directly, manufacturing items required by plantations including iron-collars, handcuffs and shackles, tongue depressors, and ball-and-chains originally designed for medieval torture.

Voltaire’s (d. 1778) dictum that England has 42 religions but only 2 sauces contrasts that society’s piety with its lack of enthusiasm for cooking. Bernard Kaufmann argues that such a hotbed of Puritanism was unusually predisposed to sucrose: ‘religious asceticism is suspicious of anything that is fatty or bloody, but is defenceless against things that are sweet’. At a time when an all-pervading spirit of ‘thou shalt not’ held sway, sucrose, dissolved in water or used to preserve, did not seem a gluttonous indulgence. It could also replace the sweetness of frowned-upon alcohol.

Writing about his countrymen from the vantage of the late 1940s the historian C. R. Fay asserts: ‘Tea which refreshes and quietens, is the natural beverage of a taciturn people, and being easy to prepare it came as a godsend to the world’s worst cooks’. But arguably the very popularity of tea contributed to the decline of English cookery. A pot of tea with sucrose, only commonly accompanied by milk by the start of the twentieth century with the advent of refrigeration, was the urban answer to the cauldron of soup that traditionally sustained rural communities. Its simple preparation, warm re-assurance and even slight suppression of appetite removed the need for hot food in a hard-working society where time was increasingly short. Also, the failure to provide infrastructure to cope with mass urbanization in 19th century England made it necessary to boil water to make it safe until improvements in sanitation arrived in the 1890s. Tea made water potable and palatable.

In many poor urban families an expensive piece of meat was reserved for the male bread-winner while the rest of the family subsisted on sweet tea, ballasted with shop-bought bread and butter or margarine and jam, composed of over 50% sucrose. This under-nourishment of children and babies in utero had long term health consequences. According to Floud et al in The Changing Body, over the course of the 19th century average final heights of men (an important nutritional indicator) in England actually declined slightly from the average at the start of the century (168.6cm to 168.0cm).

Tea, while a diuretic, has some health benefits (particularly if it is green tea) but sucrose is considered nutritionally ‘empty’, apart from as a short-term source of energy. The effects of over-consumption, now defined very conservatively by the NHS in their dietary guidelines as above 10% of daily caloric intake, can be extremely damaging. Henry Hobhouse describes the process: ‘the body becomes used to a feast/famine syndrome in the blood sugar, and this produces an addiction which is chemical, not psychological’. Thus, ‘a vicious circle is created in which the victim becomes hooked on a constant flow of industrial sugar to the bloodstream and cuts down on fibre… as sugar consumption inhibits the production of starch and fibre-converting enzymes’. A preference for less nutritious white bread is coupled with and reinforces a sucrose addiction as the enzymes required to digest whole grains are ‘killed by industrial sugar’. Furthermore, consumption of refined sugar does not trigger the release of the hormone leptin which informs the brain that we are sated. This explains why it is possible to drink highly caloric soft drinks during and after meals without feeling full.

In 1900 sucrose was supplying a whopping near one-fifth of the calories in the English diet, almost double on average the maximum limit recommended today. Despite the virtual end to sustained food shortages, and certainly famines, a series of nutritional surveys conducted among working class families across Britain at that time suggested that not only the urban poor, but also ‘the bulk of the semi-skilled workers, the routine clerical workers, and even those of the skilled artisan class’, were likely to be undernourished. Sucrose was the food of the poor it would seem.

Greater diversity entered the diet after World War I which brought better nutrition (and led to increased average heights and life expectancy) but the English sweet tooth endured. By the 1930s George Orwell still observes an unhealthy addiction in The Road to Wigan Pier: ‘plenty of people who could afford real milk in their tea would much sooner have tinned milk – even that dreadful tinned milk which is made of sugar and cornflour and has UNFIT FOR BABIES on the tin in huge letters’.

Refinement of sugarbeet into sucrose commenced at the start of the 19th century, especially gaining ground during the Napoleonic Wars when France was denied access to the West Indies. By 1880 beet production nearly equalled that of sugarcane. Although it is not environmentally hazardous, the end product is equally unhealthy. From the late 1970s, especially in America, sucrose was joined by another refined sugar derived from maize: HFCS. Farm subsidies, introduced by Richard Nixon in the 1970s maintain its low price. It is even sweeter than sucrose and has identical harmful effects. Sucrose consumption has not declined in the United States, but HFSC consumption now exceeds it. Consumption is disproportionately high among the poor, many of whom subsist on HFSC-laden fast foods in which it forms an unhealthy trinity with saturated fat and salt. Its use is rising inexorably elsewhere. It was recently calculated that of an estimated 47 billion beverage servings humans consume daily, 1 billion of these are in Coca-Cola.

The success of HFSC can also be attributed to the emergence of nutritional advice in the US and elsewhere in the 1970s promoting ‘low fat’ diets. A product could be advertised as ‘low fat’ but still contain vast quantities of cheap HFCS. Big Food has maintained this nutritional confusion through powerful lobbies.

The consequence of large-scale addiction is the public health crisis of obesity. We may now live longer than ever but our potential to live still longer and in good health is threatened. Refined sugar seems to be the greatest culprit. According to nutritionist Patrick Holford: ‘There is no question in my mind that increased sugar consumption is driving not only obesity and diabetes but heart disease and breast cancer’.

Obesity is the plague of our time with most developed countries converging with the US rate of over 50% of the population. The concomitant rise in type 2 diabetes is afflicting children at increasingly young ages. One wonders why governments, medical professionals, chefs and gastronomes have been so slow to address the issue. A zero-tolerance approach should be adopted that advocates a near-total exclusion of refined sugar in view of its addictive quality. The present NHS guideline seems inadequate. According to Floud et al the ‘evidence suggests that the rise in obesity represents one of the major challenges which needs to be faced if European populations are to build on the advantages which a century of economic and social progress have bequeathed.’

Sweetness can be derived from safe sources in which fibre is present. As Dr. Robert Lustig whose lecture ‘Sugar: the Bitter Truth’ (which has been viewed almost two million times on Youtube) says: ‘When God created the poison he packaged it with the antidote’. Natural sugars are accompanied by fibre. The problem arises when the antidote is removed, i.e. when a plant is refined into a slow-acting poison.

Not only is refined sugar responsible for expanding waistlines and a range of preventable diseases, according to Holford: ‘adolescents consuming sugary drinks become ‘more disruptive and less able to concentrate in school’. A variety of mental health problems have been associated with over-consumption of refined sugar.

Refined sugar has always had its apologists. In 1715 Dr Frederick Slare wrote an encomium to it as a tooth-cleaning powder, a hand lotion, a healing powder for minor wounds and, above all, an essential treat for babies and ‘the ladies’ to whom his treatise was dedicated.

Even the iconic Che Guevara was seduced: ‘The entire economic history of Cuba has demonstrated that no other agricultural activity, would give such returns as those yielded by the cultivation of sugarcane. At the outset of the Revolution many of us were not aware of this basic economic fact because a fetishistic idea connected sugar with our dependence on imperialism and with the misery of the rural areas, without analysing the real causes: the relation to the unequal balance of trade.’ After the fall of its main trading partner the Soviet Union, Cuba discovered the cost of its dependence on that monoculture and has only belatedly turned to mixed agriculture to address its needs. Moreover, the requirements of sugarcane sustain an autocratic mode of agriculture that exacts a terrible price on the natural environment, as well as workers. Finally, the end product is nutritionally empty.

Most surprisingly, Margaret Abbot in the closing chapter of Sugar: A Bittersweet History opines that the successful conversion of sugarcane into biofuel in Brazil has ‘a redemptive quality’ in ‘the narrative of sugar’s story’. Here she departs from the thrust of her argument, perhaps wishing to end on a positive note after telling such a harrowing tale. She disregards her own findings about Brazilian sugarcane agriculture’s continued encroachment on ‘former pastureland and ecologically-sensitive wetlands’, as well as the unequivocal findings of the WWF. The siren-sound of refined sugar has no limit it would seem.

It seems quite appropriate that refined sugar and the motor car in which that biofuel is used should join in an unholy alliance. Both were once the preserve of aristocrats but now access is near universal. As the prevalence of each increases any initial benefits decline: cities become thronged with traffic; and energy-dips, or even hypoglycaemia, occur after refined sugar’s brief high. Mechanized locomotion and instant energy are coiled in a warm, corpulent embrace; 19% of American meals, mostly fast food, are eaten in a car.

(http://inpressbooks.co.uk/products/the-london-magazine-december-2011-january-2012)