Allergic diseases have been described since the antiquities , but major advancements on understanding of their pathogenic mechanisms and clinical implications date to the end of the 19th and the first half of the 20th century [2, 3]. In the meantime, during the last decades, the prevalence of allergic diseases remarkably increased, paving the way to the definition of “allergy epidemics” . But what happened in the second half of the 19th century that gave rise to this phenomenon? As far as we know, at least two major changes took place, which led to an increase of allergic diseases. The first one was a major change in life and social conditions, with decrease of family size, people progressively moving from the countryside to the cities, and consistent improvements in public hygiene, with widespread availability of clean water and use of antibiotics, just to name two .
The second one was the increasing concentration of airborne pollens, due to the diffusion of new grasses, such as Lolium perenne in England in the late 19th century, and to the increasing presence of infesting grasses due to changes in farming techniques and increase in arable farming. By the mid-forties, hay fever became such a severe health problem in the New York city area that a ragweed eradication campaign was initiated by the city council .
If the link between the increase of airborne pollens in the air and allergic diseases caused by the pollens themselves is clear, this is not the same for the first explanation.
In 1989, David P. Strachan published on The Lancet the results of his epidemiologic investigation on 17414 British children who were followed-up until the age of 23 . He aimed to investigate the relationship between the increase of hay fever and sixteen different perinatal, social and environmental factors. What he found was a striking association of hay fever with family size and position in the household during childhood . In his evaluation, the single most influential variable was the number of older children in the household, and he hypothesized that allergies could be prevented – during childhood – by cross-infections among family members, facilitated by unhygienic contact with other siblings. This was the birth of the “hygiene hypothesis”, which is still today a milestone to explain the rise of allergic diseases [6, 7].
In the last decades of the 20th century, new information came from in vitro studies about CD4+ T-lymphocyte subpopulations . T Helper-1 (Th1) cells were characterized as T lymphocytes capable of prevalent production, in response to microbial stimuli, of interleukin-2 (IL-2), interferon γ (IFN-γ), and transforming growth factor β (TGF-β), subsequently referred to as “Th1-cytokines”; these cytokines were recognized a prominent role in defense against most infectious agents . Th2 cells, on the contrary, were characterized by the prevalent production of IL-4, IL-5, IL-9, and IL-13 (Th2-cytokines) who gave rise to an eosinophilic-rich immune response, primarily implicated in immunity against parasites and multicellular organisms but less crucial for immune responses in modern westernized countries .
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
Copyright (c) 2016 Filippo Fassio and Fabio Guagnini