How Long Has Humanity Been At War With Itself?

Deborah Barsky

02-08-2024 ~ Is large-scale intra-specific warfare Homo sapiens’ condition or can our species strive to achieve global peace?

The famous American astronomer Carl Sagan once said, “You have to know the past to understand the present.” But can we ever know the history of human origins well enough to understand why humans wage large-scale acts of appalling cruelty on other members of our own species? In January 2024, the Geneva Academy was monitoring no less than 110 armed conflicts globally. While not all of these reach mainstream media outlets, each is equally horrific in terms of the physical violence and mental cruelty we inflict on each other.

Chimpanzees, our closest living relatives, are known to partake in violent intra-specific skirmishes, typically to preserve privileged access to resources in response to breaches in territorial boundaries. But only humans engage so extensively in large-scale warfare.

Do massive acts of intra- or interpopulational violence conform with Darwinian precepts of natural selection, or is this something we do as a competitive response to the stresses of living in such large populations? Looking back in time can help us find answers to such questions. Evidence preserved in the archeological record can tell us about when and under what conditions the preludes to warlike behaviors emerged in the past. Scientific reasoning can then transform this information into viable hypotheses that we can use to understand ourselves in today’s world.

As archeologists continue to unearth new fossil evidence at an increasing rate, so too are they piecing together the human story as one of complex interactions played out by (a growing number of) different species of the genus Homo that lived during the tens of thousands of years preceding the emergence—and eventual global dominance—of our own species: Homo sapiens. In fact, scientists have recognized more than a dozen (now extinct) species of Homo that thrived over the millennia, sometimes sharing the same landscapes and occasionally even interbreeding with one another. Millions of years of hybridization is written into the genomes of modern human populations.

Although we know very little about what these paleo-encounters might have been like, progress in science and technology is helping archeologists to find ways to piece together the puzzle of interspecific human relationships that occurred so long ago and that contributed to making us who we are today. In spite of these advances, the fossil record remains very fragmentary, especially concerning the older phases of human evolution.

First consider Homo, or H. habilis, so-named because a significant increase in stone tool-making is recognized following its emergence some 2.8 million years ago in East Africa. The evidence for the beginnings of this transformational event that would set off the spiraling evolutionary history of human technological prowess is relatively sparse. But such ancient (Oldowan) toolkits do become more abundant from this time forward, at first in Africa, and then into the confines of Eurasia by around 1.8 million years ago. Throughout this period, different kinds of hominins adopted and innovated stone tool making, socializing it into normalized behavior by teaching it to their young and transforming it into a cutting-edge survival strategy. We clearly observe the positive repercussions of this major advancement in our evolutionary history from the expanding increases in both the number of archeological sites and their geographical spread. Unevenly through time, occurrences of Oldowan sites throughout the Old World begin to yield more numerous artifacts, attesting to the progressive demographic trends associated with tool-making hominins.

Tool-making was a highly effective adaptive strategy that allowed early Homo species (like H. georgicus and H. antecessor) to define their own niches within multiple environmental contexts, successfully competing for resources with large carnivorous animals. Early humans used stone tools to access the protein-rich meat, viscera, and bone marrow from large herbivore carcasses, nourishing their energy-expensive brains. The latter show significant increases in volume and organizational complexity throughout this time period.

But were these early humans also competing with one another? So far (and keeping in mind the scarcity of skeletal remains dating to this period) the paleoanthropological record has not revealed signs of intraspecific violence suffered by Oldowan peoples. Their core-and-flake technologies and simple pounding tools do not include items that could be defined as functional armaments. While a lack of evidence does not constitute proof, we might consider recent estimates in paleodemography, backed by innovative digitized modelization methods and an increasing pool of genetic data that indicates relatively low population densities during the Oldowan.

Isolated groups consisted of few individuals, organized perhaps into clan-like social entities, widely spread over vast, resource-rich territories. These hominins invested in developing technological and social skills, cooperating with one another to adapt to new challenges posed by the changing environmental conditions that characterized the onset of the Quaternary period some 2.5 million years ago. Complex socialization processes evolved to perfect and share the capacity for technological competence, abilities that had important repercussions on the configuration of the brain that would eventually set humanity apart from other kinds of primates. Technology became inexorably linked to cognitive and social advances, fueling a symbiotic process now firmly established between anatomical and technological evolution.

By around one million years ago, Oldowan-producing peoples had been replaced by the technologically more advanced Acheulian hominins, globally attributed to H. erectus sensu lato. This phase of human evolution lasted nearly one and a half million years (globally from 1.75 to around 350,000 years ago) and is marked by highly significant techno-behavioral revolutions whose inception is traced back to Africa. Groundbreaking technologies like fire-making emerged during the Acheulian, as did elaborate stone production methods requiring complex volumetric planning and advanced technical skills. Tools became standardized into specifically designed models, signaling cultural diversity that varied geographically, creating the first land-linked morpho-technological traditions. Ever-greater social investment was required to learn and share the techniques needed to manipulate these technologies, as tools were converted into culture and technical aptitude into innovation.

In spite of marked increases in site frequencies and artifact densities throughout the Middle Pleistocene, incidences of interspecific violence are rarely documented and no large-scale violent events have been recognized so far. Were some Acheulian tools suitable for waging inter-populational conflicts? In the later phases of the Acheulian, pointed stone tools with signs of hafting and even wooden spears appear in some sites. But were these sophisticated tool kits limited to hunting? Or might they also have served for other purposes?

Culture evolves through a process I like to refer to as “technoselection” that in many ways can be likened to biological natural selection. In prehistory, technological systems are characterized by sets of morphotypes that reflect a specific stage of cognitive competence. Within these broad defining categories, however, we can recognize some anomalies or idiosyncratic techno-forms that can be defined as potential latent within a given system. As with natural selection, potential is recognized as structural anomalies that may be selected for under specific circumstances and then developed into new or even revolutionary technologies, converted through inventiveness. Should they prove advantageous to deal with the challenges at hand, these innovative technologies are adopted and developed further, expanding upon the existing foundational know-how and creating increasingly larger sets of material culture. Foundational material culture therefore exists in a state of exponential growth, as each phase is built upon the preceding one in a cumulative process perceived as acceleration.

I have already suggested elsewhere that the advanced degree of cultural complexity attained by the Late Acheulian, together with the capacity to produce fire, empowered hominins to adapt their nomadic lifestyles within more constrained territorial ranges. Thick depositional sequences containing evidence of successive living floors recorded in the caves of Eurasia show that hominins were returning cyclically to the same areas, most likely in pace with seasonal climate change and the migrational pathways of the animals they preyed upon. As a result, humans established strong links with the specific regions within which they roamed. More restrictive ranging caused idiosyncrasies to appear within the material and behavioral cultural repertoires of each group: specific ways of making and doing. As they lived and died in lands that were becoming their own, so too did they construct territorial identities that were in contrast with those of groups living in neighboring areas. As cultural productions multiplied, so did these imagined cultural “differences” sharpen, engendering the distinguishing notions of “us” and “them.”

Even more significant perhaps was the emergence and consolidation of symbolic thought processes visible, for example, in cultural manifestations whose careful manufacture took tool-making into a whole new realm of aesthetic concerns rarely observed in earlier toolkits. By around 400,000 years ago in Eurasia, Pre-Neandertals and then Neandertal peoples were conferring special treatment to their dead, sometimes even depositing them with other objects suggestive of nascent spiritual practices. These would eventually develop into highly diverse social practices, like ritual and taboo. Cultural diversity was the keystone for new systems of belief that reinforced imagined differences separating territorially distinct groups.

Anatomically modern humans (H. sapiens) appeared on the scene some 300,000 years ago in Africa and spread subsequently into lands already occupied by other culturally and spiritually advanced species of Homo. While maintaining a nomadic existence, these hominins were undergoing transformational demographic trends that resulted in more frequent interpopulation encounters. This factor, combined with the growing array of material and behavioral manifestations of culture (reflected by artifact multiplicity) provided a repository from which hominin groups stood in contrast with one another. At the same time, the mounting importance of symbolic behaviors in regulating hominin lifestyles contributed to reinforcing both real (anatomic) and imagined (cultural) variances. Intergroup encounters favored cultural exchange, inspiring innovation and driving spiraling techno-social complexity. In addition, they provided opportunities for sexual exchanges necessary for broadening gene pool diversity and avoiding inbreeding. At the same time, a higher number of individuals within each group would have prompted social hierarchization as a strategy to ensure the survival of each unit.

While much has been written about what Middle Paleolithic inter-specific paleo-encounters might have been like, in particular between the Neandertals and H. sapiens, solid evidence is lacking to support genocidal hypotheses or popularized images of the former annihilating the latter by way of violent processes. Today, such theories, fed by suppositions typical of the last century of the relative techno-social superiority of our own species, are falling by the wayside. Indeed, advances in archeology now show not only that we were interbreeding with the Neandertals, but also that Neandertal lifeways and cerebral processes were of comparable sophistication to those practiced by the modern humans they encountered. Presently, apart from sparse documentation for individual violent encounters, there is no evidence that large-scale violence caused the extinction of the Neandertals or of other species of Homo thriving coevally with modern humans. That said, it has been observed that the expansion of H. sapiensinto previously unoccupied lands, like Australia and the Americas, for example, coincides ominously with the extinction of mega-faunal species. Interestingly, this phenomenon is not observed in regions with a long record of coexistence between humans and mega mammals, like Africa or India. It has been hypothesized that the reason for this is that animals that were unfamiliar with modern humans lacked the instinct to flee and hide from them, making them easy targets for mass hunting.

If large-scale human violence is difficult to identify in the Paleolithic record, it is common in later, proto-historic iconography. Evidence for warlike behavior (accumulations of corpses bearing signs of humanly-induced trauma) appear towards the end of the Pleistocene and after the onset of the Neolithic Period (nearly 12,000 years ago) in different parts of the world, perhaps in relation to new pressures due to climate change. Arguably, sedentary lifestyles and plant and animal domestication—hallmarks of the Neolithic—reset social and cultural norms of hunter-gatherer societies. Additionally, it may be that the amassing and storing of goods caused new inter-relational paradigms to take form, with individuals fulfilling different roles in relation to their capacities to benefit the group to which they belonged. The capacity to elaborate an abstract, symbolic worldview transformed land and resources into property and goods that “belonged” to one or another social unit, in relation to claims on the lands upon which they lived and from which they reaped the benefits. The written documents of the first literate civilizations, relating mainly to the quantification of goods, are revelatory of the effects of this transformational period of intensified production, hoarding and exchange. Differences inherent to the kinds of resources available in environmentally diverse parts of the world solidified unequal access to the kinds of goods invested with “value” by developing civilizations and dictated the nature of the technologies that would be expanded for their exploitation. Trading networks were established and interconnectedness favored improvements in technologies and nascent communication networks, stimulating competition to obtain more, better, faster.

From this vast overview, we can now more clearly see how the emergence of the notion of “others” that arose in the later phases of the Lower Paleolithic was key for kindling the kinds of behavioral tendencies required for preserving the production-consumption mentality borne after the Neolithic and still in effect in today’s overpopulated capitalist world.

Evolution is not a linear process and culture is a multifaceted phenomenon, but it is the degree to which we have advanced technology that sets us apart from all other living beings on the planet. War is not pre-programmed in our species, nor is it a fatality in our modern, globalized existence. Archeology teaches us that it is a behavior grounded in our own manufactured perception of “difference” between peoples living in distinct areas of the world with unequal access to resources. A social unit will adopt warlike behavior as a response to resource scarcity or other kinds of external challenges (for example, territorial encroachment by an ‘alien’ social unit). Finding solutions to eradicating large-scale warfare thus begins with using our technologies to create equality among all peoples, rather than developing harmful weapons of destruction.

From the emergence of early Homo, natural selection and technoselection have developed in synchronicity through time, transforming discrete structural anomalies into evolutionary strategies in unpredictable and interdependent ways. The big difference between these two processes at play in human evolution is that the former is guided by laws of universal equilibrium established over millions of years, while the latter exists in a state of exponential change that is outside of the stabilizing laws of nature.

Human technologies are transitive in the sense that they can be adapted to serve for different purposes in distinct timeframes or by diverse social entities. Many objects can be transformed into weapons. In the modern world plagued by terrorism, for example, simple home-made explosives, airplanes, drones, or vans can be transformed into formidable weapons, while incredibly advanced technologies can be used to increase our capacity to inflict desensitized and dehumanized destruction on levels never before attained.

Meanwhile, our advanced communication venues serve to share selected global events of warfare numbing the public into passive acceptance. While it is difficult to determine the exact point in time when humans selected large-scale warfare as a viable behavioral trait, co-opting their astounding technological prowess as a strategy to compete with each other in response to unprecedented demographic growth, there may yet be time for us to modify this trajectory toward resiliency, cooperation, and exchange.

By Deborah Barsky

Author Bio:
Deborah Barsky is a writing fellow for the Human Bridges project of the Independent Media Institute, a researcher at the Catalan Institute of Human Paleoecology and Social Evolution, and an associate professor at the Rovira i Virgili University in Tarragona, Spain, with the Open University of Catalonia (UOC). She is the author of Human Prehistory: Exploring the Past to Understand the Future (Cambridge University Press, 2022).

Source: Human Bridges

Credit Line: This article was produced by Human Bridges.




Amrus Natalsya: The Last, Farewell And Thank You

Amrus Natalsya – Image: YouTube

02-03-2024 ~ Legendary Indonesian artist Amrus Natalsya passed away at the age of 90 on January 31, 2024, at 7:30 p.m. The artist, who came of age just as his country emerged from colonialism, leaves behind nearly seven decades of creative work. His work bore witness to the tumultuous history of Indonesia’s independence and unfinished revolution, and the long-held aspirations of the Indonesian people.

Amrus was born on October 21, 1933, in the North Sumatran town of Natal (from which his name “Natalsya” was taken). In 1954, after graduating high school, he entered the Academy of Fine Arts of Indonesia in Yogyakarta and began his lifelong commitment to his craft of painting and wood sculpting.

His artistic career was never an individual pursuit but was molded by the situation of his country, which had just gained formal independence on August 17, 1945. As the young Amrus was learning to create new worlds out of paint and wood, so too was the young nation of Indonesia trying to build an independent path out of the ravages of centuries of colonialism.

At a student exhibition on Javanese culture, Amrus’s wooden sculpture entitled “A Forgotten Blind” was purchased by President Sukarno himself. Indonesia’s first post-independence president is perhaps best known for convening the 1955 Bandung Conference. At Bandung, while Amrus was still at the Academy of Fine Arts, leaders of 29 newly or soon-to-be independent countries representing half of the world’s population stood up together against imperialism.

Seeing the August Revolution of 1945 as an incomplete one, many young artists like Amrus set themselves to the task of building an anti-imperialist and independent national culture that would pave the way to a socialist revolution. This was a period when organizations of Left artists were thriving. Lekra ( Lembaga Kebudayaan Rakyat or “The Institute for People’s Culture”) was one of these organizations, founded on August 17, 1950, just five years into Indonesia’s independence.

Lekra was likely the largest cultural organization not affiliated with a country to have ever existed. As the cultural front of the Communist Party of Indonesia (PKI), it would grow over the next 15 years to over 200,000 members and, including its supporters, 1.5 million—until its life was cut short by the 1965 coup. The latter was the pretext for the persecution and killings of millions of communists and sympathizers in the months that followed.

Amrus was one of the many artists who were arrested in 1965, only to be released in 1973. Four years before his arrest, he had helped establish the Sanggar Bumi Tarung collective. Joining him were like-minded leftist artists, including Misbach Tamrin and Djoko Pekik. Forming part of Lekra, the collective and its members not only produced new artworks but also developed theory around their creations. One of the key principles that Lekra developed was Turun ke bawah or turba (“descend from above”), which was concretized in the first national congress of Lekra as a theory to guide the artist’s work. This meant going down to the grassroots to work, eat, and sleep alongside laborers, landless peasants, and fishermen. They believed that only with the sharpened feeling and understanding of the life of the people—or rakyat—can an artist adopt a kerakyatan approach, which means creating artwork that serves the people.

Amrus was a firm believer in and practitioner of turba, which he saw as both a source of knowledge and an inspiration for creation. He lived among Central Javanese peasants. In one instance, after learning of a local land dispute that resulted in the deaths of eleven peasants, he created one of his most famous wood sculptures. The work was more than a beautiful object, it became a record of an event, an analysis of class struggle, and an embodiment of the Lekra principle  kreativitas individual dan kearifan massa (“individual creativity and the wisdom of the masses”).

Most importantly, Amrus, like other political artists of his generation, defied the belief that art could be separated from politics. For Lekra artists, “politics was in command,” meaning that the foremost creative task was to create artwork that opposed imperialism and advanced the Indonesian revolution. But, for them, politics must always find its careful balance with artistry and aesthetics, and, like Amrus, an artist must be tirelessly committed to building his or her craft.

Amrus held his last solo exhibition in Jakarta in 2019. The rooms were filled with his signature “wood paintings,” which are intricate, colorful, and captivating. He created worlds with wood, three-dimensional scenes of everyday working-class life—a street market, a festival in Chinatown, a fishing village—where ordinary children, families, and people were always represented in multitudes. Every human being became a protagonist in the collective scenes that Amrus sculpted. As it was his last exhibition, the show was befittingly entitled, “ Terakhir, selamat tinggal dan terima kasih”(“The last, farewell and thank you”).

Amrus’s passing marks the final chapter of over seven decades of making artwork that served his people and an Indonesia whose revolution never saw its completion. Likewise, it has been nearly six decades since the horrors of 1965, when both the PKI and Lekra were effectively erased from political life and living memory. A Lekra poet, Putu Oka Sukanta, said, “Formal organizations can disappear; party organizations can be abolished, but the spirit lives if it is right.” May the spirit of Amrus, of Lekra, of Bandung live on for the artists of today, still seeking the path of making artwork that serves the people.

By Tings Chak

This article was produced by Globetrotter.

Author Bio:
Tings Chak is a researcher and art director at Tricontinental: Institute for Social Research.

Source: Globetrotter




Can Democracy Survive The Morbidly Rich?

Thom Hartmann – Photo: en.wikipedia.org

02-02-2024 ~ So, Donald Trump won Iowa. A crazed billionaire who wants to “suspend the Constitution” and claims the right to a murder his political enemies. It doesn’t have to be this way.

Imagine.

Fox “News” shuts down (or just decides to only tell the truth), and most of the steam goes out of the rightwing populist MAGA movement which its billionaire owners helped create in the U.S. Insurrectionist members of Congress find themselves in jail facing sedition charges, as the previous leader of the country is under criminal investigation for taking help from Putin’s Russia. The government begins the process of decriminalizing abortion nationwide. The Supreme Court’s Citizens United decision is overturned and, without the flow of billionaire money directing their votes, Congress begins to actually pass laws that reflect the desires of the majority of the people.

Unless you read international newspapers like The Financial Times, odds are you have no idea that same scenario is now playing out in Poland which, for the previous eight years, had been suffering under a Trump-like administration.

Last summer, progressive Polish politician Donald Tusk promised he was going to clean up that country with an “iron broom.” Few took his promise seriously, but after being sworn into office last month, he’s actually doing it. As Maciej Kisilowski writes for The Financial Times, just a few days after Tusk became prime minister:

“… Poland’s politicized public television station, notorious for its xenophobic, homophobic and racist messages, abruptly went dark. Tusk’s culture ministry summarily dismissed the station’s board and stopped broadcasts to prevent the outgoing leadership from inflaming tensions by airing live the takeover of the group’s headquarters.”

Last week two of the top rightwing politicians in Poland were arrested for abuse of power during the previous regime, while only a few hundred people showed up to protest the end of rightwing programming on the nation’s main television network.

Last Thursday there was a march to protest the new progressive prime minister, but the reaction was both tepid and nonviolent. As Kisilowski writes for The Financial Times:

“These decisive, if heavy-handed, actions come at a time when democrats globally are searching for strategies to deal with populists. In the U.S., for example, there is intense debate about whether the protracted legal cases against Donald Trump are serving to boost his campaign to return to the White House. Perhaps Tusk’s approach offers hope.”

Dislodging the death grip the GOP has on American politics will be more difficult than in Poland, in large part because five corrupt Republicans on the U.S. Supreme Court legalized political bribery with their Citizens United decision.

The 2020 election cost over $14 billion. The 2016 election was only $6.5 billion. But in 2008, two years before Citizens United, it hadn’t even hit $1 billion: total spending with a mere $717 million. As the Executive Director of the money tracking opensecrets.org, Sheila Krumholz, said:

“Total outside spending is surprisingly high for this point in the cycle—we’re already at nearly $230 million. That’s more than twice the previous record through this point in the cycle, which was back in 2016. But it’s more than five times as much as was spent by this point in the last presidential cycle in 2020.”

Rightwing fascist-adjacent billionaires have used that open door to severely corrupt our political system, leading to massive gridlock when it comes to anything average people want.

Meanwhile, billionaires got tax cuts and deregulation making them vastly richer at the same time the companies that made them rich refuse to even commit to paying their workers a living wage (fewer than one percent of the world’s top companies have made such a commitment).

As a result, Princeton scholars Martin Gilens and Benjamin Page famously found that the odds of average Americans’ political desires being translated into policy are about the same as “random noise.”

On the other hand, the new American system created by Republicans on our Supreme Court works quite well for the morbidly rich. The people Gilens and Page refer to as the “economic elites” frequently get everything they want from the political class.

They wrote that we still have the “features” of democracy, like elections, but ended their paper with this cautionary note:

“[W]e believe that if policymaking is dominated by powerful business organizations and a small number of affluent Americans, then America’s claims to being a democratic society are seriously threatened.”

So, here we are.

If, in a zoo somewhere, a single chimpanzee had risen up and taken all the food from all the other chimpanzees and was hoarding it without being challenged, scientists from all over the world would be trying to figure out what was wrong with that chimpanzee and why the others tolerated his theft.

As Oxfam International pointed out in a report timed to correspond with the kickoff of the billionaire love-fest at Davos this week, the five richest men in the world—four of them Americans—saw their net worth more than doubleover the past three years from $405 billion to $869 billion.

Since those five corrupt Republicans on the Court also ruled in Citizens United that corporations are “persons” with nearly-full access to the Bill of Rights—including the right to use money to pay off politicians (the Republicans on the Court call it First Amendment-protected “free speech”)—the corporations that are producing these billionaires are also gouging American consumers as hard and fast as they can.

As Oxfam noted (keep in mind, “profit” means, essentially, “the money that’s left over after all of our business expenses, including payroll, that we, the owners, get to split up and keep for ourselves”):

“Mirroring the fortunes of the super-rich, large firms are set to smash their annual profit records in 2023. 148 of the world’s biggest corporations together raked in $1.8 trillion in total net profits in the year to June 2023, a 52 percent jump compared to average net profits in 2018-2021. Their windfall profits surged to nearly $700 billion.”

That money isn’t going to workers, though, who’ve seen their real, inflation-adjusted wages fall worldwide in the same period. Instead:

“The report finds that for every $100 of profit made by 96 major corporations between July 2022 and June 2023, $82 was paid out to rich shareholders. … It would take 1,200 years for a woman working in the health and social sector to earn what the average CEO in the biggest 100 Fortune companies earns in a year.”

Thanks to Clarence Thomas’ tie-breaking vote in Citizens United, Americas are not getting what they want. Which is another way of saying the morbidly rich and the judges and politicians they’ve bought are actively breaking our democratic republic.

On the eve of the 2016 election of Donald Trump, for example, the Progressive Change Institute did a nationwide survey of likely voters. The results were stark:

— 84 percent want the government to negotiate drug prices.
— 79 percent support expanding Social Security (which Haley and DeSantis both just last week promised to cut).
— 78 percent want “fair trade” that ends shipping our jobs overseas.
— 77 percent want to tax corporations that have moved jobs overseas.
— 77 percent want universal free pre-kindergarten.
— 74 percent want all Americans to be able to buy into Medicare-for-All.
— 71 percent support a massive infrastructure spending program aimed at rebuilding our broken roads and bridges and putting people back to work.
— 70 percent want free college at all public universities.
— 68 percent want a guaranteed minimum income.
— 67 percent want the government to be the employer of last resort to end unemployment (like FDR did).
— 66 percent want the morbidly rich to pay at least a 50 percent income tax (currently the average American billionaire pays around 3 percent).
— 65 percent want the big banks broken up and a return to local banking.
— 64 percent want net neutrality so your billion-dollar corporate internet and email providers can’t monitor everything you do online and sell that information.
— 63 percent want public financing of elections to get billionaire money out of them.
— 60 percent want the Post Office to offer inexpensive public banking.

President Biden and Democrats in Congress got some of the infrastructure work done and tried to end much of America’s nearly $2 trillion in student debt (until they were blocked by Republican lawsuits and six Republicans on the Supreme Court), and a small bit of the “Green New Deal” incorporated into the Inflation Reduction Act, but otherwise there’s a lot that Americans want and deserve that they’re not getting.

Why? Because the morbidly rich control much of our political process right now, and most of our media.

As former Supreme Court Justice Louis D. Brandeis famously said:

“We must make our choice. We may have democracy, or we may have wealth concentrated in the hands of a few, but we cannot have both.”

This year’s election will not only decide whether we’re going to let Trump or a similar Republican end American democracy; it will also, if enough of us show up, determine if Citizens United can be legislatively overturned and we can purge our political system of the cancer of big money.

Democrats almost did it in 2022: the For the People Act would have reversed significant parts of Citizens United and provided for public funding of and more transparency around elections. It passed the House and got enough votes to pass the Senate under the Constitution.

Even the Republican filibuster could have been overcome if we hadn’t been betrayed by Joe Manchin and Kyrsten Sinema in the Senate. Make sure everybody you know is registered to vote. The stakes have never been higher.

By Thom Hartmann

Author Bio:
Thom Hartmann is a talk-show host and the author of The Hidden History of Neoliberalism and more than 30+ other books in print. He is a writing fellow at the Independent Media Institute and his writings are archived at hartmannreport.com.

Source: Independent Media Institute

Credit Line: This article was produced by Economy for All, a project of the Independent Media Institute.




2024 Will See Wave Of Minimum Wage Hikes — But The Impacts Won’t Be Felt Evenly

Jeannette Wicks – Photo: Political Economy Research Institute

02-02-2024 ~Ten million US workers will see wage increases this year, but inflation has eroded our tools for reducing inequality.

On January 1, 2024, the minimum wage increased from coast to coast. Indeed, 22 states and more than 40 cities and counties experienced wage increases in 2024 — most of them approaching $15. More states will follow with minimum wage increases later in the year. Undoubtedly, this is mainly the result of underpaid workers organizing and fighting for a decent living wage over the past decade, but is a minimum wage enough? Moreover, where are the increases in minimum wage taking place?
A look at the map shows that the South still trails behind. Could this be due to racism in the labor market? In the exclusive interview with Truthout that follows, progressive economist Jeannette Wicks-Lim shares her research findings and insights on the politics and economics of increasing minimum wages. Wicks-Lim is a research professor at the Political Economy Research Institute (PERI) at the University of Massachusetts Amherst and specializes in labor economics with an emphasis on the low-wage labor market.

C. J. Polychroniou: The federal minimum wage is still $7.25, but higher minimum wages went into effect in early January in 22 states, while three more (Florida, Nevada and Oregon) will increase their minimum wages later in 2024. How many and what sort of workers are going to be affected by these wage increases, and how significant are these minimum wage hikes in improving the overall standard of living for low-wage workers?

Jeannette Wicks-Lim: First, to your question about how many and what sort of workers will benefit from these minimum wage increases: An excellent resource for understanding the basic facts of who will benefit from these minimum wage increases is the Economic Policy Institute.

There are two basic types of raises that result from these increases — mandated and ripple-effect raises. Mandated raises are the raises that get workers up to the new, higher minimum wage level. Ripple-effect raises are raises that employers may choose to give workers who are earning a little more than the minimum wage — often these are workers with more experience than the entry-level workers who are more likely to earn minimum wage rates. Employers may do this so that the workers with more experience continue to earn more than entry-level workers after the minimum wage goes up. Overall, about 10 million workers — including both types of raises — will receive wage increases. This represents about 15 percent of the workforces in the states implementing minimum wage hikes. So, the impact — in terms of the share of affected workers — is substantial.

Due primarily to occupational segregation, women are over-represented in low-paying jobs, so they tend to receive a disproportionate share of the raises. Low-wage jobs are particularly concentrated in food services, for example, particularly fast-food restaurants. Workers in other service industries, such as home care aides and child care workers, are well represented among those whose wages rise with minimum wage hikes.

Black, Indigenous, and other workers of color also tend to be overrepresented among the workers who receive raises from minimum wage hikes when looking within the states adopting minimum wage hikes. However, it’s important to note that a majority of Black Americans do not reside in the states raising their minimum wage so that, overall, such workers are underrepresented among pool of beneficiaries from minimum wage hikes. This reflects the fact that southern states, and especially those that formerly made up the Confederate states, tend to have weak labor standards. This is clearly visible when you look at a map of where the minimum wage hikes are happening. The history of racism in the U.S. — especially in how the labor market operates — shows up clearly here.

Next, to your question about how the minimum wage raises affect the living standard of low-wage workers: I know from the multiple studies my colleagues at PERI and I have done in the past that the earnings of minimum wage workers typically contribute 40 percent of their household’s income so that a meaningful raise to these workers pay rates have a meaningful impact on their household incomes. These workers are typically from lower- to middle-income households.

Now, some of the minimum wage hikes that will pass are quite small — they only account for the increase in inflation rather than increasing the wage floor in inflation-adjusted terms. So, it’ll be workers in those states that lift the wage floor in real, inflation-adjusted terms, that can expect meaningful boosts to their living standards. Roughly speaking, about half the states raising their minimum wages are adopting increases in the range of 3 percent — these reflect, generally speaking, inflation-related adjustments. Another quarter of these states have minimum wage hikes in the range of 7 percent, and the remaining quarter are raising their minimum wage floors in excess of 10 percent.

In the states with minimum wage hikes of at least 10 percent, my back-of-the-envelope calculations suggest that the average worker who receives a raise will be able to boost their household’s income by about 4 percent (10 percent raise x 40 percent contribution to household income). This is a meaningful increase. For a low-income household with three people (say, an income of $35,000), that translates to an annual increase of nearly $1,400. At the same time, given the rate of inflation in the last few years, even the smaller inflation-adjustment raises serve as important safeguards to at least maintaining workers’ living standards.

Why is it that half of the country is raising the minimum wage while the other half is not? In this context, can you talk about the politics behind minimum wage increases? For instance, there have been strong public campaigns to raise wages and end subminimum wages since the outbreak of the COVID-19 pandemic.

I’m not a political scientist, so my views on this are more impressionistic than based on any research I’ve done. What seems to be an important determinant of whether a minimum wage hike passes or not is how directly voters can weigh in on the question.

What I’ve observed is that minimum wage hikes are typically extremely popular at the ballot box. When voters get a chance to directly weigh in on whether their state should strengthen this labor standard, they typically vote in favor in large numbers. It seems that voters see minimum wage hikes as reasonable, fair and potentially beneficial to themselves and the economy. This is a popular issue even across the two major political parties. Take as an example, Florida, in which 71 percent of voters voted in favor of establishing that state’s minimum in 2004. In that same year, the majority of these Florida voters voted for Republican presidential candidate George W. Bush.

As we move further away from voters having a direct say, such as minimum wage hikes proposed by state legislators, there seem to be more mixed results and partisan politics appear to play a stronger role. Getting a minimum wage hike through a state legislature dominated by the Republican Party is, it seems, nearly impossible. Again, this isn’t my area of research, but what this suggests to me is that business interests are more successful in lobbying Republican-leaning legislators than Republican-leaning voters. That, in turn, suggests that Republican-leaning legislators benefit more from these business interests than Republican-leaning voters. Still, states have increasingly taken up the role of strengthening this wage floor as the federal rate has stagnated. In 1980, for example, only two states operated with a state minimum age higher than the federal rate — Connecticut and Alaska. Now, as you note, about half the states in the country are increasing their minimum wage rates while the federal rate remains unchanged. I would bet that states that have a higher state minimum wage than the federal level and that typically have Republican-leaning state legislatures got their minimum wage rate through a ballot initiative.

At the federal level, there is no option to put this question directly to the voters. States that operate with the federal rate have seen the long stretches of time with no minimum wage hike — from 1982 to 1989 (eight years), and again from 1998 to 2006 (nine years), and now from 2010 to today (at least 14 years). The last federal increase — in 2009 — as the second step in a two-part increase was signed into law by a Republican president and a Democratic-led U.S. Congress. Since then, none have passed.

Do increases in the minimum wage affect the labor market?

I’ll focus on one of the important ways that the minimum wage impacts the labor market: The minimum wage has played an important role, at least historically, in reducing wage inequality. Prior to 1980, the minimum wage rose roughly in step with worker productivity so that the lowest wages increased in a meaningful way over time. And, because of ripple effect raises, these increases would basically pull up the bottom of the wage distribution closer to the middle. One important feature of the minimum wage to notice is that while these ripple-effect raises extend the benefits of minimum wage hikes to workers earning above the minimum wage, this pool of beneficiaries is really limited to workers at the low end of the wage distribution. High-wage workers, on the other hand, do not get raises from changes to the minimum wage. As a result, minimum wage-induced raises have the effect of compressing the wage distribution — i.e. reducing overall wage inequality. I say historically, because this role has been weakened in recent decades, especially in the states that only operate with the federal minimum, as the federal minimum wage has lost a significant amount of its inflation-adjusted value.

Another major way that the minimum wage, again historically, reduced wage inequality occurred when the Fair Labor Standards Act strengthened the labor standard by expanding which workers would be protected. Prior to 1966, the standard did not apply to a large segment of the low-wage workforce, excluding occupations in which Black workers were overrepresented due to racial discrimination. These occupations included farming and various service sector jobs. In their 2021 study, economists Ellora Derenoncourt and Claire Montialoux found that when these types of occupations became newly protected in 1966, the federal minimum wage reduced the earnings gap between Black and white workers by roughly one-fifth.

The minimum wage is regarded by many economists as an important tool of public policy for combating poverty and inequality. Do minimum wages deliver what they promise?

I’ve already touched on the topic of how the minimum wage impacts inequality so let me focus here on the question of poverty. I think of the minimum wage as being primarily a policy tool that helps low-income to middle-income households. This is because the minimum wage has no statistically discernible impact on employment (positive or negative) and so it is a policy tool that benefits employed workers, most directly. Those who are unemployed, or not in the labor force can only benefit indirectly from minimum wage hikes, e.g. through the earnings of other household members or by maintaining the labor standards of existing jobs so that when such individuals become employed, they benefit from the minimum wage floor. Other policies need to supplement robust minimum wages to effectively tackle poverty. The other policies that I think of include economic policies to move the U.S. economy to full employment, policies that subsidize traditionally unpaid work (e.g. child care and elder care), and policies that reduce wealth inequality so that more households are able to weather financial shocks, such as medical emergencies and spells of unemployment.

This interview has been lightly edited for clarity and length.

Source: https://truthout.org/




How Media Companies Can Meet Their Climate Commitments – And How Readers Can Help

Laura Lee Cascada -Photo: lauraleecascada.com

02-01-2024 ~ A few simple strategies for recipe curation can help newsgroups achieve their own climate goals.

The global shift toward plant-forward diets, particularly in wealthy countries, is consistently recognized as one of the most effective ways to reduce global greenhouse gas emissions.

Given the significant influence media wields in shaping cultural norms, news outlets have a unique opportunity to promote plant-based eating, mainly through the recipes they offer their audiences. Through several simple behavioral nudges, news outlets can guide their readers toward healthier, climate-friendly eating—while aligning their recipe sections with their purported values around responsible climate reporting.

Environmental and Public Health Dangers of Factory Farming
Industrial animal agriculture is an enormous driver of planetary harm. According to a June 2018 article published in the journal Science, meat, aquaculture, dairy, and eggs contribute about 56 to 58 percent of food’s greenhouse gas emissions, yet they only provide 18 percent of our calories. Meat and dairy production is also a leading cause of biodiversity loss, freshwater use, chemical pollution, and resource shortages.

While up to 8,000 gallons of water are needed to produce just a pound of beef, a pound of tofu only requires about 300 gallons, according to an article in UCLA Sustainability. “[W]ithout meat and dairy consumption, global farmland use could be reduced by more than 75 percent—an area equivalent to the U.S., China, European Union, and Australia combined—and still feed the world,” states an article in the Guardian.

Because of the crowded, filthy conditions inherent to factory farms, animal agriculture further contributes to a range of public health risks of particular concern in the post-pandemic world. According to the Centers for Disease Control and Prevention, poultry and egg production is associated with foodborne pathogens such as salmonella and campylobacter, and these farms are incubators of many potentially dangerous influenza viruses.

Given these dangers, an article in the American Journal of Lifestyle Medicine points out that besides doing away with the cruel confinement practices associated with animal agriculture, there is a need “to research, develop, and invest in innovative plant-based and cultivated meat technologies to move away from raising billions of feathered and curly-tailed test tubes for viruses with pandemic potential to mutate within.”

Farmed animals are also the largest source of antibiotic usage in the U.S. at about 73 percent—based on figures provided by the Natural Resources Defense Council in December 2022—leading to the rise of antibiotic-resistant superbugs. The nonprofit Farm Forward revealed in August 2022 that even meat labeled as antibiotic-free may still contain these drugs.

Mainstream Media’s Role in Creating a Climate-Friendly Food System
Many large news organizations have made public their own internal goals of helping the climate. The New York Times, for example, says that the newspaper “recognize[s] the effects of climate change, and we are taking action to reduce our own impact. We are committed to reducing our carbon footprint and identifying opportunities to improve the sustainability of our business operations.”

In 2021, the Times debuted “The Veggie” newsletter and dedicated more space in their digital cooking section for vegetarian and vegan recipes, reviews, and tips. “The call for vegan and vegetarian foods is growing louder every day, and as the volume increases, the food world is responding,” wrote Maxwell Rabb in an article on the Beet about the Times initiative.

Internally and externally, media groups play an outsized role in shaping both public sentiment and cultural norms. Fact-based coverage of climate change has gradually improved in recent years, with many outlets now espousing a commitment to scientifically reporting on the issue. For example, several top publications now have climate-dedicated verticals, while the Guardian introduced new guidelines in 2019 for language and images used in climate stories to reflect the issue’s urgency. But for most news outlets, a glaring blind spot around climate remains—in their food sections.

A 2023 Faunalytics and Sentient Media report on the number of times animal agriculture was mentioned by climate media found that only 7 percent of the 1,000 climate articles surveyed mentioned animal agriculture. Six times as many articles mentioned transportation.

Given that animal agriculture’s emissions rival those of the entire transportation sector, the media’s heavy emphasis on electric vehicles and petrol has obscured our food system’s outsized role in the climate crisis.

Considering the lack of journalistic attention given to industrial animal agriculture, it is unsurprising that most news outlets have yet to align their recipe sections with their commitment to responsible climate coverage. Yet these publications have a unique opportunity—and responsibility—to profoundly shape cultural norms by promoting climate-friendly, plant-based foods.

One of the few recipe hubs bold enough to make the link explicit is Condé Nast-owned Epicurious, which in 2021 “left beef behind.” Citing ruminants’ inefficiency and rising beef consumption in the U.S., Epicurious’s new policy was keenly introduced at rollout as “not anti-beef but rather pro-planet.” Beef does not feature in any Epicurious-originating recipes, articles, or newsletters after April 2021—though, counterproductively, the site still cross-posts beefy recipes from sister brand Bon Appetit.

Besides Epicurious, most recipe sites have been silent on this topic.

Investigating Mainstream Media’s Recipe Curation
I work at the Better Food Foundation (BFF), serving as the senior director of campaigns. We wanted to determine whether mainstream media outlets had begun aligning their recipe curation with the climate-conscious values they espouse. So, with support from Sentient Media, we worked with a data scientist to analyze the recipe sections of the top four UK-based—BBC, the Guardian, the Independent, and ITV—and top four U.S.-based—Associated Press, New York Times, Washington Post, and Yahoo News—media outlets with a track record for responsible climate reporting (reporting that adheres to the scientific consensus) in 2023.

Among these eight outlets, we found that five predominantly feature meat-based recipes. Only the Washington Post, the Guardian, and Yahoo News had less than half of their recipes categorized as “omnivorous.”

Notably, the Washington Post and the Guardian both expressed their intent to curate climate-friendly food content, indicating some level of awareness about their role in addressing the climate-diet link not only through reporting but also through modeling sustainable behavior.

We offered every outlet a chance to respond to our findings, and one significant reply came from Joe Yonan of the Washington Post, whoexplained, “We are always looking for ways to help our readers lead better lives, according to their own definition, and recipes play a large role in that… [M]ore and more readers are looking for help making climate-friendly decisions about all aspects of their lives, food included, and we want to respond to that.”

Looking Ahead
So, where do we go from here? For most mainstream news outlets, there is an untapped opportunity to curate more plant-forward recipe sections. Ourreport about the findings mentioned above and an accompanying webinar offer five scientifically proven strategies that use behavioral “nudges” to guide readers toward more climate-friendly meals.

These tips, part of a concept we call DefaultVeg, make plant-based foods the primary or default option without taking away the choice to select meat or dairy. A 2022 study conducted by Food for Climate League in collaboration with BFF, Greener by Default, and researchers at Boston College on three Sodexo campuses found that when implemented consistently, this strategy increased the take rate of plant-based meals from 30.8 percent to 81.5 percent—significantly cutting food’s environmental footprint.

Suggestions to News Organizations
We suggest the following changes in the recipe sections of news outlets:

Maintain a 2-to-1 ratio of plant-based to animal-based recipes. For every omnivorous entree, there should be two vegan recipes, or at the very least, one vegetarian and one vegan recipe.

Present plant-based options first, by default. Feature climate-friendly options prominently within search results and recipe collections by listing them first.

Designate editors’ picks or seasonal recommendations as plant-based by default. When creating features like a summer BBQ showcase or Thanksgiving dishes emphasize plant-based grilling options or vegan dishes like roasted squash and vegan mashed potatoes.

Substitute animal-based ingredients with plant-based alternatives in popular recipes. Many recipes can easily become vegan by swapping out specific ingredients without sacrificing the flavor or quality of the dish. For instance, use vegan mayo instead of egg-based mayo in potato salad—taste testers won’t notice the difference.

Include a climate score for each recipe based on ingredient emissions intensity. Present climate-friendly options as a priority, following a successful approach demonstrated in online simulations and randomized clinical trials.

Shifting toward plant-based eating is among the most powerful changes we as individuals can make to benefit the planet. Media, which has a major impact on popular culture and behavior, can help facilitate this shift on an enormous scale by presenting plant-based foods as the norm rather than the exception. Making the simple changes mentioned above to their recipe sections will also help media outlets better adhere to their own stated climate reporting values—and maintain their status as trusted news sources with an audience that’s increasingly hungry for climate-friendly food.

And readers have a role to play, too. They can help by writing letters to editors of newspapers and magazines asking them to make their recipe sections more climate-friendly and increase their coverage of plant-based eating.

By Laura Lee Cascada

Author Bio: Laura Lee Cascada is the senior director of campaigns for the Better Food Foundation, where she runs innovative campaigns to shift us to a plant-based food norm. She has a master’s degree in environmental policy from Johns Hopkins University and has spent the past 15 years campaigning for a more sustainable world. Laura is a published novelist and the founder of the Every Animal Project, a collection of true tales reshaping our relationship with animals. She is a contributor to the Observatory.

Source: Independent Media Institute

Credit Line: This article was produced by Earth | Food | Life, a project of the Independent Media Institute.




The Animal Feed Industry’s Impact On The Planet

Vicky Bond – Photo: The Humane League

01-30-2024 ~ The diet of factory-farmed animals is linked to environmental destruction around the globe.

In some parts of the continental United States, you might drive through a nearly unchanging landscape for hours. Stretching for miles and miles, vast swaths of soil are dedicated to growing crops—corn, grains, fruits, and vegetables that make up the foundation of our food system.

The process seems highly efficient, producing enormous quantities of food every year. But only a small percentage of these crops will go toward feeding humans. According to a 2013 study conducted by researchers at the Institute on the Environment at the University of Minnesota and published in the journal Environmental Research Letters, a mere 27 percent of crop calorie production in the United States actually feeds humans. So what happens to the rest?

Some crops are used for the production of ethanol and other biofuels. But the vast majority—more than 67 percent of crop calories grown in the U.S.—are used to feed animals raised for human consumption.

Rather than feeding people, these crops feed the billions of chickens, cows, pigs, and other animals who live and die on factory farms. And that’s a problem.

The issue is that feeding humans indirectly—essentially, making animals the caloric middlemen—is a highly inefficient use of food. “For every 100 calories of grain we feed animals, we get only about 40 new calories of milk, 22 calories of eggs, 12 of chicken, 10 of pork, or 3 of beef,” writes Jonathan Foley, PhD, executive director of the nonprofit Project Drawdown, for National Geographic. “Finding more efficient ways to grow meat and shifting to less meat-intensive diets… could free up substantial amounts of food across the world.”

This shift in growing and consuming food more sustainably has become especially important, with up to 783 million people facing hunger in 2022, according to the United Nations. Research indicates that if we grew crops exclusively for humans to consume directly we could feed an additional 4 billion people worldwide.

Farming has always loomed large in American politics, history, and identity. But the idyllic farming we may imagine—rich piles of compost, seedlings poking through the soil, and flourishing gardens of diverse fruits and vegetables—has transformed into factory farming, a highly industrialized system far removed from earth and soil. Animal feed is essential for the sustenance of this industry—supplying the cattle feedlots, broiler chicken sheds, and egg factories that increasingly make up the foundation of our food system.

What Factory-Farmed Animals Eat
Take a moment to picture a farm animal enjoying dinner. Are you imagining a cow grazing on grass or perhaps a chicken pecking at the ground, foraging for seeds and insects? In today’s factory farming system, the “feed” these animals eat is far removed from their natural diets. Rather than munching on grass or insects, most animals on factory farms eat some type of animal feed—a cost-effective mixture of grains, proteins, and often the addition of antibiotics designed to make them grow as quickly as possible.

The ingredients in animal feed don’t just matter to the animals’ health. They also impact human health—especially since the average American consumes 25 land animals yearly. Researchers have noted that animal feed ingredients are “fundamentally important” to human health impacts. As author and journalist Michael Pollan puts it: “We are what we eat, it is often said, but of course that’s only part of the story. We are what what we eat eats too.”

So, what are the main ingredients used in animal feed today?

Corn and Other Grains

In 2019, farmers planted 91.7 million acres of corn in the U.S. This equals 69 million football fields of corn. How can so much land be devoted to a single crop—especially something many people only eat on occasion?

The answer is that corn is in almost everything Americans eat today. It’s just there indirectly—in the form of animal feed, corn-based sweeteners, or starches. The U.S. is the world’s largest producer, consumer, and exporter of corn. And a large percentage of all that corn is used for animal feed, supplying factory farms across the country.

While “cereal grains”—such as barley, sorghum, and oats—are also used for animal feed, corn is by far the number one feed grain used in the U.S., accounting for more than 96 percent of total feed grain production. Corn supplies the carbohydrates in animal feed, offering a rich energy source to increase animals’ growth.

Unfortunately, what this system offers in efficiency it lacks in resilience. Numerous researchers have expressed concern about the vulnerability of the food supply that is so reliant on a single crop. “Under these conditions, a single disaster, disease, pest, or economic downturn could cause a major disturbance in the corn system,” notes Jonathan Foley in another article for Scientific American. “The monolithic nature of corn production presents a systemic risk to America’s agriculture.”

Soybeans

When you think about soybeans, you might imagine plant-based foods like tofu and tempeh. However, the vast majority of soybeans are used for animal feed. Animal agriculture uses 97 percent of all soybean meal produced in the United States.

While corn is rich in carbohydrates, soybeans are the world’s largest source of animal protein feed. Similar to corn, Americans might not eat a lot of soybeans in the form of tofu, tempeh, and soy milk—in fact, 77 percent of soy grown globally is used to feed livestock, and only 7 percent of it is used directly for human consumption, states a 2021 Our World in Data article—but they do consume soy indirectly through animal products like meat and dairy.

Soy production comes at a high cost to the environment. It is heavily linked to deforestation, driving the destruction of forests, savannahs, and grasslands—as these natural ecosystems are converted to unnatural farmland—and “putting traditional, local livelihoods at risk.” Critical habitats, like the Cerrado savannah in Brazil, are being razed to clear space for soybean production to meet the global demand for animal feed. More than half of the Cerrado’s 100 million hectares of native landscape has already been lost, with livestock and soybean farming being major contributors to this destruction.

“Most soybean-driven land conversions in Brazil have happened in the Cerrado,” said Karla Canavan, vice president for commodity trade and finance at World Wildlife Fund, in 2022. “The corridor [Cerrado] is like an inverted forest that has enormous roots and is a very important carbon sink. … Unfortunately, more than 50 percent of the Cerrado has been already converted into soybean farmlands.”

It’s a common misconception that plant-based soy products like tofu drive global deforestation. In reality, the vast majority of soy is used for animal feed. To fight this tragic habitat destruction, it’s far more effective to replace meat with soy-based alternatives.

Animal Protein and Waste

Editor’s note: The following section contains graphic descriptions that may disturb some readers.

It’s not just plants like corn and soybeans that go into animal feed. The factory farming industry has a long history of feeding animals waste and proteins from other animals. In 2014, outrage ensued when an investigation by the Humane Society of the United States revealed that pig farmers were feeding animals the intestines of their own piglets. At a huge factory farm in Kentucky, workers were filmed eviscerating dead piglets and turning their intestines into a puree that was being fed back to mother pigs.

This wasn’t even an isolated atrocity. The executive director of the American Association of Swine Veterinarians in 2014 commented that the practice was “legal and safe” and was meant to immunize the mother pigs against a virus called porcine epidemic diarrhea, according to the New York Times. Pigs aren’t the only animals who are effectively turned into cannibals by the factory farming industry.

Farmers were only prohibited from feeding cow meat to other cows following concerns about bovine spongiform encephalopathy (BSE), more commonly known as mad cow disease. The U.S. Department of Agriculture notes on its website that BSE may have been caused by feeding cattle protein from other cows. The practice was banned in 1997—but, notably, only because of the risks to human health and not out of concern for the cows.

Antibiotics

Another key ingredient in animal feed likely doesn’t come to mind when you think about animal nutrition. This ingredient is antibiotics, commonly used in the food given to animals across the country.

On factory farms, animals are confined in extremely crowded, filthy facilities—the perfect conditions for spreading illness and disease. Not only do antibiotics allow animals to survive the conditions in these facilities but they also encourage animals to grow unnaturally large and fast. Drugs are administered through food and water, starting when the animals are just a few days old.

The meat industry’s excessive antibiotic use has directly been linked to antimicrobial resistance (AMR), a massive threat to human health. As bacteria are killed off, the surviving that remain gradually learn how to survive the attacks, becoming resistant to antibiotics over time.

AMR means that conditions that should be easy and affordable to treat—like ear infections—can become life-threatening. It’s “one of today’s biggest threats to global health, food security, and development,” according to the World Health Organization, states a News-Medical article, and it’s projected to kill four times as many people per year as COVID-19 did in 2020, according to the British Society for Antimicrobial Chemotherapy.

Additives and Preservatives

Along with the mixture of corn, soybeans, and a cocktail of antibiotics, animal feed may also contain a plethora of additives and preservatives. The Code of Federal Regulations provides a long list of additives legally permitted in animals’ food and drinking water. These include “condensed animal protein hydrolysate” (produced from meat byproducts of cattle slaughtered for human consumption), formaldehyde, and petrolatum—to name a few.

Unfortunately, many of these additives and preservatives have been linked to adverse human health impacts. For example, formaldehyde, which is classified as a known human carcinogen by the National Toxicology Program, is commonly used in animal feed to reduce salmonella contamination. In 2017, following concerns about farmworkers being exposed to the harmful substance, the European Commission voted to ban feed producers from using formaldehyde as an additive in animal feed.

Animal Feeding Operations
To understand the true impact of animal feed, we must look at animal feeding operations. Of all the animals in our food system today, 99 percentlive on factory farms—enormous, vertically integrated operations designed to make as much profit as possible (at the expense of animals, people, and the environment). The transition to using animal feed has been closely intertwined with the transition to this type of large-scale factory farming.

The official term for a factory farm is concentrated animal feeding operation or CAFO. As the name implies, these operations are laser-focused on feeding large numbers of animals until they reach “slaughter weight,” after which they are killed and turned into products.

The faster an animal reaches slaughter weight, the more quickly the industry profits. So factory farms have dialed in on the most efficient way to feed animals in the shortest amount of time. Rather than grazing on pasture, animals are confined in stationary cages or crowded sheds and given feed that will increase their growth rates—even while it hurts their health.

Take cows, for example. Along with sheep and other grazing animals, they are known as “ruminants”—because they have a rumen, an organ perfectly designed to transform grass into protein. But the industry feeds cows corn instead of grass because it brings them to “slaughter weight” much faster than grazing. Sadly, this high-starch diet can disturb a cow’s rumen, causing pain with severe bloat, acidosis (or heartburn), and other types of stomach upset.

When it comes to feeding animals on factory farms these are some key industry terms to know:

Growth rates: This is the rate at which an animal grows or how quickly the animal reaches “slaughter weight.” Sadly, most factory farm animals are bred to grow so quickly that their health suffers. Chickens raised for meat frequently develop bone deformities, muscle diseases like white striping, and heart problems. Many chickens have difficulty walking, or even just standing due to painful lameness as a consequence of their fast growth rate.

Feed conversion ratio: This is the ratio between the amount of feed an animal eats and the amount of body weight that an animal gains. In other words, a feed conversion ratio is the industry’s effort to feed animals as little as possible to make them grow as quickly as possible.

Selective breeding: This is the practice of breeding two animals to produce offspring with a desired trait. For example, the poultry industry breeds birds who quickly develop outsized breast muscles. In the meat industry, selective breeding is generally used to optimize both feed conversion ratio and growth rates.

Animal Feed Industry Impacts
Overall, factory farming is incredibly resource-intensive and harmful to the environment. From agricultural runoff to water waste and pollution, CAFOs are responsible for some of humanity’s worst climate impacts.

“Livestock farms generate about 70 percent of the nation’s [United States] ammonia emissions, plus gases that cause global warming, particularly methane,” according to the Public Broadcasting Service. The practice of growing crops for animal feed is one of the worst drivers of environmental destruction—leaving biodiversity loss, deforestation, and greenhouse gas emissions in its wake.

Deforestation
Growing crops necessary to feed huge numbers of animals to support human meat consumption requires vast amounts of land, which results in massive deforestation. Forests worldwide are systematically being cleared and replanted with monocrops (such as the corn and soybeans mentioned earlier) to meet the demand for animal products—and therefore, animal feed.

Brazil, for example, is the world’s biggest beef exporter. In the Amazon rainforest—nearly two-thirds of which is part of Brazil—crops for animal feed are one of the primary drivers of deforestation, damaging an essential habitat for countless species. Deforestation rates have averaged nearly 2 million hectares yearly since 1995 in the Amazon, or about seven football fields every minute.

Meanwhile, farmland expansion accounts for 90 percent of deforestation worldwide, “including crops grown for both human and animal consumption, as well as the clearing of forests for animal grazing,” according to a July 2022 article in Sentient Media.

Deforestation eliminates one of our best defenses against climate change as healthy, intact forests provide a crucial ecosystem service: carbon sequestration. Forests safely store more carbon than they emit, making them powerful “carbon sinks” critical to maintaining a stable climate. When we destroy forests for farmland and other uses, we remove that carbon sink and release all the carbon into the atmosphere that had been stored there.

Biodiversity Loss and Extinction Threat
Naturally, deforestation goes hand in hand with biodiversity loss—of which animal agriculture is also a key driver. A 2021 study found that land use conversions to support the “global food system” are a primary driver of biodiversity loss. Tragically, researchers project that more than 1,000 species will lose at least a quarter of their habitats by 2050 if meat consumption continues at the same rate.

At the UN Biodiversity Conference (COP15) in Montreal in December 2022, delegates warned that if our land-intensive eating habits don’t change, more and more critical species will go extinct. As author and journalist Michael Grunwald points out in the New York Times: “[W]hen we eat cows, chickens, and other livestock, we might as well be eating macaws, jaguars, and other endangered species.”

Water Use
Along with vast amounts of land, growing crops for animal feed requires enormous quantities of water. In the U.S. alone, more than 60 percent of freshwater was used to grow crops in 2012, and around 2.5 trillion gallons per year of water was used for animal feed in the same year. Corn, soybeans, and the other grains used in animal feed require about 43 timesmore water than grass or roughage, which animals could access if they were allowed to graze.

Soil Degradation
The intensive farming practices required to grow vast amounts of crops—like corn and soybeans—even take a toll on the soil.

Healthy soil contains millions of living organisms, which naturally replenish and recycle organic material and nutrients. Soil filters water, stores carbon, and allows for carbon, nitrogen, and phosphorus cycles that are critical for life on Earth.

But intensive farming practices, like growing “monocultures” (huge amounts of one crop like corn or soybeans), can degrade soil and deplete critical nutrients. Not only do these farming practices prevent soil’s natural processes but they can also reduce the amount of carbon stored in soil—a huge problem in the face of climate change. Intensive agriculture, closely intertwined with factory farming, damages the soil beyond repair.

Change Is Possible
The impacts of our animal-based food production system are far-reaching and complex. The intensive farming practices that supply animal feed for factory farms are destroying our water, air, and soil—and harming countless animals raised in food supply chains. But there is hope. It’s not too late to build a better food system from the ground up.

The movement to build a healthier food system is growing every day. Around the world, people are advocating for systemic change—from plant-based food options to better treatment of farmed animals. In fact, according to a March 2022 article in Phys.org, “switching to a plant-based diet in high-income nations would save an area the size of the EU worldwide.” Moreover, if just one person follows a vegan diet, an average of 95 animals will be spared each year, according to the book, Ninety-Five: Meeting America’s Farmed Animals in Stories and Photographs.

Concerned citizens and consumers can also hold corporations accountable for animal abuse and environmental degradation—by pressuring companies to adopt more sustainable practices. Already, several large meat producers and fast food and supermarket chains have stopped keeping pigs in gestation crates after people expressed “disgust” at the practice. According to the New York Times, “[T]he tide is turning because consumers are making their preferences known.”

By Vicky Bond

Author Bio: Vicky Bond is a veterinary surgeon, animal welfare scientist, and the president of The Humane League, a global nonprofit organization working to end the abuse of animals raised for food through institutional and individual change. She is a contributor to the Observatory. Follow her on Twitter @vickybond_THL.

Source: Independent Media Institute

Credit Line: This article was produced by Earth | Food | Life, a project of the Independent Media Institute.