Boston’s first Italian American mayor, Thomas M. Menino, addresses a crowd at Faneuil Hall. (Courtesy of Pam Donnaruma and the Post-Gazette)
It was with great sadness that we received the news today of Mayor Tom Menino’s passing. An enormously popular public servant, Menino was not only Boston’s first Italian American mayor but would become its longest-serving mayor in history. To remember him, we’d like to present the following passage from The Boston Italians, Stephen Puleo’s tribute to the vibrant Italian American citizens of Boston who, like Menino, transformed the city around them. First published in 2007, some of Puleo’s facts might seem dated, even poignant in hindsight, but we think it captures the spirit of Mayor Menino, a man who ushered Boston from the troubles of the last century and into the promise of the new millenium.
An enormous mural in Mayor Tom Menino’s outer office virtually covers one wall and beckons visitors to study its details. Painted by Menino’s cousin, the scene depicts the mayor’s grandfather sitting in his Italian village, awaiting passage to America. Across a wide body of water that dominates the painting is the skyline of an American city, its shores a two-week voyage away in real life but just a few inches away on the canvas. The mayor describes the painting with pride; it is, he says, the beginning of the Menino story in the United States. Without Thomas Menino’s monumental decision to leave Grottaminarda, Avellino, and travel to a strange country, his grandson would never have had an opportunity to make his own special history in Boston. Thomas Menino settled in Boston’s Hyde Park section, at the far western corner of the city, a neighborhood his grandson still cherishes and lives in today, and from which he built the political base that has enabled him to lead the city for more than a dozen years.
One thing I had noticed about the academic study of religion is that scholars invariably study their own. I do not just mean that Mormons write books about Mormons or Catholics about Catholics. It goes deeper than that: mainline Protestants typically observe people much like themselves, as do Orthodox Jews. My membership in none of the above, it turned out, had given me something of an academic advantage. I may have lacked the insights that come from lifelong involvement with one particular faith. But in return I was widely viewed as someone writing about religion with no particular axe to grind. When a referee was needed, there I was. Those I studied generally treated me as an outsider but also as one making a special effort to understand them. Far from feeling excluded from their world, I felt, if anything, a bit wary about the warm embrace they offered.
Yet the fact that I had spent so much time among deeply religious Christians made me increasingly aware of two ways in which my differences with them were insurmountable: I was Jewish by background and nonreligious by conviction. For me, the two had always been intertwined. My parents were not themselves religious, nor for that matter strongly committed to any ideology. (I recall my father telling me that when he grew up, everyone he knew was either a socialist, a communist, or a Zionist, but that he had managed to avoid all such identification.) Nonetheless my parents felt Jewish enough to arrange a bar mitzvah for me, and so without much conviction on their part or mine, I did my religious duty at the age of thirteen. That has pretty much been it. I do at times read the Old Testament—the prophets in particular appeal to me—but I cannot say that the angry God pictured therein is one I find especially attractive. It is not just that I have a hard time envisioning God creating the world and then meddling with it when we human beings displease him. The religious side of Judaism is as much about practice as it is about belief, and even in this realm I feel no urge to honor the tradition by following rules that at best seem arbitrary and at worst absurd. Although I know my share of rabbis, and even though I admire their learning and commitments to social justice, I cannot bring myself to regularly attend the synagogues of any of American Judaism’s major branches. The only times I enter a shul are when I am invited to speak in one. I study religion but do not practice it, not even the one in which I was ostensibly raised.
Christine Byl reading from Dirt Work late last year, photo courtesy Mollie Foster
Near top the list of my greatest riches is the gang of artists I call friends: poets and painters, musicians and quilters, collagists and photographers. Our conversations, across medium and genre, stimulate me to consider the world at angles skew to my default impulses, and push my work to places I would not know how to take it on my own. We talk about books we’ve read—the new or the old, the overrated, the flat-out brilliant—and music we’ve rediscovered (’80s REM, anyone?) We talk about art that makes us wince, shiver, flounce or rage. We talk about the process of making, and our tools (words, paint, sound) and the tasks the tools are applied to—elegy, play, witness, and praise.
Over the past year or so, one conversational theme has recurred among us more than any other, rivaling even the old standbys, “Balancing Procrastination and Discipline” & “Does Art Really Matter?” Over beers, walking the dog and in stolen asides at conferences, we return again and again to this: How to negotiate the terrain that up-thrusts when art abuts commerce? We vent and bemoan how it seems you can’t be a writer any more without also being a spokesperson. We worry that we spend too much or not enough time shepherding work through the world. Even as we celebrate each other’s external triumphs—this prize, that grant, a fundraiser goal met, a book contract signed—we admit, in bit-off sentences, to a vague internal shame that underlies moments when a thing we make becomes a thing to buy. Because a thing to buy is necessarily a thing someone must sell. And more and more, we’re told, that someone is us.
BOSTON, MA September 12, 1974: A large crowd gathers in South Boston's Columbus Park to protest federal court-ordered busing of black students to all-white neighborhood schools. A prominent sign at right reads 'Whites Have Rights' while militant anti-busing members of the 'South Boston Information Center'—an anti-busing organization—are visible wearing white caps among the crowd.
This year, in the 40th anniversary of the explosion that was Boston busing, it’s time to be clear: busing wasn’t just about black and white. It was also about green—who had some in their pockets, and who didn’t.
Busing was the best thing that ever happened to Whitey Bulger.
In the years leading up to the 1974 busing plan, my neighborhood—South Boston—was perceived as the bastion of white supremacy and privilege in Boston. After all, some of the city’s most powerful politicians were from South Boston, and the most egregious symbol of white supremacy in Boston, school committee member (later city councilor) Louise Day Hicks, was a resident of the affluent and beautiful shoreline of South Boston’s City Point. Although the reasoning behind the State Board of Education’s busing plan will forever remain a mystery, I have had to presume that this was the motivation for a plan that—disastrously—included busing students from predominantly black Roxbury to Irish-American South Boston and vice versa, even though both groups were desperately poor with desperately underfunded schools.
As if Mercury itself somehow knew how badly I wanted out, Thursday night of Spirit Week was the last town fire I ever took part in. At dusk, the homecoming amoeba prepared to parade through town. Its course had already been set, and it was an expanded version of the path all the students took the day of the explosion at the McCandless car dealership: starting at the elementary school and ending at the high school where a homecoming bonfire waited to be lit.
In the elementary school parking lot, someone handed me a bucket of candy, and I had to pull my hands out from the cuffs of my sweater to hold it. The air was getting colder now. Inside the bucket, I found peppermints, Smarties, and butterscotch in golden cellophane, all the hard candy flavors I used to collect as a child when I’d stood on the side of the road, watching the floats as they passed by and dreaming of the day when I would get my turn.
FERGUSON, MO - OCTOBER 13: Author and activist Cornel West protests outside the Ferguson police station on October 13, 2014 in Ferguson, Missouri.
Sometime in the afternoon of Monday, October 13—Columbus Day or Indigenous Peoples’ Day depending on who you ask—the culmination of a weekend of organized protests in Ferguson, Missouri, news started trickling through Twitter and other media platforms that Cornel West and several other clergy members and activists had been arrested while trying to peacefully enter the Ferguson Police Station and request a meeting with Ferguson Police Chief Thomas Jackson. Photos from the moment are striking, instantly iconic: Dr. West knocked off his feet, grimacing mid-fall, his rain-streaked glasses reflecting the neon green riot jackets of the officers lined in front.
WASHINGTON, DC - JANUARY 22: Pro-choice activists hold signs as marchers of the annual March for Life arrive in front of the U.S. Supreme Court.
“You walk into our surgery center and it’s so cold and scary. There’s no art. The lights are bright, the recovery rooms smell like bleach. All the staff are wearing gowns and head and foot covers and the patients have to wear the same thing … and there’s nothing comforting about it. The warmth is gone.”
As recent events in Texas have made clear, when it comes to abortion care, the worst outcome of the current onslaught of state-imposed targeted regulations of abortion providers (TRAP laws) is the forced closing of clinics. But even clinics in affected states that manage to stay open suffer costs. The words above were spoken to me by an administrator of an abortion clinic in Pennsylvania, one of 23 states that have passed legislation stipulating that abortion clinics must conform to the requirements of an ambulatory surgical center (ASC). ASC legislation, in essence, demands that clinics conform to the physical standards of hospitals, with regulations about such matters as hallway widths, heating and ventilation equipment, and janitor storage space. Moreover, as part of the ASC regime, clinics must adopt certain hospital-like policies, such as sterile environments, that are more stringent than those pertaining to other outpatient facilities. Although the Supreme Court temporarily blocked Texas from enforcing these ASC provisions, many of the state’s clinics have been facing the prospect of shuttering under the extreme financial burden of physically enacting the required changes.
SEATTLE, WA - OCTOBER 13: People cheer during Indigenous Peoples' Day celebrations at the Daybreak Star Cultural Center. Earlier that afternoon, Seattle Mayor Ed Murray signed a resolution designating the second Monday in October to be Indigenous Peoples' Day instead of Columbus Day.
Last week after Native American activists successfully lobbied the city of Seattle to change Columbus Day to Indigenous Peoples’ Day, historian Roxanne Dunbar-Ortiz wrote an open letter to President Obama urging him to put an end to the federal holiday honoring Christopher Columbus—a man linked to the enslavement, mutilation, and genocide of the Indigenous people he encountered on his exploration and subsequent conquest of the “New World.” A corresponding WhiteHouse.gov petition has generated tremendous response, confirming that support for the idea of honoring Indigenous people over Columbus Day’s “metaphor and painful symbol of [a] traumatic past,” as Dunbar-Ortiz describes in the letter, has spread throughout the general public. Dunbar-Ortiz, whose An Indigenous Peoples’ History of the United Stateswas published last month, spoke with us recently about the book and about how Indigenous people remain a dynamic, diverse, and necessary force in the world today.
Beacon Broadside: What are a couple the most enduring myths about Indigenous history?
Roxanne Dunbar-Ortiz: I think the myth of disappearance—the myth of not being here now, of being people of the past. That’s a kind of unspoken, unconscious genocide that takes place over and over and over again, not just in the past. But of course, genocide doesn’t mean the total elimination of a people. It can mean that but in world history it’s not meant that. There are still Jews in the world. There are still Armenians. There are still Cambodians. Even though we call each of those cases genocide. But with Native peoples it’s different. So I think that’s the main myth, this idea of eliminationism, to do away with the Indians. And it’s sort of confusing and painful for people who don’t know US history, or know only a version of it—that is the settler colonial narrative—don’t know what to do with the fact that there are still Indians because the narrative really does away with Native people.
Historian and activist Roxanne Dunbar-Ortiz wrote a letterto President Obama requesting that the US end its celebration of Christopher Columbus, a symbol of colonization and genocide for Native American nations and communities. Tell the world that we should honor the many contributions of Indigenous People instead of the conquest of one man. Sign the petitionon WhiteHouse.gov to add your voice!
“Our nation was born in genocide. . . . We are perhaps the only nation which tried as a matter of national policy to wipe out its indigenous population. Moreover, we elevated that tragic experience into a noble crusade. . .” —Martin Luther King, Jr.
Dear President Obama:
“Columbus Day” was made a federal holiday in 1934, when Native American nations and communities had little voice to protest the celebration of the onset of colonization and genocide in the Western Hemisphere. In the era of global decolonization of the second half of the 20th century, Native Americans remained colonized. Columbus Day is a metaphor and painful symbol of that traumatic past, although the United States did not become an independent republic until nearly three centuries after Columbus’s first voyage. None of Columbus’s voyages touched the continental territory now claimed by the US. Yet, the United States soon affirmed that a 15th century Papal Bull, known as the “Doctrine of Discovery,” applied to the Indigenous nations of North America. This remains US law in claiming that Native nations are “domestic, dependent nations” with no inherent rights to the land.
Cornel West’s Black Prophetic Fire is both a new look at six revolutionary African American leaders and a rousing call for more “fire” in what West calls the Black prophetic tradition, a reframing of the social order in terms of radical justice. As Dr. West writes in the introduction,
The deep hope shot through this dialogue is that Black prophetic fire never dies, that the Black prophetic tradition forever flourishes, and that a new wave of young brothers and sisters of all colors see and feel that it is a beautiful thing to be on fire for justice and that there is no greater joy than inspiring and empowering others—especially the least of these, the precious and priceless wretched of the earth!
Note: An earlier version of this piece ran in the Huffington Post.
Yom Kippur starts tonight, with the moving and mournful Kol Nidre service, and I am grateful to belong to a radically inclusive community of interfaith families, families that will mark this High Holy Day together. As it happens, Chelsea Clinton and Marc Mezvinsky welcomed a baby girl to their interfaith family, just a week ago today. As interfaith parents, they now face decisions about the religious affiliation and education of their interfaith child. Baby Charlotte arrived just after the autumnal equinox, when the nip in the air reminds us of the passage of time, and many interfaith families are making the annual decision about whether to affiliate with a church, a synagogue, or neither. Or both.
In 1935, Howard Thurman, one of the most influential African American religious thinkers of the twentieth century, took a pivotal “Pilgrimage of Friendship” to India that would forever change him—and that would ultimately shape the course of the civil rights movement in the United States. When Thurman became the first African American to meet with Mahatma Gandhi, he found himself called upon to create a new version of American Christianity, one that eschewed self-imposed racial and religious boundaries, and equipped itself to confront the enormous social injustices that plagued the United States during this period. Gandhi’s philosophy and practice of satyagraha, or “soul force,” would have a momentous impact on Thurman, showing him the effectiveness of nonviolent resistance. After the journey to India, Thurman’s distinctly American translation of satyagraha into a Black Christian context became one of the key inspirations for the civil rights movement, fulfilling Gandhi’s prescient words that “it may be through the Negroes that the unadulterated message of nonviolence will be delivered to the world.”
Today, on the 145th anniversary of Mahatma Gandhi’s birth, we look back to that meeting in 1935, when the idea of nonviolent civil disobedience passed from India’s spiritual leader to the man who would deeply influence an entire generation of black ministers and civil rights leaders—among them Martin Luther King Jr.
The conversation then turned, in the words of Desai, to “the main thing that had drawn the distinguished members to Gandhiji,” his philosophy of ahimsa (nonviolence) and satyagraha (civil disobedience campaigns). “Is non-violence from your point of view a form of direct action?” Thurman asked. “It is not one form,” Gandhi replied, “it is the only form.” Nonviolence, Gandhi said, does not exist without an active expression of it, and indeed, “one cannot be passively nonviolent.” Gandhi went on to lament that the term had been widely misunderstood. Ahimsa was a Sanskrit word with deep resonance in all of South Asia’s ancient karmic religions, Buddhism, Hinduism, and (especially) Jainism, in which ahimsa stood for a commitment to refrain from harming living things. He felt there was no good English language equivalent for ahimsa, so he created the term nonviolence (the earliest usage in the Oxford English Dictionary, citing Gandhi, is from 1920), but told Thurman that he regretted the fact that his coinage started with the “negative particle ‘non.’ ” On the contrary, Gandhi insisted nonviolence was “a force which is more positive than electricity” and subtler and more pervasive than the ether.
Center City gay-bashing suspects (courtesy Philadelphia Police)
On September 11, 2014, around 11pm, a gay male couple walking home through Philadelphia’s fashionable Center City was accosted and badly beaten by a group of 12 well-dressed white 20-somethings, both men and women, who shouted anti-gay epithets before and during the attack. Both victims ended up in the hospital, one of them beaten so badly he suffered broken bones in his face and had to have his jaw wired shut. The case has attracted a lot of attention both because the alleged perpetrators were so clean-cut and apparently well-to-do and included women and men—not the stereotype of who commits “hate crimes”—and for the way it was solved: via social media. After the police released street-side surveillance video showing a group of young people walking away from the crime scene, citizens of the twitter universe began circulating the videos, matching faces to Facebook updates, and eventually pointing the police to suspects. Three arrests have since been made of two men and one woman, Philip Williams, Kevin Harrigan, and Kathryn Knott, each of whom has been charged with two counts of Aggravated Assault, two counts of Simple Assault, two counts of Recklessly Endangering Another Person, and one count of Criminal Conspiracy.
Protesters at the People’s Climate March, photo by Tom Hallock
Climate change is happening, and faster than scientists expected. Polar ice caps are melting faster, island nations are going underwater, the ocean is acidifying and warming. In the US, we are suffering catastrophic droughts from California to Texas, along with severe flooding in the East. The answer is to stop burning fossil fuels, but the World Meteorological Organization says that, in 2013, CO2 rates in the atmosphere were rising faster than ever. So what can we do? On Sunday, September 21st, hundreds of thousands of people from around the globe converged in Manhattan to show the world exactly how critical the issue of climate change is to them, and to demand action. Beacon’s Senior Editor Alexis Rizzuto and Associate Publisher Tom Hallock were there to bear witness. We recently spoke with them about their experiences at the march and why climate change is fast becoming one of the most important issues of our time.
Labeled a “domestic terrorist” by the McCain campaign in 2008 and used by the radical right in an attempt to castigate Obama for “pallin’ around with terrorists,” Bill Ayers is in fact a dedicated teacher, father, and social justice advocate with a sharp memory and even sharper wit. Public Enemy, now available in paperback, tells his story from the moment Ayers and his wife, Bernardine Dohrn, emerged from years on the run and rebuilt their lives as public figures, often celebrated for their community work and much hated by the radical right. In the following excerpt from the book’s prologue, Ayers tells the story of how he came to play an unlikely yet prominent role in the 2008 presidential election.
Spring 2008, Chicago
It was a mid-April evening, the sweet smells of springtime upon us and the last light reluctantly giving way outside the front window, when my graduate seminar ended and everyone pitched in to clean up. A dozen of my students were spread out in our living room, cups and dishes scattered everywhere, small piles of books and papers marking specific territory. Until a moment before, all of us had focused intensely on the work at hand: thesis development, the art of the personal essay, and the formal demands of oral history research. As a professor for two decades, my favorite teaching moments often popped up during these customary potluck seminars at our home—something about sharing food in a more intimate personal setting, perhaps, or disrupting the assumed hierarchy of teacher authority, or simply being freed from the windowless, fluorescent-lit concrete bunkers that passed for classrooms at my university. But the seminar was done for this evening, and as students began to gather their things, a self-described “political junkie” clicked on the TV and flipped to the presidential primary debate, well under way by now, between Hillary Clinton and the young upstart from Chicago, Barack Obama.
It’s that time of the year again, a time when readers, writers, and publishers everywhere are reminded of the fragility of free speech, even within a country that purportedly protects it. Though this will be the 32nd year of the annual freedom to read celebration, the reality is that book banning is still distressingly common. “It takes guts to take a stand against censorship,” free speech activist Chris Finan recently remarked in response to the banning of Emily M. Danforth’s teen novel The Miseducation of Cameron Post. Finan is president of the American Booksellers Foundation for Free Expression and author of From the Palmer Raids to the Patriot Act: A History of the Fight for Free Speech in America, the first comprehensive history of free speech in America for general readers, and a book that should be required reading for Banned Books Week.
This Sunday, September 21st, concerned citizens from across the globe are convening in New York City for what’s being called the largest climate march in history. Over 100,000 participants will march two miles through the streets of Manhattan “to demand bold action on climate change.” For those who are planning to march, or for those who wish to take action from afar, we’ve compiled a list of essential titles that raise awareness about impending climate change—the most pivotal environmental crisis humankind has yet to face:
Senator Maria Cantwell’s proposed bill to strip the NFL of their nonprofit status is the latest strike in the ongoing effort to pressure the Washington Redskins to change their mascot. Canwtell joins a growingchorus of opponents to the disparaging name. Back in January, the National Congress of American Indians created a powerful PSA that outlined the issue in just a few words: “Native Americans call themselves many things. The one thing they don’t...” The ad ends with a close-up image of the Washington Redskins logo. The implication is clear.
During the late seventeenth century, Anglo settlers in New England began the routine practice of scalp hunting and what military historian John Grenier identifies as “ranging”—the use of settler-ranger forces. By that time, the non-Indigenous population of the English colony in North America had increased sixfold, to more than 150,000, which meant that settlers were intruding on more of the Indigenous homelands.
It’s been an interesting summer for those of us who study the legal and cultural developments surrounding mobile devices. At the end of June, the US Supreme Court unanimously(!) ruled that law enforcement officials must get a search warrant before reviewing the contents of of a cellphone seized during an arrest. [See Riley v. California, 573 U.S. ___ (2014)]. This may well have been the most important pro-privacy decision in the past 45 years, and it deserved far more attention and celebration than it received.
The discussion of the Court’s cellphone decision, however inadequate, was utterly swamped by the media monsoon following the news that nude photos of numerous celebrities (perhaps more than 100, including such cultural icons such Jennifer Lawrence, Kate Upton, Mary E. Winstead, and Kirsten Dunst) had been hacked from their Apple iCloud accounts. Wholly apart from sucking the oxygen out of the global news cycle for the better part of a week, the massive celebrity hack made it clear that when it comes to privacy, nothing sells like sex.