Starting in 1967, I began writing regularly for the Times’ Sunday Arts & Leisure section. The section’s editor, Seymour Peck, a flinty New Yorker, had me write columns on movies, theatre, rock music, and television as well as on art, extending my capacities, while cracking down on my flakiness. He practically invented me as a functioning professional.
My uptown feats didn’t impress people whom I looked up to in the downtown art scene, where anti-bourgeois hardheadedness and minimalist disdain for the “literary” reigned. They were contemptuous of the Times. I was Peter the poet, a relative nobody. Advice to aspiring youth: in New York, the years that you spend as a nobody are painful but golden, because no one bothers to lie to you. The moment you’re a somebody, you have heard your last truth. Everyone will try to spin you—as they should, with careers to think of. For about a dozen years, I hung out, drank, and slept with artists who didn’t take me seriously. I observed, heard, overheard, and absorbed a great deal.
One drunken night, a superb painter let me take a brush to a canvas that she said she was abandoning. I tried to continue a simple black stroke that she had started. The contrast between the controlled pressure of her touch and my flaccid smear shocked me, physically. It was like shaking hands with a small person who flips you across a room.
A sign that year promised a new museum in 2010. But when 2010 came, only a small portion of display space had changed, given over to marking the 50th anniversary of Congolese independence. The exhibit did a considerably more honest job than the one five years earlier, but it, too, was temporary, gone after a few months. Finally, in 2013, the museum announced that it was closing down for a complete revamping, and would reopen in 2017.
Behind the scenes, occasionally leaking into the press, tensions remained. That was hardly surprising, given that Europeans were spending a huge sum—the renovation bill would eventually total $83 million—to portray Africa to the world. Half a dozen scholars from Belgium’s African-diaspora community were recruited as an advisory committee, but they had to sign nondisclosure agreements, were given no authority, and came to feel that their advice was being ignored. Eventually the committee stopped meeting. One imaginative historian-anthropologist who worked at the museum for a time suggested that Africans should be invited to build a museum-within-the-museum portraying how they saw Belgium, but this idea was considered too radical. The year 2017 passed, and the museum remained closed.
In the early 2000s, Amsterdam locals frustrated with sound pollution demanded the city do something to address it. The low frequencies and long wavelengths worked against traditional barrier solutions, but airport staff realized that plowed fields in the area seemed to dampen the noise — especially ones with a particular spacing of ridges.
Reverse-engineering this phenomenon, landscape architects from the firm H+N+S worked with artist Paul De Court drew on the work of acoustician Ernst Chladni to create over 100 grassy pyramids specifically designed to address this site-specific problem.
Emojis used as punctuation can be split up into two subgroups based on positioning around other non-emoji text: emojis surrounding and framing non-emoji text, and emojis as terminal punctuation.
In the category of emojis as punctuation framing non-emoji text is the ✌️Victory Hand used on either side of a word or string of words as air quotes. The first time I noticed this was on writer and commentator Jemele Hill’s Twitter account.
Hill could have used non-emoji quotation marks, but she didn’t. The choice to use emojis here is intentional. The inherently playful nature of emoji adds a layer of irony to her tweets. Her mocking tone and sense of humor shine through in this choice. You can feel her rolling her eyes when you read her tweets, and that might have been lost with your basic quotation marks. Additionally air quotes are functionally different than quotation marks because they represent a speech act combined with finger gesturing. Quotation marks in written text don’t tend to directly evoke the combination of someone physically speaking and gesturing in this way.
We are in the midst of determining if the kind of monoculture that thrived during the broadcast era can exist when many forms of media are opt-in: we can watch whatever we want, when we want. The “digital monoculture” could refer to the array of popular, recognizable reference points that have arisen and are accessed through the on-demand internet, whether it’s Game of Thrones or “Baby Shark.”
The universality that monoculture entails is valuable, because what everyone already knows is what they are likely to keep consuming — hence the overwhelming popularity of reboots and sequels. It’s also why Netflix paid $100 million in 2018 to keep Friends on its platform for another year (WarnerMedia later outbid Netflix, $85 million a year for five years, to put the show on HBO Max).
Big-budget productions try to worm material into the now-fragmented monocultural framework by manufacturing new universal reference points, as The Mandalorian has with its insta-meme Baby Yoda. Piggybacking on old monoculture is less risky than starting from scratch. The most successful example of digital-native monoculture might be Netflix’s Stranger Things, which merited the iconic public display of billboards in Times Square (not coincidentally, it’s also a show that references decades of pop culture).
Businesses turn their own intellectual property into self-reproducing mini-monocultures because monopolies are easiest to monetize. The streaming platforms and their various signature franchises form walled gardens, a metaphor that also recalls the scientific definition of monoculture: nothing else is allowed to grow there; there is no cross-pollination. Surely the monoculture isn’t quite dead if Netflix viewers have consumed a net 500 million hours of Adam Sandler movies.
Mercedes’s von Hugo, then, thinks that the ethical problems will be outweighed by the fact that cars will be better drivers overall. “There are situations that today’s driver can’t handle, that . . . we can’t prevent today and automated vehicles can’t prevent, either. The self-driving car will just be far better than the average human driver,” he told Car and Driver.
He also points out that, even if the car were to sacrifice its occupants, it may not help anyway. The car may end up hitting the crowd of school kids regardless. “You could sacrifice the car. You could, but then the people you’ve saved initially, you don’t know what happens to them after that in situations that are often very complex, so you save the ones you know you can save.”
But the facts in this instance are unambiguous: The president of the United States attempted to use his political power to coerce a foreign leader to harass and discredit one of the president’s political opponents. That is not only a violation of the Constitution; more importantly, it is profoundly immoral.
The reason many are not shocked about this is that this president has dumbed down the idea of morality in his administration. He has hired and fired a number of people who are now convicted criminals. He himself has admitted to immoral actions in business and his relationship with women, about which he remains proud. His Twitter feed alone—with its habitual string of mischaracterizations, lies, and slanders—is a near perfect example of a human being who is morally lost and confused.
- A useful chart for your next museum (or anything) meeting. Thanks, Margaret!
By far the most effective part of the Affordable Care Act, in terms of helping Americans get care, was simply expanding Medicaid. But what many Democrats and liberals were most excited about was the bill’s many experimental and technocratic attempts to “bend the cost curve”—reduce costs without price controls—and “improve quality,” mainly by encouraging insurers, with incentives, to strive for outcomes that market forces alone weren’t incentivizing them to aim for. The signature example of this may be the “Cadillac tax,” which was designed to nudge companies to force employees onto cheaper insurance plans with greater cost sharing—a tax built on the belief that one of the primary drivers of health care cost inflation was people taking advantage of their too-generous employers and greedily consuming more health care than they needed. The tax never went into effect. The individual mandate, similarly designed to force the healthiest young invincibles to enter the market to bring down costs, is equally dead. And a decade into the ACA, it has become more apparent than ever that the best way to reduce America’s absurd health care costs would simply be a single-payer program.
Required Reading is published every Saturday, and is comprised of a short list of art-related links to long-form articles, videos, blog posts, or photo essays worth a second look.