felixsalmon.2002
|
|
Thursday, February 28, 2002
Gerhard RichterThere is very little in the way of curatorial interjection in MoMA's Gerhard Richter retrospective. There are only two wall texts: one at the beginning, saying, essentially, that writing about Richter is futile; and one half-way through, giving the necessary background to the great October 18, 1977 suite of paintings. The labels are discretely far from the works: this retrospective is all about letting the paintings speak for themselves. While it is possible to applaud MoMA's curatorial minimalism, it's also important to realise that in this case necessity was probably the mother of virtue. Any attempt to pigeonhole Richter, to place him in a context, is bound to fail. It's not hard to see the influences here, from Vermeer all the way through Clyfford Still to Andy Warhol. But it's also impossible to see Richter as a follower of any movement. Similarly, although Richter's paintings could only have been painted by a German, this is not German Art in the sense that Norman Rosenthal might understand it. What Robert Storr, the curator, has done is to make a strong case that Gerhard Richter is one of the great painters of the 20th Century. This will come as no surprise to most art-savvy Europeans, but by all accounts Richter has never been given his due over here, somehow being overshadowed by the likes of Anselm Kiefer and Georg Baselitz. This exhibition should change all that. The title of the exhibition, Forty Years of Painting, makes a very clear statement that this is not going to be a comprehensive retrospective: it intends to deal only with Richter's paintings. That's great: they're more than strong enough on their own. The show then does exactly what it says it will do: presents us with forty years of painting, in chronological order, some 188 paintings in all. This simple device shows Richter to be almost unique among painters. Some artists do essentially the same thing for all their career; others, like Picasso, change their styles frequently. With Richter we see great breadth, to be sure, but rather than moving from one painterly idiom to the next, he is quite happy juggling three or four simultaneously. To take just one example, his Onkel Rudi a black-and-white photograph of his late uncle grinning in his Nazi uniform not long before he was killed, painstakingly transferred to canvas and then blurred with a squeegee is more or less contemporaneous with his colour paintings, where he simply applied a series of different colours, in rectangles, onto a gridded canvas. Both have been hugely influential: the first broke new ground in making highly-charged political statements out of highly-charged personal history, while the second, even if it hadn't been copied by Damien Hirst, was always a very important piece of early Conceptualism. Richter's multiplicity of styles does make it hard to see the connections between his works. They are few, it must be said: Richter has an urge to paint, but he also has an urge to dismantle metanarratives, to discredit political or painterly ideals. And before I go on, I must make it clear that Richter can by no means be reduced to any simple technique: there's a lot more to his paintings than anybody can put down in writing. Yet from the beginning of this show to the end, we see one thing in every room: a highly adept facility with the squeegee, Richter's tool of choice for unpainting his works, for smearing the surface of the canvas. Sometimes, Richter will use this technique on his photorealist canvases, with the ostensible effect of blurring the subject matter. In some paintings, the squeegee has been so heavily applied that the original photograph can be hard to discern. But after seeing this show, it starts to dawn on the viewer that the wielding of the squeegee is an act of creation as much as it is one of destruction. This is most obvious in Richter's abstract canvases: by far the most beautiful paintings in the retrospective, they're the result of an almost obsessive sequence of adding, blurring and removing hundreds of layers of paint. But it is the same mechanism which gives the abstract paintings their beauty that gives the photographic paintings their soul. In blurring the sharp edges of contemporary images, Richter creates something very personal, and very new. You could say he's done for the family snapshot or newspaper image what Jasper Johns did for the American flag: he celebrates it, elevates it to art, and yet changes it profoundly in the process, creating something unique and beautiful. Don't just go see this exhibition; go see it twice. These paintings grow on you, even when you're not there: while you're asleep in the week after you see the show the first time, they'll become part of your visual vocabulary. This is not the sort of art we're used to these days, the kind of thing where you look at it, get it, and then decide that it's either great or crap. Richter's work is much more subtle: his paintings want to be looked at like Titians, not like Warhols. While it's relatively easy to immerse oneself in the painterly surface of the abstract works, much of the rest of the show has that Modern Art intellectual edge to it: there's a temptation to say "I don't get it" and move on. Either resist that temptation, or, better, simply return a week later. There'll be a whole new show waiting.
Wednesday, February 27, 2002
Michael Finkel and the New York Times MagazineAs any regular Romenesko reader knows, the New York Times Magazine has fired a contributing editor, Michael Finkel, for using "improper narrative techniques" in his profile of a Malian teenager who worked on a cocoa plantation. The basic facts of the case are simple: Finkel conflated the stories of many different boys to tell a story which wasn't true. Furthermore, parts of his story weren't even true of any boy at all, let alone the subject of the piece. That's the sort of thing which you might get away with at Seventeen, but it's just not done at the New York Times. Finkel understands that, and doesn't blame the Times for firing him. Neither do I. But after reading an old profile of Finkel, and a new story in the New York Observer today about the whole affair, a slightly more well-rounded picture emerges. Finkel's prose is spare and clean, and a huge amount of work goes into it: "the dirty dark secret is that it takes me forever to make it sound like I just wrote it in a minute," he says. Since the New York Times gig was by far his most high-profile job, it stands to reason that Finkel would have put even more effort than usual into the story he submitted to them after travelling to Mali. The problem was that he hadn't found what he was looking for: he'd pitched a child-slavery story, and found only boys working for very low pay and in great hardship. If anything, the story was that aid agencies were playing rather fast and loose with the slavery label, because it suited their own ends. The Observer talked to the photographer who travelled with Finkel to Mali, Chris Anderson.
These drafts, it would seem, involved much more than tweaks: they were full-scale rewritings and restructurings. A huge amount of time, work and frustrated effort was going into this story, and Finkel was very keen to get it out of the way so that he could move on to other projects: specifically, a mountain-climbing expedition in Nepal. Only the New York Times editors ever saw the original drafts, so only they know for sure, but I would be very surprised if the first couple of drafts had anything ethically wrong with them. The problem was that Finkel had travelled to Mali to report one story, and was now being pressured to write another: one which his reporting really couldn't sustain. According to Adam Moss, the editor of the New York Times magazine, Finkel wrote the story which ended up being published "as an exercise, to put down on paper what he wanted to write," although that hardly explains how it got into the hands of the magazine without any indication that it wasn't true. Freelance journalism is a risky gig, for reporters and editors both. Stories get pitched and commissioned before they're fully reported, and most of the time the story which emerges at the end is different from that envisaged at the beginning. Problems arise when, after the reporting has been done, the resulting story turns out to be very different indeed from the one which was commissioned.Reporters are generally happy to go back to a magazine with a tale along the lines of "as I was reporting on X, I stumbled across a much bigger story: Y". What they're less keen on doing is saying "the more I looked into X, the more I realised there really wasn't a story there at all." It's the job of editors to help journalists find the angle and write a good story, even when the reporting doesn't necessarily back up the original commission. A lot of the time, the editor will propose that the reporter phone a few more people and get the extra material needed for the new story, as it's now envisaged. The problem in this case was that no one was proposing that Finkel return to Mali, and there was no way that the extra material could be obtained over the phone. Eventually, Finkel gave the New York Times what it wanted: a compelling third-person account of a typical "child slave". I've written a similar kind of account myself, although in a very different context. My story was written directly from my notes, a couple of hours after I talked to the subject of the piece. Even though the conversation was still fresh in my mind, I relied heavily on what I wrote down as he was talking. Finkel, by contrast, doesn't take any notes at all when he's talking to his subjects, only writing down his "impressions" later in the day. Furthermore, he wrote the whole New York Times story without referring to his notes once. With all the good will in the world, I have great difficulty imagining how such a story could ever be wholly accurate. To that we must add the facts that the story was being written in order to pacify an editor who had already rejected several drafts. In such a context, it becomes clearer how the noteless writer ended up producing something better than the story which he was actually able to write, given the reporting. While Michael Finkel is certainly to blame for this affair, I think there is a lesson here for all features editors: Beware stories which change significantly after the reporting has been finished. All editors should go back to their writers and ask for more material when more material is needed. But if what they got originally was short on crucial elements, then they should bear in mind the fact that there's a good chance the journalist hasn't had any access to those elements. If you're an editor and you get back a revised draft which adds in stuff which wasn't in the original and which couldn't have been obtained in the mean time, proceed only with extreme caution.
Thursday, February 21, 2002
Kinsley on mammogramsMichael Kinsley is no longer editor of Slate. He gave up that post amidst rather a lot of publicity last week, saying that he would continue at the online magazine as a columnist. That was good news: Kinsley, when he's on form, which is most of the time, is one of the most perspicacious columnists in America. He's particularly good at using subtle but decisive logical arguments to cut through standard political rhetoric. Kinsley's first column as full-time writer, however, makes one wonder whether his talent as a columnist was in some way reliant upon his day job. It's about the Great Mammography Debate, and it's terrible. In a nutshell: The medical industry in the US is very keen on regular mammograms for women over 50, and possibly even for women over 40. But recent research out of Denmark reviewed the available data and concluded that the benefits of regular mammograms were statistically zero: women who got tested had the same chance of dying of breast cancer as women who didn't. Since then, a National Cancer Institute advisory panel has concluded there is insufficient evidence to prove that mammography reduces breast cancer deaths, although the controversy still rages. Kinsley's contribution to the debate is worthless. He starts off with four paragraphs of sheer irrelevance, about official NIH advice, dating back to 1997, about when and how often women should have mammograms. Kinsley then moves on to pretty much his only argument in the whole piece: one he repeats, but never bothers to argue.
Kinsley, of all people, knows that correlation is not causation. It may be true that almost everyone knows women who have survived breast cancer; it may even be true that most of those women's cancers were discovered by means of a mammogram. It is certainly true that if a malignant tumor is not discovered and removed, then it is very likely to kill. But from none of this can we conclude that mammograms save lives. If the breast cancer would have been discovered in time by other means, then the mammogram has done no good. The problem is that advances in treatment have not kept pace with advances in mammography. A mammogram will show growths maybe 20mm in diameter growths easily confused with many other markings on the film. The problem is that medicine really doesn't know what to do when it encounters a growth that small: quite often the best thing to do is nothing, and see if it grows. Continues Kinsley:
This is incredibly disingenuous. For one thing, it completely ignores all of Kinsley's own throat-clearing at the beginning of the article: his whole argument from here on in applies as much to 16-year-olds as it does to 40-year-olds. There is a serious case that mammograms have no benefit: it has been made, in excruciating statistical detail, by a pair of Danish scientists. To say that there is none, and to base that assertion on a statement that has nothing to do with mammography, is to commit a logical solecism quite atypical for Kinsley. Kinsley's column becomes even more unhinged towards the end, however, after he's done some really rather bizarre and off-topic chin-scratching about how much anxiety a woman feels when she's been diagnosed with a certain probability of having breast cancer. Are you ready?
It's obvious that if someone decides to have a biopsy or surgery on the basis of a mammogram's results, then the mammogram has caused the biopsy or surgery. What's not at all obvious is that said biopsy or surgery has a "large chance" of being life-saving, and a "small chance" of being unnecessary. Kinsley simply asserts here, without citing any evidence at all. The whole argument in this piece, insofar as there is one, seems to rest on the idea that the Danish research thinks that the downside to having a mammogram in some way balances out the upside. But that's not what the research says at all: the research says that even if there were no downside at all to the cost and inconvenience of the mammogram, not to mention the pain and anxiety of the biopsies and surgeries which would otherwise not happen, there still wouldn't be any upside to having mammograms: people live just as long without them. Now, it's possible that the Danes are wrong about this, although most of the arguments against them seem to be far too reliant on breast-cancer deaths falling since mammography was introduced, which is something that can be explained by advances in treatment as opposed to diagnosis. But to miss the point entirely, and to start tilting at straw men: that's most unlike Mr Kinsley. I think he should go back to editing.
Monday, February 18, 2002
Lord of the RingsI guess it's partly my fault: I made a bet with Stefan just after Lord of the Rings came out, and as part of that bet I promised him that if the film got nominated for either Best Director or Best Picture at the Oscars, then I would go see it. So it got nominated, so I went to see it. I mean, how bad can it be? It's now at #2 on the IMDB top 250, in some extremely exalted company. How bad can it be? It was dreadful. If we're talking IMDB top 250 here, you need to go down to #65 and Braveheart before you even start getting close to a movie as bad as this one. Actually, it was a lot worse than Braveheart. It felt like the bastard offspring of Phantom Menace and Conan the Barbarian, only stretched out to 178 of the longest minutes I've ever had the misfortune to spend in a cinema. If I wasn't with two friends, and if I hadn't been in the middle of a row, I'd've got up and walked out less than halfway through. I would happily give up my bottle of 1990 Pol Roger never to have seen the thing. The film opens with an epic battle scene (natch), but rather than start us off with a bang likeGladiator did, Peter Jackson decides to treat us to the sort of dreadful computer-generated animation which we've all grown to hate through seeing the likes of The Mummy Returns or the Phantom Menace. Immediately, we're in the mindset of watching a video game, rather than an Oscar-worthy film. Not coincidentally, the only other film of 2001 which I can think of which is as bad as this one was Tomb Raider. The weird thing is that the nauseating overreliance on CGI continues well past the tens-of-thousands-of-orcs scene and well into the innocent-hobbits-frolicking-in-bucolic-bliss act which takes up most of the next half hour. The Shire looks like nothing so much as the set of Teletubbies, with acid-green rolling hills and cute wide-eyed little people. Into the Shire rides Dumbledore, I mean Obi-wan Kenobi, I mean Gandalf, a wizard (you can tell by the pointy hat) played by some Grand Old British Actor or other. And Ian McKellen (for it is he) turns out to be the best thing in the film: quite up to Alec Guinness's hammy-but-not-too-hammy standards. The way in which Peter Jackson sends him on his way after a fight to the death with the incarnation of the Forces of Darkness, faced off against him on a slender bridge over a bottomless pit, is a little bit much, however. And Darth Vader is an infinitely more convincing baddy than the Balrog. (In fact, rather than try and come up with something original, Jackson simply nicks the Balrog straight from the Night on Bare Mountain part of Fantasia.) Come to think of it, was there any scene in the whole film which didn't feel as though we've seen it a hundred times before? The dark riders galloping out of the fortress gate, the collpasing staircase, the swordfights in the woods, the helicopter shot of our plucky heroes dwarfed by nature as they paddle down the river... it was cliché after cliché after cliché. The genius of a film like Raiders of the Lost Ark is that it takes a tired genre and revitalises it; this film looks as though it was cobbled together out of offcuts from Hollywood's recycling bin. The rest of the cast generally looks like it's been chosen solely on the basis of how blue their eyes are, although Jackson makes sure that Frodo, the purest of them all, also has the bluest eyes. The Fellowship, note, is 100% white, while most of their opponents are shades of brown. It's frankly racist, the way in which the Aryan Fellowship happily hacks its way through hordes of sub-humans at every available opportunity, and completely defeats the whole purpose of comprising the Fellowship of various different races. That dwarf is tokenism, although he does get the only (deliberate) laugh line in the whole film. It reached the point where I could feel the whole audience sighing in exasperation every time we had to endure another close-up of Frodo's large blue eyes. It's a bit like Zoolander: he has the same expression throughout the film, whether he's impaled by a monster's weapon or whether he's just seen Queen Galadriel Nerwendë Artanis Alatáriel of the Galadhrim for the first time. And when we're not subjected to Elijah Wood's face, we're subjected to his hand, outstretched, with the ring sitting in the palm, over and over again. I could moan about the plot, too, I suppose, but there's little point: this type of film never has any plot to speak of. But I think it is reasonable to point out that there's no structure to the film either: it's plucky Frodo and his gang versus yet another threat and... repeat, with no indication that they're actually getting anywhere, or that this latest obstacle is particularly dangerous compared to all the others. I mean, thousands of orcs in the caves scarper when they see the Balrog, and we managed to cope with the Balrog alright, but then a couple of hundred orcs later on in the film manage to capture a pair of hobbits and kill one of our blue-eyed heroes. A lot of the weakness of the film can certainly be laid squarely at the door of JRR Tolkien and his ridiculously over-written epic. Jackson wanted to be reasonably faithful to the book, so that explains the hilariously archaic dialogue and the directionless episodic nature of the screenplay. All the same, it makes for very bad cinema. And the dreadful mattes, especially elf-land, which looks like it was nicked straight from Thomas Kinkade, Painter of Light, are unforgivable. Is there really appetite for another six hours of this? Why is it that it seems to have any chance at all of winning multiple Oscars? Although I hated Braveheart or Forrest Gump or Titanic with a passion, at least I could see where the Academy was coming from. But if they give this thing multiple gongs after having ignored all the great films in its genre for the past 20 years, I will be mute with incomprehension. Is Peter Jackson really a better director here than Steven Spielberg was in any of the Indiana Jones films? Is Lord of the Rings better than Crouching Tiger, Hidden Dragon? This film has achieved the impossible: it's made me actually want a Ron Howard film to clean up at the Oscars. That takes some doing.
Friday, February 15, 2002
Books and ChomskyCast your mind back, if you will, one month. The bestseller lists at the time were dominated by right-wing screeds; Michael Kinsley even wrote a column about it, which is helpful, because we can use it to remember where those books once stood. The most successful, then as now, was Bias: A CBS Insider Exposes How the Media Distort the News, Bernard Goldberg's screed against the liberal tilt of America's TV networks. It was Top Book, back then; now, it stands at #7 on Amazon's bestseller list. One month ago, Kinsley found five right-wing books in the Amazon top 15; now, Bias stands alone in that group. Its former position at #1 is now held by The Wisdom of Menopause: Creating Physical and Emotional Health and Healing During the Change, by Christiane Northrup, M.D. Goldberg's fellow-travellers, it turns out, have done pretty poorly. Pat Buchanan, bless his cotton socks, is still up there: his charmingly-titled The Death of the West: How Dying Populations and Immigrant Invasions Imperil Our Country and Civilization has slipped merely to #25 from #11. Meanwhile, Barbara Olson's The Final Days: The Last, Desperate Abuses of Power by the Clinton White House has dropped down to #154 from #7, Peggy Noonan's When Character was King: A Story of Ronald Reagan has fallen to #263 from #10, and Bill O'Reilly's The No-Spin Zone: Confrontations with the Powerful and Famous in America now languishes at #3,046, down from #8 a month ago. Meanwhile, who do we find at #8 on the list but Michael Moore, of all people. His Stupid White Men ...and Other Sorry Excuses for the State of the Nation! has got the one of the main criteria of success (a really long title) down pat, but nobody seems to have told him that you need to be a right-winger to write a bestseller. The ellipsis and exclamation mark, however, are obviously there to make up for the lack of a colon: Moore's book is one of only three in the top 15 without one. But the point is that Moore is well up there already, and his book hasn't even been published yet. Could it be that the right-wing books on the bestseller lists were only a temporary blip? After all, the next political book after Buchanan's on the list is 9-11, the admirably short-titled pamphlet by, of all people, Noam Chomsky. Could it be that the nation's flag-waving fervor in the wake of the terrorist attacks has been replaced by second thoughts? Have Americans suddenly developed an interest in how the world sees them and what their nation has been up to overseas in the past few decades? The weird thing is that Chomsky is definitely losing the intra-Left war against the likes of Christopher Hitchens over whether or not America was to blame for the events of September 11. Conventional wisdom on the Upper West Side these days has it that Chomsky might be a great linguistic theorist, but that he's a little bit over the edge when it comes to politics. Which is why it's unfortunate for Chomsky that the latest issue of the New York Review of Books carries a fabulous article the first of two, no less about Chomsky by his old sparring partner John Searle. Searle purports to be reviewing Chomsky's latest philosophical book, New Horizons in the Study of Language and Mind, but in fact he aims squarely at the whole history of Chomsky's theories, past and present. He spends a good amount of time gloating, in the politest possible fashion, over the failure of Chomsky's early linguistic project, and then jauntily skewers the bizarre animal into which it has now morphed. The whole thing makes me very happy. I never got very much into Chomsky during my undergraduate days, but what little I understood of him seemed ridiculous on its face. I was always given to feel rather sheepish about my opinions, however: many much more learned and intelligent philosophers than I seemed to take Chomsky's theories about language being hard-wired into the brain perfectly seriously. The fact that Chomsky himself now seems to have abandoned them makes me very thankful I didn't waste many hours trying to understand them further. My only regret is that Chomsky seems to be far too busy bashing his political drums to spend much time rebutting Searle: we probably won't see his response in the pages of the New York Review. Indeed, although Searle is too polite to say so, even the book under review is not a genuine scholarly work so much as it is a collection of relatively short essays and papers which repeat themselves as much as they gloss over any really knotty problems. Chomsky's obsession with conspiracy theories centering on Henry Kissinger has distracted him from his linguistics: maybe deep down, he always knew the latter was going nowhere.
Wednesday, February 13, 2002
Air travel reduxLast Monday, I flew to New York from Toronto, where Id gone
to watch my sister become a Ph.D. I was booked onto a 9am flight,
but I got there early enough that I had time to make it onto the 8am
flight.
Tuesday, February 5, 2002
Mail.com reduxSo I got something approaching a proper reply from mail.com. (If you don't know what I'm talking about, I wrote about it in January, but suffice to say that they promised me a free-for-life email forwarding service, and then told me they were charging for it.) Here it is in full, with annotations: Dear Felix, This is already a huge improvement on anything I've received from them in the past: my very first email from mail.com personally addressed to me! Thank you for contacting Mail.com regarding our email forwarding Well, they certainly understand the problem. In accordance with the User My, what a long sentence! Smells like it was drafted by a lawyer to me. What they don't mention is that I never consented to the User Agreement. I signed up with iname before it merged with mail.com, and as far as I know I never had to click through any legalese at all. (The internet was far more innocent then.) The idea that "any and all claims that attempt to extend the notion of a free email account to a free forwarding service are misguided" is hilarious; even the idea that I'm attempting to extend anything is pretty laughable. When I signed up for the iname service, a free email account and a free forwarding service were coterminous. There WAS nothing else: web-based email systems had not yet arrived on the scene. The irony here is that it's mail.com doing the extending: it's their attempt to extend the notion of a free email account to a free way of accessing one's email over the web which is misguided. I know that internet newbies these days tend not to understand that there's a lot more to the internet than the web, but mail.com is surely no internet newbie. What use, for instance, would mail.com's service be to my sister when she moves to Antarctica later this year? She will have email access, but no web access. And what about blind people who find it a lot easier to read emails than web pages? Rest assured, your email account in and of itself Wow, breaking their promise to their clients is essential to their continued existence? I'd better change my email address anyway: I don't hold out much hope for their continued existence in that case. Given the challenge of increasing the number of people using their email service, they chose to do so not by making it better but by making it worse. Both Microsoft and Yahoo have wildly popular free web-based email systems. Mail.com had two edges over them: it had a large range of domain names to choose from, and it offered free email forwarding. Now it only has one edge. They've given up a huge advantage: a lot of people, like me, hate web-based email. We consider it slow and clunky with insufficient functionality; you can't even store all your old emails. There are lots of very good email clients out there, and a lot of them are free. Why give all that up and migrate to a web-based email system? Someone isn't thinking straight here, and I'm pretty sure it's mail.com.
Monday, February 4, 2002
Self-esteemThe New York Times Magazine this weekend has an interesting, if slightly confused, article about how self-esteem is overrated. It's based on research by, among others, Nicholas Emler of the LSE and Roy Baumeister of Case Western Reserve University. Here are the nut grafs:
This rings true, as far as it goes. The Enron debacle, for instance, was probably caused by the high self-esteem of the company's top brass: their feeling that they were so amazing that they could do anything. On the other end of the spectrum, it takes an incredible amount of arrogance (which is what we term an excess of self-esteem) to send thousands of innocents to their deaths, as Osama bin Laden did in September. I can also easily believe that statistically, people with high self-esteem are no more successful financially, academically, whatever than those with low self-esteem. However, the author of the article, Lauren Slater, misses two crucial points. The first is that people want to have high self-esteem because they want to be happy. It almost seems to obvious to mention, but Slater doesn't mention it, so I will: if you feel good about yourself, you're probably more likely to feel good in general. The second point is that there is such a market for self-help self-esteem books and the like because people are aspirational. The American Dream is that anybody can become President, or a multimillionaire. Statistically speaking, virtually nobody will, of course. Class mobility, I believe, is not much greater in the US than it is anywhere else. But the country is built on aspirationalism, which is why poor southerners vote Republican and why welfare mothers in the inner cities, when asked, favour the abolition of welfare. When you're aspirational, you look to the very top, and what you see is, overwhelmingly, men with very high self-esteem. Take Bill Clinton: the sheer chutzpah he showed when he denied having sexual relations with "that woman" is exactly the sort of thing that the average Oprah watcher would love to have. For while high self-esteem might be an extremely bad indicator of success, I would hazard that extreme success rising to the very top of a corporate or political greasy pole is probably a very good predictor of high self-esteem. It's the desire to Be Like Them which drives people into the arms of groups like the Landmark Forum, a rather scary organisation which seems to break people psychologically in order to persuade them to spend more money on its courses. People join the Forum because they think probably correctly that it will give them a certain extra drive, will boost their self-confidence. Looking at the Forum members I know in New York, that will mean that they will take on scary amounts of debt, safe in the knowledge of their future success. That can have one of three effects: it can cause a horrible spiral of indebtedness; it can cause a bubble of prosperity which lasts for a while and then bursts; or it can succeed, creating a lot of genuine wealth. Since it's very hard to differentiate the third from the second (see Enron), and since the Forum is a very large organisation, it's easy to see it they can find a lot of success stories with which to wow prospective members. And so the search for self-esteem continues, for good reasons and for bad. As Slater points out in her article, the main beneficiaries of this societal obsession, apart from the motivational speakers, are the psychoanalysts. But you can hardly blame the likes of the Landmark Forum for getting in on the act as well.
Monday, February 4, 2002
SmithBrian McNally is a smart guy. He's got a new restaurant open in the East Village, and he's working the media as masterfully as ever. This time, post-9/11, the spin is that it's low-key, it's not about the glitz and the glamour, it's just another neighborhood restaurant. Yeah, right, Mr McNally: you've been open less than a week, and already the Sunday Styles section of the New York Times is gushing over sightings of "Julianne Moore, Ellen Barkin, Sophie Dahl and Mr. McNally's onetime roommate, Anna Wintour of Vogue." So of course I had to pop over there and check it out. It's called Smith, although you have to read the media coverage to know that: the name appears nowhere on the awning, the door, or the menu. It's chaos inside, as you might expect from a brand-new joint. The woman who takes my name at the door promptly disappears and isn't seen again for the rest of the evening; the barmen seem to spend half their time apologising for the lack of various spirits and the other half trying to work the cash register; the dishes are curiously unaccompanied by cutlery; the bread arrives in the middle of the first course; the waitress knows nothing about the food. All these things will be ironed out, I would guess, sooner rather than later. What needs no ironing out is the food, which is fantastic. Remember that this is Brian, not Keith: a man who aspires to something slightly greater than the souped-up French bistro food of Balthazar and Pastis. I don't know how he's done it, but within days of opening up, I have to say he's got the best kitchen in the East Village. It's certainly not pretentious: their sardines are served in the can (and they're delicious, when accompanied by Smith's flavourings). But it is ambitious: I ordered sweetbreads followed by rabbit ravioli, both of which were mouth-wateringly delicious, bringing out the earthier flavours while remaining sweet and succulent. And the bread oh! the bread. When it arrived, it was quite simply the best rye bread I think I've ever had in my life. I have no idea where it came from, but this stuff can give Bouley a run for his money. The one real disappointment is the decor: this is visually a very bland restaurant indeed, both in the rear dining room and in the front bar area. Apparently there are murals coming, but there is no feel of fabulousness, of being a destination at all. It's not even cosy or comfortable. Still, I can highly recommend a trip to the restaurant: after all, located as it is on the same block as Chez Es Saada, Prune and the Tasting Room, there's no shortage of alternatives to go to if you fail to get a table. Just hope they haven't changed the bread.
January postings 2001 postings | |