A childhood friend reminded me the other day that, back during the Oscar season of 1989-90, I stood before a live television audience and passionately implored them to see a movie called My Left Foot, starring a little-known British actor and playing in only a single local cinema. The television program was the closed-circuit morning news show of Roland Park Middle School in Tampa, Florida, and the audience my not especially cinephile-minded fellow sixth graders—but nevertheless, I made the case. That same year, I convinced our school librarian (and morning show “executive producer”), Nina Masky, to exchange Oscar picks with me and to appear on camera as we opened each other’s sealed envelopes, revealing who would take home Oscar “if we picked the winners.” (Apologies to Roger Ebert and the late Gene Siskel, from whom I shamelessly lifted the concept).

Back then, Oscar night was, for me, something close to a high holiday. I would insist—testing my parents’ patience at every step—on seeing as many of the nominated films as possible in advance, even when it entailed a day trip to neighboring Sarasota or Orlando to catch some small foreign or independent film that didn’t have enough muscle to penetrate a “B” market like Tampa. And when the hallowed evening arrived, I would camp out in front of the television, stay up past my bedtime, and watch every second of the telecast with feverish intensity.

Those were, I dare say, banner years for Oscar. Billy Crystal was the host of record (never since equaled), striking just the right balance of ego massaging and deflating in that high-powered room, while a wave of exciting independent movies (like My Left Foot and the same year’s Sex, Lies and Videotape) was crashing upon Hollywood’s shores, loosening the big studios’ ironclad grip on the awards. Harvey Weinstein was but a 400-pound gorilla back then, or maybe that’s just the rosy tint of memory. Sure, there were head-scratching upsets (like the Original Screenplay victory of Dead Poets Society over Do the Right Thing, the latter of which also failed to secure a Best Picture nomination), but it was all more than enough to entrance a budding young film critic for whom Hollywood seemed an enchanted kingdom as far away as Oz.

I’d like to say that was the last moment at which I took the Oscars seriously, but that wouldn’t be entirely true. Only a half-dozen years later, I even found myself at the gates of the Emerald City, covering the ceremony from inside the official Shrine Auditorium press room as a cub reporter for the USC Daily Trojan, where I also published other Oscar-related commentaries, such as who would be nominated versus who should, and so on. Of course, this was 1995, before the term “awards season” had entered the lexicon, let alone erupted into a year-round industry whose dutiful foot soldiers begin prognosticating next year’s Oscars before this year’s red carpet has been rolled up. And while VHS screeners of eligible films had, at this point, been arriving in Academy households for more than a decade, we were still a few years away from the total victory of the small screen over the large where movie viewing is concerned—as compelling an explanation as any for the outcome of this year’s Best Picture race. But I’m getting a little ahead of myself.

So the budding critic grew older and wiser and, in his tenure at the L.A. Weekly, reported on Oscar mostly from a distance, investigating the beleaguered Documentary and Foreign Language competitions and, in 2006, even finding himself at a crossroads with Roger Ebert over that year’s eventual Best Picture winner, Crash. But I preferred to leave the big night itself in the hands of my famous colleague Nikki Finke, whose real-time, Oscar-night blog dispatches were often more entertaining than the increasingly bloated and self-important ceremony itself. Still, I continued to watch, out of some combination of habit, nostalgia and professional obligation. Which brings me to the story of how I ended up in Los Angeles over Oscar weekend 2011, without so much as a single invitation to politely refuse (film programmers being even smaller fish than critics on the pre- and post-Oscar schmoozing circuit), save for one to not watch the Oscars in the company of a few like-minded individuals. More on that in a minute.

Originally, I wasn’t even supposed to be in L.A. last weekend, but when a meeting scheduled for later this month got moved up, I changed my plans accordingly, not even realizing at first the confluence of the dates. Or did I, unconsciously drawn back into the fold like one of Nathaniel West’s locusts swarming down on the Hollywood sign, or—to borrow an image from one of this year’s more overrated Best Picture nominees—an amputee who feels the phantom sensation of a missing limb? In any case, there I was, spending Saturday evening watching the Independent Spirit Awards with a childhood friend (the same one mentioned earlier in this post) and his fiancée, and enjoying them, even if the comic stylings of host Joel McHale were a far cry from John Waters (the Billy Crystal of the Spirits) in his prime.

Traditionally handed out in a tent on the beach in Santa Monica on All Oscar’s Eve, the Spirits have taken their share of knocks from critics and other industry observers—and not entirely without reason—for further blurring the distinction between “independent” and “studio” movies that began the moment the Walt Disney Company acquired Miramax Films in 1993. In the two decades since, the Spirits have become as much of an Academy bellwether (or “foreplay,” to quote one critic friend) as the Golden Globes, with four of this year’s five Spirit nominees for Best Feature, and five of the six nominees for Best Female Lead, also earning Oscar nods. Does that mean Oscar has gotten more indie? Or that the Spirits have sold their soul in the name of a starry red carpet and TV contract?

Probably a little of both, and yet the Spirits are still the only room in town where you will find the likes of Natalie Portman and James Franco sharing the stage with the filmmaker brothers Josh and Benny Safdie, who won a much-deserved Spirit for their remarkable second feature, Daddy Longlegs, in a category for films with budgets under $500,000, named for the seminal American independent filmmaker John Cassavetes. Not yet 30, the Safdies are among the few “indie” directors (along with their star and fellow filmmaker, Ronald Bronstein, and Cold Weather director Aaron Katz) of whom it can be said that they are continuing down the path blazed by Cassavetes rather than toiling slavishly in his footsteps. No matter that their films have been seen by fewer Oscar voters than were present in that Santa Monica tent on an uncommonly cold and windy SoCal afternoon. If genuinely independent American cinema—which is to say independent of both mind and means—has a future, and a face, this is it.

Around the time the Safdies are giving their charmingly awkward acceptance speech, my cell phone buzzes with a cryptic text message concerning the next day’s festivities: “The plan: some people are coming over for spaghetti and meatballs, a motley crew, and a few of them will go into the master bedroom where that nauseating spectacle will be on. Those of us who are sane will be chatting in the living room and perhaps playing some Rock Band.” The sender is a filmmaker friend who travels freely between the worlds of low-budget indies and mid-budget studio fare, has himself been nominated for a variety of industry accolades, and is considered by many to be one of the best and the brightest of his generation. Intrigued by the idea of not watching the Oscars with someone who could very plausibly be nominated for one in the not too distant future, I accept, even as I suspect that neither of us will have the guts to go through with it.

Yet, when I arrive at his house the next afternoon, the smell of simmering Bolognese filling the kitchen, the Oscars are nowhere to be seen—though they can ever so faintly be heard, a muffled din emanating from the depths of the master bedroom, where his wife and a few other invited guests are huddled around the television, watching the arrivals. My director friend tells me that, in fact, he never watches the Oscars, because the movies he loves rarely win, and because he can’t bear the spectacle of the losers, the four other nominees left smiling plastic smiles for the ever-watchful cameras when someone else’s name is called. And unlike in decades past, he adds, when a Woody Allen or a Stanley Kubrick could regularly blow off the Oscars without a care, nowadays if you’re nominated you have to show up or you’ll look like a jerk—as the army of bloggers will dutifully remind everyone. “But isn’t it an honor just to be nominated?” I counter. He does not seem convinced.

Then, as 5:30 PM PST comes and goes, and the few remaining stragglers file into the bedroom, I decide to dig in my heels and join my friend in his Oscar-less alterna-universe. Sometimes, I reason, it’s good to shake yourself free of tradition. And who knows: Perhaps not watching the Oscars will be akin to the moment in The Matrix when Neo resolves to take the blue pill and suddenly, for the first time, sees the world as it really is? Or perhaps not. As it turns out, not watching the Oscars is a bit like being in your twenties and deciding to prove your independence by not going home for Thanksgiving or Christmas; you can do it, and you may feel triumphant in the moment, but afterwards it seems a bit silly, and no one really cares.

Ironically, as I was not watching the Oscars, New York Times critic (and my fellow LA Weekly alumnus) Manohla Dargis was busy attending the ceremony for the first time—an experience she writes about in a wonderful piece for this coming Sunday’s Arts & Leisure section, but already available on the Times website. As Dargis points out, in the 58 years that the Oscars have been televised, media coverage surrounding the event (not least by the Gray Lady herself) has increased in dramatically inverse proportion to weekly movie attendance in the U.S. and to ratings for the Oscar show itself, and never more dramatically than in the last 20 years. Fueled by the internet media boom, awards season now stretches towards infinity, with stars and filmmakers showing up on every red carpet from New York to Palm Springs, in some cases collecting as many gilded trinkets for a single film as A-listers of previous generations amassed over an entire career.

Just as the wide reporting of weekly box-office returns in major news outlets (beginning in the 1980s) irrevocably conflated a movie’s artistic value with its commercial heft, so too has the 24/7/365 Oscar cycle further shifted attention away from the movies themselves and towards the wall of noise surrounding them. Some films are now christened front-runners or also-rans as early as summer, before they have even been widely seen (if at all), while dozens if not hundreds of pundits now appear to make their living from this annual guessing game. Which explains why they seem intent on making it drag on for as long as possible.

All of this has not gone unnoticed by the Academy, which has tried to maintain its awards-giving supremacy in recent years by moving its dates up one month (which only caused all the other awards shows to move up their dates by a month), trying to appear more inclusive (10 Best Picture nominees instead of five), relegating tributes and honorary awards to a separate ceremony in November, and hiring younger, hipper hosts (which, in light of this year’s ratings, may point towards Clint Eastwood as host of Oscar night 2012). Not that the Academy should worry too much. No TV program, or movie, can hope to command the kind of audience that was possible decades ago, when there were only three networks and “social media” was limited to the personal ads. Ratings aside, the Oscars are still the only real game in town, the molten core that fuels all the ancillary hype, the only award around anyone actually cares about winning.

Meanwhile, back at the Oscars blackout, around the 7:00 hour, as the pasta was ready to be served and the starstruck troops decamped from the bedroom, my friend’s principled resistance met its match in the form of his wife's love of Oscar-night fashion, and the awards appeared on the 60-inch living room plasma screen after all. The volume remained at a low hum, and between the kitchen noise and conversation and the gaggle of small children circling about our feet, I can’t say I “watched” the show, but I caught the highlights, including the dispiriting (if forgone) victory of director Tom Hooper and The King’s Speech over David Fincher and The Social Network.

For months, the awards-season cognoscenti, who tend to think in more reductive clichés than even most studio executives, had been envisaging a pitched battle between these two Oscar heavyweights, characterizing The King’s Speech as the people’s movie, a triumphant underdog story (with respect to King George VI and to the movie’s executive producer, Harvey Weinstein), while The Social Network was deemed the erudite critics’ darling, a movie of all head and no heart. Never mind that both films were quite warmly embraced by critics (including this one) and performed just about equally well at the global box office ($245 million for Speech versus $221 million for Social Network as of this writing). The real difference is that one film is at best a passing fancy, while the other is a momentous and lasting achievement that will claim a place of honor in the canon of great American movies, alongside quite a few prior Best Picture losers.

In the days leading up to Oscar, two critics not generally thought of as awards hawks, David Thomson and Karina Longworth, offered two of the more thoughtful explanations as to why The King’s Speech would carry the day, and I will here offer yet one more: The King’s Speech is, at its core, a movie about acting, as King George V himself comments to his stammering son in one key scene. It is about how the advent of radio turned politicians into performers, and it is above all about how an amateur Australian actor, Lionel Logue, coaches King George VI towards the great performance of his career. That surely makes The King’s Speech one of the most flattering portraits of the acting profession ever recorded on film, and we must never forget that the largest voting block of the Academy are actors. By comparison, The Social Network, while not expressly a film about acting, is very much a study in narcissism, unchecked ambition and betrayal—in short, the very lingua franca of working one’s way to the top of the Hollywood food chain In the case of Fincher v. Hooper, the reasoning is simpler. Like such perennial Oscar bridesmaids as Alfred Hitchcock (who never won) and Stanley Kubrick (honored only for visual effects), Steven Spielberg and Martin Scorsese (both passed over for decades, until they have become certified éminences grises), Fincher’s talent is so prodigious, his mastery over both the art and commerce of making movies so total, that he quite frankly scares people. Nor does he seem particularly keen on awards, which may scare some people even more.

And so the 2011 Oscars came to a close with a chorus of schoolchildren performing “Over the Rainbow,” an ideal anthem for movies that show us the world as we wish it to be rather than how it really is. Hollywood has long strived to convince us that those two worlds can be one and the same, and on Oscar night, for a few fleeting moments, we may even believe it—until we remember that the great and powerful Oz was himself just a two-bit actor who fancied himself the equal of kings.