SXSW 2012 (or How Online Publishing is Like X-Men)

26 Mar

I was in Austin earlier this month for SXSW Interactive, one of the world’s biggest digital media festivals. Below is a roundup I wrote for Sparksheet

The Austin Convention Center

Over the past decade or so the formula for success in online publishing has been something like this:

Post a lot of content.

Keep it short.

And do it fast.

The assumption here is that people look to the internet for simple, snackable content in “real time.” And with its reverse chronological template, the web’s first indigenous news medium – the blog – was designed to deliver just that.

But over the course of the weekend here at SXSW, this model has finally been challenged and it seems as though fast, short and abundant may be giving way to slow (read: thoughtful), long (read: in-depth) and scarce (read: quality over quantity).

Curating quality

Maria Popova and David Carr

This web publishing paradigm shift became apparent to me during a Saturday morning session entitled “The Curators and the Curated,” featuring an all-star panel with David Carr, the curmudgeonly New York Times media critic, Mia Quagliarello, content curator for digital magazine app Flipboard, Max Linsky, co-founder of longform.org, Maria Popova, a blogger, and Noah Brier, co-founder of brand curation tool Percolate.

Although the panelists failed to see eye to eye on the monetization question (Carr: “I’m so glad you’re all here to repackage and repurpose me. By the way, that’s how I eat!”), they agreed that content should be judged on relevance rather than timeliness.

Popova decried what she called the “newsification of the web,” while Carr lamented “the tyranny of the new.” Lansky insited that “new stories and old stories get clicked on the same amount” and claimed that his site experienced no decrease in traffic when they scaled back the number of daily posts (although this wasn’t mentioned in the panel, that squares with Salon’s recent revelation that the online magazine’s traffic actually increased after they committed to posting less, but better, content).

SX-Men: Gawker vs. Slate

The old and new paradigms of web publishing came head-to-head in the form of two simultaneous sessions Sunday morning (I managed to catch about half of each, running from the Austin Convention Centre to the Hilton next door).

The first session was a live Q&A with Nick Denton, the founder of mega-popular blog network Gawker Media. Denton, who doesn’t so much court controversy as seduce it, defended Gawker’s gossipy, nouveau yellow style of journalism, summing up Gawker’s philosophy as “don’t consider too much before you put it down on the page.”

Prompted by interviewer Anil Dash to reveal the contents of a voice message by Brian Williams (Denton recently alienated the veteran news anchor by publishing a snarky email Williams had sent him), Denton joked, “I’m not getting page views out of this so what’s the point?” Which sums up Gawker’s editorial mandate pretty neatly.

Dr. X and Magneto playing Chess in X-Men film

The second session featured David Plotz, the editor of online magazine Slate, in conversation with Evan Ratliff, a contributor to Wired magazine and the editor of mobile publishing platform The Atavist.

As Plotz explained in a Sparksheet Q&A last year, Slate sees itself as a bastion of long-form journalism on the web and encourages staffers to spend months reporting on pet projects that manifest themselves as multi-part, print magazine-length pieces.

The session was dubbed “140 Characters vs. 14,000” words, but Plotz said that “it would be a mistake to think of social media as the enemy of long-form.” On the contrary, Plotz argued that by satiating our thirst for quick news and pithy headlines, Twitter was “driving out” what he called “commodity news” and “aggregation journalism” (Plotz didn’t say where Slate’s own news aggregation site, The Slatest, fits in to all of this).

Although Plotz didn’t call out Gawker by name, my guess is that he would put Denton’s content in the latter category. As I tweeted during the session (in a geeky nod to X-Men), Denton and Plotz are sort of like the Magneto and Professor X of web journalism, two very different sides of the same coin. Only time will tell whose vision for the future of web content will win. But I guess it’s pretty clear which one we’re rooting for.

Social Media Make Me Who I Am

14 Nov

Social media make me who I am. This has nothing to do with my job or with spending lots of time on social media (because I don’t, really). Social media make me who I am because of all the decisions social media force me to make with every tweet, every status update and every blog post.

Social media force me to decide what music I like, what my political views are, or even what I did tonight. Of course, those views and preferences and facts exist regardless of Facebook or Twitter or YouTube. But whether I share them with my online networks, and on what terms, comes down to a bunch of tiny, semi-conscious editorial decisions. I may decide to tweet about the new Radiohead album, but not the new Barenaked Ladies. I may post a photo from my trip to Austin for SXSW, but not my trip to Miami to stay with my snowbird grandma. I’m happy to take a potshot at the Tea Party, but my thoughts on Israel/Palestine may be a little too nuanced and touchy to expose to Likes, comments or @ replys.

You may call the product of these editorial decisions my “brand,” but that makes it seem like I’m trying to sell something. I prefer to call it my public self. Personally, I’m not ready to embrace the sort of radical transparency espoused by digital utopians like Jeff Jarvis. There are some things I’d prefer to keep to myself. Part of this is probably J-School baggage. A journalist never carries a sign at a protest or reveals which candidate he supports, my journalism professors taught me. Part of this is probably middle child syndrome. When I was five I refused to let my birthday party guests sing happy birthday, my parents love to remind me; I didn’t like the attention.

Picture a Venn diagram with two circles, one private, one public. Our social media selves exist in the sweet spot where the circles intersect. The big innovation of Google+ was that it allowed us to keep those circles apart. But, as Farhad Manjoo has noted, keeping them apart can be tedious, “like creating a seating chart for your wedding.” I don’t want to make all those decisions all the time. I don’t want to cultivate a “family self” and a “professional self” and a “high school friends” self.

Having one public self forces me to decide how sarcastic I am (trying to tone it down), how silly I’m willing to look (pretty silly) and what I’m ready to go to bat for (black licorice, Entourage, Beyoncé). And over time those decisions form a pretty complete picture of who I am. Whether I Like it or not

Where I Am

15 Nov

I’m a bad blogger, and not just because it’s been four months since my last update. Yes, the previous post is dated February 28th, but I actually turned this site into my travel blog while I was in Europe this summer. Realizing how out of place those posts were, I decided to move them to my new travel writing page. That got me thinking about what this blog is about and whether my original premise ­– beginning a career at the end of journalism – still holds up two years later. So consider this post a clearing of the throat, an attempt to figure out how to move forward by looking at where I am in my career and how I got here.

I launched this site just over two years ago when I truly was beginning a career at the end of journalism (as we know it). It also seemed like I was beginning a career at the end of the world as we knew it. Just a month earlier, Lehman Brothers filed for Chapter 11 bankruptcy protection, triggering the biggest and scariest financial crisis since – as you’ve heard countless time before – The Great Depression. Barack Obama’s historic election victory was a week away and the world was in a state of nervous anticipation.

At the time, I was living at the epicenter of all this, a journalism student working as a reporter in Washington, DC. I remember visiting the White House for the first time and being more impressed by the modest, less iconic building next door, the Treasury Department. If an unpopular and overwhelmed lame duck President lived at 1600 Pennsylvania Avenue, 1500 Pennsylvania Avenue surely contained the men and women who would or would not save the world.

Six weeks and four days after election night I returned to Montreal. I finished up my thesis, overhauled the blog, and began searching for a job at what might have been the worst time, in the worst industry, in the worst market I could have picked – an English journalism job in Montreal during the heart of the Great Recession. Because of my American degree, I was eligible to work in the States for a year and sent off countless CVs and cover letters for entry-level newspaper jobs and magazine internships as far-flung as New Mexico and the Northwest Territories.

Of course, I was competing with hundreds of recently laid-off veteran reporters and fellow fresh J-School graduates for a rapidly dwindling number of positions. I got a couple “Wait, can you even work in the States?” emails and a few “Sorry, not hiring but we’ll keep your resume on file” formalities but that was pretty much it. In the meantime, I honed my craft writing for Masc magazine and at one point nearly moved to Ottawa to write Web copy for Michael Ignatieff who I could have sworn would be Prime Minister by now.

During this time, I did what every good job hunter is supposed to do. I reached out to former teachers and mentors, set up meetings with people in the field whom I admired (or had some tenuous connection to) and cold called a few wish-list publications. I had a great phone chat with Jordan Timm, then at The Walrus and now at Canadian Business, who is a friend of a friend of my girlfriend. Jordan was generous and helpful and said that if my heart wasn’t 100% in the game, that if there was anything else in the world I’d be happy to do other than journalism, to run for my life. I spoke to my thesis advisor at B.U., the amazingly empathetic Boston Globe alum Mitch Zuckoff, and essentially asked his permission to do something else – something that’s not quite journalism – until things picked up. I spoke to my former editor at the New London Day, who said she would love to hire me if only they were hiring.  Continue reading

Is Branded Content Journalism?

25 Feb

Like lots of young J-School graduates, I have one foot in the traditional journalism world and one foot…somewhere else. One former schoolmate works as a “community manager” at an Internet startup. Another friend is a Web editor/SEO specialist at an online newspaper. Yet another sunlights as “communications director” for a government agency.

As I’ve discussed before, I’m the editor of a lightly-branded media and marketing blog called Sparksheet, which is both an independent-minded industry publication and a strategic corporate property. So I’m always navigating the line between editorial and advertorial, zealously guarding my journalistic independence and integrity while making sure not to embarrass the company or its clients.

Which brings me to this recent blog post by Sally Gethin. A self-proclaimed “old fashioned journalist,” Gethin edits a respected inflight entertainment industry newsletter (yes, such a thing exists). Although I can’t say for certain, the post seems to be a thinly-veiled attack on Sparksheet.

(Note: For some reason she seems to have deleted the post. But here’s another Web lesson for Ms. Gethin: online content is forever. You can find a cached version here– just scroll down a few posts to “When news becomes clutter”).

Indeed, most of the post is an inchoate rant. She blames the Internet for killing investigative reporting. She laments that “There is too much online ‘chatter’ going on.” Regarding Twitter, she contends that “just the word itself defames the notion of real debate.” Really?

But the question of whether branded content should be regarded as credible journalism is a legitimate one. So here is my response (originally posted as a comment on her blog):

As a fellow journalism school graduate and someone who works in the branded media space, I couldn’t disagree more.

First, the idea that the Internet and “digital media” are killing investigative journalism is ludicrous. Check out websites like ProPublica, Spot.us and Talking Points Memo, which have picked up the investigative torch dropped by newspapers, magazines and TV stations that are no longer willing or able to invest in proper muckraking.

It’s a shame that so many legacy media outlets are struggling. But “old fashioned journalists” and media executives are far from blameless. Ignoring what happened to the music industry in the face of Napster and iTunes, they failed to grasp the impact digital media would have on their outdated and inefficient business models (low subscription costs, print classifieds, un-targeted ads, etc.).  Instead of seeing the Internet as an opportunity, they saw it as a threat, and leaner, keener outlets rose up to fill the void. Continue reading

From i to Wii: The Decade in Technology

31 Dec

I was recently asked to contribute a short piece to a London-based magazine’s “noughties” roundup. The assignment? Sum up the decade in technologyeverything from new media and music, to gadgets and games. In 300 words. Here’s what I came up with:

In November, Forbes magazine pronounced Apple’s Steve Jobs “CEO of the decade.” The runners up? Microsoft boss Bill Gates and Google co-founders Larry Page and Sergey Brin. At a time in which financial giants, manufacturing moguls and media barons are cutting their losses, the geeks have inherited the earth.

This was the decade in which technology went mainstream. Gadgets and gateways—some loaded with “apps” and run in the “cloud”—fill our pockets, furnish our living rooms and power our offices. For most of the decade, it was all about “You.” From iPods and iMacs to MySpace and YouTube, the noughties made technology personal. Blogs, search engines and aggregators turned newspapers and other mass media “old.” On-demand entertainment made movie and television watching into custom experiences. eBay, Amazon and online banking transformed your laptop into a private commercial hub. And file sharing and MP3s rendered music labels and number one hit records unnecessary. In 2006, “You” were even named Time Magazine’s “Person of the Year.”

Then came social media and Web 2.0., and suddenly it was all about “us.” With a world of knowledge at our fingertips, we rediscovered our desire to connect to the world (and not just to our friends via email or instant messenger). Enter Facebook, Twitter, Skype and Google Wave. Even the iPod—that perfect vessel of bespoke gratification—morphed into the iPhone, a humanistically-designed connector and communicator.  Napster and “Just Do It” gave way to Wikipedia and “Yes We Can.”

But this decade’s achievements have also raised the bar for what comes next. In an age where everyday people can point a cursor at an image of the globe and zoom in on their kitchen window, we’ve become hard to impress. We’re shocked and appalled when our GPS doesn’t recognize a new roundabout or when a Blu-Ray disc won’t play on our Nintendo Wii. Google, Apple and Microsoft have proven that anything is possible. These days the only thing surprising about technology is its limits.

Content, Design, Experience: Notes from #UXMTL

26 Nov

Photo by celinecelines via Flickr

As I wrote in my last post, the Internet may enable us to connect with countless people from all corners of the world.  But that only fuels are desire for face-to-face meetings and personal connections. That’s why God Google people invented Tweetups, or small grassroots get-togethers of local Twitter users and like-minded geeks twits people. OK, geeks.

Tonight I went to an event organized by UXMTL, “a group that aims to help Montreal organizations create more enjoyable, useful and meaningful connections with their audiences, through User Experience Design,” as they put it.

Ironically, the venue was having connectivity issues (and I’m currently iPhone-less), so I wasn’t able to share my notes in real time. Normally, banging on my laptop or phone when people are talking makes me feel like a tool. At these events, being offline makes me feel naked.

Better late than never though, right? Here are some highlights from the evening’s panel discussion. Note that these are paraphrases/interpretations, not direct quotes. The panel consisted of:

Continue reading

Lessons From BlogWorld 2009

7 Nov

I spent a few days last month at the BlogWorld conference and New Media Expo in Las Vegas. I didn’t gamble a cent– I like to say I’m not dumb enough to play a game of chance, not smart enough to play a game of skill– but I learned a lot, tweeted a lot, and met heaps of interesting, engaging people. I even got to see Beatles LOVE courtesy of Cirque de Soleil. The show was magical and it was fun to watch 20-something Eastern European acrobats dance like ’60s-era Yanks to “Back in the U.S.S.R.” But getting back to the learning part, in the spirit of  making sure what happens in Vegas stays online, here are a few old and new media lessons from BlogWorld:

1. Online vs. traditional journalism is not a zero sum game

Despite some stinging comments hurled at CNN anchor Don Lemon during one panel, I was surprised by how much love “legacy media” were getting in BlogWorld. NYU journalism prof Jay Rosen advocated using search data to determine what readers care about. Blogcritics publisher Eric Olsen waxed nostalgic about the tactile experience of print magazines. Rather than eye each other suspiciously, old and new media types shared best practices and ideas for preserving quality journalism. Continue reading

I Miss Today’s Papers

29 Sep

Last month Slate announced that it was pulling the plug on Today’s Papers, its popular daily summary of the morning journals, and replacing it with The Slatest, a thrice-daily aggregator of “the 12 most important news stories, blog entries, magazine features, and Web videos of the moment.” Like many diehard Slatees, I was shocked. TP had become the prologue to my mornings. It was a quick, concise read, that made me feel reasonably well informed before starting my day. But I soon chalked up my initial reaction to nostalgia. After all, Slate’s editors were right. The news cycle is no longer daily. And newspapers aren’t the only players driving it. Surely, as an online editor, I should be the last person to cling to such a relic.

But now it’s clear to me that Slate got it all wrong. The lesson of online news is not that readers want their news all the time and from countless sources. It’s that they can afford to be pickier about when and from what medium they get it.  Sometimes that may still be from the newspaper at the breakfast table. At other times, it may be via smart phone on the way to the pub. In any case, organizations need to add value to the news by providing either content or context. I don’t need Slate to tell me what the 12 most important news stories are right now. That’s what my RSS feeds and Twitter and Digg and the myriad other aggregators that have emerged in the 14 years since Slate introduced TP are for. Continue reading

Sparks and see-through silos

2 Jun

While we’re on the subject of branching out from traditional journalism, check out Sparksheet, which launched about an hour ago. It’s a new media and marketing blog by Spafax, the custom publishing company behind enRoute and other slick “branded” magazines. I will be helping them produce content and already have a couple of posts up about airlines on Twitter (some get it; others, not so much) and Robert Scoble’s social media starfish. Like I said, journalism is changing and the walls between ad shop, think tank and newsroom are coming down– and being replaced with windows.

Picture 30

J-School Baggage

31 May

I recently took the train to Ottawa for a job interview at a partisan political organization. Over sushi and seaweed salad, my interviewer asked whether I was afraid, if I took the job, I’d be shunned by the mainstream media gods. I said I wasn’t; I lied. But everything I said next was true.

I told him that journalism is changing, that first-person narratives, argumentative essays and cheeky personal blogs were all the rage in our battered journalistic landscape. I said that the best bloggers—the Josh Marshalls and Michael Geists and Andrew Sullivans—are serious intellectuals whose reporting is grounded in sound research and reasoning. Yet, they all have strong voices, and crystal clear points of view. I told him that, in any case, I didn’t go to J-School because it was my lifelong dream to be a newspaper reporter. I wasn’t even particularly keen on current affairs until university.

Rather, I pursued a master’s in journalism because I was interested in everything— from music, literature and pop culture, to  politics, religion and technology– and figured a life in reporting would allow me to dip in and out of various disciplines and worlds. I chose journalism because it seemed like a worthwhile endeavor, what Churchill and Herzl and Orwell did before moving on to grander things. Besides, a year and a half of school would allow me to hone my writing and chops and put off getting a job for another 18 months.

I told him all this, and I meant it. But why, despite all that, did I still feel as though I had betrayed some fundamental instinct or ethic that I never had in the first place?