Breaking the News, One Tweet at a Time
"just setting up my twttr"
On March 21, 2006, Twitter CEO Jack Dorsey used 20 characters to send out the first-ever tweet in a test of his new program's capabilities; after much more back-and-forth testing to iron out the bugs that were left, the service went live four months later on July 15 (fun fact: Twitter was originally known as "twttr" because, ironically, the original twitter.com domain was already being used at the time of its launch—Dorsey and Co. had to wait another six months after that before they could legally buy the domain and add a couple of vowels to the name).
Here's another fun fact—in a 2009 interview with the LA Times, Dorsey pointed out that while brainstorming names for the new site, one of the ideas they threw out was "twitch", but as he explained, "'twitch' is not a good product name because it doesn't bring up the right imagery." With that logic, one has to wonder what went through his head when the actual Twitch streaming service went live on June 6, 2011; he probably wouldn't have cared all that much, considering that when Twitch launched, Twitter already had around 100 million users on its site.
Early reviews were mostly positive. During its July 15, 2006 launch, a Techcrunch writer remarked that "If this were a new startup, a one or two person shop, I'd give it a thumbs up for innovation and good execution on a simple but viral idea." However, they warned that there could also be a privacy issue with using it, noting that "Every user has a public page that shows all of their messages. Messages from that person's extended network are also public. I imagine most users are not going to want to have all of their Twttr messages published on a public website." Even the Los Angeles Times admitted that Twitter could be the new face of the Internet, if the way it went about doing it wasn't exactly earth-shattering, remarking that "Twitter works by hypercharging social networks such as those on MySpace or Friendster...The result, according to [CEO] Jack Dorsey, the force behind Twitter, "brings you closer to everyone, because you know what everyone is doing, things you would never imagine.'"
Other writers were less forgiving, with one on Slate going so far as to openly mock the idea:
"For the ultimate in solipsism, check out Twitter.com, a site where—once you register—you can answer the question, 'What are you doing?' At 7:47 a.m. on Monday, for example, Lynda was going to get a glass of cold water.
"This raises more questions than it answers. Did she get it? Was it cold enough? Tragically, we'll never know until someone starts a site about what you were doing before what you're doing now. Or possibly an interactive site about what you are going to do next after you finish doing what you're doing now. There could be multiple options. People could vote. Hey, someone call Google. We're rich!"
Snark and condescension aside, it soon became clear that Twitter was more than just a place for people to post about their daily trips to the bathroom (though that did happen—a 2008 article on Twitter's early days told a humorous tale of a New York city venture capitalist who loathed the idea so much that, just to emphasize his belief on how ridiculous and pointless the service was, he literally only used it to document when he was using the toilet. His mindset changed when a steam pipe burst and he used Twitter for the first time in the way it was originally intended—he posted about something that actually mattered.)
It's clear that from a traditional media standpoint, those whose only experience with social media was something like MySpace (Facebook only started allowing non-school-affiliated users in May 2006, two months after Twitter launched) would have felt that having another site on the scene would have been akin to infringing on "their turf", a sort of "We were here first, so we make the rules" mentality.
The news media had made its feelings known about Twitter, but because of the fact that it's still around 15 years later, Dorsey and Co. were doing something right. So what changed? The Media realized that instead of being a hindrance to their reporting, Twitter became an asset—with the arrival of the iPhone a year later in 2007, journalists were no longer shackled to their computer desks and could break a story as fast as their fingers could type on their phone screens (or until they ran out of characters; it took another 11 years after launch for the Twitter brain trust to up the character count from a measly 140 to the 280 it stands at today). They still had their pens and notebooks as backups, but Twitter made it so that they weren't the first things reporters reached for when Something Important was happening—the only thing they had to worry about was if their phone had enough juice left in the tank. The last thing they needed (or wanted) was to miss out on being the first to break a story because their phone decided to up and die right in the middle of it.
Because of this, news organizations soon adopted the "If you can't be them, join them" aphorism and started using Twitter to break their own news instead of forcing their readers to go to their own websites—not only was it faster, but since the majority of a news company's readers were probably using Twitter anyway to feed their own social media consumption (or addiction, if you're the condescending sort), it only made logical sense for you to be on the same platform as your viewer base. Nowadays, literally every large-scale news organization on the planet (and some of the less-well-known) has its own Twitter handle—in fact, Twitter has become such a necessity in reporting that it's no longer surprising to see individual journalists with their own Twitter handles.
Twitter also made it so that literally anyone with a phone (and the app, of course) can be a citizen journalist, able to report the news whenever and wherever it happens around them—sometimes even getting the story out faster than actual journalists. While they obviously aren't held to the same standard, these citizen journalists nevertheless play an invaluable role in the news hierarchy, helping to supplant the news in addition to providing critical "first-person" accounts that otherwise wouldn't have been possible.
Unfortunately, the aforementioned speed and ease of use that Twitter provides has to come at a cost, which oftentimes means that the First Rule of Journalism—an obligation to the truth (or, in other words, "Trust but Verify")—gets sidelined in favor of being the first to break a story. Because a journalist always has their phone with them (and, by extension, Twitter), sometimes they don't think about the specifics of the story and instead get caught up in the moment, which can have dangerous consequences.
Nowhere was this more prevalent than on June 25, 2009, when Michael Jackson passed away from a heart attack. The problem was, not only was Twitter so overwhelmed with messages that the site crashed, but many news outlets had so much trouble differentiating fact from speculation (Was he dead? Was he in a coma?) that hours went by before people knew the truth. They forgot the First Rule—just because you trust your sources, that doesn't mean you should believe everything they say and accept it as fact.
As you can see, Twitter can be a double-edged sword—it provides speed and ease-of-use that traditional journalism can't match, yet at the same time the "Trust but Verify" rule can sometimes get pushed aside in favor of "being first".
About the Author: Ronald Hamilton, Jr. is a graduate of Arizona State University's Cronkite School of Journalism and Mass Communication, where he graduated with a Bachelor of Arts in Mass Communication and Media Studies. When he's not writing for COM-GAP, Ronald is a volunteer with the nonprofit Live Forever Project.