Home > Citizen/Community Journalism > Behind the Scenes at Citizen Journalism startup AllVoices – Mediashift, September 2009

Behind the Scenes at Citizen Journalism startup AllVoices – Mediashift, September 2009


An interesting look behind the scenes at Citizen Journalism startup AllVoices – from PBS Mediashift:

[blip.tv ?posts_id=2607293&dest=-1]

There are some interesting observations and comments in this video that shed insight into the core competencies of the future news organization. From the first part of the video, a few observations:

  • Low cost structure – The company employees relatively few staff, where each person where a lot of different hats – the prototypical startup
  • Community Management – Strong emphasis on Community Management, and the role of the Community Manager
  • Copyright – A need to manage copyright violations – for both professional and user-generated content – which AllVoices manages by NLP algorithms (recognizing sequences of 5 identical terms)
  • Marketing – The Community has become the evangelist for AllVoices, which has helped AllVoices tremendously in creating buzz. People promote their content on Social Networks and other sites. AllVocies depends on their community to do their marketing for them … it’s all about the Community.

The second part of the video (starting at approx 5:29) is an interview with AllVoices’ CEO Amra Tareen and VP Social Media, Erik Sundelof. Some insightful quotes in this segment. Here’s a few:

Amra Tareen: So when AllVoices started, what we wanted to do was create a place where people could report regardless of where they are, from any device – cell phone, computer, using MMS, SMS, e-mail, or just going to the website.

When they send us something, what we want to do is geolocate – where exactly is it coming from? In AllVoices, we can detect locations down to any place greater than 500 people … So any city in the world we can detect where the message or report is coming from.

And then we try to geolocate, based on the IP address, based on the cell phone #, based on any tags the user adds to their text.

Continuing,

Amra Tareen: Now there are two types of content that come into AllVoices. One is “user reported”, the other is what our system aggregates from news sources and news feeds all around the world.

So, first, we geolocate, we categorize – whether you’re talking about Politics, Conflict and Tragedy, Sports, Entertainment. Then what we do is break it down, do contextual analysis to “bag of worlds“.

Then based on those bag of words, … we want to showcase the user report, as well as create context around that report by aggregating related information.

… Since we already break it down into keywords, we know what the tags are for that user report. But we let the user add the tags themselves. Because sometimes the machines are not always as accurate as the user is. And that’s what we’ve learned – AllVoices is based on Machine Learning and the Community, and the Community always corrects the Machine Learning.

So some interesting stuff here. Once again (that is, I have strongly advocated this position in previous posts), the future of Journalism will be significantly about a balance between Machine Learning and the Community … and the many, many technologies that support the interface between the two.

Let’s see what Erik Sundelof, AllVoices’ VP Social Media, has to say:

Erik Sundelof: If you are doing cell phone reporting or “in the field” reporting, you have to bring in the context, and [show people that context].

At All Voices, we try to bring in all the different content and media types … By doing this, you will also be able to determine how credible a particular report is.

If user content is coming in very short, very opinionated peices – which I really think is what Citizen Journalism should be about, bringing in the more emotional side, and telling what is really going on on the ground – that doesn’t mean that it’s fact checked. But you can’t fact check the complete flow of information in free-form. So you have to apply technology on top of it.

How does AllVoices’ system deal with “hoaxes” reported by the community?

Erik Sundelof: The way we are attacking the “hoaxes” problem is through “credibility”. A hoax is just another story. We’re still going to apply the same methodology, because everything is a computerized [algorithim]. So this means if the hoax comes in, and no one is talking about it, then it will just drop off the system. It will still have a page, because it’s a free publishing platform. So you will get your page, but it won’t show in the landing pages because no one will view it.

Amra Tareen: And each page has a credibility rating. So every report in AllVoice has a 5-bar credibility rating. So based on the activity level, based on similirity of content we find on AllVoices and off of AllVoices, I think the likelihood of a hoax being report is small, compared to some person individually fact-checking, and trying to figure it out.

Interesting perspectives – again, particular around the intersection of machine learning and the crowd-sourced journalism and content.

Tip of the hat to Stephen Konrath‘s blog News 3.0: The Future of Journalism, where I first came across this video in this post.

glenn

Advertisements
  1. No comments yet.
  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: