Submissions/Learning & Evaluation on Wikimedia projects: asking and answering questions with data

From Wikimania 2013 • Hong Kong

This submission is on the wait list for Wikimania 2013.

Submission no.
5056
Subject no.
C7 or T5
Title of the submission
Learning & Evaluation on Wikimedia projects: asking and answering questions with data
Type of submission
Author of the submission
Jonathan T. Morgan, Evan Rosen
Country of origin
United States of America
Affiliation
Wikimedia Foundation
E-mail address
jmo25@wikimedia.org, erosen@wikimedia.org
Username
Jtmorgan, Erosen
Personal homepage or blog
Abstract

There are a lot of great research tools available to Wikimedians who want to find out what's happening on their home projects or evaluate the impact of events, programs and community-driven initiatives. But figuring out how to evaluate (what questions to ask, what tools to use, and how to demonstrate impact) can be tricky. In this presentation, we will walk through a few examples of how to conduct an evaluation and provide links to some new helpful resources for collecting and analyzing data that are available to all community members.

Detailed proposal

At the Wikimedia Foundation we use a variety of tools for gathering and analyzing data—tools like Limn, Wikistats, the User Metrics API, Tool Labs tools and research databases, even simple surveys and guerrilla usability tests. Many of these tools are being modified and developed to enable broader usage within our community in better understand the work that is being done: to monitor the effects of a program or initiative, to articulate the impact of a grant, to identify promising new strategies, to build consensus and make data-driven decisions for a fitter, happier and more productive wiki experience. In this talk we will go over some of the tools we use in the WMF Grantmaking Learning & Evaluation team and give a few examples of the way we frame questions, collect and analyze data and figure out what it means.

We will frame our presentation around real-world case studies from our WMF grantee and community experiences that use common (and powerful) research techniques and simple tools: comparing user cohorts with the User Metrics API and gathering feedback with online surveys. We will cover some of the most important considerations when performing these kinds of research: how to gather the right sample, decide what to measure and draw useful conclusions from your results. We hope to leave this session with input into the future direction of our tools, and we hope that you leave this session with a better understanding of where to find tools for evaluation and how to incorporate them into your wiki-work!

Track
Technology and Infrastructure
Length of presentation/talk
25 Minutes
Language of presentation/talk
English
Will you attend Wikimania if your submission is not accepted?
Yes
Slides or further information (optional)
Special requests

If both this submission and the User Metrics API submission are accepted, it could be useful to put them in the same session as they will complement one another.


Interested attendees

If you are interested in attending this session, please sign with your username below. This will help reviewers to decide which sessions are of high interest. Sign with four tildes. (~~~~).

  1. Daniel Mietchen (talk) 23:10, 22 April 2013 (UTC)[reply]
  2. Jwild (talk) 18:10, 30 April 2013 (UTC)[reply]
  3. Sharihareswara (WMF) (talk) 01:39, 1 May 2013 (UTC)[reply]
  4. Heatherawalls (talk) 18:58, 6 May 2013 (UTC)[reply]
  5. sats (talk) 16:00, 8 May 2013 (UTC)[reply]
  6. Ocaasi (talk) 21:51, 8 May 2013 (UTC)[reply]
  7. Atropine (talk) 14:08, 10 May 2013 (UTC)[reply]