Wikipedia:Wikipedia Signpost/2010-02-08/Dispatches
Content reviewers crucial to setting standards
Content review processes such as
Featured articles
Featured articles represent Wikipedia's best work, and achieve this status after a review open to the whole Wikipedia community. Editors can support the article's promotion if they believe it meets all the
In 2009, 522 articles were promoted to
- Annual increase in FAs down 37%
- FAC reviews down 26%
- FAC reviewers down 36%
- FAC "nominators only" up 250%
- FAR participants down 32%
In 2009 there were 991 FACs (522 successful, 469 unsuccessful), which attracted a total of 9,409 reviews. 1,434 editors were involved with the FAC process, of whom 224 were nominators only, 302 were both nominators and reviewers, and 908 were reviewers only. A successful FAC had, on average, reviews from 12 different people, while an unsuccessful FAC had reviews from 9. In 78% of all FACs, one of these reviewers was
Thus compared to 2008, there were 28% fewer people participating in the FAC process in 2009, which led to 26% fewer reviews. However there were in fact 35% fewer people providing reviews; the number of editors nominating an article but not reviewing others increased by a factor of 2.5, or 250%.
Articles can also lose featured status through the Featured article review process. Editors who believe an article no longer meets the featured article criteria can list it at FAR. Ideally one or more editors will take on the task of bringing it up to standard. The FAR process showed a similar decline in participation in 2009. Last year there were 219 FARs (157 demoted, 62 kept), and 767 editors participated in reviews. In 2008 there were 263 FARs (143 demoted, 120 kept), and 1129 editors participated. The number of editors participating thus dropped by 32% in 2009.[3]
Featured lists
- Annual increase in FLs down 38%
- FLC participants down 23%
- FLRC participants up 31%
Similar processes to FAC and FAR exist for primarily list-based content—featured list candidates (FLC) and featured list removal candidates (FLRC). In 2009, 500 lists were promoted to
FLRC bucked the trend, having 235 people involved in 114 reviews, compared to 179 in 72 reviews in 2008.[5] The increased number of lists having their featured status reviewed is possibly a consequence of the large growth of the featured list process in 2008.
Good articles
- Annual increase in GAs down 11%
- GA participants down 25%
A-Class review
- WP:MILHISTA-Class reviews up 40%
- Number of WP:MILHIST ACR participants steady
On the
Peer review
- PR reviewers down 37%
- PR "nominators only" down 11%
- Three editors provided 43% of 2009 reviews
In 2009 a peer review was requested for 1,478 articles, resulting in 2,062 reviews. Of these, 891, or 43%, were carried out by just three editors—Ruhrfisch (343), Finetooth (332) and Brianboulton (216).[10] They were assisted by a further 730 reviewers making one or more review comments. A further 503 editors nominated articles for PR but did not review others.[11] Once again, these numbers are down on last year. In 2008, 2,090 articles had a peer review. For technical reasons the number of reviewers could only be determined for the period February to December;[12] in this period 1028 editors reviewed PRs and a further 499 nominated articles for PR and did not comment on others. In the corresponding period of 2009 the numbers are 645 (37% lower) and 449 (11% lower) respectively.[11]
How can I help?
Start reviewing articles!
Notes
- ^ Source: Wikipedia:Featured article statistics.
- ^ a b These figures were obtained by counting the number of links to the User or User talk namespaces from editor's signatures on the individual FAC pages. Queries like this one to the Wikipedia API provided the data in an easy-to-parse form. The nominators usernames were obtained by parsing the HTML of the monthly archive pages (e.g. Wikipedia:Featured article candidates/Featured log/January 2009 or Wikipedia:Featured article candidates/Archived nominations/January 2009, and recording the usernames listed after the string "Nominator(s)".
- ^ These figures were obtained by counting the number of links to the User or User talk namespaces from editor's signatures on the individual FAR pages. Queries like this one to the Wikipedia API provided the data in an easy-to-parse form. This method probably overestimates the number of users involved, as it counts links to users who, as significant contributors to the article, were notified of the FAR.
- ^ Source: Template:Featured list log.
- ^ a b These figures were obtained by counting the number of links to the User or User talk namespaces from editor's signatures on the individual FLC or FLRC pages. Queries like this one to the Wikipedia API provided the data in an easy-to-parse form. The number of reviewers cannot be separated from the number of nominators, as was done in the FA case, because the nominators were not listed in a standardised form until February 2009.
- ^ Source: Wikipedia:Good articles.
- ^ Source: Revision history statistics of Wikipedia:Good article nominations.
- Tropical cyclonesWikiprojects had active A-class review departments in 2009.
- ^ These figures were obtained by counting the number of links to the User or User talk namespaces from editor's signatures on the individual ACR pages. Queries like this one to the Wikipedia API provided the data in an easy-to-parse form.
- ^ Source: Wikipedia talk:Peer review.
- ^ a b These figures were obtained by counting the number of links to the User or User talk namespaces from editor's signatures on the individual PR pages. Queries like this one to the Wikipedia API provided the data in an easy-to-parse form. The nominators' usernames were obtained by finding the creator of each individual peer review page (e.g. Wikipedia:Peer review/Gilbert Foliot/archive1) using API queries like this one.
- ^ The category January 2008 Peer Reviews does not exist.
- ^ User:Ruhrfisch at Wikipedia:Wikipedia Signpost/2008-09-15/Dispatches.
Discuss this story
Kudos to Dr pda for a lot of work to put together a fine article, and to all those content reviewers who do so much invaluable work for the Wiki! Now let's get some more reviewers at all of these content review processes; promotion of articles depends on conscientous reviewers as much as it does on the content builders. SandyGeorgia (Talk) 17:23, 8 February 2010 (UTC)[reply]
I think a big problem here is that reviewing is a pretty thankless task, which receives little attention from other editors (particularly GA reviewing). I've reviewed a few articles, but stopped doing it as it felt like too much work for too little reward; while being a good content writer gets you a GA or FA star, being a good reviewer gets you nothing. Perhaps there should be a Reviewers' Barnstar, to encourage more participation in this area? Or perhaps we should be asking 'who reviews the reviewers?'... Robofish (talk) 14:01, 9 February 2010 (UTC)[reply]
I also applaud the article for drawing attention to the review drought, but I was rather disappointed to discover that no attempt was made to include the A-class reviews that occur here in with the count. As a practical matter, while such reviews cover only a small percentage of article reviews on Wikipedia, they are still a review process. I think it irresponsible for the post to have made such a glaring omission in this story. TomStar81 (Talk) 23:54, 9 February 2010 (UTC)[reply]
It strikes me that the type of reviewing has changed. Looking back at one of my early FAs, Wikipedia:Featured article candidates/Seabird, there are plenty of drive-by supports (people who come in and support without nitpicking too much). More recent FAs have fewer actual reviewers, but the reviewers tend to take a fine tooth comb to the article, seemingly much more so than in the past, although I concede this may be because it is so much harder to get a peer review to iron out problems. In all my recent reviews the main sticking points are usually related to prose and finicky style stuff. I would guess that a tightening of prose standards puts people off - I can't contribute anything meaningful in that area, so I'm scared off reviewing or offering opinions. That said, I can offer opinions on content, so I promise to start reviewing again at least from that point of view and let our more literate and stylish editors worry about that stuff. Sabine's Sunbird talk 03:42, 10 February 2010 (UTC)[reply]
This is very interesting, thanks! Is there any chance of getting the figures for 2007? It is difficult to draw reliable conclusions from only two data points. --Tango (talk) 09:49, 10 February 2010 (UTC)[reply]
It's not worth it for a POV-pusher to spend an extra 10-15 hours in the modern era to get the prose and formatting etc done when they could be writing more tripe. the 30k TFA hits were a big carrot in the old days, and some guys started their FACs with soapboxing comments about why the historical incident in question was important with a strong nationalist bent etc.... Still as for toxicity, 2006 was a very turbulent year in terms of political wiki-riots and I remember some people who were very famous headkickers, thug admins and enforcers in those days complain that the now way-outdated 2006 standards were hard and that FAC was "nasty". Those were the days of 1-line drive-by supports when it was common for 10 guys from one wikiproject to just turn up and pile on; I remember one guy casting about 50 votes in three hours. Gee some of those thug admins in those days were soft as jelly. Thank goodness there aren't FAs like in those days that just copied and paraphrased a few web-bios/enyc articles and mixed them together YellowMonkey (vote in the Southern Stars photo poll) 02:54, 16 February 2010 (UTC)[reply]
Number of reviews or reviewers is not a good measure, IMO. Reviews in past years were a lot shorter; just a support or oppose and a sentence or maybe two. Reviews now tend to be much more detailed. A word count per review page might be a good thing to also look at. I would not be surprised at all if the effort, expressed in word count, is trending higher. Number of edits to articles and byte count changed in articles during the review period would also be a good measure. But what constitutes a review has changed so a simple head count is not comparing the same thing. --mav (Urgent FACs/FARs/PRs) 00:05, 21 February 2010 (UTC)[reply]