My initial instinct when seeing this question was to draw attention to a process/policy that I think most people wouldn't even think about - that an answer being verifiably wrong is not a reason to delete it and it should be downvoted instead.
This has often confused me. I understand the argument - I just don't agree. The primary issue with the argument is that it relies on users actually voting and leaving comments to inform future users that the answer is wrong. Unfortunately, if no one is around to leave those breadcrumbs, the answer is just an answer.
As such, I went searching for data to help show how voting has changed since 2015 but ended up leaning on data I could get without SEDE, like the (all-time) vote history of the 20 highest-volume voters on SO in 2025.
I discovered that only five of the 20 upvote more than they downvote and that, combined, their all-time 1.4 million votes were downvotes 87% of the time. That would be great for my argument except that, 75% of those votes are on questions, with only two of the 20 voting more on answers than questions.
While I know that not all users will follow the patterns of these 20, I wondered - "If the top voters are voting so little on answers -
- Is anyone voting on answers?
- Is the foundation the platform was built on even still sound?
- If people don't vote on answers, is SO even useful?
So, I needed more info. I collected all-time and recent data about both questions and answers and focused far more broadly than score... using on site search. Meaning my info doesn't include deleted stuff. I also chose to only look at open questions and only looked at data through the end of 2024, so nothing was too new to have gotten votes. I'd say it's fair to call my numbers a "best case scenario".
I looked at:
- Are there still more answers than questions? - yes, about 1.3 answers per question.
- What percentage of questions are still getting at least one answer? 55%.
- What percentage of questions get at least two answers? 20%.
- Well, I guess you don't need to rank answers if 74% of answered recent questions only have one answer.
- Do recent questions ever get lots of answers? Yes - 603 questions from 2022-2024 have 10 or more answers.
- Fun fact - 3 of those qualify as "Unanswered".
- What percentage of answers have a score of 0 or less? 53%
- Wait... so half of recent non-deleted answers have a score of zero or less? In fact, half have a score of exactly 0. Only 3% of recent answers on SO have a score less than 0.
- Questions are doing better, right? Worse. 63% of recent, non-deleted questions have a score of 0 or less. 55% have a score of 0.
You'd think that - as the person who (literally) brought you the 1 rep to vote concept - I'd have looked at some data to support the necessity of increasing voting on SO, so I'd be talking about stuff I learned in 2023. You would be wrong. If anyone looked at this data, it wasn't shared with me, as far as I can remember.
So, here's most of the info I got from my searches on Stack Overflow -
"Answerable" question data
Here's a comparison of the "answerable"? question status on Stack Overflow up to the end of 2024 ("All-time") and for 2022 - 2024. Due to the Roomba cleanup that deletes unanswered, zero-score questions after a year, I'll also include info for 2022 - 17 Feb 2024 - labeled "Roomba" in the table.
Question category |
All-time |
% |
22-24 |
% |
Roomba |
% |
"Answerable" questions? |
23.1m |
N/A |
2.5m |
N/A |
2.1m |
N/A |
Question score >0 |
11.4m |
49% |
901k |
36% |
792k |
37% |
Question score <1 |
11.7m |
51% |
1.6m |
64% |
1.3m |
63% |
Questions with at least one answer |
20.0m |
87% |
1.9m |
75% |
1.7m |
78% |
Questions with at least two answers |
8.0m |
34% |
479k |
19% |
429k |
20% |
Questions with "Accepted" answer - % of all Qs |
11.8m |
51% |
928k |
37% |
822k |
39% |
Questions with "Accepted" answer - % of Qs with >0 answers |
11.8m |
87% |
928k |
90% |
822k |
90% |
Questions with zero answers |
3.1m |
13% |
637k |
25% |
461k |
22% |
"Answered" questions§ |
15.8m |
69% |
1.3m |
51% |
1.1m |
54% |
"Answered" due to accepted answer - % of all Qs |
2.3m |
10% |
266k |
11% |
233k |
11% |
"Answered" due to accepted answer - % of "Answered" |
2.3m |
15% |
266k |
20% |
233k |
20% |
"Unanswered" questions§ |
7.3m |
31% |
1.2m |
49% |
967k |
46% |
"Unanswered" with at least one answer - % of "Unanswered" |
4.2m |
57% |
588k |
48% |
506k |
52% |
"Unanswered" with at least two answers - % of "Unanswered" |
870k |
12% |
91k |
7% |
80k |
8% |
? - "Answerable" means questions that are not deleted, closed, locked or wikis. There are 24.2 MM total non-deleted questions.
§ - "Answered" and "Unanswered" refer to whether a question has an accepted answer or at least one answer with a score of >0.
It seems like questions are largely faring worse when it comes to getting any answer over the last two years, while getting a "validated" answer is even less likely and relies more on the asker accepting an answer. The likelihood of getting multiple answers is lower as well.
It's important to remember that these numbers only reflect questions that are currently answerable - not all questions that were asked. This table excludes everything the community has already acted on to close and/or delete or that askers have deleted on their own.
Even the "Roomba" column - what I think of as the "best case scenario" - shows the current trajectory isn't looking great, with nearly half of recent "Answerable" questions considered "Unanswered" by the system and 22% having no answers at all.
Answer data
So, now we know how likely a question is to be "Answered" - let's see what the answers look like. Here's their status on Stack Overflow up to the end of 2024 ("All-time") and for 2022-2024. Note, this table includes answers to questions that aren't "Answerable".
Answer category |
All-time |
% |
22-24 |
% |
All Answers |
35.9m |
N/A |
3.3m |
N/A |
Answer score <1 |
14.6m |
41% |
1.8m |
53% |
Answer score <0 |
811k |
2% |
106k |
3% |
Answer score >2 |
7.7m |
21% |
334k |
10% |
Answer score >9 |
1.7m |
5% |
36k |
1% |
Accepted Answers |
12.4m |
34% |
986k |
30% |
Accepted Answer score <1 - % of accepted |
2.8m |
23% |
306k |
31% |
Accepted Answer score <0 - % of accepted |
70k |
0.6% |
9k |
0.9% |
Over half of answers created in the last two years have a score of 0 or less - mostly a score of 0. Far fewer answers have a score of two or greater than did in the past. While some of this is likely due to accumulation of votes over time, it certainly calls into question some of the core tenets of the platform.
I don't know how to explain this succinctly - the platform, company, community, and users each have ideas of what this platform is, what it is "supposed" to be, and what it "could" be - but these ideas each seem flawed.
Top voters data
While by no means representative of all voting, here's the voting habits (all time) of the top voters on SO so far in 2025 (as of Feb 12). Note - these numbers do not reflect the makeup of votes cast this year, since that info isn't available. This represents all users who have cast at least 1000 votes in 2025.
The top 20 voters in 2025 fall into a few categories:
- Exclusively upvote - 3 upvote >98% of the time
- 1 votes <25% on questions, 2 vote 50-75% on questions
- Primarily upvote - 2 upvote 75-80% of the time
- Primarily downvote - 9 upvote 2-25% of the time
- 1 votes 25-50% on questions, 5 vote 50-75% on questions, 3 vote >75% on questions
- Exclusively downvote - 6 upvote <2% of the time
- 1 votes 50-75% on questions, 5 vote >75% on questions
Overall (all time):
- 1.4 million votes
- 75% question votes
- 87% downvotes
- 21k average reputation, 6.5k median
- 9 years average account age, 10 years median
- The two newest voters (2 months and 1 year) upvote >98% of the time and are the only users with <2k votes.
- The 6 lowest-rep users (<2k) are equally split between primary upvoters and primary downvoters, as are the two highest rep (>100k).
While my interpretation can be debatable and the data is incomplete due to deleted posts, the points I take from this are:
- Despite there being more answers posted per year, the top voters vote on questions more than answers.
- At this volume, primary upvoters tend to be lower reputation but primary downvoters are represented at all reputation levels.
Bigger than wrong answers
So... since I pulled all of this info and it seems to be a "best case scenario"... I've been really trying to figure out how to frame this as an answer to the question. I agree with a lot of the answers already here. I've long championed many of them personally and have collected data and user sentiment about them informally over the years that frequently just ended up living rent-free in my head - what was the use of writing down proposals knowing nothing would ever come of them?
Seeing these numbers is scary and leaves me questioning whether the core design of the platform actually supports the original intention and searching for ideas that would lead to a dramatic shift towards making the platform successful. But I do know that deleting verifiably wrong answers instead of (not) downvoting them isn't going to solve anything.
Is Stack Overflow delivering on its core purpose?
What are we even supposed to be doing anyway?
This platform was established to solve the problem of programmers struggling to find trustworthy solutions to issues they face in a miasma of forums, blogs, and other resources, sometimes behind paywalls. It intended to do that by having a durable library of high-quality questions and answers that anyone could access or contribute to freely. It would be curated and maintained by a community of experts and the content "validated" by voting - which would rank answers to help the best "bubble up" to the top. This is covered in the Tour.
The cake is a lie
And by "cake", I mean the Tour.
Ask questions, get answers, no distractions
Increasingly, people don't get a single answer, let alone multiple answers. While historically less than 15% of questions don't have answers, 25% of questions asked between 2022-2024 don't have an answer. And, while nearly 35% of questions all-time have at least two answers, less than 20% of questions asked between 2022 and 2024 do.
Good answers are voted up and rise to the top.
In actuality, 40% of all answers and 50% of answers in the last two years have a score of 0. Only about 20% of answers all-time have a score of >2 - the same is true of only 10% of answers posted in the last years.
You earn reputation when people vote on your posts
Technically true but with so few posts earning any upvotes, it's unlikely a user will actually earn any reputation to unlock privileges without hitting the jackpot.
Cutting away
At some point, things stopped working as intended - and some may have never worked well. In some cases, things changed - the people, the world, the platform. In other cases, they stayed the same when they needed to change. I appreciate questions like this one because they encourage us to look at the big picture and question the status-quo and identify what is working, what needs to change, and what we think will have the most impact.
We need to stop acting like the platform is OK.
If you can look at the data above and think a few small changes here and there will take the site back to its 2010 glory... please help me understand that. Did you even know - really know - that things were that bad?
Company people - I understand that it's "bad for business" to talk about your tentpole product going down the toilet but you need to stop acting like the site just needs different content types to appeal more to younger users or some minor "pain points" need to be addressed so that people are more willing to ask/answer questions. The recent community product roadmap blog post states:
the Question Assistant helps new askers improve their question and boost the likelihood of getting an answer by 12%.
This isn't even accurate, based on the data shared on MSO. While the "Success rate" increased by 12% - from 40% to 44%, getting an answer only increased by 6% - from 43% to 46%. By relying on a percentage change instead of stating what percentage get answered, you obscure the fact that the answered rate is so terrible in the first place.
This is the thing - the system is failing, which makes it difficult for users to actually find value in the platform and leaves the site looking like a ghost town - further depressing people's interest in participating. It also makes it a bit difficult for y'all to validate any changes being made are actually doing any good - if no one is voting, you kinda can't tell if the post quality got any better.
Start talking about the major platform problems in public - honestly, without marketing lingo to spin it. When you can't acknowledge things are broken and be specific, the community is unlikely to believe that you actually see what's broken. Step back and talk about the whole forest, not just one tree at a time. Present potential solutions you think might address those problems as concepts, not half-built features. Recognize the expertise of users by inviting critique and alternative ideas from the community.
Community people - y'all have been trying to draw attention to many of these issues for years. The answers here show that many things are up for consideration when it comes to restructuring the platform but the responses to the One rep to vote post really woke me up to how difficult it is as a staff member to even have these discussions.
Y'all have really valid and key concerns that even someone like me doesn't always see - that's why the community discussion is so valuable and why it's frustrating to feel unheard. But some of y'all tend to present those concerns as insurmountable or inevitable or couple valid risks with untested assertions of the scale of abuse or damage.
I understand that when things feel inevitable and people feel like they have no control, it's common to overstate or exaggerate arguments for impact in an effort to get someone to hit the brakes. It does work - but only so much. Eventually, people will just start ignoring what's said and dismiss it as hyperbolic. The thing is, with so many individual voices, even if the specific people speaking up against something may vary, it can seem like the community is always using this tactic, causing it to lose impact very quickly.
You don't have to trust the company to work with them. When you respond to staff posts, aim to create the same quality of meta answer that's expected on main sites - complete answers with explanations and supporting documentation. If something shouldn't be done, explain why and give an alternative way to address the issue. Try to understand the goals of the question and feel free to propose a frame challenge. These answers actually give staff things to think about they may not have realized or considered. Thanks to those already doing this!
Stop investing in projects that don't address the core problems
While making it easier for people to write good questions or encouraging people to write answers might make the numbers investors look at go up, they actually add to the problems of the platform. There's too much content to curate and too few people to do it and no tools to support community efforts.
Adding new content types (with minimal or no moderation) will exacerbate that problem and prevent the community from actually addressing it because now they have a new section of the site with minimal moderation tooling that also has to be curated. Banking on people who want to chat in discussions deciding to stick around and answer questions or review is unrealistic.
Fix what's already broken before creating more stuff.
Consider - if people aren't coming here any more, is it really because they want different content types, or is it because they can't actually find the content they need to address their problems? Do you honestly think you'll be able to compete with YouTube, Reddit, Discord, and other existing platforms people are already using?
I don't have all of the solutions - no one person does. That's why this community is such an amazing resource. There needs to be a vision and that needs to be a collaborative effort. If everyone's working towards the same goal, things can come together quickly.
But that requires everyone. I apparently failed to actually state my point (thanks Shadow). Over the last few years, I've seen a widening gulf between the company and the community and a lot of polarization of the amazing people on either side. I know (many of) those people and believe they are kind and capable and honest. I also know they're often stressed and anxious and human - they make mistakes, forget simple things, or rush in without enough information. No one I know actually wants this platform to fail.
As I stated in a comment:
We need to stop seeing the company and the community as effigies of everything that's "bad" about their respective entity as a whole. We need to start thinking beyond their actions and looking to understand the true reasons for them rather than making uncharitable assumptions. That requires being vulnerable and open about things - which isn't easy in the situation we're in. But I don't think that the platform can succeed without us working together to find a way forward unclouded by secrecy and ulterior motives (or the assumptions of them).