Skip to content

Hotness tweaks: reduce open boost, add cp reveal boost#4150

Merged
hlbmtc merged 7 commits intomainfrom
claude/issue-4146-20260122-1742
Feb 4, 2026
Merged

Hotness tweaks: reduce open boost, add cp reveal boost#4150
hlbmtc merged 7 commits intomainfrom
claude/issue-4146-20260122-1742

Conversation

@SylvainChevalier
Copy link
Copy Markdown
Contributor

@SylvainChevalier SylvainChevalier commented Jan 23, 2026

Implements changes requested in #4146

Changes

  • Reduced question open hotness boost from 20 to 10
  • Added hotness boost of 20 for cp_reveal_time (community prediction reveal)

Generated with Claude Code

Summary by CodeRabbit

  • New Features

    • Added a CP Reveal Time score as a new ranking factor for questions.
  • Changes

    • Removed the approval-score factor from post ranking.
    • Reduced Open Time hotness decay (from 20 → 10), resulting in lower aggregate hotness totals.
  • Tests

    • Updated tests to reflect removal of the approval score and recalibrated expected hotness values.

- Reduced question open hotness boost from 20 to 10
- Added hotness boost of 20 for cp_reveal_time (community prediction reveal)

Co-authored-by: Sylvain <SylvainChevalier@users.noreply.github.com>
@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai Bot commented Jan 23, 2026

📝 Walkthrough

Walkthrough

Removed the Approval-score component from post hotness and added a new CP Reveal Time hotness component for questions; updated tests to remove approval-score expectations and adjust totals accordingly.

Changes

Cohort / File(s) Summary
Hotness Service
posts/services/hotness.py
Removed _compute_hotness_approval_score(post) and its entry in POST_HOTNESS_COMPONENTS. Added _compute_question_hotness_cp_reveal_time(question) and registered it in QUESTION_HOTNESS_COMPONENTS. Adjusted Open Time decay from 20 to 10.
Unit Tests
tests/unit/test_posts/test_services/test_hotness.py
Removed references/tests for _compute_hotness_approval_score. Updated expected hotness totals and related assertions to reflect removal of approval-score contribution.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

Poem

🐰 I nudged a number, hopped from score to time,
Approval slipped away, CP reveal starts to chime.
Tests adjusted their sums, the totals rearranged,
A little hotness shuffle — the meadow's gently changed. 🥕✨

🚥 Pre-merge checks | ✅ 2
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately summarizes the main changes: reducing open boost (20→10) and adding CP reveal boost, which are the primary modifications in the changeset.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch claude/issue-4146-20260122-1742

📜 Recent review details

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between daaabde and a8a29bf.

📒 Files selected for processing (1)
  • posts/services/hotness.py
🧰 Additional context used
🧠 Learnings (1)
📚 Learning: 2026-01-15T19:29:58.940Z
Learnt from: hlbmtc
Repo: Metaculus/metaculus PR: 4075
File: authentication/urls.py:24-26
Timestamp: 2026-01-15T19:29:58.940Z
Learning: In this codebase, DRF is configured to use IsAuthenticated as the default in REST_FRAMEWORK['DEFAULT_PERMISSION_CLASSES'] within metaculus_web/settings.py. Therefore, explicit permission_classes([IsAuthenticated]) decorators are unnecessary on DRF views unless a view needs to override the default. When reviewing Python files, verify that views relying on the default are not redundantly decorated, and flag cases where permissions are being over-specified or when a non-default permission is explicitly required.

Applied to files:

  • posts/services/hotness.py
🔇 Additional comments (3)
posts/services/hotness.py (3)

53-58: Open-time boost reduction looks good.
Cleanly applies the requested lower base while keeping the existing guard/decay behavior intact.


61-66: CP reveal boost computation is solid.
The guard against future reveal times prevents premature boosts.


80-84: Component list update is consistent.
Including the CP Reveal Time Score alongside the other question components matches the new calculator.

✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Comment thread posts/services/hotness.py Outdated
@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented Jan 23, 2026

🧹 Preview Environment Cleaned Up

The preview environment for this PR has been destroyed.

Resource Status
🌐 Preview App ✅ Deleted
🗄️ PostgreSQL Branch ✅ Deleted
⚡ Redis Database ✅ Deleted
🔧 GitHub Deployments ✅ Removed
📦 Docker Image ⚠️ Retained (auto-cleanup via GHCR policies)

Cleanup triggered by PR close at 2026-02-04T19:10:32Z

@SylvainChevalier
Copy link
Copy Markdown
Contributor Author

@claude let's remove the approval hotness boost completely.

@claude
Copy link
Copy Markdown
Contributor

claude Bot commented Jan 23, 2026

Claude finished @SylvainChevalier's task in 55s —— View job


Task: Remove approval hotness boost ✓

  • Read current hotness.py implementation
  • Remove approval hotness boost code
  • Commit and push changes

Summary

Removed the approval hotness boost completely from posts/services/hotness.py:

  • Deleted the _compute_hotness_approval_score function
  • Removed the "Approval score" entry from POST_HOTNESS_COMPONENTS

The PR now includes three hotness adjustments:

  1. Reduced question open hotness boost from 20 to 10
  2. Added hotness boost of 20 for cp_reveal_time (community prediction reveal)
  3. Removed approval hotness boost entirely

Changes pushed to claude/issue-4146-20260122-1742.

--- •

Co-authored-by: Sylvain <SylvainChevalier@users.noreply.github.com>
@SylvainChevalier
Copy link
Copy Markdown
Contributor Author

@claude can you see the failed unit tests in the github actions?

@claude
Copy link
Copy Markdown
Contributor

claude Bot commented Jan 23, 2026

Claude finished @SylvainChevalier's task —— View job


GitHub Actions Test Failures - RESOLVED ✓

  • Check CI status for this PR
  • Identify failed workflow runs
  • Analyze test failure logs
  • Fix test file to remove approval hotness references
  • Commit and push fix
  • Update test expectations based on CI results
  • Commit and push updated expectations
  • Verify all tests pass

Summary

Yes, I can see the failed unit tests in GitHub Actions! The issue was that the test file was trying to import _compute_hotness_approval_score, which was removed when we deleted the approval hotness boost.

Changes Made

Commit 3c6d4f2 - Fix unit tests after removing approval hotness boost:

  1. Removed import of deleted _compute_hotness_approval_score function
  2. Removed test_compute_hotness_approval_score test (no longer applicable)
  3. Updated test_handle_post_boost expected values:
    • Initial hotness: 20 → 0 (no approval boost)
    • After UP boost: 45 → 20
    • After DOWN boost: 3 → -10

Commit 1d1c4b8 - Update test expectations after hotness formula changes:
Updated expected values to reflect the reduced open boost (20 → 10):

  • test_compute_question_hotness active question: 13.0 → 10.5
  • test_compute_question_hotness resolved question: 6.25 → 5.625
  • test_compute_question_hotness annulled question: 1.25 → 0.625
  • test_compute_post_hotness: 110.9 → 109.025

CI Status

Run Unit Tests - PASSING
PR #4150 - PASSING
⏳ Run Integration Tests (Playwright) - in progress
⏳ PR Preview Environment - in progress

All unit tests are now passing!


View latest workflow run

- Remove import of deleted _compute_hotness_approval_score function
- Remove test_compute_hotness_approval_score test
- Update test_handle_post_boost expected values to reflect removal of approval boost
- Update test_compute_post_hotness expected value

Co-authored-by: Sylvain <SylvainChevalier@users.noreply.github.com>
Adjust expected values in question and post hotness tests to reflect:
- Reduced question open hotness boost from 20 to 10
- Removed approval hotness boost

Test expectation updates:
- test_compute_question_hotness active question: 13.0 → 10.5
- test_compute_question_hotness resolved question: 6.25 → 5.625
- test_compute_question_hotness annulled question: 1.25 → 0.625
- test_compute_post_hotness: 110.9 → 109.025

Co-authored-by: Sylvain <SylvainChevalier@users.noreply.github.com>
@SylvainChevalier SylvainChevalier marked this pull request as ready for review February 3, 2026 18:15
Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 0

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
tests/unit/test_posts/test_services/test_hotness.py (1)

166-207: ⚠️ Potential issue | 🟡 Minor

Verify the expected total hotness calculation.

The expected value of 109.025 should be verified. The comment on line 185 says the second question "Will be scored as 18" but let me verify:

  • Question 2 open time (2025-04-11, 7 days ago): decay(10, open_time) = 2.5
  • No resolution, no movement specified → Total for Q2: 2.5

The inline comment says "scored as 18" but with the reduced open boost (10 instead of 20), this should now be ~2.5, not 18.

Components breakdown:

  • Max subquestions score: 2.5 (Q2 is the max)
  • Total Boosts Score: 100
  • Posted comments score: 2
  • Net post votes score: 1
  • Relevant ITN news: decay(0.4, ...) = 0.1 * (7/3.5)^-2 = 0.025 (within 7 days, distance 0.1 → 0.5-0.1=0.4 → decayed)

Wait, let me recalculate. The article was created at freeze_time context (2025-04-18), so no decay:

  • Relevant ITN news: 0.5 - 0.1 = 0.4 (no decay since same day)

Total: 2.5 + 100 + 2 + 1 + 0.4 = 105.9

But expected is 109.025. Let me re-examine...

Actually Q1 has resolution_set_time on 2025-04-11 (7 days ago):

  • Q1: decay(10, 2025-04-04) + decay(20, 2025-04-11) = 0.625 + 5 = 5.625
  • Q2: decay(10, 2025-04-11) = 2.5

Max is Q1 with 5.625. But comments say Q2 "Will be scored as 18" which is outdated.

Total: 5.625 + 100 + 2 + 1 + 0.4 = 109.025

The test value is correct but the inline comments are stale.

📝 Proposed fix to update stale comments

The inline comments on lines 177 and 185 are outdated after the open boost reduction. Consider updating them:

         group_of_questions=factory_group_of_questions(
             questions=[
-                # Will be scored as 15
+                # Will be scored as 5.625 (max)
                 create_question(
                     question_type=Question.QuestionType.BINARY,
                     open_time=make_aware(datetime.datetime(2025, 4, 4)),
                     scheduled_close_time=make_aware(datetime.datetime(2025, 4, 10)),
                     resolution_set_time=make_aware(datetime.datetime(2025, 4, 11)),
                     resolution="yes",
                 ),
-                # Will be scored as 18
+                # Will be scored as 2.5
                 create_question(
                     question_type=Question.QuestionType.BINARY,
                     open_time=make_aware(datetime.datetime(2025, 4, 11)),
                     scheduled_close_time=make_aware(datetime.datetime(2025, 4, 25)),
                 ),
             ]
         ),
🧹 Nitpick comments (1)
tests/unit/test_posts/test_services/test_hotness.py (1)

11-20: Consider adding test coverage for the new CP reveal time score.

The new _compute_question_hotness_cp_reveal_time function was added but there are no dedicated tests for it. Consider adding a test case that exercises the CP reveal time hotness component.

Would you like me to generate a test case for the CP reveal time score component?

📜 Review details

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 4bb9e1e and daaabde.

📒 Files selected for processing (2)
  • posts/services/hotness.py
  • tests/unit/test_posts/test_services/test_hotness.py
🧰 Additional context used
🧠 Learnings (1)
📚 Learning: 2026-01-15T19:29:58.940Z
Learnt from: hlbmtc
Repo: Metaculus/metaculus PR: 4075
File: authentication/urls.py:24-26
Timestamp: 2026-01-15T19:29:58.940Z
Learning: In this codebase, DRF is configured to use IsAuthenticated as the default in REST_FRAMEWORK['DEFAULT_PERMISSION_CLASSES'] within metaculus_web/settings.py. Therefore, explicit permission_classes([IsAuthenticated]) decorators are unnecessary on DRF views unless a view needs to override the default. When reviewing Python files, verify that views relying on the default are not redundantly decorated, and flag cases where permissions are being over-specified or when a non-default permission is explicitly required.

Applied to files:

  • posts/services/hotness.py
  • tests/unit/test_posts/test_services/test_hotness.py
🧬 Code graph analysis (1)
tests/unit/test_posts/test_services/test_hotness.py (3)
posts/services/hotness.py (2)
  • compute_post_hotness (160-161)
  • handle_post_boost (210-227)
tests/unit/conftest.py (1)
  • user1 (46-48)
posts/models.py (2)
  • Vote (1063-1077)
  • VoteDirection (1064-1066)
🔇 Additional comments (8)
posts/services/hotness.py (4)

53-58: LGTM! Open time decay value reduced from 20 to 10.

The change aligns with the PR objective to reduce the question open hotness boost.


80-85: LGTM! Component registry updated correctly.

The new "CP Reveal Time Score" component is properly added to QUESTION_HOTNESS_COMPONENTS.


151-157: LGTM! Approval score removed from POST_HOTNESS_COMPONENTS.

The removal aligns with the PR comments requesting complete removal of the approval hotness boost.


61-67: LGTM! New CP reveal time score component follows established patterns.

The implementation correctly mirrors _compute_question_hotness_open_time with appropriate null/time checks before applying decay.

tests/unit/test_posts/test_services/test_hotness.py (4)

59-68: Verify the expected hotness calculation for active question.

The expected value of 10.5 should be verified:

  • Open time (2025-04-11, 7 days ago): decay(10, open_time) = 10 * (7/3.5)^-2 = 10 * 0.25 = 2.5
  • Movement score: 20 * 0.4 = 8 (question is OPEN status)
  • Total: 2.5 + 8 = 10.5

The calculation appears correct.


69-81: Verify the expected hotness calculation for resolved question.

The expected value of 5.625 should be verified:

  • Open time (2025-04-04, 14 days ago): decay(10, open_time) = 10 * (14/3.5)^-2 = 10 * 0.0625 = 0.625
  • Movement: 0 (question is not OPEN, so movement is ignored)
  • Resolution time (2025-04-11, 7 days ago): decay(20, resolution_set_time) = 20 * (7/3.5)^-2 = 20 * 0.25 = 5
  • Total: 0.625 + 0 + 5 = 5.625

The calculation appears correct.


82-92: Verify the expected hotness calculation for unsuccessfully resolved question.

The expected value of 0.625 should be verified:

  • Open time (2025-04-04, 14 days ago): decay(10, open_time) = 10 * (14/3.5)^-2 = 0.625
  • Movement: 0 (no movement specified, and question not OPEN)
  • Resolution time: 0 (resolution is "annulled" which is in UnsuccessfulResolutionType, so ignored)
  • Total: 0.625

The calculation appears correct.


210-228: Verify the boost/bury hotness calculations.

  • Line 220: Initial hotness is 0 (no components contribute) ✓
  • Line 224: After UP boost, score = (0/4) + 20 = 20, so hotness = 20
  • Line 228: After DOWN bury, score = -(20/2) - 20 = -30, total boosts = 20 + (-30) = -10, so hotness = -10

The calculations are correct.

✏️ Tip: You can disable this entire section by setting review_details to false in your review settings.

@hlbmtc hlbmtc merged commit 36a8369 into main Feb 4, 2026
14 checks passed
@hlbmtc hlbmtc deleted the claude/issue-4146-20260122-1742 branch February 4, 2026 19:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants