Skip to content

Conversation

@regenrek
Copy link

@regenrek regenrek commented Jan 26, 2026

summary

  • Centralize rate limit helpers and enforce CF-only IP parsing.
  • Add opt-in forwarded-header support for non-Cloudflare deployments.

motivation

  • Remove duplicated logic, block spoofing by default, and avoid breaking non-CF setups.

what's included

  • Shared rate-limit helper in convex/lib/httpRateLimit.ts.
  • convex/httpApiV1.ts updated to use helper.
  • Optional forwarded IP support gated by TRUST_FORWARDED_IPS.
  • Tests for CF-only and opt-in behavior.

what's not included

  • Download endpoint changes or dedupe logic.
  • UI changes.

tests

  • bun run test
  • bun run lint

affected files

  • convex/httpApiV1.ts
  • convex/lib/httpRateLimit.ts
  • convex/lib/httpRateLimit.test.ts

prompt

# ClawdHub Security Hardening

## Goal & Success Criteria

- Block download inflation: rate limit + per‑IP/day dedupe on ZIP downloads.
- IP spoofing fixed: trust only cf-connecting-ip.
- Files tab shows full file viewer + warnings for dangerous commands.
- Each PR has Conventional Commits, full test suite runs, PR opened from regenrek fork.

## Non‑goals / Out of Scope

- Replace download stats with installs.
- Auth‑gated downloads or paid access.
- Deep static analysis beyond warning heuristics.

## Assumptions

- Rate limits: new “download” tier tighter than “read”.
- IP trust: CF‑only, no fallback.
- PRs live on regenrek fork, not upstream.

## Proposed Solution

- Single canonical rate‑limit/IP module (Convex best‑practice: no duplicated logic).
- Dedupe table keyed by hashed IP + skill + day bucket; increment only once.
- UI file viewer loads via existing getFileText; warnings from regex scan.

### Alternatives Considered

- Reuse “read” limits in convex/httpApiV1.ts:634 — too permissive.
- Fallback to x-forwarded-for — violates CF‑only requirement.
- Store raw IPs — privacy risk.

## System Design

- Increment flow: new mutation recordDownload handles dedupe + stats increment atomically.
- Cleanup: cron job prunes dedupe rows older than N days.
- UI: SkillDetailPage Files tab extracted to SkillFilesPanel with viewer + warnings.

## Interfaces & Data Contracts

- recordDownload mutation args: { skillId: Id<'skills'>, ipHash?: string, dayStart: number }.
- downloadDedupes schema: { skillId, ipHash, dayStart, createdAt }.
- Rate limit config includes download: { ip, key } in shared helper.

## Execution Details

PR 1 — IP + Rate‑Limit Helper (fix/refactor)

- Extract applyRateLimit/getClientIp from convex/httpApiV1.ts:634 into convex/lib/httpRateLimit.ts.
- Update convex/httpApiV1.ts to use shared helper.
- Enforce CF‑only IP parsing.

## Testing & Quality

- Full suite per PR: bun run test and bun run lint.
- Add unit tests for dedupe/rate limit in convex/downloads.test.ts.
- Extend handler tests to cover CF‑only IP logic.

## Risks & Mitigations

- Missing CF header → “unknown” key. Mitigate by documenting requirement.
- Dedupe table growth → cron pruning.
- Viewer perf → relies on existing 200KB cap.

@vercel
Copy link
Contributor

vercel bot commented Jan 26, 2026

@regenrek is attempting to deploy a commit to the Amantus Machina Team on Vercel.

A member of the Team first needs to authorize it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant