Skip to content

Conversation

@wofferl
Copy link
Collaborator

@wofferl wofferl commented Jan 28, 2026

Summary

This PR adds timeouts to all HTTP Client requests in the feed fetcher to prevent endless loading, when adding or importing a feed, especially with CDN like Akamai, etc.

Checklist

@wofferl wofferl added 3. to review API Impact API/Backend code labels Jan 28, 2026
@Grotax Grotax requested a review from Copilot January 29, 2026 04:49
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds HTTP request timeouts to prevent endless loading when adding or importing feeds, particularly with CDN services like Akamai. It addresses issues #3488 and #3525 by ensuring all HTTP client instantiations have both connection and response timeouts configured.

Changes:

  • Added connect_timeout of 3 seconds to all HTTP client configurations
  • Added operation-specific timeout values (3 seconds for HEAD requests, 10 seconds for favicon downloads)
  • Updated changelog to document the fix

Reviewed changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated 4 comments.

File Description
lib/Config/FetcherConfig.php Added 3-second connect timeout to the main client configuration
lib/Fetcher/FeedFetcher.php Added timeouts to hasLastModifiedHeader (3s/3s) and downloadFavicon (3s/10s) methods
CHANGELOG.md Added entry documenting the fix for feed fetcher requests getting stuck

Comment on lines +566 to +568
'base_uri' => $base_url,
'connect_timeout' => 3,
'timeout' => 10,
Copy link

Copilot AI Jan 29, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The array elements use 2-space indentation instead of the standard 4-space indentation used throughout the codebase. This is inconsistent with the PSR-2 coding style followed by this project. Please adjust the indentation to use 4 spaces for each level.

Copilot uses AI. Check for mistakes.
Comment on lines 134 to 139
$client = new Client([
'base_uri' => $url,
'connect_timeout' => 3,
'timeout' => 3,
]);
$response = $client->request('HEAD');
Copy link

Copilot AI Jan 29, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Client here is constructed with base_uri set directly from the user-supplied url (via FeedApiController::create()FeedServiceV2::create()hasLastModifiedHeader($url)) and then used to issue a HEAD request, which exposes a server-side request forgery (SSRF) vector. An authenticated user can submit a feed URL pointing to internal hosts or metadata endpoints (e.g. 169.254.169.254 or other private IPs) and cause the Nextcloud server to make HTTP requests to systems that are not otherwise reachable by that user. You should validate and restrict outbound targets before creating the client (e.g. reject local/loopback/metadata IPs and non-HTTP(S) schemes, or route through a hardened HTTP client/allow-list) so user-provided URLs cannot target internal infrastructure.

Copilot uses AI. Check for mistakes.
Comment on lines +565 to 572
$client = new Client([
'base_uri' => $base_url,
'connect_timeout' => 3,
'timeout' => 10,
]);
$response = $client->request(
'GET',
$favicon_url,
Copy link

Copilot AI Jan 29, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In downloadFavicon() a new Client is created with base_uri taken from the feed’s base_url (ultimately derived from the user-provided feed URL) and then used to perform a GET request to favicon_url, which again allows an authenticated user to trigger server-side HTTP requests to arbitrary hosts. By crafting a feed or favicon URL that points to internal services or cloud metadata endpoints, an attacker can use the News app as an SSRF gadget to reach internal network resources from the Nextcloud server. Add strict validation/allow-listing for both base_url and favicon_url (e.g. block local/loopback/metadata addresses and unsupported schemes, or funnel all outbound HTTP through a secured client service) before issuing these requests.

Copilot uses AI. Check for mistakes.
Co-authored-by: Copilot <[email protected]>
Signed-off-by: Sean Molenaar <[email protected]>
@wofferl
Copy link
Collaborator Author

wofferl commented Jan 29, 2026

@Grotax @SMillerDev
Copilot has a point here, which also applies to the calls for the scraper and the actual FeedFetcher.

We could use Remote Host Validation from nextcloud or \OCP\Http\Client\IClientService.

@SMillerDev
Copy link
Contributor

Looking at https://github.com/nextcloud/server/blob/master/lib/public/Http/Client/IClientService.php we won't be able to set what we want there, so we'll need to do validation ourselves. I'd recommend making it a separate PR though.

@Grotax Grotax merged commit 002a1e4 into nextcloud:master Jan 29, 2026
27 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

3. to review API Impact API/Backend code

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants