p2p: Make HTTP downloads fail fast if using p2p to download

Failing fast when downloading via p2p is desirable because if we're
disconnected from the peer we're downloading from, chances are good
that it's not coming back. For example the peer could have gone to
sleep (user shutting the lid) or gone out of range. This is unlike the
non-p2p path where we can assume much better connectivity.

Also introduce new constants instead of hard-coded numbers and move
some existing constants to constants.h.

BUG=chromium:260426
TEST=Unit tests pass
Change-Id: Id2f1d0c60907caec06c4bdff3c70871d9f3eb20d
Reviewed-on: https://chromium-review.googlesource.com/64830
Reviewed-by: David Zeuthen <zeuthen@chromium.org>
Commit-Queue: David Zeuthen <zeuthen@chromium.org>
Tested-by: David Zeuthen <zeuthen@chromium.org>
diff --git a/http_fetcher_unittest.cc b/http_fetcher_unittest.cc
index 7386bb3..f49c563 100644
--- a/http_fetcher_unittest.cc
+++ b/http_fetcher_unittest.cc
@@ -935,7 +935,7 @@
   ASSERT_TRUE(server->started_);
 
   string url;
-  for (int r = 0; r < LibcurlHttpFetcher::kMaxRedirects; r++) {
+  for (int r = 0; r < kDownloadMaxRedirects; r++) {
     url += base::StringPrintf("/redirect/%d",
                               kRedirectCodes[r % arraysize(kRedirectCodes)]);
   }
@@ -951,7 +951,7 @@
   ASSERT_TRUE(server->started_);
 
   string url;
-  for (int r = 0; r < LibcurlHttpFetcher::kMaxRedirects + 1; r++) {
+  for (int r = 0; r < kDownloadMaxRedirects + 1; r++) {
     url += base::StringPrintf("/redirect/%d",
                               kRedirectCodes[r % arraysize(kRedirectCodes)]);
   }