12345678910111213141516171819202122232425262728293031323334353637383940414243444546474849505152535455565758596061626364656667686970 |
- <TeXmacs|1.99.16>
- <project|rehash.tm>
- <style|tmmanual>
- <\body>
- <subsection|Where to download from?>
- How can rehash-remirror know a \<#2018\>classical\<#2019\> download
- location for downloading content-addressed files, or a substitute server
- for downloading the narinfos? If rehash-remirror operates as a proxy, then
- an \<#2018\>obvious\<#2019\> download location presents itself. When used
- as a substitute server, rehash-remirror needs to be configured to fetch
- substitutes and narinfos from a set of \<#2018\>real\<#2019\> substitute
- servers (only a single narinfo will be returned though).
- <subsection|What if the hash isn't in rehash and GNUnet?>
- If mapping the hash to a GNUnet FS URI didn't succeed in a (configurable)
- reasonable time frame, then rehash-remirror will download from a
- \<#2018\>real\<#2019\> substitute server (and insert the newly found
- substitute into GNUnet FS and rehash afterwards), but will still try to
- download via rehash + GNUnet FS in the background (maybe waiting just a
- little longer will do the trick, perhaps the file just isn't
- well-represented yet so it needs some attention to propagate?), and the
- earliest completed download will be used.
- <\problem>
- What if the GNUnet FS URI would be found by waiting just a little longer?
- Does <samp|guix publish> support range queries? Is downloading a part of
- an lzip- or gzip-compressed file possible? Could <samp|guix publish>
- compress the <samp|nar> in parts? If only a few parts are missing,
- perhaps downloading these without compression would be ok? For
- simplicity, this is ignored for now, but some experimentation would be
- useful.
- </problem>
- <subsection|What if the GNUnet FS URI is bad?>
- In case the hash of the file as downloaded via GNUnet FS doesn't check out,
- the bad hash mapping will be removed, and the download will be retried via
- a `real' substitute server.
- <\remark>
- In the worst-case, twice as much data as necessary may be downloaded, if
- downloading via GNUnet and a `real' substitute server succeed at
- approximately the same time.
- </remark>
- <\problem>
- Can we tell other peers the hash is bad? This might require sending the
- whole file to other peers, unless the other peers have a file by that
- hash locally, in which case this requires the other peer to hash a
- potentially large file (perhaps this is acceptable as a low-priority, CPU
- and IO trottled job?).
- </problem>
- <subsection|What if multiple substitute servers have a narinfo?>
- rehash-remirror will choose the \<#2018\>best\<#2019\> narinfo, as
- <shell|guix substitute> does. Currently, the heuristic is preferring low
- (compressed) file sizes.
- </body>
- <\initial>
- <\collection>
- <associate|save-aux|false>
- </collection>
- </initial>
|