remirror-addressing-complications.tm 2.8 KB

12345678910111213141516171819202122232425262728293031323334353637383940414243444546474849505152535455565758596061626364656667686970
  1. <TeXmacs|1.99.16>
  2. <project|rehash.tm>
  3. <style|tmmanual>
  4. <\body>
  5. <subsection|Where to download from?>
  6. How can rehash-remirror know a \<#2018\>classical\<#2019\> download
  7. location for downloading content-addressed files, or a substitute server
  8. for downloading the narinfos? If rehash-remirror operates as a proxy, then
  9. an \<#2018\>obvious\<#2019\> download location presents itself. When used
  10. as a substitute server, rehash-remirror needs to be configured to fetch
  11. substitutes and narinfos from a set of \<#2018\>real\<#2019\> substitute
  12. servers (only a single narinfo will be returned though).
  13. <subsection|What if the hash isn't in rehash and GNUnet?>
  14. If mapping the hash to a GNUnet FS URI didn't succeed in a (configurable)
  15. reasonable time frame, then rehash-remirror will download from a
  16. \<#2018\>real\<#2019\> substitute server (and insert the newly found
  17. substitute into GNUnet FS and rehash afterwards), but will still try to
  18. download via rehash + GNUnet FS in the background (maybe waiting just a
  19. little longer will do the trick, perhaps the file just isn't
  20. well-represented yet so it needs some attention to propagate?), and the
  21. earliest completed download will be used.
  22. <\problem>
  23. What if the GNUnet FS URI would be found by waiting just a little longer?
  24. Does <samp|guix publish> support range queries? Is downloading a part of
  25. an lzip- or gzip-compressed file possible? Could <samp|guix publish>
  26. compress the <samp|nar> in parts? If only a few parts are missing,
  27. perhaps downloading these without compression would be ok? For
  28. simplicity, this is ignored for now, but some experimentation would be
  29. useful.
  30. </problem>
  31. <subsection|What if the GNUnet FS URI is bad?>
  32. In case the hash of the file as downloaded via GNUnet FS doesn't check out,
  33. the bad hash mapping will be removed, and the download will be retried via
  34. a `real' substitute server.
  35. <\remark>
  36. In the worst-case, twice as much data as necessary may be downloaded, if
  37. downloading via GNUnet and a `real' substitute server succeed at
  38. approximately the same time.
  39. </remark>
  40. <\problem>
  41. Can we tell other peers the hash is bad? This might require sending the
  42. whole file to other peers, unless the other peers have a file by that
  43. hash locally, in which case this requires the other peer to hash a
  44. potentially large file (perhaps this is acceptable as a low-priority, CPU
  45. and IO trottled job?).
  46. </problem>
  47. <subsection|What if multiple substitute servers have a narinfo?>
  48. rehash-remirror will choose the \<#2018\>best\<#2019\> narinfo, as
  49. <shell|guix substitute> does. Currently, the heuristic is preferring low
  50. (compressed) file sizes.
  51. </body>
  52. <\initial>
  53. <\collection>
  54. <associate|save-aux|false>
  55. </collection>
  56. </initial>