#5 Better/alternate duplicate file handling for "cleanup" operation

開啟中
rpdelaney6 年之前創建 · 0 條評論

If I download user/subreddit content after a cleanup, the duplicates are downloaded again.

Workarounds/solutions:

  • Instead of removing duplicate files, hardlink them.
  • Store metadata in the directory about already downloaded files (so they are not downloaded again unless requested specifically).
If I download user/subreddit content after a cleanup, the duplicates are downloaded again. Workarounds/solutions: - Instead of removing duplicate files, hardlink them. - Store metadata in the directory about already downloaded files (so they are not downloaded again unless requested specifically).
登入 才能加入這對話。
未選擇標籤
未選擇里程碑
未指派成員
1 參與者
正在加載...
取消
保存
尚未有任何內容