#5 Better/alternate duplicate file handling for "cleanup" operation

开启中
rpdelaney6 年之前创建 · 0 条评论

If I download user/subreddit content after a cleanup, the duplicates are downloaded again.

Workarounds/solutions:

  • Instead of removing duplicate files, hardlink them.
  • Store metadata in the directory about already downloaded files (so they are not downloaded again unless requested specifically).
If I download user/subreddit content after a cleanup, the duplicates are downloaded again. Workarounds/solutions: - Instead of removing duplicate files, hardlink them. - Store metadata in the directory about already downloaded files (so they are not downloaded again unless requested specifically).
登录 并参与到对话中。
未选择标签
未选择里程碑
未指派成员
1 名参与者
正在加载...
取消
保存
这个人很懒,什么都没留下。