#5 Better/alternate duplicate file handling for "cleanup" operation

Offen
vor 6 Jahren geöffnet von rpdelaney · 0 Kommentare
rpdelaney kommentierte vor 6 Jahren

If I download user/subreddit content after a cleanup, the duplicates are downloaded again.

Workarounds/solutions:

  • Instead of removing duplicate files, hardlink them.
  • Store metadata in the directory about already downloaded files (so they are not downloaded again unless requested specifically).
If I download user/subreddit content after a cleanup, the duplicates are downloaded again. Workarounds/solutions: - Instead of removing duplicate files, hardlink them. - Store metadata in the directory about already downloaded files (so they are not downloaded again unless requested specifically).
Anmelden, um an der Diskussion teilzunehmen.
Kein Label
Kein Meilenstein
Niemand zuständig
1 Beteiligte
Laden…
Abbrechen
Speichern
Hier gibt es bis jetzt noch keinen Inhalt.