As long as there's a valid clone available and no other git commands
fail, we allow `git fetch` to fail and proceed processing commands. Even
if internet connectivity is down, that shouldn't necessarily prevent
sync from functioning.
The primary motivation for this change is that we expect the Trash
Guides repo to be relocated soon and I do not want that to cause the
program to stop working between the change and when I can update the
URL.
When doing a `sync --preview`, new custom formats are not created and
thus they never get an ID greater than `0`. Because of this, a
dictionary that tracks duplicates based on ID would result in warnings
about duplicate scores that made no sense.
We now index by Trash ID instead of Format ID, which is more accurate.
In a previous commit (SHA: `76040df`), I forgot to mark it as a breaking
change. Thus, this commit also serves to mark that breaking change for
versioning purposes.
For most CF specifications, there is only one element in the `fields`
array, which has a `value` property inside of each of its objects. One
particular specification, however, deviates from this assumption. The
"SizeSpecification" has been observed with *two* field objects.
Logic for parsing custom format CFs no longer assumes that the fields
property may only have one element in it.
Fixes#178
Due to changes in v4.1.1, sometimes invalid cache entries (zero-value)
were written to the cache. These are now treated as invalid and matches
by name will be performed.
Fixes#160
Due to [an issue][1] with the `actions/upload-artifact` action, when
binaries are uploaded they lose permission bits and `xattr` properties.
Composite actions `upload-tar` and `download-tar` have been added that
tarball the artifacts before uploading them to retain those properties.
[1]: https://github.com/actions/upload-artifact/issues/38