I have directory which contains millions of files of small size (output from a big data program). I need to delete those directories but when I type the standard rm * I get:
zsh: sure you want to delete more than 100 files in data/output [yn]? y
zsh: argument list too long: rm
The files all have the same prefix with a unique number appended after like this
data-12343
data-12344
... etc
So I can't even use regular expressions to trash the files in a piecemeal way. Looking for advice and tips on how to do this efficiently and in an automated way.
Thanks.
ls | xargs rmand it did the trick! – Apr 28 '19 at 12:55lsfor things like this: a better option would beprintf './%s\0' * | xargs -0 rm. Please refer to Why you shouldn't parse the output of ls(1) – steeldriver Apr 28 '19 at 14:59findand pipe the results toxargslike so:find . -type f -print0 | xargs -0 rm– Jeff Apr 28 '19 at 15:34rm *failed why wouldprintf ... *work? – Chris Davies Apr 28 '19 at 16:06zsh, which the OP is using, andbash)printfis a shell builtin and so not subject to theARG_MAXlimit – steeldriver Apr 28 '19 at 16:32