A directory I have is filled by a lot of files. I want to discover of what kind they are and whose of them are so numerous.
Here are the events when I try some commands :
ls -l | wc -l
1514340
ls | head -n 4
2004112700001.htm
2004112700002.htm
2004112700003.htm
2004112700004.htm
ls *.xml | head -n 4
20041127.xml
20041225.xml
20050101.xml
20050108.xml
ls -l *.htm | wc -l
bash: /bin/ls: Liste d'arguments trop longue
0
# Any other kind of ls command with *.htm, *.* is failing too.
I understand that wc -l has to wait that the output of the ls -l *.htm is entirely done before starting to analyze it. And because that output is too big, it fails.
Is it truly what is happening ?
What is the good way to make the ls command works in this case in conjunction with wc -l ? Is there a way to ask the wc command to start asynchronously, before the output is entirely completed ?
wcfailing because the output is too big or the pipe that's overflowing.lsis notg even starting because*.htmexpands into too many arguments for it. – muru Jun 04 '20 at 06:29htmthanhtm. Nohtmlfile, for example. – Marc Le Bihan Jun 04 '20 at 07:06*.htmexpands to2004112700001.htm 2004112700002.htm 2004112700003.htm 2004112700004.htm ...thenlsis run with all those filenames as arguments, which exceeds the argument length limit. Whether or not you have a.htmlfile makes no difference. Please see the dupe. – muru Jun 04 '20 at 07:08*.htmisn't thearg[0]that a C programlsis taking to resolve a file filter by classicalfindFirst,findNextfunctions ? How would thelssucceed in expanding *.htm to a list of files ? By doing itself anls? – Marc Le Bihan Jun 04 '20 at 07:17lsdoesn't expand anything. The shell does. See, e.g,, https://unix.stackexchange.com/q/17938/70524 – muru Jun 04 '20 at 07:28