Check disk usage of files returned with spacesssnux ommleh500e wgantedia.pedyardepicoriy8.9 0.908ri
I'd like to output the total disk space for a collection of files output by find.
One of my files has spaces in its name, which is causing du to return a 'No such file or directory' message for it.
chris@chris-x1c6:/media/E/2Videos$ du -ch $(find . -maxdepth 1 -iname "*syed*")
du: cannot access './The': No such file or directory
du: cannot access 'Case': No such file or directory
du: cannot access 'Against': No such file or directory
du: cannot access 'Adnan': No such file or directory
du: cannot access 'Syed': No such file or directory
du: cannot access 'S01E01': No such file or directory
du: cannot access '1080p.WEB.H264-AMRAP.mkv': No such file or directory
4.0G ./The.Case.Against.Adnan.Syed.S01E02.In.Between.the.Truth.1080p.AMZN.WEB-DL.DDP5.1.H.264-NTb.mkv
4.0G ./The.Case.Against.Adnan.Syed.S01E03.1080p.WEB.H264-AMRAP.mkv
3.5G ./The.Case.Against.Adnan.Syed.S01E04.Time.is.the.Killer.1080p.AMZN.WEB-DL.DDP5.1.H.264-NTb.mkv
12G total
I've tried dealing with the spaces by piping into sed and either wrapping filenames in quotes or adding escape characters to the blanks, neither of which allows du to recognize the filename with blanks.
It's a little confusing because this works:
chris@chris-x1c6:/media/E/2Videos$ du -ch ./The\\ Case\\ Against\\ Adnan\\ Syed\\ S01E01\\ 1080p.WEB.H264-AMRAP.mkv
4.1G ./The Case Against Adnan Syed S01E01 1080p.WEB.H264-AMRAP.mkv
4.1G total
But this doesn't:
chris@chris-x1c6:/media/E/2Videos$ du -ch $(find . -maxdepth 1 -iname "*syed*" | sed 's/ /\\\\ /g')
du: cannot access './The\\': No such file or directory
du: cannot access 'Case\\': No such file or directory
du: cannot access 'Against\\': No such file or directory
du: cannot access 'Adnan\\': No such file or directory
du: cannot access 'Syed\\': No such file or directory
du: cannot access 'S01E01\\': No such file or directory
du: cannot access '1080p.WEB.H264-AMRAP.mkv': No such file or directory
4.0G ./The.Case.Against.Adnan.Syed.S01E02.In.Between.the.Truth.1080p.AMZN.WEB-DL.DDP5.1.H.264-NTb.mkv
4.0G ./The.Case.Against.Adnan.Syed.S01E03.1080p.WEB.H264-AMRAP.mkv
3.5G ./The.Case.Against.Adnan.Syed.S01E04.Time.is.the.Killer.1080p.AMZN.WEB-DL.DDP5.1.H.264-NTb.mkv
12G total
Is there a better way to deal with this?
-
Recommended reading: BashFAQ #20: How can I find and safely handle file names containing newlines, spaces or both? – Gordon Davisson 1 hour ago
2 Answers
What if we let find handle the filenames?
find . -maxdepth 1 -iname '*syed*' -exec du -ch {} +
-
Thought I'd tried this (though may have omitted the +). But works like a charm, many thanks! – Chris 9 hours ago
How about this?:
find . -maxdepth 1 -iname '*syed*' -print0 | xargs -0 du -ch
Explanation of options:
find– What you were using to find files-print0– Split each result with a null character, which is a character that cannot occur in a filename
xargs– Assembles arguments to a command piped from standard input (stdin)-0– Receive each argument split by a null characterdu -ch– The command to which you want to pass file arguments
As for why your proposed sed way of escaping doesn't work, the \\ characters you're trying to add are put in after the shell argument delimiter ("") escaping has already taken place. Each word, delimited by space, is already an argument.
My solution with xargs ensures that each argument is a path from find, regardless of spaces.
-
This is helpful, commentary on sed in particular, thank you. Looks like xargs and the -exec option work in a similar fashion per
find's man page: ...The command line is built in much the same way that xargs builds its command lines. – Chris 9 hours ago