Reporting number of files in Subdirectories, Bash
I'm working on a Win10 computer, but I usually work on Gitbash or in the linux subsystem.
I'm trying to get the number of files in all subdirectories of a specified directory.
This is a similar question to How to report number of files in all subdirectories? But the difference is that I do not have a constant number of levels on all subdirectories, I have something like:
Dir1/sub1
Dir1/sub1/subsub1
Dir1/sub2
Dir1/sub3/subsub3/subsubsub3
I tried
shopt -s dotglob; for dir in */; do all=("$dir"/*); echo "$dir: ${#all[@]}"; done
playing around with the number of levels to search in (* /,* /* /* and so on)
But I cannot really get what Im looking for, something like:
Dir1/sub1: Number of files
Dir1/sub2: Number of files
Dir1/sub3: Number of files
bash
add a comment |
I'm working on a Win10 computer, but I usually work on Gitbash or in the linux subsystem.
I'm trying to get the number of files in all subdirectories of a specified directory.
This is a similar question to How to report number of files in all subdirectories? But the difference is that I do not have a constant number of levels on all subdirectories, I have something like:
Dir1/sub1
Dir1/sub1/subsub1
Dir1/sub2
Dir1/sub3/subsub3/subsubsub3
I tried
shopt -s dotglob; for dir in */; do all=("$dir"/*); echo "$dir: ${#all[@]}"; done
playing around with the number of levels to search in (* /,* /* /* and so on)
But I cannot really get what Im looking for, something like:
Dir1/sub1: Number of files
Dir1/sub2: Number of files
Dir1/sub3: Number of files
bash
You'd want a report of files in each of the directoriessub1
,sub1/subsub1
,sub2
,sub3
,sub3/subsub3
, andsubsubsub3
? Or just forsub1
,sub2
, andsub3
? If this second option, shouldsub1
andsub3
count files in their subdirectories too?
– roaima
Feb 13 at 22:48
add a comment |
I'm working on a Win10 computer, but I usually work on Gitbash or in the linux subsystem.
I'm trying to get the number of files in all subdirectories of a specified directory.
This is a similar question to How to report number of files in all subdirectories? But the difference is that I do not have a constant number of levels on all subdirectories, I have something like:
Dir1/sub1
Dir1/sub1/subsub1
Dir1/sub2
Dir1/sub3/subsub3/subsubsub3
I tried
shopt -s dotglob; for dir in */; do all=("$dir"/*); echo "$dir: ${#all[@]}"; done
playing around with the number of levels to search in (* /,* /* /* and so on)
But I cannot really get what Im looking for, something like:
Dir1/sub1: Number of files
Dir1/sub2: Number of files
Dir1/sub3: Number of files
bash
I'm working on a Win10 computer, but I usually work on Gitbash or in the linux subsystem.
I'm trying to get the number of files in all subdirectories of a specified directory.
This is a similar question to How to report number of files in all subdirectories? But the difference is that I do not have a constant number of levels on all subdirectories, I have something like:
Dir1/sub1
Dir1/sub1/subsub1
Dir1/sub2
Dir1/sub3/subsub3/subsubsub3
I tried
shopt -s dotglob; for dir in */; do all=("$dir"/*); echo "$dir: ${#all[@]}"; done
playing around with the number of levels to search in (* /,* /* /* and so on)
But I cannot really get what Im looking for, something like:
Dir1/sub1: Number of files
Dir1/sub2: Number of files
Dir1/sub3: Number of files
bash
bash
edited Feb 13 at 16:45
Rui F Ribeiro
40.7k1479137
40.7k1479137
asked Feb 13 at 15:46
Faustino DelgadoFaustino Delgado
263
263
You'd want a report of files in each of the directoriessub1
,sub1/subsub1
,sub2
,sub3
,sub3/subsub3
, andsubsubsub3
? Or just forsub1
,sub2
, andsub3
? If this second option, shouldsub1
andsub3
count files in their subdirectories too?
– roaima
Feb 13 at 22:48
add a comment |
You'd want a report of files in each of the directoriessub1
,sub1/subsub1
,sub2
,sub3
,sub3/subsub3
, andsubsubsub3
? Or just forsub1
,sub2
, andsub3
? If this second option, shouldsub1
andsub3
count files in their subdirectories too?
– roaima
Feb 13 at 22:48
You'd want a report of files in each of the directories
sub1
, sub1/subsub1
, sub2
, sub3
, sub3/subsub3
, and subsubsub3
? Or just for sub1
, sub2
, and sub3
? If this second option, should sub1
and sub3
count files in their subdirectories too?– roaima
Feb 13 at 22:48
You'd want a report of files in each of the directories
sub1
, sub1/subsub1
, sub2
, sub3
, sub3/subsub3
, and subsubsub3
? Or just for sub1
, sub2
, and sub3
? If this second option, should sub1
and sub3
count files in their subdirectories too?– roaima
Feb 13 at 22:48
add a comment |
5 Answers
5
active
oldest
votes
#!/bin/bash
shopt -s dotglob nullglob
topdir='./Dir1'
for subdir in "$topdir"/*/; do
find "$subdir" -type f -exec echo . ; |
printf '%s: %dn' "${subdir%/}" "$( wc -l )"
done
This small bash
script would output a list of pathnames of subdirectories of $topdir
followed by the number of regular files found (anywhere) under each of those subdirectories.
The script loops over all subdirectories of $topdir
and for each, it runs the find
command
find "$subdir" -type f -exec echo . ;
This outputs a dot on an otherwise empty line for each found regular file under $subdir
. We output a dot because these are easy to count (filenames can contain newline characters).
The dots are piped to
printf '%s: %dn' "${subdir%/}" "$( wc -l )"
Here, printf
is used to format the output. It takes the subdirectory path (with the final slash removed) and the count of files.
The count of files is had from wc -l
which will count the dots coming over the pipe from find
(strictly speaking, it does not count the dots but the newlines). Since printf
itself is not reading its standard input stream, this is instead consumed by wc -l
.
Setting the nullglob
and dotglob
shell options at the start allows us to skip the whole loop if there are no subdirectories under $topdir
(that's with nullglob
) and also to include hidden directory names under $topdir
(that's with dotglob
).
By changing
topdir='./Dir1'
into
topdir=$1
you can get the script to take a directory path as its only command line argument.
You may speed the find
up radically by changing it into the slightly more complex
find "$subdir" -type f -exec sh -c 'for pathname do echo .; done' sh {} +
(the rest of the loop should be left as it is). This runs a really small in-line shell script for batches of found files, instead of echo
for each file. This would be much quicker assuming echo
is a built-in command in the sh
shell. (You may want to change sh -c
to bash -c
to be sure of that.) When -exec echo . ;
is used, find
would execute /bin/echo
, which would be slow to do for each file.
add a comment |
I'm not familiar with Gitbash on Windows, but I'll assume that whatever platform you're running this script on, you have these installed:
bash
v4.x or higher (macOS users will need to install a more recent version via Homebrew or something)- GNU
find
--really, any standard Unixfind
will do, just not the MS-DOS/Windows version (which is more likegrep
)
Assuming the above, this script should do the trick:
#!/bin/bash
# USAGE: count_files <dir> ...
declare -A filecount
# Tell bash to execute the last pipeline element in this shell, not a subshell
shopt -s lastpipe
# Run through all the user-supplied directories at one go
for d in "$@"; do
find "$d" -type f | while read f; do
[[ $f =~ ^(${d%%/}/[^/]+)/ ]] && (( filecount["${BASH_REMATCH[1]}"]++ ))
done
done
# REPORT!
for k in "${!filecount[@]}"; do
echo "$k: ${filecount[$k]}"
done
I should really learn how to use Bash properly, this did the job very nicely, how do I cite you for showing me the script?
– Faustino Delgado
Feb 13 at 17:03
@FaustinoDelgado Just point back to this answer. The permalink can be found by clicking on the "share" link at the bottom of the answer.
– Adrian
Feb 13 at 17:06
1
Does this work on directories given with absolute pathnames? What about pathnames containing newlines? It also seems to count the number of files in the given directories, not in their subdirectories, as asked for in the question, but that may just be me not understanding your code. Care to describe what you're doing?
– Kusalananda
Feb 13 at 17:23
@Kusalananda How can you introduce new lines in path names?, I do not have that problem fortunately, yet.
– Faustino Delgado
Feb 13 at 17:39
1
@Kusalananda And for full disclosure, I just noticed that I’d accidentally edited out a trailing slash from my regex. Time for bed. :)
– Adrian
Feb 13 at 18:06
|
show 3 more comments
With GNU utilities:
find Dir1 -mindepth 2 -type f -printf '%P' |
awk -F/ -vRS='' '{n[$1]++}; END{for (i in n) print i ": " n[i]}'
Counting only regular files for each of the subdirectories of Dir1
.
Outputs something like:
sub1: 3
sub2: 30
sub3: 13
sub4: 3
sub5: 3
add a comment |
find $DIR -mindepth 2 -type f -exec bash -c 'echo ${0%${0#$1/*/}}' {} $DIR ; | uniq -c
- The
-mindepth 2
means we look only at files which are descendants of direct subdirectories of$DIR
.
-type f
looks only at files.
-exec bash -c "..." {} $DIR
executes the string with the arguments{}
and$DIR
, where{}
is substituted with each file name found byfind
.- The
echo
part extracts the corresponding direct subdirectory of$DIR
from a descendent filename. See https://stackoverflow.com/questions/16623835/remove-a-fixed-prefix-suffix-from-a-string-in-bash for an explanation of what%
and#
do. The0
and1
correspond to the first and second arguments after the string respectively.
find
will list all descendants of direct subdirectories of$DIR
in succession, souniq -c
will return the total number of descendant files along with the name for each direct subdirectory.
1
Won't this showsub3
andsub3/subsub3
as two different entries?
– Stephen Harris
Feb 13 at 21:29
Oh I see now. We need a command to extract direct subdirectories of${DIR}
.
– justinpc
Feb 13 at 21:31
I've fixed the answer.
– justinpc
Feb 13 at 22:41
add a comment |
Assuming that your bash
version is at least 4.0, actually you were almost there.
You can allow your code to count files recursively with the globstar
shell option. From man bash(1)
:
If set, the pattern
**
used in a pathname expansion context will match all files and zero or more directories and subdirectories. If the pattern is followed by a/
, only directories and subdirectories match.
If you want to recursively count all files, including subdirectories, that are in your top-level directories:
shopt -s dotglob globstar
for dir in */; do
all=( "$dir"/** )
printf '%sn' "$dir: ${#all[@]}"
done
As in the code you tried, for each of your top-level directory we are populating an array with the results of pathname expansion and then displaying the number of its elements.dotglob
is used to include files whose names start with .
(hidden files).
If you want to recursively count all files except for subdirectory objects, you can just subtract the count of subdirectories from the count of all files:
shopt -s dotglob globstar
for dir in */; do
all=( "$dir"/** )
alldir=( "$dir"/**/ )
printf '%sn' "$dir: $(( ${#all[@]} - ${#alldir[@]} ))"
done
However, here I'm assuming a broad definition of "file", which, in POSIX, may refer to a regular file, character, block or FIFO special file, symbolic link, socket, directory, or whatever specific implementations may add beyond the standard.
To count a specific type of files only (e.g. regular files), it may be easier to resort to a find
-based solution.
Alternatively you can extend the above code, testing for the file type in a loop:
shopt -s dotglob globstar
for dir in */; do
all=( "$dir"/** )
count=0
for file in "${all[@]}"; do
test -f "$file" && count="$(( "$count" + 1 ))"
done
printf '%sn' "$dir: $count"
done
But this less convenient solution will also be significantly slower than the find
-based alternative (e.g. more than two times slower than the faster one in Kusalananda's answer, tested on Linux with bash
5.0 and find
4.6).
Also note that, unlike find
in its default behavior, pathname expansion with the globstar
option will follow symbolic links that resolve to files, making all the above snippets include them in the counts as well.
(Initially it used to follow symbolic links that resolve to directories too, but this behavior has been changed in bash
4.3).
Finally — to also provide a solution that does not depend on the globstar
shell option — you can use a recursive function to recursively count all regular files in the top-level subdirectories of the $1
directory:
#!/bin/bash
# nullglob is needed to avoid the function being
# invoked on 'dir/*' when * matches nothing
shopt -s nullglob dotglob
function count_files () {
for file in "$1"/*; do
# Only count regular files
[ -f "$file" ] && count="$(( "$count" + 1 ))"
# Only recurse on directories
[ -d "$file" ] && count_files "$file"
done
}
for dir in "$1"/*/; do
count="0"
count_files "$dir"
printf '%s: %sn' "$dir" "$count"
done
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "106"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f500417%2freporting-number-of-files-in-subdirectories-bash%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
5 Answers
5
active
oldest
votes
5 Answers
5
active
oldest
votes
active
oldest
votes
active
oldest
votes
#!/bin/bash
shopt -s dotglob nullglob
topdir='./Dir1'
for subdir in "$topdir"/*/; do
find "$subdir" -type f -exec echo . ; |
printf '%s: %dn' "${subdir%/}" "$( wc -l )"
done
This small bash
script would output a list of pathnames of subdirectories of $topdir
followed by the number of regular files found (anywhere) under each of those subdirectories.
The script loops over all subdirectories of $topdir
and for each, it runs the find
command
find "$subdir" -type f -exec echo . ;
This outputs a dot on an otherwise empty line for each found regular file under $subdir
. We output a dot because these are easy to count (filenames can contain newline characters).
The dots are piped to
printf '%s: %dn' "${subdir%/}" "$( wc -l )"
Here, printf
is used to format the output. It takes the subdirectory path (with the final slash removed) and the count of files.
The count of files is had from wc -l
which will count the dots coming over the pipe from find
(strictly speaking, it does not count the dots but the newlines). Since printf
itself is not reading its standard input stream, this is instead consumed by wc -l
.
Setting the nullglob
and dotglob
shell options at the start allows us to skip the whole loop if there are no subdirectories under $topdir
(that's with nullglob
) and also to include hidden directory names under $topdir
(that's with dotglob
).
By changing
topdir='./Dir1'
into
topdir=$1
you can get the script to take a directory path as its only command line argument.
You may speed the find
up radically by changing it into the slightly more complex
find "$subdir" -type f -exec sh -c 'for pathname do echo .; done' sh {} +
(the rest of the loop should be left as it is). This runs a really small in-line shell script for batches of found files, instead of echo
for each file. This would be much quicker assuming echo
is a built-in command in the sh
shell. (You may want to change sh -c
to bash -c
to be sure of that.) When -exec echo . ;
is used, find
would execute /bin/echo
, which would be slow to do for each file.
add a comment |
#!/bin/bash
shopt -s dotglob nullglob
topdir='./Dir1'
for subdir in "$topdir"/*/; do
find "$subdir" -type f -exec echo . ; |
printf '%s: %dn' "${subdir%/}" "$( wc -l )"
done
This small bash
script would output a list of pathnames of subdirectories of $topdir
followed by the number of regular files found (anywhere) under each of those subdirectories.
The script loops over all subdirectories of $topdir
and for each, it runs the find
command
find "$subdir" -type f -exec echo . ;
This outputs a dot on an otherwise empty line for each found regular file under $subdir
. We output a dot because these are easy to count (filenames can contain newline characters).
The dots are piped to
printf '%s: %dn' "${subdir%/}" "$( wc -l )"
Here, printf
is used to format the output. It takes the subdirectory path (with the final slash removed) and the count of files.
The count of files is had from wc -l
which will count the dots coming over the pipe from find
(strictly speaking, it does not count the dots but the newlines). Since printf
itself is not reading its standard input stream, this is instead consumed by wc -l
.
Setting the nullglob
and dotglob
shell options at the start allows us to skip the whole loop if there are no subdirectories under $topdir
(that's with nullglob
) and also to include hidden directory names under $topdir
(that's with dotglob
).
By changing
topdir='./Dir1'
into
topdir=$1
you can get the script to take a directory path as its only command line argument.
You may speed the find
up radically by changing it into the slightly more complex
find "$subdir" -type f -exec sh -c 'for pathname do echo .; done' sh {} +
(the rest of the loop should be left as it is). This runs a really small in-line shell script for batches of found files, instead of echo
for each file. This would be much quicker assuming echo
is a built-in command in the sh
shell. (You may want to change sh -c
to bash -c
to be sure of that.) When -exec echo . ;
is used, find
would execute /bin/echo
, which would be slow to do for each file.
add a comment |
#!/bin/bash
shopt -s dotglob nullglob
topdir='./Dir1'
for subdir in "$topdir"/*/; do
find "$subdir" -type f -exec echo . ; |
printf '%s: %dn' "${subdir%/}" "$( wc -l )"
done
This small bash
script would output a list of pathnames of subdirectories of $topdir
followed by the number of regular files found (anywhere) under each of those subdirectories.
The script loops over all subdirectories of $topdir
and for each, it runs the find
command
find "$subdir" -type f -exec echo . ;
This outputs a dot on an otherwise empty line for each found regular file under $subdir
. We output a dot because these are easy to count (filenames can contain newline characters).
The dots are piped to
printf '%s: %dn' "${subdir%/}" "$( wc -l )"
Here, printf
is used to format the output. It takes the subdirectory path (with the final slash removed) and the count of files.
The count of files is had from wc -l
which will count the dots coming over the pipe from find
(strictly speaking, it does not count the dots but the newlines). Since printf
itself is not reading its standard input stream, this is instead consumed by wc -l
.
Setting the nullglob
and dotglob
shell options at the start allows us to skip the whole loop if there are no subdirectories under $topdir
(that's with nullglob
) and also to include hidden directory names under $topdir
(that's with dotglob
).
By changing
topdir='./Dir1'
into
topdir=$1
you can get the script to take a directory path as its only command line argument.
You may speed the find
up radically by changing it into the slightly more complex
find "$subdir" -type f -exec sh -c 'for pathname do echo .; done' sh {} +
(the rest of the loop should be left as it is). This runs a really small in-line shell script for batches of found files, instead of echo
for each file. This would be much quicker assuming echo
is a built-in command in the sh
shell. (You may want to change sh -c
to bash -c
to be sure of that.) When -exec echo . ;
is used, find
would execute /bin/echo
, which would be slow to do for each file.
#!/bin/bash
shopt -s dotglob nullglob
topdir='./Dir1'
for subdir in "$topdir"/*/; do
find "$subdir" -type f -exec echo . ; |
printf '%s: %dn' "${subdir%/}" "$( wc -l )"
done
This small bash
script would output a list of pathnames of subdirectories of $topdir
followed by the number of regular files found (anywhere) under each of those subdirectories.
The script loops over all subdirectories of $topdir
and for each, it runs the find
command
find "$subdir" -type f -exec echo . ;
This outputs a dot on an otherwise empty line for each found regular file under $subdir
. We output a dot because these are easy to count (filenames can contain newline characters).
The dots are piped to
printf '%s: %dn' "${subdir%/}" "$( wc -l )"
Here, printf
is used to format the output. It takes the subdirectory path (with the final slash removed) and the count of files.
The count of files is had from wc -l
which will count the dots coming over the pipe from find
(strictly speaking, it does not count the dots but the newlines). Since printf
itself is not reading its standard input stream, this is instead consumed by wc -l
.
Setting the nullglob
and dotglob
shell options at the start allows us to skip the whole loop if there are no subdirectories under $topdir
(that's with nullglob
) and also to include hidden directory names under $topdir
(that's with dotglob
).
By changing
topdir='./Dir1'
into
topdir=$1
you can get the script to take a directory path as its only command line argument.
You may speed the find
up radically by changing it into the slightly more complex
find "$subdir" -type f -exec sh -c 'for pathname do echo .; done' sh {} +
(the rest of the loop should be left as it is). This runs a really small in-line shell script for batches of found files, instead of echo
for each file. This would be much quicker assuming echo
is a built-in command in the sh
shell. (You may want to change sh -c
to bash -c
to be sure of that.) When -exec echo . ;
is used, find
would execute /bin/echo
, which would be slow to do for each file.
edited Feb 25 at 13:29
answered Feb 13 at 17:02
KusalanandaKusalananda
133k17253416
133k17253416
add a comment |
add a comment |
I'm not familiar with Gitbash on Windows, but I'll assume that whatever platform you're running this script on, you have these installed:
bash
v4.x or higher (macOS users will need to install a more recent version via Homebrew or something)- GNU
find
--really, any standard Unixfind
will do, just not the MS-DOS/Windows version (which is more likegrep
)
Assuming the above, this script should do the trick:
#!/bin/bash
# USAGE: count_files <dir> ...
declare -A filecount
# Tell bash to execute the last pipeline element in this shell, not a subshell
shopt -s lastpipe
# Run through all the user-supplied directories at one go
for d in "$@"; do
find "$d" -type f | while read f; do
[[ $f =~ ^(${d%%/}/[^/]+)/ ]] && (( filecount["${BASH_REMATCH[1]}"]++ ))
done
done
# REPORT!
for k in "${!filecount[@]}"; do
echo "$k: ${filecount[$k]}"
done
I should really learn how to use Bash properly, this did the job very nicely, how do I cite you for showing me the script?
– Faustino Delgado
Feb 13 at 17:03
@FaustinoDelgado Just point back to this answer. The permalink can be found by clicking on the "share" link at the bottom of the answer.
– Adrian
Feb 13 at 17:06
1
Does this work on directories given with absolute pathnames? What about pathnames containing newlines? It also seems to count the number of files in the given directories, not in their subdirectories, as asked for in the question, but that may just be me not understanding your code. Care to describe what you're doing?
– Kusalananda
Feb 13 at 17:23
@Kusalananda How can you introduce new lines in path names?, I do not have that problem fortunately, yet.
– Faustino Delgado
Feb 13 at 17:39
1
@Kusalananda And for full disclosure, I just noticed that I’d accidentally edited out a trailing slash from my regex. Time for bed. :)
– Adrian
Feb 13 at 18:06
|
show 3 more comments
I'm not familiar with Gitbash on Windows, but I'll assume that whatever platform you're running this script on, you have these installed:
bash
v4.x or higher (macOS users will need to install a more recent version via Homebrew or something)- GNU
find
--really, any standard Unixfind
will do, just not the MS-DOS/Windows version (which is more likegrep
)
Assuming the above, this script should do the trick:
#!/bin/bash
# USAGE: count_files <dir> ...
declare -A filecount
# Tell bash to execute the last pipeline element in this shell, not a subshell
shopt -s lastpipe
# Run through all the user-supplied directories at one go
for d in "$@"; do
find "$d" -type f | while read f; do
[[ $f =~ ^(${d%%/}/[^/]+)/ ]] && (( filecount["${BASH_REMATCH[1]}"]++ ))
done
done
# REPORT!
for k in "${!filecount[@]}"; do
echo "$k: ${filecount[$k]}"
done
I should really learn how to use Bash properly, this did the job very nicely, how do I cite you for showing me the script?
– Faustino Delgado
Feb 13 at 17:03
@FaustinoDelgado Just point back to this answer. The permalink can be found by clicking on the "share" link at the bottom of the answer.
– Adrian
Feb 13 at 17:06
1
Does this work on directories given with absolute pathnames? What about pathnames containing newlines? It also seems to count the number of files in the given directories, not in their subdirectories, as asked for in the question, but that may just be me not understanding your code. Care to describe what you're doing?
– Kusalananda
Feb 13 at 17:23
@Kusalananda How can you introduce new lines in path names?, I do not have that problem fortunately, yet.
– Faustino Delgado
Feb 13 at 17:39
1
@Kusalananda And for full disclosure, I just noticed that I’d accidentally edited out a trailing slash from my regex. Time for bed. :)
– Adrian
Feb 13 at 18:06
|
show 3 more comments
I'm not familiar with Gitbash on Windows, but I'll assume that whatever platform you're running this script on, you have these installed:
bash
v4.x or higher (macOS users will need to install a more recent version via Homebrew or something)- GNU
find
--really, any standard Unixfind
will do, just not the MS-DOS/Windows version (which is more likegrep
)
Assuming the above, this script should do the trick:
#!/bin/bash
# USAGE: count_files <dir> ...
declare -A filecount
# Tell bash to execute the last pipeline element in this shell, not a subshell
shopt -s lastpipe
# Run through all the user-supplied directories at one go
for d in "$@"; do
find "$d" -type f | while read f; do
[[ $f =~ ^(${d%%/}/[^/]+)/ ]] && (( filecount["${BASH_REMATCH[1]}"]++ ))
done
done
# REPORT!
for k in "${!filecount[@]}"; do
echo "$k: ${filecount[$k]}"
done
I'm not familiar with Gitbash on Windows, but I'll assume that whatever platform you're running this script on, you have these installed:
bash
v4.x or higher (macOS users will need to install a more recent version via Homebrew or something)- GNU
find
--really, any standard Unixfind
will do, just not the MS-DOS/Windows version (which is more likegrep
)
Assuming the above, this script should do the trick:
#!/bin/bash
# USAGE: count_files <dir> ...
declare -A filecount
# Tell bash to execute the last pipeline element in this shell, not a subshell
shopt -s lastpipe
# Run through all the user-supplied directories at one go
for d in "$@"; do
find "$d" -type f | while read f; do
[[ $f =~ ^(${d%%/}/[^/]+)/ ]] && (( filecount["${BASH_REMATCH[1]}"]++ ))
done
done
# REPORT!
for k in "${!filecount[@]}"; do
echo "$k: ${filecount[$k]}"
done
edited Feb 13 at 18:01
answered Feb 13 at 16:32
AdrianAdrian
96268
96268
I should really learn how to use Bash properly, this did the job very nicely, how do I cite you for showing me the script?
– Faustino Delgado
Feb 13 at 17:03
@FaustinoDelgado Just point back to this answer. The permalink can be found by clicking on the "share" link at the bottom of the answer.
– Adrian
Feb 13 at 17:06
1
Does this work on directories given with absolute pathnames? What about pathnames containing newlines? It also seems to count the number of files in the given directories, not in their subdirectories, as asked for in the question, but that may just be me not understanding your code. Care to describe what you're doing?
– Kusalananda
Feb 13 at 17:23
@Kusalananda How can you introduce new lines in path names?, I do not have that problem fortunately, yet.
– Faustino Delgado
Feb 13 at 17:39
1
@Kusalananda And for full disclosure, I just noticed that I’d accidentally edited out a trailing slash from my regex. Time for bed. :)
– Adrian
Feb 13 at 18:06
|
show 3 more comments
I should really learn how to use Bash properly, this did the job very nicely, how do I cite you for showing me the script?
– Faustino Delgado
Feb 13 at 17:03
@FaustinoDelgado Just point back to this answer. The permalink can be found by clicking on the "share" link at the bottom of the answer.
– Adrian
Feb 13 at 17:06
1
Does this work on directories given with absolute pathnames? What about pathnames containing newlines? It also seems to count the number of files in the given directories, not in their subdirectories, as asked for in the question, but that may just be me not understanding your code. Care to describe what you're doing?
– Kusalananda
Feb 13 at 17:23
@Kusalananda How can you introduce new lines in path names?, I do not have that problem fortunately, yet.
– Faustino Delgado
Feb 13 at 17:39
1
@Kusalananda And for full disclosure, I just noticed that I’d accidentally edited out a trailing slash from my regex. Time for bed. :)
– Adrian
Feb 13 at 18:06
I should really learn how to use Bash properly, this did the job very nicely, how do I cite you for showing me the script?
– Faustino Delgado
Feb 13 at 17:03
I should really learn how to use Bash properly, this did the job very nicely, how do I cite you for showing me the script?
– Faustino Delgado
Feb 13 at 17:03
@FaustinoDelgado Just point back to this answer. The permalink can be found by clicking on the "share" link at the bottom of the answer.
– Adrian
Feb 13 at 17:06
@FaustinoDelgado Just point back to this answer. The permalink can be found by clicking on the "share" link at the bottom of the answer.
– Adrian
Feb 13 at 17:06
1
1
Does this work on directories given with absolute pathnames? What about pathnames containing newlines? It also seems to count the number of files in the given directories, not in their subdirectories, as asked for in the question, but that may just be me not understanding your code. Care to describe what you're doing?
– Kusalananda
Feb 13 at 17:23
Does this work on directories given with absolute pathnames? What about pathnames containing newlines? It also seems to count the number of files in the given directories, not in their subdirectories, as asked for in the question, but that may just be me not understanding your code. Care to describe what you're doing?
– Kusalananda
Feb 13 at 17:23
@Kusalananda How can you introduce new lines in path names?, I do not have that problem fortunately, yet.
– Faustino Delgado
Feb 13 at 17:39
@Kusalananda How can you introduce new lines in path names?, I do not have that problem fortunately, yet.
– Faustino Delgado
Feb 13 at 17:39
1
1
@Kusalananda And for full disclosure, I just noticed that I’d accidentally edited out a trailing slash from my regex. Time for bed. :)
– Adrian
Feb 13 at 18:06
@Kusalananda And for full disclosure, I just noticed that I’d accidentally edited out a trailing slash from my regex. Time for bed. :)
– Adrian
Feb 13 at 18:06
|
show 3 more comments
With GNU utilities:
find Dir1 -mindepth 2 -type f -printf '%P' |
awk -F/ -vRS='' '{n[$1]++}; END{for (i in n) print i ": " n[i]}'
Counting only regular files for each of the subdirectories of Dir1
.
Outputs something like:
sub1: 3
sub2: 30
sub3: 13
sub4: 3
sub5: 3
add a comment |
With GNU utilities:
find Dir1 -mindepth 2 -type f -printf '%P' |
awk -F/ -vRS='' '{n[$1]++}; END{for (i in n) print i ": " n[i]}'
Counting only regular files for each of the subdirectories of Dir1
.
Outputs something like:
sub1: 3
sub2: 30
sub3: 13
sub4: 3
sub5: 3
add a comment |
With GNU utilities:
find Dir1 -mindepth 2 -type f -printf '%P' |
awk -F/ -vRS='' '{n[$1]++}; END{for (i in n) print i ": " n[i]}'
Counting only regular files for each of the subdirectories of Dir1
.
Outputs something like:
sub1: 3
sub2: 30
sub3: 13
sub4: 3
sub5: 3
With GNU utilities:
find Dir1 -mindepth 2 -type f -printf '%P' |
awk -F/ -vRS='' '{n[$1]++}; END{for (i in n) print i ": " n[i]}'
Counting only regular files for each of the subdirectories of Dir1
.
Outputs something like:
sub1: 3
sub2: 30
sub3: 13
sub4: 3
sub5: 3
answered Feb 25 at 14:22
Stéphane ChazelasStéphane Chazelas
308k57581939
308k57581939
add a comment |
add a comment |
find $DIR -mindepth 2 -type f -exec bash -c 'echo ${0%${0#$1/*/}}' {} $DIR ; | uniq -c
- The
-mindepth 2
means we look only at files which are descendants of direct subdirectories of$DIR
.
-type f
looks only at files.
-exec bash -c "..." {} $DIR
executes the string with the arguments{}
and$DIR
, where{}
is substituted with each file name found byfind
.- The
echo
part extracts the corresponding direct subdirectory of$DIR
from a descendent filename. See https://stackoverflow.com/questions/16623835/remove-a-fixed-prefix-suffix-from-a-string-in-bash for an explanation of what%
and#
do. The0
and1
correspond to the first and second arguments after the string respectively.
find
will list all descendants of direct subdirectories of$DIR
in succession, souniq -c
will return the total number of descendant files along with the name for each direct subdirectory.
1
Won't this showsub3
andsub3/subsub3
as two different entries?
– Stephen Harris
Feb 13 at 21:29
Oh I see now. We need a command to extract direct subdirectories of${DIR}
.
– justinpc
Feb 13 at 21:31
I've fixed the answer.
– justinpc
Feb 13 at 22:41
add a comment |
find $DIR -mindepth 2 -type f -exec bash -c 'echo ${0%${0#$1/*/}}' {} $DIR ; | uniq -c
- The
-mindepth 2
means we look only at files which are descendants of direct subdirectories of$DIR
.
-type f
looks only at files.
-exec bash -c "..." {} $DIR
executes the string with the arguments{}
and$DIR
, where{}
is substituted with each file name found byfind
.- The
echo
part extracts the corresponding direct subdirectory of$DIR
from a descendent filename. See https://stackoverflow.com/questions/16623835/remove-a-fixed-prefix-suffix-from-a-string-in-bash for an explanation of what%
and#
do. The0
and1
correspond to the first and second arguments after the string respectively.
find
will list all descendants of direct subdirectories of$DIR
in succession, souniq -c
will return the total number of descendant files along with the name for each direct subdirectory.
1
Won't this showsub3
andsub3/subsub3
as two different entries?
– Stephen Harris
Feb 13 at 21:29
Oh I see now. We need a command to extract direct subdirectories of${DIR}
.
– justinpc
Feb 13 at 21:31
I've fixed the answer.
– justinpc
Feb 13 at 22:41
add a comment |
find $DIR -mindepth 2 -type f -exec bash -c 'echo ${0%${0#$1/*/}}' {} $DIR ; | uniq -c
- The
-mindepth 2
means we look only at files which are descendants of direct subdirectories of$DIR
.
-type f
looks only at files.
-exec bash -c "..." {} $DIR
executes the string with the arguments{}
and$DIR
, where{}
is substituted with each file name found byfind
.- The
echo
part extracts the corresponding direct subdirectory of$DIR
from a descendent filename. See https://stackoverflow.com/questions/16623835/remove-a-fixed-prefix-suffix-from-a-string-in-bash for an explanation of what%
and#
do. The0
and1
correspond to the first and second arguments after the string respectively.
find
will list all descendants of direct subdirectories of$DIR
in succession, souniq -c
will return the total number of descendant files along with the name for each direct subdirectory.
find $DIR -mindepth 2 -type f -exec bash -c 'echo ${0%${0#$1/*/}}' {} $DIR ; | uniq -c
- The
-mindepth 2
means we look only at files which are descendants of direct subdirectories of$DIR
.
-type f
looks only at files.
-exec bash -c "..." {} $DIR
executes the string with the arguments{}
and$DIR
, where{}
is substituted with each file name found byfind
.- The
echo
part extracts the corresponding direct subdirectory of$DIR
from a descendent filename. See https://stackoverflow.com/questions/16623835/remove-a-fixed-prefix-suffix-from-a-string-in-bash for an explanation of what%
and#
do. The0
and1
correspond to the first and second arguments after the string respectively.
find
will list all descendants of direct subdirectories of$DIR
in succession, souniq -c
will return the total number of descendant files along with the name for each direct subdirectory.
edited Feb 13 at 22:46
answered Feb 13 at 21:15
justinpcjustinpc
1114
1114
1
Won't this showsub3
andsub3/subsub3
as two different entries?
– Stephen Harris
Feb 13 at 21:29
Oh I see now. We need a command to extract direct subdirectories of${DIR}
.
– justinpc
Feb 13 at 21:31
I've fixed the answer.
– justinpc
Feb 13 at 22:41
add a comment |
1
Won't this showsub3
andsub3/subsub3
as two different entries?
– Stephen Harris
Feb 13 at 21:29
Oh I see now. We need a command to extract direct subdirectories of${DIR}
.
– justinpc
Feb 13 at 21:31
I've fixed the answer.
– justinpc
Feb 13 at 22:41
1
1
Won't this show
sub3
and sub3/subsub3
as two different entries?– Stephen Harris
Feb 13 at 21:29
Won't this show
sub3
and sub3/subsub3
as two different entries?– Stephen Harris
Feb 13 at 21:29
Oh I see now. We need a command to extract direct subdirectories of
${DIR}
.– justinpc
Feb 13 at 21:31
Oh I see now. We need a command to extract direct subdirectories of
${DIR}
.– justinpc
Feb 13 at 21:31
I've fixed the answer.
– justinpc
Feb 13 at 22:41
I've fixed the answer.
– justinpc
Feb 13 at 22:41
add a comment |
Assuming that your bash
version is at least 4.0, actually you were almost there.
You can allow your code to count files recursively with the globstar
shell option. From man bash(1)
:
If set, the pattern
**
used in a pathname expansion context will match all files and zero or more directories and subdirectories. If the pattern is followed by a/
, only directories and subdirectories match.
If you want to recursively count all files, including subdirectories, that are in your top-level directories:
shopt -s dotglob globstar
for dir in */; do
all=( "$dir"/** )
printf '%sn' "$dir: ${#all[@]}"
done
As in the code you tried, for each of your top-level directory we are populating an array with the results of pathname expansion and then displaying the number of its elements.dotglob
is used to include files whose names start with .
(hidden files).
If you want to recursively count all files except for subdirectory objects, you can just subtract the count of subdirectories from the count of all files:
shopt -s dotglob globstar
for dir in */; do
all=( "$dir"/** )
alldir=( "$dir"/**/ )
printf '%sn' "$dir: $(( ${#all[@]} - ${#alldir[@]} ))"
done
However, here I'm assuming a broad definition of "file", which, in POSIX, may refer to a regular file, character, block or FIFO special file, symbolic link, socket, directory, or whatever specific implementations may add beyond the standard.
To count a specific type of files only (e.g. regular files), it may be easier to resort to a find
-based solution.
Alternatively you can extend the above code, testing for the file type in a loop:
shopt -s dotglob globstar
for dir in */; do
all=( "$dir"/** )
count=0
for file in "${all[@]}"; do
test -f "$file" && count="$(( "$count" + 1 ))"
done
printf '%sn' "$dir: $count"
done
But this less convenient solution will also be significantly slower than the find
-based alternative (e.g. more than two times slower than the faster one in Kusalananda's answer, tested on Linux with bash
5.0 and find
4.6).
Also note that, unlike find
in its default behavior, pathname expansion with the globstar
option will follow symbolic links that resolve to files, making all the above snippets include them in the counts as well.
(Initially it used to follow symbolic links that resolve to directories too, but this behavior has been changed in bash
4.3).
Finally — to also provide a solution that does not depend on the globstar
shell option — you can use a recursive function to recursively count all regular files in the top-level subdirectories of the $1
directory:
#!/bin/bash
# nullglob is needed to avoid the function being
# invoked on 'dir/*' when * matches nothing
shopt -s nullglob dotglob
function count_files () {
for file in "$1"/*; do
# Only count regular files
[ -f "$file" ] && count="$(( "$count" + 1 ))"
# Only recurse on directories
[ -d "$file" ] && count_files "$file"
done
}
for dir in "$1"/*/; do
count="0"
count_files "$dir"
printf '%s: %sn' "$dir" "$count"
done
add a comment |
Assuming that your bash
version is at least 4.0, actually you were almost there.
You can allow your code to count files recursively with the globstar
shell option. From man bash(1)
:
If set, the pattern
**
used in a pathname expansion context will match all files and zero or more directories and subdirectories. If the pattern is followed by a/
, only directories and subdirectories match.
If you want to recursively count all files, including subdirectories, that are in your top-level directories:
shopt -s dotglob globstar
for dir in */; do
all=( "$dir"/** )
printf '%sn' "$dir: ${#all[@]}"
done
As in the code you tried, for each of your top-level directory we are populating an array with the results of pathname expansion and then displaying the number of its elements.dotglob
is used to include files whose names start with .
(hidden files).
If you want to recursively count all files except for subdirectory objects, you can just subtract the count of subdirectories from the count of all files:
shopt -s dotglob globstar
for dir in */; do
all=( "$dir"/** )
alldir=( "$dir"/**/ )
printf '%sn' "$dir: $(( ${#all[@]} - ${#alldir[@]} ))"
done
However, here I'm assuming a broad definition of "file", which, in POSIX, may refer to a regular file, character, block or FIFO special file, symbolic link, socket, directory, or whatever specific implementations may add beyond the standard.
To count a specific type of files only (e.g. regular files), it may be easier to resort to a find
-based solution.
Alternatively you can extend the above code, testing for the file type in a loop:
shopt -s dotglob globstar
for dir in */; do
all=( "$dir"/** )
count=0
for file in "${all[@]}"; do
test -f "$file" && count="$(( "$count" + 1 ))"
done
printf '%sn' "$dir: $count"
done
But this less convenient solution will also be significantly slower than the find
-based alternative (e.g. more than two times slower than the faster one in Kusalananda's answer, tested on Linux with bash
5.0 and find
4.6).
Also note that, unlike find
in its default behavior, pathname expansion with the globstar
option will follow symbolic links that resolve to files, making all the above snippets include them in the counts as well.
(Initially it used to follow symbolic links that resolve to directories too, but this behavior has been changed in bash
4.3).
Finally — to also provide a solution that does not depend on the globstar
shell option — you can use a recursive function to recursively count all regular files in the top-level subdirectories of the $1
directory:
#!/bin/bash
# nullglob is needed to avoid the function being
# invoked on 'dir/*' when * matches nothing
shopt -s nullglob dotglob
function count_files () {
for file in "$1"/*; do
# Only count regular files
[ -f "$file" ] && count="$(( "$count" + 1 ))"
# Only recurse on directories
[ -d "$file" ] && count_files "$file"
done
}
for dir in "$1"/*/; do
count="0"
count_files "$dir"
printf '%s: %sn' "$dir" "$count"
done
add a comment |
Assuming that your bash
version is at least 4.0, actually you were almost there.
You can allow your code to count files recursively with the globstar
shell option. From man bash(1)
:
If set, the pattern
**
used in a pathname expansion context will match all files and zero or more directories and subdirectories. If the pattern is followed by a/
, only directories and subdirectories match.
If you want to recursively count all files, including subdirectories, that are in your top-level directories:
shopt -s dotglob globstar
for dir in */; do
all=( "$dir"/** )
printf '%sn' "$dir: ${#all[@]}"
done
As in the code you tried, for each of your top-level directory we are populating an array with the results of pathname expansion and then displaying the number of its elements.dotglob
is used to include files whose names start with .
(hidden files).
If you want to recursively count all files except for subdirectory objects, you can just subtract the count of subdirectories from the count of all files:
shopt -s dotglob globstar
for dir in */; do
all=( "$dir"/** )
alldir=( "$dir"/**/ )
printf '%sn' "$dir: $(( ${#all[@]} - ${#alldir[@]} ))"
done
However, here I'm assuming a broad definition of "file", which, in POSIX, may refer to a regular file, character, block or FIFO special file, symbolic link, socket, directory, or whatever specific implementations may add beyond the standard.
To count a specific type of files only (e.g. regular files), it may be easier to resort to a find
-based solution.
Alternatively you can extend the above code, testing for the file type in a loop:
shopt -s dotglob globstar
for dir in */; do
all=( "$dir"/** )
count=0
for file in "${all[@]}"; do
test -f "$file" && count="$(( "$count" + 1 ))"
done
printf '%sn' "$dir: $count"
done
But this less convenient solution will also be significantly slower than the find
-based alternative (e.g. more than two times slower than the faster one in Kusalananda's answer, tested on Linux with bash
5.0 and find
4.6).
Also note that, unlike find
in its default behavior, pathname expansion with the globstar
option will follow symbolic links that resolve to files, making all the above snippets include them in the counts as well.
(Initially it used to follow symbolic links that resolve to directories too, but this behavior has been changed in bash
4.3).
Finally — to also provide a solution that does not depend on the globstar
shell option — you can use a recursive function to recursively count all regular files in the top-level subdirectories of the $1
directory:
#!/bin/bash
# nullglob is needed to avoid the function being
# invoked on 'dir/*' when * matches nothing
shopt -s nullglob dotglob
function count_files () {
for file in "$1"/*; do
# Only count regular files
[ -f "$file" ] && count="$(( "$count" + 1 ))"
# Only recurse on directories
[ -d "$file" ] && count_files "$file"
done
}
for dir in "$1"/*/; do
count="0"
count_files "$dir"
printf '%s: %sn' "$dir" "$count"
done
Assuming that your bash
version is at least 4.0, actually you were almost there.
You can allow your code to count files recursively with the globstar
shell option. From man bash(1)
:
If set, the pattern
**
used in a pathname expansion context will match all files and zero or more directories and subdirectories. If the pattern is followed by a/
, only directories and subdirectories match.
If you want to recursively count all files, including subdirectories, that are in your top-level directories:
shopt -s dotglob globstar
for dir in */; do
all=( "$dir"/** )
printf '%sn' "$dir: ${#all[@]}"
done
As in the code you tried, for each of your top-level directory we are populating an array with the results of pathname expansion and then displaying the number of its elements.dotglob
is used to include files whose names start with .
(hidden files).
If you want to recursively count all files except for subdirectory objects, you can just subtract the count of subdirectories from the count of all files:
shopt -s dotglob globstar
for dir in */; do
all=( "$dir"/** )
alldir=( "$dir"/**/ )
printf '%sn' "$dir: $(( ${#all[@]} - ${#alldir[@]} ))"
done
However, here I'm assuming a broad definition of "file", which, in POSIX, may refer to a regular file, character, block or FIFO special file, symbolic link, socket, directory, or whatever specific implementations may add beyond the standard.
To count a specific type of files only (e.g. regular files), it may be easier to resort to a find
-based solution.
Alternatively you can extend the above code, testing for the file type in a loop:
shopt -s dotglob globstar
for dir in */; do
all=( "$dir"/** )
count=0
for file in "${all[@]}"; do
test -f "$file" && count="$(( "$count" + 1 ))"
done
printf '%sn' "$dir: $count"
done
But this less convenient solution will also be significantly slower than the find
-based alternative (e.g. more than two times slower than the faster one in Kusalananda's answer, tested on Linux with bash
5.0 and find
4.6).
Also note that, unlike find
in its default behavior, pathname expansion with the globstar
option will follow symbolic links that resolve to files, making all the above snippets include them in the counts as well.
(Initially it used to follow symbolic links that resolve to directories too, but this behavior has been changed in bash
4.3).
Finally — to also provide a solution that does not depend on the globstar
shell option — you can use a recursive function to recursively count all regular files in the top-level subdirectories of the $1
directory:
#!/bin/bash
# nullglob is needed to avoid the function being
# invoked on 'dir/*' when * matches nothing
shopt -s nullglob dotglob
function count_files () {
for file in "$1"/*; do
# Only count regular files
[ -f "$file" ] && count="$(( "$count" + 1 ))"
# Only recurse on directories
[ -d "$file" ] && count_files "$file"
done
}
for dir in "$1"/*/; do
count="0"
count_files "$dir"
printf '%s: %sn' "$dir" "$count"
done
edited Feb 25 at 15:12
answered Feb 14 at 9:45
fra-sanfra-san
1,8131518
1,8131518
add a comment |
add a comment |
Thanks for contributing an answer to Unix & Linux Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f500417%2freporting-number-of-files-in-subdirectories-bash%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
You'd want a report of files in each of the directories
sub1
,sub1/subsub1
,sub2
,sub3
,sub3/subsub3
, andsubsubsub3
? Or just forsub1
,sub2
, andsub3
? If this second option, shouldsub1
andsub3
count files in their subdirectories too?– roaima
Feb 13 at 22:48