How to grep for same string but multiple files at the same time?

  • I have a set of log files that I need to review and I would like to search specific strings on the same files at once Is this possible? Currently I am using

    grep -E 'fatal|error|critical|failure|warning|' /path_to_file

    How do I use this and search for the strings of multiple files at once? If this is something that needs to be scripted, can someone provide a simple script to do this?

    `grep` can take more than one file argument.

  • grep -E 'fatal|error|critical|failure|warning|' *.log

    How do I get `grep` to skip directories, but still recursively check all files? `grep -E 'text' **/*` works, but gives an error message for each subdirectory (and then correctly checks all files in them)

    @Jorn, really you should ask a new question, but use `find . -type f -exec grep -E 'fatal|error|critical|failure|warning' {} +`

    I get `grep: invalid max count`, so I used jherrans answer

  • You could use something like this:

    find . -name "*.log" | xargs grep -E 'fatal|error|critical|failure|warning|'

    This will find every file with .log as extension and apply the grep command.

    That was too easy :) Actually all of my files are .log so "grep -E 'fatal|error|critical|failure|warning' *.log works too. Thanks!

    What a great effort of comprehension/adaptation user53029 !

    Why bother with `xargs` and the possibility for severe breakage on whitespace in filenames when you can just use `find . -name '*.log' -exec grep -E 'fatal|error|critical|failure|warning' {} +`?

  • If it is simpler, you can just specify each file one after the other.

    grep -E 'fatal|error|critical|failure|warning' file1.log file2.log 

    So simple, but was looking for this for too long :0

  • If you need to grep on a arbitrary set of file names that cannot be retrieved by a regular expression:

    grep -E 'fatal|error|critical|failure|warning|' `cat<<FIN
    > file1
    > file2
    > ...
    > filen
    > FIN`

    What's the advantage over pasting the filenames one after another? You can compiled the file name list on a text file and then paste it.

  • If You want to search recursively in subdirectories files also Then you can use below command

    It will search recursively in subdirectories files also

    egrep -r "string1|string2" pathname
  • This was a very time consuming task. And yes, it certainly needed to be scripted if you're going to search for multiple strings in multiple different logs at the same time. But I recently had to do this and it was quite painful. Nevertheless, it is done and ready and can be downloaded from the following link:

    Log Search Script Download

    The way this works is pretty simple.

    Scenario 1: Monitor ONE string in just ONE log file

    ./ localhost /var/tmp/logXray autonda /var/log/messages 60m 'can.*t.*open' '.'  1 2 single_errCheck -ndshow

    Scenario 2: Monitor MULTIPLE strings in just ONE log file

    ./ localhost /var/tmp/logXray autonda /var/log/messages 60m 'can.*t.*open_P_ntpd.*stat' '.'  1 2 multi_errCheck -ndshow

    Scenario 3: Monitor Single/Multiple strings in Multiple log files

    ./ localhost /var/tmp/logXray autonda /var/log 60m 'can.*t.*open_P_ntpd.*stat' '.'  1 2 multi_err_multi_logCheck -ndshow


    The _P_ means OR - It replaces the pipe "|" symbol because it is less likely you'll have to search for a string containing "_P_". If you dont wish to type "_P_", you can just substitute the _P_ with "|".

    When using this script, the parameters you'll be changing frequently are:

    1. The log file or log directory to be monitored
    2. The age a log file must be for it to be monitored..i.e. do not monitor or discover any log file that has a timestamp over 60 minutes
    3. The strings(s)/pattern(s) you want to watch for
    4. The tag - this is the second to last argument you have to supply. It records stats about the log file(s) you're monitoring under /var/tmp/logXray
    5. The log option -ndshow - This is the parameter you want to use if you wish to output the entries from the logs found matching the pattern(s) you specified. If you just want to see the total count of each pattern found, simply replace '-ndshow' with '-ndfoundmul'.

    When using '-ndfoundmul', you'll get an output similar to:

    [[email protected]]# ./ localhost /var/tmp/logXray autonda /var/log/messages 60m 'can.*t.*open_P_ntpd.*stat' '.'  1 2 blahblahA -ndfoundmul
    OK: [/var/log/messages][1]  /var/log/messages:P=(can_t_open=0 ntpd_stat=0)_F=(117s)_R=(228,228=0) 

    Solution to the Original Poster's Issue: Scan for Multiple strings in multiple log files

    ./ localhost /var/tmp/logXray autonda /var/log 60m 'fatal_P_error_P_critical_P_failure_P_warning' '.'  1 2 multierr_logCheck -ndshow

    OSes: This was tested on Ubuntu and Red Hat

  • grep -EFn "fatal|error|critical|failure|warning|search-string" /path/to/the/file/log_file?.lo* --color=auto

    This will search for 'fatal or error or critical or failure or warning or search-string' in the files with the name starting with 'log_file?' and extension 'lo'* in the path /path/to/the/file/ and give the search string a random color and print line number it was found at.

    Sure this is a working answer, but the user asked for searching using a pattern, you answered using a fixed search string. Sorry, but adding things that were not asked for, like the line-numbering and colouring of results are not likely to make an answer more beneficial. But there will be other questions that can likely be answered your `grep` skills so best of luck on your USE career!

    @zagrimsan point taken, I've added the _-E 'fatal|error|critical|failure|warning|'_ param to it.

    And hey, he specifically asked for multiple files, not a search pattern. Please read the question again.

    Quote from the Q: "search specific strings", and the question shows the search pattern (containing multiple strings to match for) he uses. You are right that the title of the question is slightly off from what he is really asking for, though. BTW, `-E` and `-F` can't be used at the same time, they are conflicting (typo?).

  • JigarGandhi's answer demonstrates the use of the asterisk wildcard. There are more of them and you can see them here or by running man 7 glob.

    One of them that I found useful is the range matching with []. Since the system I work on produces sequentially numbered log files eg product.log.1 product.log.2 ... product.log.200, it's handy to grep with a single command on 3 or 4 sequential files but not more. So this

    grep 'whatever' product.log.[5-7]

    will grep for all files ending with product.log. 5, 6 or 7. The wildcard isn't necessary to be at the end so flickerfly's answer can be simplified to

    grep -E 'fatal|error|critical|failure|warning' file[1,2].log

    Note also that these wildcards can be used in other commands as well like in cp for example.

  • You can also use curly braces if the files are all the same folder.

    See an example

    grep -E 'fatal|error|critical|failure|warning|' /var/log/{messages,secure,syslog,dmesg}

    If you add an s to the grep this supresses errors about missing files

    grep -sE 'fatal|error|critical|failure|warning|' /var/log/{messages,secure,syslog,dmesg}

    I was just experimenting with this myself for doing commands that work across multiple distros where it's in one file vs the other due to OS differences.

    Mail logs

    sudo grep -is [email protected] /var/log/{maillog,exim_mainlog,exim_rejectlog,mail.log,mail.err,syslog}

    Archived Mail Logs using 2>/dev/null to supress zgrep missing .gz warnings

    sudo zgrep -is [email protected] /var/log/{maillog*,exim_mainlog*,exim_rejectlog*,mail.log*,mail.err*,syslog*} 2>/dev/null

    Reference: Is there a way to refer to multiple files in a directory without retyping the whole path?

License under CC-BY-SA with attribution

Content dated before 6/26/2020 9:53 AM