Bash/grep: ignore lines over N characters

bashfindgrep

I have a find/grep command that i use to search my projects' codebase, thus:

find folder1 folder2 folderetc -print | awk '{print "\""$0"\""}' | xargs grep -n "search-string"

Something that irritates me regularly is minified javascript & css files, which have everything packed into a single line (due to a probably misguided belief that removing newline characters will make downloading javascript files significantly quicker). With these packed or minified files, there is always a non-packed version, which i want to see in my results. I never want to see the packed version, as it just results in screens and screens of unreadable garbage.

There's no systematic way that i can tell grep to avoid minified or packed files, as there's no naming convention. But it occurred to me that i could filter out results where the line is more than, say, 400 characters long. Is there a way i can do this with grep (or something else?)

Best Answer

Yes, there is:

$ echo foobar | awk 'length($0) < 7 && $0~/foo/'
foobar

Replace the echo with your find command and foo with the string to search for. Adjust the 7 to the length you want to filter by.