Mailing List ArchiveSupport open source code!
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]Re: grep resistance
- To: tlug@example.com
- Subject: Re: grep resistance
- From: Fredric Fredricson <fredric.fredriksson@example.com>
- Date: Sun, 05 Nov 2000 01:11:27 +0100
- Content-Transfer-Encoding: 7bit
- Content-Type: text/plain; charset=us-ascii
- Organization: MYDATA automation AB
- References: <Pine.LNX.4.10.10011041910280.813-100000@example.com>
- Reply-To: tlug@example.com
- Resent-From: tlug@example.com
- Resent-Message-ID: <K7HZSD.A.EeD.AnKB6@example.com>
- Resent-Sender: tlug-request@example.com
- Sender: fredric@example.com
Tony Laszlo wrote: > > It is apparently not possible to grep > through a directory which has a large > number of files (perhaps the limit > is someone near 1,000 or so). > I happen to have a directory which has > more than 10,000 files and found > that grep gives the error > "Argument list too long." Are you sure it's a grep problem? Sounds like a shell problem to me. I doubt that there is a limit on number of files but there is a limit on command line length. > Here is a workaround which works > pretty well: > > find . -type f -exec grep -n -C STRING '{}' ';' | less > > problem is, it doesn't display the name of the > file. Can this be adjusted to do so? > Another solution: find . -type f -exec grep -n -C STRING '{}' /dev/null ';' | less (Adding /dev/null to list of files is an old trick to make sure grep prints the file name even if there is only one file. Useful for scripts.) The best solution I can come up with is: ls -1 | xargs grep ..... /dev/null | less /Fredric Fredricson (Btw. Wouldn't fgrep be faster?)
- References:
- grep resistance
- From: Tony Laszlo <laszlo@example.com>
Home | Main Index | Thread Index
- Prev by Date: Re: grep resistance
- Next by Date: RE: Linux distributions
- Prev by thread: Re: grep resistance
- Next by thread: Re: Linux distributions
- Index(es):
Home Page Mailing List Linux and Japan TLUG Members Links