The problem of the pipeline being closed by the acceptor in the Linux system is solved

  

Pipe in the Linux shell| It is a very convenient function, you can use the output of one program as the input of another program, so that we can "spelt" multiple commands together, eliminating the cumbersome temporary files. There are similar usages in windows, such as dir | More, you should know all about the dos command.


Since it is a pipeline, there is an entry and an exit, each corresponding to an application. Under normal circumstances, the output of the portal application should be accepted by the export application, but in some special In the case, the export application will close the pipeline in advance, for example, when querying the update log of svn, only when it is before:


$ svn log | Head

----------------

r137 |  Fwolf |  2007-05-28 13:38:47 +0800 (Mon, 28 May 2007) |  4 lines


Update the record.


svn: Write error: Broken pipe


Because the head only needs to use the first 10 lines of input (the default number of lines, it can also be specified by the user), and then receive the remaining The output below is also redundant, and the pipeline is closed in advance. After the pipeline entry application svn finds it, it exits with an error. In this example, the error message is very clear, but not all applications are like this:


$ find . -name "*rc" | Xargs -i cat {}| Head -1

[Desktop]

xargs: cat: terminated by signal 13


The error message doesn't seem to be well understood, in fact it means :xargs finds that its child process cat was aborted due to signal 13. Since xargs itself is a loop operation, the loop is stopped after the error is found. This is one; the signal 13 is generated when cat tries to write data to a closed pipe pipe, and stops after cat receives it. . Similar to the process of cat output, the user presses the effect of ctrl+c.


How to avoid this problem? Very simple, the program behind the pipeline will not close the pipeline in advance, especially when combined with xargs, it will not continue if it finds an error. For example, to use the head can be like this:


$ cat file | Head -1


Although cat will still be closed by signal 13, but bash will not report an error, so it can only operate on one file, even if it uses wildcards, it can only head to the first a file. If you want to add a traversal of the file, you can use for:


$for file in .*rc;do cat $file | Head -1;done


cat will still be closed, but for will not care about it, continue to loop. Head can also directly specify the file name, so we can throw away cat:


$find . -name "*rc" | Xargs -i head -n1 {}


Personally think that this is the most perfect solution, that is, you can use find powerful search instructions, and will not involve the pipeline. However, if the file name has no special requirements, there is an easier way:


$head -n1 .*rc


Use wildcards directly in the parameters of the head Specify the file.

Copyright © Windows knowledge All Rights Reserved