Linux method of cutting txt large files into small files through shell script

  
                

Under Linux system, some txt files are relatively large, and it is very inconvenient to transfer to mobile devices. Can you turn txt files into multiple small files? In fact, it can be achieved through shell scripts. The following small series will introduce you to the method of cutting txt large files through shell scripts.

Solution:

1. first use the split command to cut large files, each small file saved one million lines

split parameters:

-b : You can select the size of the file to be divided into later, you can add units, such as b, k, m, etc.;

-l : divide by the number of lines;

#按Split 1000 lines per file except

split -l 1000 httperr8007.log httperr

httpaa,httpab,httpac. . . . . . . .

# Split by 100K per file

split -b 100k httperr8007.log http

httpaa,httpab,httpac. . . . . . . .

2. Traverse all 1 million lines of files, create new one by one and cut into 10,000 lines of small files

#! /bin/bash

bigfile=“1.txt”

split -l 1000000 $bigfile text

currdir=1

for smallfile in ` Ls

Copyright © Windows knowledge All Rights Reserved