Test the efficiency of deleting a large number of files under Linux. First create 500,000 files 1. rm delete
rm does not work due to the large number of files. 2. Find and delete
About 43 minutes on my computer. . . . . . I deleted it while watching the video. 3. find with delete
It takes 9 minutes. 4. rsync delete
Very good and powerful. 5. Python Delete import os import timeit def main(): for pathname,dirnames,filenames in os.walk('/home/username/test'): for filename in filenames: file = os.path.join(pathname,filename) os.remove(file) if __name__ == '__main__': t = timeit.Timer('main()','from __main__ import main') print t.timeit(1) 1 2 $ python test.py 529.309022903 It takes about 9 minutes. 6. Perl Delete
This should be the fastest. 7. Results:
Conclusion: rsync is the fastest and most convenient way to delete a large number of small files. The above is the full content of this article. I hope it will be helpful for everyone’s study. I also hope that everyone will support 123WORDPRESS.COM. You may also be interested in:
|
<<: Understanding v-bind in vue
>>: Detailed explanation of the use of MySQL mysqldump
Table of contents 01-Event Bubbling 1.1- Introduc...
Problem Description In our projects, horizontal t...
When developing a backend management project, it ...
1. What is Docker? (1) Docker is an open source t...
Table of contents 1 Test Environment 1.1 Server H...
Table of contents 1. Default values for functio...
General form prompts always occupy the form space...
Need to know how many days there are before an im...
Let's take a look at the code file structure ...
The effect shows that two browsers simulate each ...
Table of contents summary Basic Example motivatio...
Table of contents frame First-class error reporti...
WeChat Mini Programs are becoming more and more p...
MySQL 5.7.8 introduced the json field. This type ...
Setting the font for the entire site has always b...