爱美女网爬虫[预览版] [23.06.01] [Windows] — 儿童节快乐

前一篇文章imn5的爬虫,预览版,不支持搜索。支持webp格式图片下载(这个文件格式在其他的网站都没出现过)

本站为全网最新秀人机构写真,新出的作品为预览版一般在2个星期内更新高清完整版。高清版均采用1200高像素大图,且无别家的水印。本站不管手机还是电脑端均无广告。 —图片网站宣传语

Continue Reading

秀人集爬虫 [更新版] 【23.05.13】【Windows】

C:\Users\obaby>F:\Pycharm_Projects\meitulu-spider\dist\xiurenji2\xiurenji2.exe
****************************************************************************************************
秀人集爬虫 [更新版]
Verson: 23.05.13
当前服务器地址:https://www.xiuren5.vip
Blog: http://oba.by
姐姐的上面的域名怎样啊?说不好的不让用!!哼!!
****************************************************************************************************
USAGE:
spider -h <help> -a <all> -q <search>
Arguments:
         -a <download all site images>
         -q <query the image with keywords>
         -h <display help text, just this>
Option Arguments:
         -p <image download path>
         -r &lt;random index category list>
         -c <single category url>
         -e <early stop, work in site crawl mode only>
         -s <site url eg: http://www.xiurenji.vip (no last backslash "/")>
****************************************************************************************************

Continue Reading

爱看美女网爬虫【Windows】【23.03.02】

C:\Users\obaby>F:\Pycharm_Projects\sexy_girl_spider\dist\ikmn\ikmn.exe
****************************************************************************************************
USAGE:
spider -h <help> -a <all> -q <search> -e <early stop>
Arguments:
         -a <download all site images>
         -q <query the image with keywords>
         -h <display help text, just this>
Option Arguments:
         -p <image download path>
         -r <random index category list>
         -c <single category url>
         -e <early stop, work in site crawl mode only>
         -s <site url eg: https://www.ikmn.vip (no last backslash "/")>
****************************************************************************************************

Continue Reading

精品美女吧 爬虫【Windows】【22.12.23】

精品美女吧 爬虫
Verson: 22.12.23
Blog: http://www.h4ck.org.cn
****************************************************************************************************
USAGE:
spider -h <help> -a <all> -q <search> -e <early stop>
Arguments:
         -a <download all site images>
         -q <query the image with keywords>
         -h <display help text, just this>
****************************************************************************************************

Continue Reading

微图坊爬虫 【22.06.07】【Windows】

Change Log:

1. Install newst chrome before use this program.
2. Open chrome and login to v2ph.com
3. The spider will auto stop after crawl 16 albums

Usage:

(venv) PS F:\Pycharm_Projects\meitulu-spider> python .\v2ph.py
Arguments:
         -a <download all site images>
         -q <query the image with keywords>
         -h <display help text, just this>
Option Arguments:
         -p <image download path>
         -r <random index category list>
         -c <single category url>
         -e <early stop, work in site crawl mode only>
         -s <site url eg: https://www.v2ph.com (no last backslash "/")>
****************************************************************************************************

Continue Reading

KU138爬虫 【22.05.23】【Windows】

****************************************************************************************************
USAGE:
spider -h <help> -a <all> -q <search>
Arguments:
         -a <download all site images>
         -q <query the image with keywords>
         -h <display help text, just this>
Option Arguments:
         -p <image download path>
         -r <random index category list>
         -c <single category url>
         -e <early stop, work in site crawl mode only>
         -s <site url eg: https://www.v2ph.com (no last backslash "/")>
****************************************************************************************************

Continue Reading