本文主要是介绍python 爬虫 美桌网 50行代码爬取明星写真摄影图片,希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!
好几天没更图片爬虫了,今天就简单的趴一下美桌网:
运行效果如下:
其中,每位明星单独一个文件夹,写真的册也是单独一个文件夹,非常的银杏
源码:
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# @Time : 2020/12/15 18:14
# @Author : huni
# @File : 美桌网.py
# @Software: PyCharm
import requests
from lxml import etree
import os
if __name__ == '__main__':headers = {'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/86.0.4240.183 Safari/537.36'}url = 'http://www.win4000.com/mt/star.html'resp = requests.get(url=url,headers=headers).texttree = etree.HTML(resp)li_list = tree.xpath('/html/body/div[4]/div/div[2]/ul/li')for li in li_list:href = li.xpath('./a/@href')[0]resp1 = requests.get(url=href,headers=headers).texttree1 = etree.HTML(resp1)href1_list = list(set(tree1.xpath('/html/body/div[4]/div/div[3]/div[1]/div[2]/div//@href')))href1_list.append(href)title = tree1.xpath('/html/head/title/text()')[0]m_path = './明星图片'if not os.path.exists(m_path):os.mkdir(m_path)title_path = m_path + f'/{title}'if not os.path.exists(title_path):os.mkdir(title_path)for href1 in href1_list:resp2 = requests.get(url=href1,headers=headers).texttree2 = etree.HTML(resp2)href2_list = tree2.xpath('/html/body/div[4]/div/div[3]/div[1]/div[1]/div[2]/div/div/ul//@href')for href2 in href2_list:resp3 = requests.get(url=href2,headers=headers).texttree3 = etree.HTML(resp3)page_num = int(tree3.xpath('/html/body/div[4]/div/div[2]/div/div[1]/div[1]/em/text()')[0])name = tree3.xpath('/html/head/title/text()')[0]name_path = title_path + f'/{name}'if not os.path.exists(name_path):os.mkdir(name_path)for i in range(1,page_num+1):every_href = href2.replace('.html',f'_{i}.html')resp4 = requests.get(url=every_href,headers=headers).texttree4 = etree.HTML(resp4)src = tree4.xpath('//*[@id="pic-meinv"]/a/img/@src')[0]jpg_data = requests.get(url=src,headers=headers).contentjpg_name = src.split('/')[-1]jpg_path = name_path + f'/{jpg_name}'with open(jpg_path,'wb') as fp:fp.write(jpg_data)print(jpg_name,'下载完成')
单线程如此,多线程内容还可以参考我之前的博客
好了,今天就码到这里来,如果各位看官觉得不错,可以多多投喂小编哟
这篇关于python 爬虫 美桌网 50行代码爬取明星写真摄影图片的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!