本文主要是介绍python去哪儿网的旅游景点信息,希望对大家解决编程问题提供一定的参考价值,需要的开发者们随着小编来一起学习吧!
爬取过程分析:
1、网站url:‘https://piao.qunar.com/ticket/list.htm?keyword=北京&page=1’
2、http请求方法为get方法
3、用beautifulsoup提取所需要的信息
4、将爬取的信息存至本地
具体代码如下:
import requests
from bs4 import BeautifulSoupclass QuNaEr():def __init__(self, keyword, page=1):self.keyword = keywordself.page = pagedef qne_spider(self):url = 'https://piao.qunar.com/ticket/list.htm?keyword=%s&page=%s' % (self.keyword, self.page)response = requests.get(url)response.encoding = 'utf-8'text = response.textbs_obj = BeautifulSoup(text, 'html.parser')arr = bs_obj.find('div', {'class': 'result_list'}).contentswith open('./qunaer/tour.csv','a') as f:for i in arr:info = i.attrs# 景区名称name = info.get('data-sight-name')print(name)# 地址address = info.get('data-address')# 近期售票数count = info.get('data-sale-count')# 经纬度point = info.get('data-point')# 起始价格try:price = i.find('span', {'class': 'sight_item_price'})price = price.find_all('em')price = price[0].textf.write('{},{},{},{},{}\n'.format(name,address,count,price,point))except Exception as e:print(e)if __name__ == '__main__':citys = ['北京', '上海', '成都', '三亚', '广州', '重庆', '深圳', '西安', '杭州', '厦门', '武汉', '大连', '苏州']with open('./qunaer/tour.csv', 'a') as f:f.write('{},{},{},{},{}\n'.format('景区名称', '地址', '售票数', ‘起始价格', '经纬度'))for i in citys:for page in range(1, 10):qne = QuNaEr(i, page=page)qne.qne_spider()
这篇关于python去哪儿网的旅游景点信息的文章就介绍到这儿,希望我们推荐的文章对编程师们有所帮助!