公作者 | 旧时晚风拂晓城
编辑 | JackTian 来源 | 杰哥的IT之旅(ID:Jake_Internet) 转载请联系授权(微信ID:Hc220066)
公众号后台回复:「成都二手房数据」,获取本文完整数据。
本文先熟悉并发与并行、阻塞与非阻塞、同步与异步、多线程、多线程、协程的基本概念。再实现 asyncio + aiohttp 爬取链家成都二手房源信息的异步爬虫,爬取效率与多线程版进行简单测试和比较。
并发与并行
阻塞与非阻塞
同步与异步
多线程
多线程(multithreading),是指从软件或者硬件上实现多个线程并发执行的技术。具有多线程能力的计算机因有硬件支持而能够在同一时间执行多于一个线程,进而提升整体处理性能。具有这种能力的系统包括对称多处理机、多核心处理器以及芯片级多处理或同时多线程处理器。在一个程序中,这些独立运行的程序片段叫作“线程”(Thread),利用它编程的概念就叫作“多线程处理”。
多进程
多进程(multiprocessing),每个正在系统上运行的程序都是一个进程。每个进程包含一到多个线程。进程也可能是整个程序或者是部分程序的动态执行。线程是一组指令的集合,或者是程序的特殊段,它可以在程序里独立执行,也可以把它理解为代码运行的上下文。所以线程基本上是轻量级的进程,它负责在单个程序里执行多任务。多进程就是利用 CPU 的多核优势,在同一时间并行地执行多个任务,可以大大提高执行效率。
协程
协程,英文叫作 Coroutine,又称微线程、纤程,协程是一种用户态的轻量级线程。
协程拥有自己的寄存器上下文和栈。协程调度切换时,将寄存器上下文和栈保存到其他地方,在切回来的时候,恢复先前保存的寄存器上下文和栈。因此协程能保留上一次调用时的状态,即所有局部状态的一个特定组合,每次过程重入时,就相当于进入上一次调用的状态。
协程本质上是个单进程,协程相对于多进程来说,无需线程上下文切换的开销,无需原子操作锁定及同步的开销,编程模型也非常简单。
我们可以使用协程来实现异步操作,比如在网络爬虫场景下,我们发出一个请求之后,需要等待一定的时间才能得到响应,但其实在这个等待过程中,程序可以干许多其他的事情,等到响应得到之后才切换回来继续处理,这样可以充分利用 CPU 和其他资源,这就是协程的优势。
爬虫基本思路:
查看网页源代码,可以找到我们想要提取的数据
检查分析网页:
可以发现一页里的每条房源的各种信息都在li标签下
第1页:https://cd.lianjia.com/ershoufang/ 第2页:https://cd.lianjia.com/ershoufang/pg2/ 第3页:https://cd.lianjia.com/ershoufang/pg3/ 第100页:https://cd.lianjia.com/ershoufang/pg100/ 分析易得翻页的规律,构造请求url列表。
异步爬虫代码如下:
import asyncio
import aiohttp
from lxml import etree
import logging
import datetime
import openpyxl
wb = openpyxl.Workbook()
sheet = wb.active
sheet.append(['房源', '房子信息', '所在区域', '单价', '关注人数和发布时间', '标签'])
logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s: %(message)s')
start = datetime.datetime.now()
class Spider(object):
def __init__(self):
self.semaphore = asyncio.Semaphore(6) # 信号量,控制协程数,防止爬的过快被反爬
self.header = {
"Host": "cd.lianjia.com",
"Referer": "https://cd.lianjia.com/ershoufang/",
"Cookie": "lianjia_uuid=db0b1b8b-01df-4ed1-b623-b03a9eb26794; _smt_uid=5f2eabe8.5e338ce0; UM_distinctid=173ce4f874a51-0191f33cd88e85-b7a1334-144000-173ce4f874bd6; _jzqy=1.1596894185.1596894185.1.jzqsr=baidu.-; _ga=GA1.2.7916096.1596894188; gr_user_id=6aa4d13e-c334-4a71-a611-d227d96e064a; Hm_lvt_678d9c31c57be1c528ad7f62e5123d56=1596894464; _jzqx=1.1596897192.1596897192.1.jzqsr=cd%2Elianjia%2Ecom|jzqct=/ershoufang/pg2/.-; select_city=510100; lianjia_ssid=c9a3d829-9d20-424d-ac4f-edf23ae82029; Hm_lvt_9152f8221cb6243a53c83b956842be8a=1596894222,1597055584; gr_session_id_a1a50f141657a94e=33e39c24-2a1c-4931-bea2-90c3cc70389f; CNZZDATA1253492306=874845162-1596890927-https%253A%252F%252Fwww.baidu.com%252F%7C1597054876; CNZZDATA1254525948=1417014870-1596893762-https%253A%252F%252Fwww.baidu.com%252F%7C1597050413; CNZZDATA1255633284=1403675093-1596890300-https%253A%252F%252Fwww.baidu.com%252F%7C1597052407; CNZZDATA1255604082=1057147188-1596890168-https%253A%252F%252Fwww.baidu.com%252F%7C1597052309; _qzjc=1; gr_session_id_a1a50f141657a94e_33e39c24-2a1c-4931-bea2-90c3cc70389f=true; _jzqa=1.3828792903266388500.1596894185.1596897192.1597055585.3; _jzqc=1; _jzqckmp=1; sensorsdata2015jssdkcross=%7B%22distinct_id%22%3A%22173ce4f8b4f317-079892aca8aaa8-b7a1334-1327104-173ce4f8b50247%22%2C%22%24device_id%22%3A%22173ce4f8b4f317-079892aca8aaa8-b7a1334-1327104-173ce4f8b50247%22%2C%22props%22%3A%7B%22%24latest_traffic_source_type%22%3A%22%E7%9B%B4%E6%8E%A5%E6%B5%81%E9%87%8F%22%2C%22%24latest_referrer%22%3A%22%22%2C%22%24latest_referrer_host%22%3A%22%22%2C%22%24latest_search_keyword%22%3A%22%E6%9C%AA%E5%8F%96%E5%88%B0%E5%80%BC_%E7%9B%B4%E6%8E%A5%E6%89%93%E5%BC%80%22%7D%7D; _gid=GA1.2.865060575.1597055587; Hm_lpvt_9152f8221cb6243a53c83b956842be8a=1597055649; srcid=eyJ0Ijoie1wiZGF0YVwiOlwiOWQ4ODYyNmZhMmExM2Q0ZmUxMjk1NWE2YTRjY2JmODZiZmFjYTc2N2U1ZTc2YzM2ZDVkNmM2OGJlOWY5ZDZhOWNkN2U3YjlhZWZmZTllNGE3ZTUwYjA3NGYwNDEzMThkODg4NTBlMWZhZmRjNTIwNDBlMDQ2Mjk2NTYxOWQ1Y2VlZjE5N2FhZjUyMTZkOTcyZjg4YzNiM2U1MThmNjc5NmQ4MGUxMmU2YTM4MmI3ZmU0NmFhNTJmYmMyYWU1ZWI3MjU5YWExYTQ1YWFkZDUyZWVjMzM2NTFjYTA2M2NlM2ExMzZhNjEwYjFjYzQ0OTY5MTQwOTA4ZjQ0MjQ3N2ExMDkxNTVjODFhN2MzMzg5YWM3MzBmMTQxMjU4NzAwYzk5ODE3MTk1ZTNiMjc4NWEzN2M3MTIwMjdkYWUyODczZWJcIixcImtleV9pZFwiOlwiMVwiLFwic2lnblwiOlwiYmExZDJhNWZcIn0iLCJyIjoiaHR0cHM6Ly9jZC5saWFuamlhLmNvbS9lcnNob3VmYW5nLyIsIm9zIjoid2ViIiwidiI6IjAuMSJ9; _qzja=1.726562344.1596894309336.1596897192124.1597055583833.1597055601626.1597055649949.0.0.0.12.3; _qzjb=1.1597055583833.3.0.0.0; _qzjto=3.1.0; _jzqb=1.3.10.1597055585.1; _gat=1; _gat_past=1; _gat_global=1; _gat_new_global=1; _gat_dianpu_agent=1",
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/84.0.4147.89 Safari/537.36"
}
async def scrape(self, url):
async with self.semaphore:
session = aiohttp.ClientSession(headers=self.header)
response = await session.get(url)
result = await response.text()
await session.close()
return result
async def scrape_index(self, page):
url = f'https://cd.lianjia.com/ershoufang/pg{page}/'
text = await self.scrape(url)
await self.parse(text)
async def parse(self, text):
html = etree.HTML(text)
lis = html.xpath('//*[@id="content"]/div[1]/ul/li')
for li in lis:
house_data = li.xpath('.//div[@class="title"]/a/text()')[0] # 房源
house_info = li.xpath('.//div[@class="houseInfo"]/text()')[0] # 房子信息
address = ' '.join(li.xpath('.//div[@class="positionInfo"]/a/text()')) # 位置信息
price = li.xpath('.//div[@class="priceInfo"]/div[2]/span/text()')[0] # 单价 元/平米
attention_num = li.xpath('.//div[@class="followInfo"]/text()')[0] # 关注人数和发布时间
tag = ' '.join(li.xpath('.//div[@class="tag"]/span/text()')) # 标签
sheet.append([house_data, house_info, address, price, attention_num, tag])
logging.info([house_data, house_info, address, price, attention_num, tag])
def main(self):
# 100页的数据
scrape_index_tasks = [asyncio.ensure_future(self.scrape_index(page)) for page in range(1, 101)]
loop = asyncio.get_event_loop()
tasks = asyncio.gather(*scrape_index_tasks)
loop.run_until_complete(tasks)
if __name__ == '__main__':
spider = Spider()
spider.main()
wb.save('house.xlsx')
delta = (datetime.datetime.now() - start).total_seconds()
print("用时:{:.3f}s".format(delta))
运行结果如下:
成功爬取了100页的数据,共有3000条房源信息,用时15.976s。
多线程版爬虫如下:
import requests
from lxml import etree
import openpyxl
from concurrent.futures import ThreadPoolExecutor
import datetime
import logging
headers = {
"Host": "cd.lianjia.com",
"Referer": "https://cd.lianjia.com/ershoufang/",
"Cookie": "lianjia_uuid=db0b1b8b-01df-4ed1-b623-b03a9eb26794; _smt_uid=5f2eabe8.5e338ce0; UM_distinctid=173ce4f874a51-0191f33cd88e85-b7a1334-144000-173ce4f874bd6; _jzqy=1.1596894185.1596894185.1.jzqsr=baidu.-; _ga=GA1.2.7916096.1596894188; gr_user_id=6aa4d13e-c334-4a71-a611-d227d96e064a; Hm_lvt_678d9c31c57be1c528ad7f62e5123d56=1596894464; _jzqx=1.1596897192.1596897192.1.jzqsr=cd%2Elianjia%2Ecom|jzqct=/ershoufang/pg2/.-; select_city=510100; lianjia_ssid=c9a3d829-9d20-424d-ac4f-edf23ae82029; Hm_lvt_9152f8221cb6243a53c83b956842be8a=1596894222,1597055584; gr_session_id_a1a50f141657a94e=33e39c24-2a1c-4931-bea2-90c3cc70389f; CNZZDATA1253492306=874845162-1596890927-https%253A%252F%252Fwww.baidu.com%252F%7C1597054876; CNZZDATA1254525948=1417014870-1596893762-https%253A%252F%252Fwww.baidu.com%252F%7C1597050413; CNZZDATA1255633284=1403675093-1596890300-https%253A%252F%252Fwww.baidu.com%252F%7C1597052407; CNZZDATA1255604082=1057147188-1596890168-https%253A%252F%252Fwww.baidu.com%252F%7C1597052309; _qzjc=1; gr_session_id_a1a50f141657a94e_33e39c24-2a1c-4931-bea2-90c3cc70389f=true; _jzqa=1.3828792903266388500.1596894185.1596897192.1597055585.3; _jzqc=1; _jzqckmp=1; sensorsdata2015jssdkcross=%7B%22distinct_id%22%3A%22173ce4f8b4f317-079892aca8aaa8-b7a1334-1327104-173ce4f8b50247%22%2C%22%24device_id%22%3A%22173ce4f8b4f317-079892aca8aaa8-b7a1334-1327104-173ce4f8b50247%22%2C%22props%22%3A%7B%22%24latest_traffic_source_type%22%3A%22%E7%9B%B4%E6%8E%A5%E6%B5%81%E9%87%8F%22%2C%22%24latest_referrer%22%3A%22%22%2C%22%24latest_referrer_host%22%3A%22%22%2C%22%24latest_search_keyword%22%3A%22%E6%9C%AA%E5%8F%96%E5%88%B0%E5%80%BC_%E7%9B%B4%E6%8E%A5%E6%89%93%E5%BC%80%22%7D%7D; _gid=GA1.2.865060575.1597055587; Hm_lpvt_9152f8221cb6243a53c83b956842be8a=1597055649; srcid=eyJ0Ijoie1wiZGF0YVwiOlwiOWQ4ODYyNmZhMmExM2Q0ZmUxMjk1NWE2YTRjY2JmODZiZmFjYTc2N2U1ZTc2YzM2ZDVkNmM2OGJlOWY5ZDZhOWNkN2U3YjlhZWZmZTllNGE3ZTUwYjA3NGYwNDEzMThkODg4NTBlMWZhZmRjNTIwNDBlMDQ2Mjk2NTYxOWQ1Y2VlZjE5N2FhZjUyMTZkOTcyZjg4YzNiM2U1MThmNjc5NmQ4MGUxMmU2YTM4MmI3ZmU0NmFhNTJmYmMyYWU1ZWI3MjU5YWExYTQ1YWFkZDUyZWVjMzM2NTFjYTA2M2NlM2ExMzZhNjEwYjFjYzQ0OTY5MTQwOTA4ZjQ0MjQ3N2ExMDkxNTVjODFhN2MzMzg5YWM3MzBmMTQxMjU4NzAwYzk5ODE3MTk1ZTNiMjc4NWEzN2M3MTIwMjdkYWUyODczZWJcIixcImtleV9pZFwiOlwiMVwiLFwic2lnblwiOlwiYmExZDJhNWZcIn0iLCJyIjoiaHR0cHM6Ly9jZC5saWFuamlhLmNvbS9lcnNob3VmYW5nLyIsIm9zIjoid2ViIiwidiI6IjAuMSJ9; _qzja=1.726562344.1596894309336.1596897192124.1597055583833.1597055601626.1597055649949.0.0.0.12.3; _qzjb=1.1597055583833.3.0.0.0; _qzjto=3.1.0; _jzqb=1.3.10.1597055585.1; _gat=1; _gat_past=1; _gat_global=1; _gat_new_global=1; _gat_dianpu_agent=1",
"User-Agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/84.0.4147.89 Safari/537.36"
}
logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s: %(message)s')
wb = openpyxl.Workbook()
sheet = wb.active
sheet.append(['房源', '房子信息', '所在区域', '单价', '关注人数和发布时间', '标签'])
start = datetime.datetime.now()
def get_house(page):
if page == 1:
url = "https://cd.lianjia.com/ershoufang/"
else:
url = f"https://cd.lianjia.com/ershoufang/pg{page}/"
res = requests.get(url, headers=headers)
html = etree.HTML(res.text)
lis = html.xpath('//*[@id="content"]/div[1]/ul/li')
for li in lis:
house_data = li.xpath('.//div[@class="title"]/a/text()')[0] # 房源
house_info = li.xpath('.//div[@class="houseInfo"]/text()')[0] # 房子信息
address = ' '.join(li.xpath('.//div[@class="positionInfo"]/a/text()')) # 位置信息
price = li.xpath('.//div[@class="priceInfo"]/div[2]/span/text()')[0] # 单价 元/平米
attention_num = li.xpath('.//div[@class="followInfo"]/text()')[0] # 关注人数和发布时间
tag = ' '.join(li.xpath('.//div[@class="tag"]/span/text()')) # 标签
sheet.append([house_data, house_info, address, price, attention_num, tag])
logging.info([house_data, house_info, address, price, attention_num, tag])
if __name__ == '__main__':
with ThreadPoolExecutor(max_workers=6) as executor:
executor.map(get_house, [page for page in range(1, 101)])
wb.save('house.xlsx')
delta = (datetime.datetime.now() - start).total_seconds()
print("用时:{:.3f}s".format(delta))
运行结果如下:
成功爬取了100页的数据,共有3000条房源信息,用时16.796s。
土地市场数据一般会公示在当地的公共资源交易中心,但经常会出现只公示当周或当月数据的情况,因此,我们得去找专业的土地网站获取交易数据。比如土流网:https://www.tudinet.com/market-0-0-0-0/
网站结构简单,简单的url翻页构造,然后用xpath解析提取数据即可。
爬虫代码如下:
import requests
from lxml import etree
import random
import time
import logging
import openpyxl
from datetime import datetime
wb = openpyxl.Workbook()
sheet = wb.active
sheet.append(['土地位置', '出让形式', '推出时间', '土地面积', '规划建筑面积', '土地地址', '成交状态', '土地代号', '规划用途'])
logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s: %(message)s')
user_agent = [
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.1 (KHTML, like Gecko) Chrome/22.0.1207.1 Safari/537.1",
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.6 (KHTML, like Gecko) Chrome/20.0.1092.0 Safari/536.6",
"Mozilla/5.0 (Windows NT 6.2) AppleWebKit/536.6 (KHTML, like Gecko) Chrome/20.0.1090.0 Safari/536.6",
"Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.1 (KHTML, like Gecko) Chrome/19.77.34.5 Safari/537.1",
"Mozilla/5.0 (Windows NT 6.0) AppleWebKit/536.5 (KHTML, like Gecko) Chrome/19.0.1084.36 Safari/536.5",
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1063.0 Safari/536.3",
"Mozilla/5.0 (Windows NT 5.1) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1063.0 Safari/536.3",
"Mozilla/5.0 (Windows NT 6.2) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1062.0 Safari/536.3",
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1062.0 Safari/536.3",
"Mozilla/5.0 (Windows NT 6.2) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1061.1 Safari/536.3",
"Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1061.1 Safari/536.3",
"Mozilla/5.0 (Windows NT 6.1) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1061.1 Safari/536.3",
"Mozilla/5.0 (Windows NT 6.2) AppleWebKit/536.3 (KHTML, like Gecko) Chrome/19.0.1061.0 Safari/536.3",
"Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/535.24 (KHTML, like Gecko) Chrome/19.0.1055.1 Safari/535.24"
]
start = datetime.now()
def get_info(page):
headers = {
"User-Agent": random.choice(user_agent)
}
url = f'https://www.tudinet.com/market-254-0-0-0/list-pg{page}.html'
resp = requests.get(url, headers=headers).text
time.sleep(1)
html = etree.HTML(resp)
lis = html.xpath('//div[@class="land-l-cont"]/dl')
# print(len(lis)) # 一页35条信息
for li in lis:
try:
location = li.xpath('.//dd/p[7]/text()')[0] # 土地位置
transfer_form = li.xpath('.//dt/i/text()')[0] # 出让形式
launch_time = li.xpath('.//dd/p[1]/text()')[0] # 推出时间
land_area = li.xpath('.//dd/p[3]/text()')[0] # 土地面积
planning_area = li.xpath('.//dd/p[5]/text()')[0] # 规划建筑面积
address = li.xpath('.//dd/p[4]/text()')[0] # 土地地址
state = li.xpath('.//dd/p[2]/text()')[0] # 成交状态
area_code = li.xpath('.//dt/span/text()')[0] # 土地代号
planned_use = li.xpath('.//dd/p[6]/text()')[0] # 规划用途
data = [location, transfer_form, launch_time, land_area, planning_area, address, state, area_code, planned_use]
sheet.append(data)
logging.info(data)
except Exception as e:
logging.info(e.args[0])
continue
def main():
for i in range(1, 101):
get_info(i)
logging.info(f'抓取第{i}页数据完毕')
# 休眠 防止爬得过快 给服务器减少压力
time.sleep(random.uniform(1, 2))
wb.save(filename="real_estate_info.xlsx")
if __name__ == '__main__':
main()
delta = (datetime.now() - start).total_seconds()
print(f'数据抓取完毕,用时:{delta}')
运行爬虫代码,提取到成都地区3158块土地数据,结果如下:
数据比较干净和完整,可以直接用于数据分析。
import pandas as pd
from pyecharts import options as opts
from pyecharts.charts import Pie
from pyecharts.globals import CurrentConfig, ThemeType
# 引用本地资源
CurrentConfig.ONLINE_HOST = 'D:/python/pyecharts-assets-master/assets/'
# 读取数据
df = pd.read_excel('real_estate_info.xlsx').loc[:, ['出让形式', '成交状态']]
# 统计
df1 = df['出让形式'].value_counts()
df2 = df['成交状态'].value_counts()
# 构造data_pair
data_pair_1 = [(i, int(j)) for i, j in zip(df1.index, df1.values)]
data_pair_2 = [(i, int(j)) for i, j in zip(df2.index, df2.values)]
# 绘制饼图
c = (
Pie(init_opts=opts.InitOpts(theme=ThemeType.DARK, width="1100px", height="500px")) # 初始配置项
.add(
"土地出让形式",
data_pair_1,
center=["25%", "50%"],
label_opts=opts.LabelOpts(is_show=True),
)
.set_colors(['red', 'blue', 'purple'])
.add(
"土地成交状态",
data_pair_2,
center=["70%", "50%"],
label_opts=opts.LabelOpts(is_show=True),
)
.set_global_opts(title_opts=opts.TitleOpts(title="土地出让形式&土地成交状态占比"), # 全局配置项
legend_opts=opts.LegendOpts(is_show=False)
)
.set_series_opts( # 系列配置项
tooltip_opts=opts.TooltipOpts(
trigger="item", formatter="{a} <br/>{b}: {c} ({d}%)"
)
)
.render("pie_.html")
)
统计分析,并用pyecharts饼图可视化。已有的数据中,从 2015年9月 到 2020年2月,成都土地出让形式:挂牌出让占比67.73%、拍卖出让占比31.45%,只有很少一部分是招标出让,仅占比0.82%,成都土地招拍挂未成交和流拍土地占比不到一半,而已成交土地占比高达65.77%,整体成交率较高,原因可能为有意向竞拍人数量多、出价比较合适。
import pandas as pd
import pyecharts.options as opts
from pyecharts.charts import Bar
from pyecharts.globals import CurrentConfig, ThemeType
CurrentConfig.ONLINE_HOST = 'D:/python/pyecharts-assets-master/assets/'
df = pd.read_excel('real_estate_info.xlsx').loc[:, ['推出时间', '土地面积', '规划建筑面积']]
date = df['推出时间'].str.split('年', expand=True)[0] # 这列的字符串 按年切割
df['年份'] = date # 添加新的一列 年份
# 取掉 '平' 数据类型转为float
df['土地面积'] = df['土地面积'].str[:-1].map(float)
df['规划建筑面积'] = df['规划建筑面积'].str[:-1].map(float)
# 分组 求和 单位转换为 万m²
land_area = df.groupby('年份').agg({'土地面积': 'sum'}) / 10000
planned_area = df.groupby('年份').agg({'规划建筑面积': 'sum'}) / 10000
# <class 'pandas.core.frame.DataFrame'>
print(land_area, type(land_area))
print(planned_area, type(planned_area))
# 2016年-2019年 爬取的数据 2020年的只有两个月数据 2015年的数据是9月之后的
years = [int(y) for y in land_area.index[1:-1]]
# 面积保留两位小数
ydata_1 = [float('{:.2f}'.format(i)) for i in land_area['土地面积'][1:-1]]
ydata_2 = [float('{:.2f}'.format(j)) for j in planned_area['规划建筑面积'][1:-1]]
# 绘制柱形图
bar = (
Bar(init_opts=opts.InitOpts(theme=ThemeType.DARK))
.add_xaxis(xaxis_data=years)
.add_yaxis(
series_name='土地面积(万m²)',
yaxis_data=ydata_1,
label_opts=opts.LabelOpts(is_show=False)
)
.add_yaxis(
series_name='规划建筑面积(万m²)',
yaxis_data=ydata_2,
label_opts=opts.LabelOpts(is_show=False)
)
.set_global_opts(
xaxis_opts=opts.AxisOpts(name='年份'),
yaxis_opts=opts.AxisOpts(name='万m²')
)
.set_series_opts(markpoint_opts=opts.MarkPointOpts(
data=[
opts.MarkPointItem(type_="max", name="最大值"),
opts.MarkPointItem(type_="min", name="最小值"),
]),
)
.render('bar_.html')
)
从2016年到2019年,土地交易面积逐年增加,2018土地交易面积开始达到高潮,该年总的规划建筑面积为4156.15万m²,之后2019年土地交易面积较2018年有所下降。
import pandas as pd
from pyecharts import options as opts
from pyecharts.charts import Bar
from pyecharts.globals import CurrentConfig, ThemeType
CurrentConfig.ONLINE_HOST = 'D:/python/pyecharts-assets-master/assets/'
df = pd.read_excel('real_estate_info.xlsx').loc[:, ['推出时间', '土地面积', '规划建筑面积']]
df['土地面积'] = df['土地面积'].str[:-1].map(float)
df['规划建筑面积'] = df['规划建筑面积'].str[:-1].map(float)
date = df['推出时间'].str.split('月', expand=True)[0] # 这列的字符串 按月切割
date = date.apply(lambda x: x + '月') # 都加上月
# print(date)
df['月份'] = date
# 取2019年之后的
df1 = df[(df['推出时间'].str[:4] == '2020') | (df['推出时间'].str[:4] == '2019')]
df2 = df1.groupby('月份').agg({'土地面积': 'sum'}) / 10000
df3 = df1.groupby('月份').agg({'规划建筑面积': 'sum'}) / 10000
# print(df2)
# print(df3)
month = df2.index.tolist()
ydata_1 = [float('{:.2f}'.format(i)) for i in df2['土地面积']]
ydata_2 = [float('{:.2f}'.format(j)) for j in df3['规划建筑面积']]
bar = (
Bar(init_opts=opts.InitOpts(theme=ThemeType.DARK))
.add_xaxis(xaxis_data=month)
.add_yaxis(
series_name='土地面积(万m²)',
yaxis_data=ydata_1,
stack='stack1', # 堆叠
label_opts=opts.LabelOpts(is_show=False)
)
.add_yaxis(
series_name='规划建筑面积(万m²)',
yaxis_data=ydata_2,
stack='stack1',
label_opts=opts.LabelOpts(is_show=False)
)
.reversal_axis() # 反转 水平条形图
.set_global_opts(
xaxis_opts=opts.AxisOpts(name='万m²'),
yaxis_opts=opts.AxisOpts(name='月份')
)
.render('reverse_bar.html')
)
从2019年1月到2020年2月各月份上土地交易面积来看,2019年成都土地交易市场比较活跃,土地交易面积起伏较大,2019年12月规划建筑面积为817.47万m²,达到峰值,之后2020年开始,1、2月土地交易面积下降较多,部分原因可能是受年初国内新冠疫情爆发的影响。
import pandas as pd
from pyecharts.charts import Radar
from pyecharts import options as opts
from pyecharts.globals import CurrentConfig, ThemeType
CurrentConfig.ONLINE_HOST = 'D:/python/pyecharts-assets-master/assets/'
df = pd.read_excel('real_estate_info.xlsx')['规划用途']
datas = df.value_counts()
items = datas.index.tolist()
colors = ['#FF0000', '#FF4500', '#00FA9A', '#FFFFF0', '#FFD700']
# RadarItem:雷达图数据项配置
labels = [opts.RadarIndicatorItem(name=items[i], max_=50, color=colors[i]) for i in range(len(items))]
value = [int(j) for j in datas.values]
radar = (
Radar(init_opts=opts.InitOpts(theme=ThemeType.DARK))
.add_schema(
schema=labels
)
.add(
series_name='土地规划用途占比(%)',
data = [[round((x / sum(value)) * 100, 3) for x in value]],
areastyle_opts=opts.AreaStyleOpts(opacity=0.5, color='blue') # 区域填充颜色
)
.set_global_opts(
)
.render('radar.html')
)
交易土地的用途主要以工业用地为主,工业用地占比高达43.667%,还有相当一部分比例用作商业/办公用地、综合用地、其他用地,住宅用地仅占比5.098%。也从侧面反应出成都注重工业的发展,搜索一些资料了解到,“十二五”期间,成都工业年均增速约14.4%,在15个副省级城市中排名首位,有力支撑了成都地区生产总值迈上“万亿”台阶。
import matplotlib.pyplot as plt
import numpy as np
import seaborn as sns
import pandas as pd
import matplotlib as mpl
df = pd.read_excel('real_estate_info.xlsx')
area = df['土地位置']
# 成都主要 区 县 市 9区6县4市
with open('test.txt', encoding='utf-8') as f:
areas = f.read().split('、')
for item in areas:
# 每个行政区 对每行数据都进行判断
# 土地位置里包含行政区名 值为规划建筑面积 不包含 值为0
# 得到19列 以行政区为列名 其下面为规划建筑面积
df[item] = [eval(df.loc[x, '规划建筑面积'][:-1]) if item in df.loc[x, '土地位置'] else 0 for x in range(len(df['土地位置']))]
date = df['推出时间'].str.split('年', expand=True)[0] # 这列的字符串 按年切割
df['年份'] = date # 添加新的一列 年份
df1 = df[areas]
df1.index = df['年份']
df2 = df1.groupby('年份').sum()
# print(df2.iloc[:5, ::]) # 2020年数据只有到2月的 舍去
# print(type(df2.iloc[:5, ::].T)) # 转置
datas = np.array(df2.iloc[:5, ::].T) # 19行 5列 二维数组
print(datas, type(datas))
x_label = [year for year in range(2015, 2020)]
y_label = areas
mpl.rcParams['font.family'] = 'Kaiti'
fig, ax = plt.subplots(figsize=(15, 9))
# 绘制热力图 cmap:从数字到色彩空间的映射
sns.heatmap(data=df2.iloc[:5, ::].T, linewidths=0.25,
linecolor='black', ax=ax, annot=True,
fmt='.1f', cmap='OrRd', robust=True,
)
# 添加描述信息 x y轴 title
ax.set_xlabel('年份', fontdict={'size': 18, 'weight': 'bold'})
ax.set_ylabel('行政区', fontdict={'size': 18, 'weight': 'bold'})
ax.set_title(r'各行政区2015-2019年的总规划建筑面积(平方米)', fontsize=25, x=0.5, y=1.02)
# 隐藏边框
ax.spines['top'].set_visible(False)
ax.spines['right'].set_visible(False)
ax.spines['left'].set_visible(False)
ax.spines['bottom'].set_visible(False)
# 保存 展示图片
plt.savefig('heat_map.png')
plt.show()
从交易区域来看,除双流县和郫县,各行政区每年都有一定土地成交,龙泉驿区和青白江区 2018 年到 2019 年交易土地面积最大,土地交易市场火热。