您现在的位置: 主页 > 上位机技术 > python > python爬取页面数据,并将数据插入到MySQL中?
本文所属标签:
为本文创立个标签吧:

python爬取页面数据,并将数据插入到MySQL中?

来源:网络整理 网络用户发布,如有版权联系网管删除 2018-08-13 



参考答案如下:

是(%s, %s, %s, %s, %s, %s)吧

参考答案如下:

# -*- coding: utf-8 -*-"""Created on Mon Aug 31 20:05:25 2015@author: wt"""import requestsfrom bs4 import BeautifulSoupimport MySQLdbimport MySQLdb.cursorsimport sysreload(sys)sys.setdefaultencoding('utf8')#def getInfo(url):proxy_info = []headers = {'User-Agent':'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/31.0.1650.63 Safari/537.36'}page_code = requests.get('http://www.xici.net.co/nn', headers=headers).textsoup = BeautifulSoup(page_code)table_soup = soup.find('table')proxy_list = table_soup.findAll('tr')[1:]conn = MySQLdb.connect(host='10.10.21.21', user='root',                    passwd='123456', db='python', port = 3306, charset = 'utf8')                    cur = conn.cursor()for tr in proxy_list:    td_list = tr.findAll('td')    ip = td_list[2].string    port = td_list[3].string    location = td_list[4].string or td_list[4].find('a').string    anonymity = td_list[5].string    proxy_type = td_list[6].string    speed = td_list[7].find('div', {'class': 'bar'})['title']    connect_time = td_list[8].find('div', {'class': 'bar'})['title']    validate_time = td_list[9].string    # strip    l = [ip, port, location, anonymity, proxy_type, speed, connect_time, validate_time]    cur.execute("insert into proxy(ip, port, location, anonymity, proxy_type, speed, connect_time, validate_time) values(%s,%s,%s,%s,%s,%s,%s,%s)", (l[0], l[1], l[2], l[3], l[4], l[5], l[6], l[7]))    print 'success connect'    conn.commit()cur.close()conn.close()

之前一直不能将这些数据插入到数据库中,主要原因是我的insert语句写的有问题
开始
cur.execute("insert into proxy(ip, port, location, anonymity, proxy_type, speed, connect_time, validate_time) values(%s%s%s%s%s%s%s%s)", (l[0], l[1], l[2], l[3], l[4], l[5], l[6], l[7]))

运行程序后,一直出现这样的错误,而且找了很多解决方法,不知道怎么解决
今天突然发现,自己的程序中这条语句写的有问题
正确的是这样的
cur.execute("insert into proxy(ip, port, location, anonymity, proxy_type, speed, connect_time, validate_time) values(%s,%s,%s,%s,%s,%s,%s,%s)", (l[0], l[1], l[2], l[3], l[4], l[5], l[6], l[7]))

终于解决了这个问题

              查看评论 回复



嵌入式交流网主页 > 上位机技术 > python > python爬取页面数据,并将数据插入到MySQL中?
 解决 有问题 写的

"python爬取页面数据,并将数据插入到MySQL中?"的相关文章

网站地图

围观()