Java配合爬虫代理IP采集大众点评店铺信息

news/2024/7/5 19:20:46 标签: java, 爬虫, json

大众点评店铺网址格式如下:
http://www.dianping.com/shop/6000000/
http://www.dianping.com/shop/6000001/

shop后面的ID是连续的,范围是1-1500万,当然有许多店铺是不存在的(404错误),实际的店铺数量在700万左右,这里是用的穷举法,当然也可以进入网页按深度索引。

程序采集过程中会发现大众点评采取了严格的反爬虫措施,如果一个IP一秒一个进行采集,大概采集500-1000个左右就会出现403错误,IP被冻结了,一段时间后才解封,如果冻结了你不死心,继续大量采,就永久冻结了。

其实这个问题很好解决,使用爬虫代理IP,那403迎刃而解,IP采用无忧代理IP,网址 http://www.data5u.com/buy/dynamic.html

代码用到了HtmlUnit和Jsoup,如下:


import java.io.BufferedInputStream;
import java.io.InputStream;
import java.net.HttpURLConnection;
import java.util.ArrayList;
import java.util.List;

import org.jsoup.Jsoup;
import org.jsoup.nodes.Document;
import org.jsoup.nodes.Element;
import org.jsoup.select.Elements;

import com.gargoylesoftware.htmlunit.BrowserVersion;
import com.gargoylesoftware.htmlunit.Page;
import com.gargoylesoftware.htmlunit.ProxyConfig;
import com.gargoylesoftware.htmlunit.WebClient;
import com.gargoylesoftware.htmlunit.WebResponse;
import com.gargoylesoftware.htmlunit.html.HtmlPage;
import com.gargoylesoftware.htmlunit.util.NameValuePair;

/**
 * 这个DEMO主要为了测试爬虫(动态)代理IP的稳定性
 * 完美支持企业信息天眼查、电商Ebay、亚马逊、新浪微博、法院文书、分类信息等
 * 也可以作为爬虫参考项目,如需使用,请自行修改webParseHtml方法
 */
public class TestDynamicIpContinue {

    public static List ipList = new ArrayList<>();
    public static boolean gameOver = false;

    public static void main(String[] args) throws Exception {
        // 每隔几秒提取一次IP
        long fetchIpSeconds = 5;
        int testTime = 3;

        // 请填写无忧代理IP订单号,填写之后才可以提取到IP哦
        String order = "88888888888888888888888888888";

        // 你要抓去的目标网址
        String targetUrl = "http://www.dianping.com/shop/6000000/";

        // 设置referer信息,如果抓取淘宝、天猫需要设置
        String referer = "";
        // 开启对https的支持
        boolean https = true;
        // 是否输出Header信息
        boolean outputHeaderInfo = false;
        // 是否加载JS,加载JS会导致速度变慢
        boolean useJS = false;
        // 请求超时时间,单位毫秒,默认5秒
        int timeOut = 10000;

        if (order == null || "".equals(order)) {
            System.err.println("请输入爬虫(动态)代理订单号");
            return;
        }
        System.out.println(">>>>>>>>>>>>>>动态IP测试开始<<<<<<<<<<<<<<");
        System.out.println("***************");
        System.out.println("提取IP间隔 " + fetchIpSeconds + " 秒 ");
        System.out.println("爬虫目标网址  " + targetUrl);
        System.out.println("***************\n");
        TestDynamicIpContinue tester = new TestDynamicIpContinue();
        new Thread(tester.new GetIP(fetchIpSeconds * 1000, testTime, order, targetUrl, useJS, timeOut, referer, https, outputHeaderInfo)).start();

        while(!gameOver){
            try {
                Thread.sleep(100);
            } catch (InterruptedException e) {
                e.printStackTrace();
            }
        }
        System.out.println(">>>>>>>>>>>>>>动态IP测试结束<<<<<<<<<<<<<<");
        System.exit(0);
    }

    // 抓取IP138,检测IP
    public class Crawler extends Thread{
        @Override
        public void run() {
            webParseHtml(targetUrl);
        }

        long sleepMs = 200;
        boolean useJs = false;
        String targetUrl = "";
        int timeOut = 5000;
        String ipport = "";

        String referer;
        boolean https;
        boolean outputHeaderInfo;

        public Crawler(long sleepMs, String targetUrl, boolean useJs, int timeOut, String ipport, String referer, boolean https, boolean outputHeader) {
            this.sleepMs = sleepMs;
            this.targetUrl = targetUrl;
            this.useJs = useJs;
            this.timeOut = timeOut;
            this.ipport = ipport;

            this.referer = referer;
            this.https = https;
            this.outputHeaderInfo = outputHeader;
        }
        public String webParseHtml(String url) {
            String html = "";
            BrowserVersion[] versions = { BrowserVersion.CHROME, BrowserVersion.FIREFOX_38, BrowserVersion.INTERNET_EXPLORER_11, BrowserVersion.INTERNET_EXPLORER_8};
            WebClient client = new WebClient(versions[(int)(versions.length * Math.random())]);
            try {
                client.getOptions().setThrowExceptionOnFailingStatusCode(false);
                client.getOptions().setJavaScriptEnabled(useJs);
                client.getOptions().setCssEnabled(false);
                client.getOptions().setThrowExceptionOnScriptError(false);
                client.getOptions().setTimeout(timeOut);
                client.getOptions().setAppletEnabled(true);
                client.getOptions().setGeolocationEnabled(true);
                client.getOptions().setRedirectEnabled(true);

                // 对于HTTPS网站,加上这行代码可以跳过SSL验证
                client.getOptions().setUseInsecureSSL(https);

                if (referer != null && !"".equals(referer)) {
                    client.addRequestHeader("Referer", referer);
                }

                if (ipport != null) {
                    ProxyConfig proxyConfig = new ProxyConfig((ipport.split(",")[0]).split(":")[0], Integer.parseInt((ipport.split(",")[0]).split(":")[1]));
                    client.getOptions().setProxyConfig(proxyConfig);
                }else {
                    System.out.print(".");
                    return "";
                }

                long startMs = System.currentTimeMillis();

                Page page = client.getPage(url);
                WebResponse response = page.getWebResponse();

                if (outputHeaderInfo) {
                    // 输出header信息
                    List headers = response.getResponseHeaders();
                    for (NameValuePair nameValuePair : headers) {
                        System.out.println(nameValuePair.getName() + "-->" + nameValuePair.getValue());
                    }
                }

                boolean isJson = false ;
                if (response.getContentType().equals("application/json")) {
                    html = response.getContentAsString();
                    isJson = true ;
                }else if(page.isHtmlPage()){
                    html = ((HtmlPage)page).asXml();
                }

                long endMs = System.currentTimeMillis();

                Document doc = Jsoup.parse(html);System.out.println(getName() + " " + ipport + " 用时 " + (endMs - startMs) + "毫秒 :" + doc.select("title").text());               
            } catch (Exception e) {
                System.err.println(ipport + ":" + e.getMessage());
            } finally {
                client.close();
            }
            return html;
        }

    }

    // 定时获取动态IP
    public class GetIP implements Runnable{
        long sleepMs = 1000;
        int maxTime = 3;
        String order = "";
        String targetUrl;
        boolean useJs;
        int timeOut;
        String referer;
        boolean https;
        boolean outputHeaderInfo;

        public GetIP(long sleepMs, int maxTime, String order, String targetUrl, boolean useJs, int timeOut, String referer, boolean https, boolean outputHeaderInfo) {
            this.sleepMs = sleepMs;
            this.maxTime = maxTime;
            this.order = order;
            this.targetUrl = targetUrl;
            this.useJs = useJs;
            this.timeOut = timeOut;
            this.referer=referer;
            this.https=https;
            this.outputHeaderInfo=outputHeaderInfo;
        }

        @Override
        public void run() {
            int time = 1;
            while(!gameOver){
                if(time >= 4){
                    gameOver = true;
                    break;
                }
                try {
                    java.net.URL url = new java.net.URL("http://api.ip.data5u.com/dynamic/get.html?order=" + order + "&ttl&random=true");

                    HttpURLConnection connection = (HttpURLConnection)url.openConnection();
                    connection.setConnectTimeout(3000);
                    connection = (HttpURLConnection)url.openConnection();

                    InputStream raw = connection.getInputStream();  
                    InputStream in = new BufferedInputStream(raw);  
                    byte[] data = new byte[in.available()];
                    int bytesRead = 0;  
                    int offset = 0;  
                    while(offset < data.length) {  
                        bytesRead = in.read(data, offset, data.length - offset);  
                        if(bytesRead == -1) {  
                            break;  
                        }  
                        offset += bytesRead;  
                    }  
                    in.close();  
                    raw.close();
                    String[] res = new String(data, "UTF-8").split("\n");
                    System.out.println(">>>>>>>>>>>>>>当前返回IP量 " + res.length);
                    for (String ip : res) {
                        new Crawler(100, targetUrl, useJs, timeOut, ip, referer, https, outputHeaderInfo).start();
                    }
                } catch (Exception e) {
                    System.err.println(">>>>>>>>>>>>>>获取IP出错, " + e.getMessage());
                }
                try {
                    Thread.sleep(sleepMs);
                } catch (InterruptedException e) {
                    e.printStackTrace();
                }
            }
        }
    }

    public String joinList(List list){
        StringBuilder re = new StringBuilder();
        for (String string : list) {
            re.append(string).append(",");
        }
        return re.toString();
    }

    public String trim(String html) {
        if (html != null) {
            return html.replaceAll(" ", "").replaceAll("\n", "");
        }
        return null;
    }

}

转载于:https://blog.51cto.com/4123815/2308420


http://www.niftyadmin.cn/n/1494292.html

相关文章

oracle oms启动慢,寻求帮助:oms起不来

先说下oem的结构&#xff0c;三个oms服务器(04&#xff0c;05&#xff0c;06)&#xff0c;repository database是个三节点的RAC(A&#xff0c;B&#xff0c;C)&#xff0c;现在04上的oms服务可以起来&#xff0c;oem也能用&#xff0c;但是05&#xff0c;06上的oms服务都起不来…

oracle 存储loop,oracle 写存储过程有返回值时 注意在loop循环处添加返回值:=

例子&#xff1a;create or replace procedure p_xl isv_count NUMBER(10);beginfor rs in(select yhbh from dbyh) loopv_count : osm_pkg_arc_limited_configs.F_LIMITED_METERS_CREATE(‘rs.yhbh‘,10001,---限量用水idsysdate,201706,0.00,10000);end loop;commit;end p_xl…

Linux 下 MATLAB2017 键位修改

把Emacs的默认键位改为Windows

LeetCode 中级 - 从前序与中序遍历序列构造二叉树(105)

一个前序遍历序列和一个中序遍历序列可以确定一颗唯一的二叉树。 根据前序遍历的特点, 知前序序列(PreSequence)的首个元素(PreSequence[0])为二叉树的根(root), 然后在中序序列(InSequence)中查找此根(root), 根据中序遍历特点, 知在查找到的根(root) 前边的序列为根的左子树…

IDEA 实用插件总结

IDEA Plugin 备份 Date 2017.05.24 key promoter 在操作IDEA功能时会提示对应的快捷键maven helper 分析maven依赖lombok 支持编译logbok注解FindBugs 扫描工程发现代码明显错误告警问题Alibaba Java Coding Guidelines 阿里巴巴代码规范Grep Console IDEA 控制台日志输出,颜色…

oracle不同实例解锁用户,oracle 用户账号解锁linux下新建oracle数据库实例

linux下新建oracle数据库实例1、在Linux服务器的图形界面下&#xff0c;打开一个终端&#xff0c;输入如下的命令&#xff1b;xhost 2、切换到oracle 用户&#xff0c;使用如下的命令&#xff1a;su – oracle3、指定数据库实例的变量与值&#xff0c;假设要建立的数据实例的SI…

记一次grub引导修复

今早起床发现deepin出了问题&#xff0c;还好周六备份过一次&#xff0c;于是直接用镜像覆盖了主分区 但是开机后出现了引导错误&#xff0c;即grub>的命令行界面 参考一个博客&#xff0c;有惊无险的修复了 ls # 出现了很多设备&#xff0c;逐个测试 ls (hd,gpt?)&#x…

Fiddler 断点功能

Fiddler 断点&#xff1a; (1) Fiddler 是以作为代理服务器的方式进行工作的&#xff0c;所以&#xff0c;本地应用与服务器传递的这些数据都会经过 Fiddler&#xff1b;(2) 有的时候&#xff0c;我们希望在传递的中间进行修改后再传递&#xff0c;那么可以使用 Fiddler 的断点…