使用java的HttpClient实现抓取网页数据
生活随笔
收集整理的這篇文章主要介紹了
使用java的HttpClient实现抓取网页数据
小編覺得挺不錯的,現在分享給大家,幫大家做個參考.
網絡爬蟲就是用程序幫助我們訪問網絡上的資源,我們一直以來都是使用HTTP協議來訪問互聯網上的網頁,網絡爬蟲需要編寫程序,在這里使用同樣的HTTP協議來訪問網頁。
1.pom依賴
? ? ? ?<dependency><groupId>org.apache.httpcomponents</groupId><artifactId>httpclient</artifactId><version>4.5.2</version></dependency><dependency><groupId>org.slf4j</groupId><artifactId>slf4j-log4j12</artifactId><version>1.7.25</version></dependency>2.log4j的配置文件
log4j.properties
log4j.rootLogger=DEBUG,A1 log4j.logger.com.yfy = DEBUG ? log4j.appender.A1=org.apache.log4j.ConsoleAppender log4j.appender.A1.layout=org.apache.log4j.PatternLayout log4j.appender.A1.layout.ConversionPattern=%-d{yyyy-MM-dd HH:mm:ss,SSS} [%t] [%c]-[%p] %m%n3.GET請求
public class HttpGetTest {public static void main(String[] args) throws URISyntaxException {//1.創建HttpClient對象CloseableHttpClient httpClient = HttpClients.createDefault(); ?//設置請求地址是:http://yun.itheima.com/search?keys=Java//創建URIBuilderURIBuilder uriBuilder = new URIBuilder("https://movie.douban.com/top250");uriBuilder.setParameter("start", "25"); ?//2.創建HttpGet對象,設置url訪問地址HttpGet httpGet = new HttpGet(uriBuilder.build()); ?//配置請求信息//有時候因為網絡,獲取目標服務器的原因,請求需要更長的時間才能完成,我們需要自定義相關時間RequestConfig config = RequestConfig.custom().setConnectTimeout(1000) //創建連接的最長時間,單位是毫秒.setConnectionRequestTimeout(500)//設置獲取連接的最長時間.setSocketTimeout(10 * 1000) //設置數據傳輸的最長時間.build(); ?System.out.println("發起請求的信息:" + httpGet);//3.使用HttpClient發起請求,獲取responseCloseableHttpResponse response = null;try {response = httpClient.execute(httpGet);//4.解析響應if (response.getStatusLine().getStatusCode() == 200) {String content = EntityUtils.toString(response.getEntity(), "utf8");System.out.println(content);System.out.println(content.length());}} catch (IOException e) {e.printStackTrace();} finally {try {response.close();} catch (IOException e) {e.printStackTrace();}try {httpClient.close();} catch (IOException e) {e.printStackTrace();}} ?} }4.POST請求
public class HttpParamTest {public static void main(String[] args) throws UnsupportedEncodingException {//1.創建HttpClient對象CloseableHttpClient httpClient = HttpClients.createDefault(); ?//2.創建HttpPost對象,設置url訪問地址HttpPost httpPost = new HttpPost("http://yun.itheima.com/search"); ?System.out.println("發起請求的信息:" + httpPost); ?//聲明List集合,封裝表單中的餐胡List<NameValuePair> params = new ArrayList<>();params.add(new BasicNameValuePair("keys", "Java")); ?//創建表單的Entity對象UrlEncodedFormEntity formEntity = new UrlEncodedFormEntity(params, "utf8");httpPost.setEntity(formEntity); ?//3.使用HttpClient發起請求,獲取responseCloseableHttpResponse response = null;try {response = httpClient.execute(httpPost);//4.解析響應if (response.getStatusLine().getStatusCode() == 200) {String content = EntityUtils.toString(response.getEntity(), "utf8");System.out.println(content.length());}} catch (IOException e) {e.printStackTrace();} finally {try {response.close();} catch (IOException e) {e.printStackTrace();}try {httpClient.close();} catch (IOException e) {e.printStackTrace();}}} }5.連接池
如果每次請求都創建HttpClient,會有頻繁創建和銷毀的問題,可以使用連接池來解決這個問題
public class HttpClientPoolTest {public static void main(String[] args) {//創建連接池管理器PoolingHttpClientConnectionManager cm = new PoolingHttpClientConnectionManager();//設置最大連接數cm.setMaxTotal(100);//設置每個主機的最大連接數cm.setDefaultMaxPerRoute(10);//使用連接池管理器發起請求doGet(cm);doGet(cm);} ?private static void doGet(PoolingHttpClientConnectionManager cm) {//不是每次創建新的HttpClient,而是從連接池中獲取HttpClientCloseableHttpClient httpClient = HttpClients.custom().setConnectionManager(cm).build(); ?HttpGet httpGet = new HttpGet("https://movie.douban.com/top250");CloseableHttpResponse response = null;try {response = httpClient.execute(httpGet);if (response.getStatusLine().getStatusCode() == 200) {String content = EntityUtils.toString(response.getEntity(), "utf8");System.out.println(content.length());}} catch (IOException e) {e.printStackTrace();} finally {if (response != null) {try {response.close();} catch (IOException e) {e.printStackTrace();}}//不能關閉HttpClient,由連接池管理}} }?
總結
以上是生活随笔為你收集整理的使用java的HttpClient实现抓取网页数据的全部內容,希望文章能夠幫你解決所遇到的問題。
- 上一篇: Docker私有仓库的搭建
- 下一篇: 使用jsoup解析html