3atv精品不卡视频,97人人超碰国产精品最新,中文字幕av一区二区三区人妻少妇,久久久精品波多野结衣,日韩一区二区三区精品

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程语言 > python >内容正文

python

A Complete Tutorial on Tree Based Modeling from Scratch (in R Python)

發布時間:2025/3/21 python 28 豆豆
生活随笔 收集整理的這篇文章主要介紹了 A Complete Tutorial on Tree Based Modeling from Scratch (in R Python) 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.
轉載自:

http://www.analyticsvidhya.com/blog/2016/04/complete-tutorial-tree-based-modeling-scratch-in-python/

Introduction

Tree based learning algorithms are considered to be one of the best and mostly used supervised learning methods.?Tree based methods empower predictive models with high accuracy, stability and ease of interpretation.?Unlike linear models, they map non-linear relationships quite well. They are?adaptable at solving any kind of problem at hand (classification or regression).

Methods like decision trees, random forest, gradient boosting are being popularly used in all kinds of data science problems. Hence, for every analyst (fresher also), it’s important to learn these algorithms and use them for modeling.

This tutorial is meant to help beginners learn tree based modeling from scratch. After the successful completion of this tutorial, one is expected to become proficient at using tree based algorithms and?build predictive models.

Note: This tutorial requires no prior knowledge of machine learning. However, elementary knowledge of R or Python will be helpful. To get started you can follow?full tutorial in R?and?full tutorial in Python.

?

Table of Contents

  • What is a Decision Tree? How does it work?
  • Regression Trees vs Classification Trees
  • How does a tree decide where to split?
  • What are the key parameters of model building and how can we avoid over-fitting in decision trees?
  • Are tree based models better than linear models?
  • Working with Decision Trees in R and Python
  • What are the ensemble methods of trees based model?
  • What is Bagging? How does it work?
  • What is Random Forest ? How does it work?
  • What is Boosting ? How does it work?
  • Which is more powerful: GBM or Xgboost?
  • Working with GBM in R and Python
  • Working with Xgboost in R and Python
  • Where to Practice ?
  • ?

    ?

    1. What is a Decision Tree ? How does it work ?

    Decision tree?is a type of supervised learning algorithm (having a pre-defined target variable) that is mostly used in?classification problems. It works for both categorical and continuous input and output variables. In this technique, we split the population or sample into two or more homogeneous sets (or sub-populations) based on most significant splitter / differentiator in?input variables.

    Example:-

    Let’s say we have a sample of 30 students with three variables Gender (Boy/ Girl), Class( IX/ X) and Height (5 to 6 ft). 15 out of these 30 play cricket in?leisure time. Now, I want to create a model to?predict who will play cricket during leisure period? In this problem, we need to segregate students who play cricket in their leisure time based on highly significant input variable among all three.

    This is where decision tree helps, it will segregate the students based on all values of three variable and?identify the variable, which creates the?best homogeneous sets of students (which are heterogeneous to each other). In the snapshot below, you can see that variable Gender is able to identify best homogeneous sets compared to the other two variables.

    As mentioned above, decision tree identifies the most significant variable and it’s value that gives best homogeneous sets of population. Now the question which arises is, how does it identify the variable and the split? To do?this, decision tree uses various algorithms, which we will shall discuss in the following section.

    ?

    Types of Decision Trees

    Types of decision tree is based on the type of target variable we have. It can be of two types:

  • Categorical?Variable Decision Tree:?Decision Tree which has categorical target variable then it called as categorical?variable decision tree. Example:- In above scenario of student problem, where the target variable was “Student will play cricket or not” i.e. YES?or NO.
  • Continuous Variable Decision Tree:?Decision Tree has continuous?target variable then it is called as Continuous?Variable Decision Tree.
  • Example:-?Let’s say we have a problem to predict whether a customer will pay his renewal premium with an insurance company?(yes/ no). Here we know that income of customer is a?significant variable but insurance company does not have income details for all customers. Now, as we know this?is an important variable, then we can build a decision tree to predict customer income based on occupation, product and various other variables. In this case, we are predicting values for continuous variable.

    ?

    Important Terminology related to?Decision Trees

    Let’s look at the basic terminology used with Decision trees:

  • Root Node:?It represents entire population or sample and this further gets divided into two or more homogeneous sets.
  • Splitting:?It is a process of dividing a node into two or more sub-nodes.
  • Decision Node:?When a sub-node splits into further sub-nodes, then it is called decision node.
  • Leaf/ Terminal Node:?Nodes do not split is called Leaf or Terminal node.
  • Pruning:?When we remove sub-nodes of a decision node, this process is called pruning. You can say opposite process of splitting.
  • Branch / Sub-Tree:?A sub section of entire tree is called branch or sub-tree.
  • Parent and Child Node:?A node, which is divided into sub-nodes is called parent node of sub-nodes where as sub-nodes are?the child of parent node.
  • These are the terms commonly used for decision trees. As we know that every algorithm has advantages and disadvantages, below are the important factors which one should know.

    ?

    Advantages

  • Easy to Understand:?Decision tree output is?very easy to understand even for people from non-analytical background. It does not require any statistical knowledge to read and interpret them. Its graphical representation is very intuitive?and users can easily relate their hypothesis.
  • Useful in Data exploration:?Decision tree is one of the fastest way to identify most significant variables and relation between two or more variables. With the help of decision trees, we can create new variables / features that has better power to predict target variable. You can refer article (Trick to enhance power of regression model) for one such trick.? It can also be used in data exploration stage. For example, we are working on a problem where we have information available in hundreds of variables, there decision tree will help to identify most significant variable.
  • Less data cleaning required:?It requires less data cleaning compared to some?other modeling techniques.?It is not influenced by outliers and missing values to a fair degree.
  • Data type is not a constraint:?It can handle both numerical and categorical variables.
  • Non Parametric Method:?Decision tree is?considered to be a non-parametric method. This means?that decision trees have no assumptions about the space distribution and?the classifier structure.
  • ?

    Disadvantages

  • Over fitting:?Over fitting is one of the most practical difficulty for decision tree models. This problem gets solved by?setting constraints on model parameters and pruning (discussed in detailed below).
  • Not fit for continuous variables: While working with continuous numerical variables, decision tree looses information when?it categorizes variables in different categories.
  • ?

    ?

    2. Regression Trees vs Classification Trees

    We all know that the terminal nodes (or leaves) lies at the bottom of the decision tree. This means that decision trees are typically drawn upside down such that leaves?are the the bottom & roots are the tops (shown below).

    Both the trees work almost similar to each other, let’s look at the primary differences & similarity?between classification and regression trees:

  • Regression trees are used when dependent variable is continuous. Classification trees are used when dependent variable is categorical.
  • In case of regression tree, the value obtained by terminal nodes in the training data is the mean response of observation falling in that region. Thus, if an unseen data observation falls in that region, we’ll make its prediction with?mean value.
  • In case of classification tree, the value (class) obtained by terminal node in the training data is the mode of observations falling in that region. Thus, if an unseen data observation falls in that region, we’ll make its prediction with mode value.
  • Both the trees divide the predictor space (independent variables) into distinct and non-overlapping regions. For the sake of simplicity, you can think of these regions as high dimensional boxes or boxes.
  • Both the trees follow a top-down greedy approach known as recursive binary splitting. We call it as ‘top-down’ because it begins from the top of tree when all the observations are available in a single region and successively splits the predictor space into two new branches down the tree. It is known as ‘greedy’ because, the algorithm cares (looks for best variable available) about only the current split, and not about future splits which will lead to a better tree.
  • This splitting process is continued?until a user defined stopping criteria is reached. For example: we can tell the the algorithm?to?stop once the number of observations per node becomes less than 50.
  • In both the cases, the splitting process results in fully grown trees until the stopping criteria is reached.But, the fully grown tree is likely to overfit data, leading to poor accuracy on unseen data. This bring ‘pruning’. Pruning is one of the technique used tackle overfitting. We’ll learn more about it in following section.
  • ?

    ?

    3. How does a tree decide where?to split?

    The decision of making strategic splits heavily affects a tree’s accuracy. The decision criteria is different for classification and regression trees.

    Decision trees use multiple?algorithms to decide to split a node in two or more sub-nodes. The creation of sub-nodes increases the homogeneity of resultant sub-nodes. In other words, we can say that purity of the node increases with respect to the target variable. Decision tree splits the nodes on all available variables and then selects the split which results in most homogeneous sub-nodes.

    The algorithm selection is also based on type of target variables. Let’s look at the four most commonly?used algorithms in decision tree:

    ?

    Gini Index

    Gini index says, if we select two items from a population at random then they must be of same class and probability for this is 1 if population is pure.

  • It works with categorical target variable “Success” or “Failure”.
  • It performs only Binary splits
  • Higher the value of Gini higher the homogeneity.
  • CART (Classification and Regression Tree) uses Gini method to create binary splits.
  • Steps to?Calculate Gini for a split

  • Calculate Gini for sub-nodes, using formula sum of square of probability for success and failure (p^2+q^2).
  • Calculate Gini for split using weighted Gini score of each node of that split
  • Example: –?Referring to example used above, where we want to segregate the students based on target variable ( playing cricket or not ). In the?snapshot below, we split the population using two input variables Gender and Class. Now, I want to identify which split is producing more homogeneous sub-nodes using Gini index.

    Split on Gender:

  • Calculate, Gini for sub-node Female = (0.2)*(0.2)+(0.8)*(0.8)=0.68
  • Gini for sub-node Male = (0.65)*(0.65)+(0.35)*(0.35)=0.55
  • Calculate weighted Gini for Split Gender = (10/30)*0.68+(20/30)*0.55 =?0.59
  • Similar for Split on Class:

  • Gini for sub-node Class IX = (0.43)*(0.43)+(0.57)*(0.57)=0.51
  • Gini for sub-node Class X = (0.56)*(0.56)+(0.44)*(0.44)=0.51
  • Calculate weighted Gini for Split Class?= (14/30)*0.51+(16/30)*0.51 =?0.51
  • Above, you can see that Gini score for?Split on Gender?is higher than?Split on Class,?hence, the node split will take place on Gender.

    ?

    Chi-Square

    It is an algorithm to find out the statistical significance between the differences between sub-nodes and parent node. We measure it by?sum of squares of standardized?differences between observed and expected frequencies?of target variable.

  • It works with categorical target variable “Success” or “Failure”.
  • It can perform two or more splits.
  • Higher the value of Chi-Square higher the statistical significance of differences between sub-node and Parent node.
  • Chi-Square of each node is calculated using formula,
  • Chi-square = ((Actual – Expected)^2 / Expected)^1/2
  • It generates tree called CHAID (Chi-square Automatic Interaction Detector)
  • Steps to?Calculate Chi-square for a split:

  • Calculate Chi-square for individual node by calculating the deviation for Success and Failure both
  • Calculated Chi-square of Split using Sum of all Chi-square of success and Failure of each node of?the split
  • Example:?Let’s work with above example that we have used to calculate Gini.

    Split on Gender:

  • First we are populating for node Female, Populate the actual value for “Play Cricket”?and?“Not Play Cricket”, here these are 2 and 8 respectively.
  • Calculate expected value for “Play Cricket”?and “Not Play Cricket”, here it would be 5 for both because parent node has probability of 50% and we have applied same probability on Female count(10).
  • Calculate deviations by using formula, Actual – Expected. It is for “Play Cricket”?(2 – 5 = -3) and for “Not play cricket”?( 8 – 5 = 3).
  • Calculate Chi-square of node for “Play Cricket”?and “Not Play Cricket”?using formula with formula,=?((Actual – Expected)^2 / Expected)^1/2. You can refer below table for calculation.
  • Follow similar steps for calculating Chi-square value for Male node.
  • Now add all Chi-square values to calculate Chi-square for split Gender.
  • Split on Class:

    Perform similar steps of calculation for split on Class and you will come up with below table.

    Above, you can see that Chi-square?also identify the Gender split is more significant compare to Class.

    ?

    Information Gain:

    Look at the image below and think which node can be described easily. I am sure, your answer is?C because it requires less?information as all values are similar. On the other hand, B requires more information to describe it and A requires the maximum information. In other words, we can say that?C is a Pure node, B is less Impure and A is more impure.

    Now, we can build a?conclusion that less impure node requires less information to describe it. And, more impure node requires more information. Information theory is?a measure to define this degree of disorganization in a system?known as Entropy. If the sample is completely homogeneous, then the entropy is zero and if the sample is an equally divided (50% – 50%), it has entropy of one.

    Entropy can be calculated using formula:-

    Here p and q is probability of success and failure respectively in that node. Entropy is also used with categorical target variable. It chooses the split which has lowest entropy compared to parent node and other splits. The lesser the entropy, the better it is.

    Steps to calculate entropy for a split:

  • Calculate entropy of parent node
  • Calculate entropy of each individual node of split and calculate weighted average of all sub-nodes available in split.
  • Example:?Let’s use this method to identify best split for student example.

  • Entropy for parent node = -(15/30)?log2 (15/30)?– (15/30) log2 (15/30) =?1. Here 1 shows that it is a impure node.
  • Entropy for Female node = -(2/10) log2 (2/10) –?(8/10) log2 (8/10) = 0.72 and for male node,? -(13/20) log2 (13/20) –?(7/20) log2 (7/20) =?0.93
  • Entropy for?split Gender = Weighted entropy of sub-nodes = (10/30)*0.72 + (20/30)*0.93 =?0.86
  • Entropy for?Class?IX node, -(6/14) log2 (6/14) –?(8/14) log2 (8/14) = 0.99 and for Class X?node,? -(9/16) log2 (9/16) –?(7/16) log2 (7/16) = 0.99.
  • Entropy for split Class = ?(14/30)*0.99 + (16/30)*0.99 =?0.99
  • Above, you can see that entropy for?Split on Gender?is the lowest among all,?so the tree will split on?Gender. We can derive information gain from entropy as?1- Entropy.

    ?

    Reduction in Variance

    Till now, we have discussed the algorithms for categorical target variable. Reduction in variance is an algorithm used for?continuous?target variables (regression problems). This algorithm uses the standard formula?of variance to choose the best?split. The split with lower variance is selected as the?criteria to split the population:

    Above X-bar is mean of the values, X is actual and n is number of values.

    Steps to calculate Variance:

  • Calculate variance for each node.
  • Calculate variance for each split as weighted average of each node variance.
  • Example:-?Let’s assign numerical value 1 for play cricket and 0 for not playing cricket. Now follow the steps to identify the right split:

  • Variance for Root node, here mean value is (15*1 + 15*0)/30 = 0.5 and we have 15 one and 15 zero. Now variance would be ((1-0.5)^2+(1-0.5)^2+….15 times+(0-0.5)^2+(0-0.5)^2+…15 times) / 30, this can be written as (15*(1-0.5)^2+15*(0-0.5)^2) / 30 =?0.25
  • Mean of Female node = ?(2*1+8*0)/10=0.2 and?Variance = (2*(1-0.2)^2+8*(0-0.2)^2) / 10?= 0.16
  • Mean of Male Node = (13*1+7*0)/20=0.65?and?Variance = (13*(1-0.65)^2+7*(0-0.65)^2) / 20?= 0.23
  • Variance for Split Gender = Weighted Variance of Sub-nodes = (10/30)*0.16 + (20/30) *0.23 =?0.21
  • Mean of Class IX node =??(6*1+8*0)/14=0.43 and Variance = (6*(1-0.43)^2+8*(0-0.43)^2) / 14= 0.24
  • Mean of Class X node = ?(9*1+7*0)/16=0.56 and Variance = (9*(1-0.56)^2+7*(0-0.56)^2) / 16 = 0.25
  • Variance for Split Gender = (14/30)*0.24 + (16/30) *0.25 =?0.25
  • Above, you can see that Gender split has lower variance compare to parent node, so the split would take place on?Gender?variable.

    Until here, we learnt about the basics of decision trees and the decision making process involved to choose the best splits in building a tree model. As I said, decision tree can be applied both on regression and classification problems. Let’s understand these aspects in detail.

    ?

    ?

    4. What are the key parameters of tree modeling?and how can we avoid over-fitting in decision trees?

    Overfitting is one of the key challenges faced while modeling decision trees. If?there is no limit set of a decision tree, it will give you 100% accuracy on training set because?in the worse case it will end up making 1 leaf for each observation. Thus, preventing overfitting is?pivotal while modeling a decision tree and it can be done in 2 ways:

  • Setting constraints on tree size
  • Tree pruning
  • Lets discuss both of these briefly.

    Setting Constraints on Tree Size

    This can be done by using various parameters which are used to define a tree.?First, lets look at the general structure of a decision tree:

    The parameters used for defining a tree are further explained below. The parameters described below are irrespective of tool. It is?important to understand?the role?of parameters used in tree modeling. These parameters are available in R & Python.

  • Minimum samples for a node split
    • Defines the minimum number of samples (or observations) which are required in a node to be considered for splitting.
    • Used to control over-fitting. Higher values prevent a model from learning relations which might be highly?specific to the?particular sample selected for a tree.
    • Too high values can lead to under-fitting hence, it should be tuned using CV.
  • Minimum samples for a terminal node (leaf)
    • Defines the minimum samples (or observations) required in a terminal node or leaf.
    • Used to control over-fitting similar to min_samples_split.
    • Generally lower values should be chosen for imbalanced class problems because the regions in which the minority class will be in majority will be very small.
  • Maximum depth of tree (vertical depth)
    • The maximum depth of a tree.
    • Used to control over-fitting as higher depth will allow model to learn relations very specific to a particular sample.
    • Should be tuned using CV.
  • Maximum number of terminal nodes
    • The maximum number of terminal nodes or leaves in a tree.
    • Can be defined in place of?max_depth. Since binary trees are created, a depth of ‘n’ would produce a maximum of 2^n leaves.
  • Maximum features to consider for split
    • The number of features to consider while searching for a best split. These will be randomly selected.
    • As a thumb-rule, square root of the total number of features works great but we should check upto 30-40% of the total number of features.
    • Higher values can lead to over-fitting but depends on case to case.
  • ?

    Tree Pruning

    As discussed earlier, the technique of setting constraint is a?greedy-approach. In other words, it will check for the best split instantaneously and move forward until one of the specified stopping condition is reached. Let’s consider the following?case when you’re driving:

    There are 2 lanes:

  • A lane with cars moving at 80km/h
  • A lane with trucks moving at 30km/h
  • At this instant, you are the yellow car and you have 2 choices:

  • Take a left and overtake the other 2 cars quickly
  • Keep moving in the present lane
  • Lets analyze these choice. In the former choice, you’ll immediately?overtake the car ahead and reach?behind the truck and start moving at 30 km/h, looking for an opportunity to move back right. All cars originally behind you move ahead in the meanwhile. This would be the optimum choice if your objective is to maximize the distance covered in next say 10 seconds. In the later choice, you sale through at same speed, cross trucks and then overtake maybe depending on situation ahead. Greedy you!

    This is exactly the difference between normal decision tree & pruning. A decision tree with constraints won’t see the truck ahead and adopt a greedy approach by taking a left. On the other hand if we use pruning, we in effect look at a few steps ahead and make a choice.

    So we know pruning is better. But how?to implement it in decision tree? The idea is simple.

  • We first make the decision tree to a large depth.
  • Then we start at the bottom and start removing leaves which are giving us negative returns when compared from the top.
  • Suppose?a split is giving us a gain of say -10 (loss of 10) and then the next split on that gives us a gain of 20. A simple decision tree will stop at step 1 but in pruning, we will see that the overall gain is +10 and keep both leaves.
  • Note that sklearn’s decision tree classifier does not currently?support?pruning.?Advanced packages like xgboost have adopted?tree pruning in their implementation.?But the library?rpart?in R, provides a function to prune. Good for R users!

    ?

    5. Are tree based models better than linear models?

    “If I can use logistic regression for classification problems and linear regression for regression problems, why is there a need to use trees”? Many of us have this question. And, this is a valid one too.

    Actually, you can use any algorithm. It is dependent on the type of problem you are solving. Let’s look at some key factors which will help you to decide which algorithm to use:

  • If the relationship between dependent & independent variable is well approximated by a linear model, linear regression will outperform tree based model.
  • If there is a high non-linearity & complex relationship between dependent & independent variables, a tree model will outperform a classical regression method.
  • If you need to build a model which is easy to explain to people, a decision tree model?will always do better than a linear model. Decision tree models?are even simpler to interpret than linear regression!
  • ?

    ?

    6. Working with Decision Trees in R and Python

    For R users and Python users, decision tree is quite easy to implement. Let’s quickly look at the set of codes which can get you started with this algorithm. For ease of use, I’ve shared standard codes where you’ll need to replace your data set name and variables to get started.

    For R users, there are multiple packages available to implement decision tree such as ctree, rpart, tree etc.

    > library(rpart) > x <- cbind(x_train,y_train) # grow tree? > fit <- rpart(y_train ~ ., data = x,method="class") > summary(fit) #Predict Output > predicted= predict(fit,x_test)

    In the code above:

    • y_train – represents dependent variable.
    • x_train – represents independent variable
    • x – represents training data.

    ?

    For Python users, below is the code:

    #Import Library #Import other necessary libraries like pandas, numpy... from sklearn import tree #Assumed you have, X (predictor) and Y (target) for training data set and x_test(predictor) of test_dataset # Create tree object model = tree.DecisionTreeClassifier(criterion='gini') # for classification, here you can change the algorithm as gini or entropy (information gain) by default it is gini # model = tree.DecisionTreeRegressor() for regression # Train the model using the training sets and check score model.fit(X, y) model.score(X, y) #Predict Output predicted= model.predict(x_test)

    ?

    ?

    7. What are ensemble methods in tree based modeling ?

    The literary meaning of word ‘ensemble’ is?group. Ensemble methods involve group of predictive models to achieve a better accuracy and model stability. Ensemble methods are known to impart supreme boost to tree based?models.

    Like every other model, a tree based model also suffers from the plague of bias and variance. Bias means, ‘how much on an average are the predicted values different from the actual value.’ Variance means, ‘how different will the predictions of the model be at the same point if different samples are?taken from the same population’.

    You build a small tree and you will get a model with low variance and high bias. How do you manage to balance the trade off between bias and variance ?

    Normally, as you increase the complexity of your model, you will see a reduction in prediction error due to lower bias in the model. As you continue to make your model more complex, you end up over-fitting your model and your model will start suffering from high variance.

    A champion model should maintain a balance between these two types of errors. This is known as thetrade-off management?of bias-variance errors.?Ensemble learning is one way to execute this trade off analysis.

    Some of the commonly used ensemble methods include: Bagging, Boosting and Stacking. In this tutorial, we’ll focus on Bagging and Boosting in detail.

    ?

    ?

    8. What is Bagging? How does it work?

    Bagging?is a technique used to reduce the variance of our predictions?by combining?the result?of multiple?classifiers modeled on different sub-samples of the same data set. The following figure will make it clearer:


    The steps followed in bagging are:

  • Create Multiple DataSets:
    • Sampling is done?with replacement?on the original data and new datasets are formed.
    • The new data sets can have a fraction of the columns as well as rows, which are generally hyper-parameters in a bagging model
    • Taking row and column fractions less than 1 helps in making robust models, less prone to overfitting
  • Build Multiple Classifiers:
    • Classifiers are built on each data set.
    • Generally the same classifier is modeled on each data set and predictions are made.
  • Combine Classifiers:
    • The predictions of all the classifiers are combined using a mean, median or mode value depending on the problem at hand.
    • The combined values are generally more robust than a single model.
  • Note that, here?the number of models built is not a hyper-parameters.?Higher number of models are always better or may give similar?performance than lower numbers. It can be theoretically shown that the variance of the combined predictions are reduced to 1/n (n: number of classifiers) of the original variance, under some assumptions.

    There are various implementations of bagging models. Random forest is one of them and we’ll discuss it next.

    ?

    ?

    9. What is Random Forest ? How does it work?

    Random Forest is considered to be a?panacea?of all data science problems. On a funny note, when you can’t think of any algorithm (irrespective of situation), use random forest!

    Random Forest?is a versatile machine learning method capable of performing both regression and classification tasks.?It also undertakes?dimensional reduction methods, treats missing values, outlier values?and other essential?steps of data exploration,?and does a fairly good job.?It is?a type of ensemble learning method, where a group of weak models combine?to form a powerful model.

    ?

    How does it work?

    In Random Forest, we grow?multiple?trees as opposed?to a single tree in CART model (see comparison between CART and Random Forest here,?part1?and?part2).?To classify a new object based on attributes, each tree gives a classification and we say the tree “votes” for that class. The forest chooses the classification having the most votes (over all the trees in the forest) and in case of regression, it takes the average of outputs by different trees.

    It works in the following manner.?Each tree is planted & grown as follows:

  • Assume number of cases in the training set is N. Then, sample of these N cases is taken at random butwith replacement. This sample will be the training set for growing the tree.
  • If there are M input variables, a number m<M is specified such that at each node, m variables are selected at random out of the M. The best split on these m is used to split the node. The value of m is held constant while we grow?the forest.
  • Each tree is grown to the largest extent possible and ?there is no pruning.
  • Predict new data by aggregating the predictions of the ntree trees (i.e., majority votes for classification, average for regression).
  • To?understand more in detail about this algorithm using a case study, please read this?article “Introduction to Random forest – Simplified“.

    ?

    Advantages of Random Forest

    • This algorithm can solve both type of problems i.e. classification and regression and does a decent estimation at both fronts.
    • One of benefits of Random forest which?excites me most is, the power of handle large data set with higher dimensionality. It can handle thousands of input variables and identify most significant variables so it is considered as one of the dimensionality reduction methods. Further, the model outputs?Importance of variable,?which can be a very handy feature (on some random data set).
    • It has an effective method for estimating missing data and maintains accuracy when a large proportion of the data are missing.
    • It has methods for balancing errors in data sets where classes are imbalanced.
    • The capabilities of the above can be extended to unlabeled data, leading to unsupervised clustering, data views and outlier detection.
    • Random Forest involves sampling of the input data with replacement called as bootstrap sampling. Here?one third of the data is not used for training and can be used to testing. These are called the?out of bag?samples. Error estimated on these out of bag samples is known as?out of bag error. Study of error estimates by Out of bag, gives evidence to show that the out-of-bag estimate is as accurate as using a test set of the same size as the training set. Therefore, using the out-of-bag error estimate removes the need for a set aside test set.

    ?

    Disadvantages of Random Forest

    • It surely does a?good job at?classification but not as good as for regression problem as it does not give precise continuous nature predictions. In case of regression, it?doesn’t?predict beyond the range in the training data, and that they may over-fit data sets that are particularly noisy.
    • Random Forest can feel like a black box approach for statistical modelers – you have very little control on what the model does. You can at best – try different parameters and random seeds!

    ?

    Python & R implementation

    Random forests have commonly known implementations in R packages and Python scikit-learn. Let’s look at the code?of loading random forest model in R and Python below:

    Python

    #Import Library from sklearn.ensemble import RandomForestClassifier #use RandomForestRegressor for regression problem #Assumed you have, X (predictor) and Y (target) for training data set and x_test(predictor) of test_dataset # Create Random Forest object model= RandomForestClassifier(n_estimators=1000) # Train the model using the training sets and check score model.fit(X, y) #Predict Output predicted= model.predict(x_test)

    ?

    R?Code

    > library(randomForest) > x <- cbind(x_train,y_train) # Fitting model > fit <- randomForest(Species ~ ., x,ntree=500) > summary(fit) #Predict Output > predicted= predict(fit,x_test)

    ?

    ?

    10. What is Boosting ? How does it work?

    Definition:?The term ‘Boosting’ refers to a family of algorithms which?converts weak learner to strong learners.

    Let’s understand this definition in detail by solving a problem of spam email identification:

    How would you classify?an email as SPAM or not? Like everyone else, our initial approach would be?to identify ‘spam’ and ‘not spam’ emails using following criteria. If:

  • Email has only one?image file (promotional image), It’s a SPAM
  • Email has only link(s), It’s a SPAM
  • Email body consist of?sentence like “You won a prize money of $ xxxxxx”, It’s a SPAM
  • Email from our?official domain “Analyticsvidhya.com” , Not a SPAM
  • Email from known source, Not a SPAM
  • Above, we’ve defined multiple rules to classify?an email into ‘spam’ or ‘not spam’.?But, do you think these rules individually are strong enough to successfully classify?an email? No.

    Individually, these rules are?not powerful enough to classify an email into ‘spam’ or ‘not spam’.?Therefore, these rules are called as?weak learner.

    To convert weak learner to strong learner, we’ll combine the prediction of each weak learner using methods like:

    • Using average/ weighted average
    • Considering prediction has higher vote

    For example: ?Above,?we have defined 5 weak learners. Out of these 5, 3 are?voted as?‘SPAM’ and 2 are voted as ‘Not a SPAM’. In this case, by default, we’ll consider an email as SPAM because we?have higher(3) vote for ‘SPAM’.

    ?

    How does it work?

    Now we know that, boosting combines weak learner a.k.a. base learner to form a strong rule. An immediate question which should pop in your mind is, ‘How boosting identify weak rules?‘

    To find weak rule, we apply?base learning (ML) algorithms with a different distribution. Each time base learning algorithm is applied, it generates a new weak prediction rule. This is an iterative process. After many iterations, the boosting algorithm combines these weak rules into a single strong prediction rule.

    Here’s another question which might haunt you, ‘How do we choose different distribution for each round?’

    For choosing the right distribution, here are the following steps:

    Step 1:??The?base learner takes all the distributions and assign equal weight or attention to each observation.

    Step 2:?If there is any prediction error caused by first base learning algorithm, then we pay higher attention to observations having prediction error. Then, we?apply the next base learning algorithm.

    Step 3:?Iterate Step 2 till the limit of base learning algorithm is reached or higher accuracy is achieved.

    Finally, it combines the outputs from?weak learner and creates ?a strong learner which eventually improves the prediction power of the model. Boosting pays?higher?focus on examples which are mis-classi?ed or have higher errors by preceding weak rules.
    There are many boosting algorithms which impart additional boost to model’s accuracy. In this tutorial, we’ll learn about the two most commonly used algorithms i.e. Gradient Boosting (GBM) and XGboost.

    ?

    ?

    11. Which is more powerful:?GBM or Xgboost?

    I’ve always admired the boosting capabilities that xgboost algorithm. At times, I’ve found that it provides?better result compared to GBM implementation, but at times you might find that the gains are just marginal. When I explored more about its performance and science behind its high accuracy, I discovered many advantages of Xgboost over GBM:

  • Regularization:
    • Standard GBM implementation has no?regularization?like?XGBoost,?therefore?it also helps to reduce overfitting.
    • In fact, XGBoost is also known as?‘regularized boosting‘ technique.
  • Parallel Processing:
    • XGBoost implements parallel processing and is?blazingly faster?as compared to GBM.
    • But hang on, we know that?boosting?is sequential process so how can it be parallelized? We know that each tree can be built only after the previous one, so?what stops us from making a tree using all cores? I hope you?get?where I’m coming from. Check?this link?out to explore further.
    • XGBoost also supports implementation on Hadoop.
  • High Flexibility
    • XGBoost allow users to define?custom optimization objectives and evaluation criteria.
    • This adds a whole new dimension to the model and there is no limit to what we can do.
  • Handling Missing Values
    • XGBoost has an in-built routine to handle?missing values.
    • User is required to?supply?a different value than other observations and pass that as a parameter. XGBoost?tries different things as it encounters a missing value on each node and learns which path to take for missing values in future.
  • Tree Pruning:
    • A GBM would stop splitting a node when it encounters a negative loss in the split. Thus it is more of a?greedy algorithm.
    • XGBoost on the other hand make?splits upto the max_depth?specified and then start?pruningthe tree backwards and remove splits beyond which there is no positive gain.
    • Another?advantage is that sometimes a split of negative loss say -2 may be followed by a split of positive loss +10. GBM would stop as it encounters -2. But XGBoost will go deeper and it will see a combined effect of +8 of the split and keep both.
  • Built-in Cross-Validation
    • XGBoost allows user to run a?cross-validation at each iteration?of the boosting process and thus it is easy to get the exact optimum number of boosting iterations in a single run.
    • This is unlike GBM where we have to run a grid-search and only a limited values can be tested.
  • Continue on Existing Model
    • User can start training an XGBoost model from its last iteration of previous run. This can be of significant advantage in certain specific applications.
    • GBM implementation of sklearn also has this feature so they are even on this point.
  • ?

    ?

    12. Working with GBM in R and Python

    Before we start working, let’s quickly understand the important parameters and the working of this?algorithm. This will be helpful for both R and Python users. Below is the overall pseudo-code of GBM algorithm for 2 classes:

    1. Initialize the outcome 2. Iterate from 1 to total number of trees2.1 Update the weights for targets based on previous run (higher for the ones mis-classified)2.2 Fit the model on selected subsample of data2.3 Make predictions on the full set of observations2.4 Update the output with current results taking into account the learning rate 3. Return the final output.

    This is an extremely simplified (probably naive) explanation of GBM’s working. But, it will help every beginners to understand this algorithm.

    Lets consider?the important GBM?parameters used to improve model performance in Python:

  • learning_rate
    • This determines the impact of each tree on the final outcome (step 2.4). GBM works by starting with an initial estimate which is updated using the output of each tree. The learning parameter controls the magnitude of this change in the estimates.
    • Lower values are generally preferred as they?make the model?robust to the specific characteristics of tree and thus allowing it to generalize well.
    • Lower values would require higher number of trees to model all the relations and will be computationally expensive.
  • n_estimators
    • The number of sequential trees to be modeled (step 2)
    • Though GBM is fairly robust at?higher number of trees but it can still overfit at a point. Hence, this should be tuned using?CV for a particular learning rate.
  • subsample
    • The fraction of observations to be selected for each tree. Selection is done by random sampling.
    • Values slightly less than 1 make the model robust by reducing the variance.
    • Typical values ~0.8 generally work fine but can be fine-tuned further.
  • Apart from these, there are certain miscellaneous parameters which affect overall functionality:

  • loss
    • It refers to the loss function to be minimized in each split.
    • It can have various values for classification and regression case. Generally the default values work fine. Other values should be chosen only if you?understand their impact on the model.
  • init
    • This affects initialization of the output.
    • This can be used if we have made another model whose outcome is?to be used as the initial estimates for GBM.
  • random_state
    • The random number seed so that same random numbers are generated every time.
    • This is important for parameter tuning. If we don’t fix the random number, then we’ll have different outcomes for subsequent runs on the same parameters and it becomes difficult to compare models.
    • It can potentially result in overfitting to a particular random sample selected. We can try running models for different random samples, which is computationally expensive and generally not used.
  • verbose
    • The type of output?to be printed when?the model fits. The different values can be:
      • 0: no output generated (default)
      • 1: output generated for trees in certain intervals
      • >1: output generated?for all trees
  • warm_start
    • This parameter has an interesting application?and can help a lot if used judicially.
    • Using this, we can fit additional trees on previous fits of a model. It can save a lot of time and you should explore this option for advanced applications
  • presort?
    • ?Select whether to presort data for faster splits.
    • It makes the selection automatically by default but it can be changed if needed.
  • I know its a long list of parameters but I have simplified it for you in?an excel file which you can download from this?GitHub repository.

    For R users, using caret package, there are 3 main tuning parameters:

  • n.trees?– It refers to number of iterations i.e. tree which will be taken to grow the trees
  • interaction.depth?– It determines the complexity of the tree i.e. total number of splits it has to perform on a tree (starting from a single node)
  • ?shrinkage?– It refers to the learning rate. This is similar to learning_rate in python (shown above).
  • n.minobsinnode – It refers to minimum number of training samples required in a node to perform splitting
  • ?

    GBM in R (with cross validation)

    I’ve shared the standard codes in R and Python. At your end, you’ll be required to change the value of dependent variable and data set name used in the codes below. Considering the ease of implementing GBM in R, one can easily perform tasks like cross validation and grid search with this package.

    > library(caret) > fitControl <- trainControl(method = "cv",number = 10, #5folds) > tune_Grid <- expand.grid(interaction.depth = 2,n.trees = 500,shrinkage = 0.1,n.minobsinnode = 10) > set.seed(825) > fit <- train(y_train ~ ., data = train,method = "gbm",trControl = fitControl,verbose = FALSE,tuneGrid = gbmGrid) > predicted= predict(fit,test,type= "prob")[,2]

    ?

    GBM in Python

    #import libraries from sklearn.ensemble import GradientBoostingClassifier #For Classification from sklearn.ensemble import GradientBoostingRegressor #For Regression #use GBM function clf = GradientBoostingClassifier(n_estimators=100, learning_rate=1.0, max_depth=1) clf.fit(X_train, y_train)

    ?

    ?

    13. Working with XGBoost in R and Python

    XGBoost (eXtreme Gradient Boosting)?is an advanced implementation of gradient boosting algorithm. It’s feature to implement parallel computing makes it?at least?10 times faster?than existing gradient boosting implementations. It supports various objective functions, including regression, classification and ranking.

    R Tutorial:?For R users, this is a complete tutorial on XGboost which explains the parameters along with codes in R.?Check Tutorial.

    Python Tutorial: For Python users, this is a comprehensive tutorial on XGBoost, good to get you started.?Check Tutorial.

    ?

    ?

    14. Where to practice ?

    Practice is the one and true method of mastering any concept. Hence, you need to?start practicing?if you wish to master these algorithms.

    Till here, you’ve got gained significant knowledge on tree based models along with these practical implementation. It’s time that you start working on them. Here are open practice problems where you can participate and check your live rankings on leaderboard:

    For Regression:?Big Mart Sales Prediction

    For Classification:?Loan Prediction

    ?

    End Notes

    Tree based algorithm are important for every data scientist to learn. In fact, tree models are known to provide the best model performance in the family of whole machine learning algorithms. In this tutorial, we learnt until GBM and XGBoost. And with this, we come to the end of this tutorial.

    We discussed about tree based modeling from scratch. We learnt the important of decision tree and how that simplistic concept is being used in boosting algorithms. For better understanding, I would suggest you to continue practicing these algorithms practically. Also, do keep note of the parameters associated with boosting algorithms. I’m hoping that this tutorial would enrich you with complete knowledge on tree based modeling.

    Did you find this tutorial useful ? If you have experienced, what’s the best trick you’ve used while using tree based models ? Feel free to share your tricks, suggestions and opinions in the comments section below.


    總結

    以上是生活随笔為你收集整理的A Complete Tutorial on Tree Based Modeling from Scratch (in R Python)的全部內容,希望文章能夠幫你解決所遇到的問題。

    如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。

    午夜精品久久久久久久久 | 欧美丰满老熟妇xxxxx性 | 真人与拘做受免费视频一 | 性欧美videos高清精品 | 亚洲国产精品成人久久蜜臀 | 久久亚洲中文字幕精品一区 | 色一情一乱一伦一视频免费看 | 曰韩少妇内射免费播放 | 少妇厨房愉情理9仑片视频 | 精品欧洲av无码一区二区三区 | 精品无码av一区二区三区 | 午夜无码人妻av大片色欲 | 国内丰满熟女出轨videos | 亚洲色偷偷男人的天堂 | 国产无遮挡吃胸膜奶免费看 | 300部国产真实乱 | 亚洲国产精品一区二区第一页 | 黄网在线观看免费网站 | 人妻尝试又大又粗久久 | 国产亚洲精品久久久久久 | 日日碰狠狠躁久久躁蜜桃 | 男人的天堂av网站 | 成人精品视频一区二区 | 国内精品一区二区三区不卡 | 蜜桃视频韩日免费播放 | 久久国产36精品色熟妇 | 亚洲日韩乱码中文无码蜜桃臀网站 | √天堂中文官网8在线 | 欧洲vodafone精品性 | 伊人久久大香线焦av综合影院 | 国语精品一区二区三区 | 国产乱人伦偷精品视频 | 欧美精品国产综合久久 | 高潮毛片无遮挡高清免费 | 99久久精品午夜一区二区 | 未满成年国产在线观看 | 国产在线精品一区二区三区直播 | 婷婷综合久久中文字幕蜜桃三电影 | 内射巨臀欧美在线视频 | 国产特级毛片aaaaaa高潮流水 | 东京一本一道一二三区 | 亚洲中文字幕在线观看 | 国产办公室秘书无码精品99 | 欧美喷潮久久久xxxxx | 欧美精品无码一区二区三区 | 亚洲国产综合无码一区 | 一个人免费观看的www视频 | 久久综合色之久久综合 | 麻花豆传媒剧国产免费mv在线 | 超碰97人人射妻 | 精品少妇爆乳无码av无码专区 | 日本大香伊一区二区三区 | 蜜桃臀无码内射一区二区三区 | 蜜桃视频韩日免费播放 | 国产成人一区二区三区别 | ass日本丰满熟妇pics | 国产亲子乱弄免费视频 | 强开小婷嫩苞又嫩又紧视频 | 97se亚洲精品一区 | 狠狠噜狠狠狠狠丁香五月 | 亚洲精品久久久久中文第一幕 | 日韩精品无码一本二本三本色 | 狂野欧美激情性xxxx | 精品夜夜澡人妻无码av蜜桃 | 强辱丰满人妻hd中文字幕 | 国产农村妇女高潮大叫 | 牲交欧美兽交欧美 | 丰满少妇女裸体bbw | 一本久道久久综合狠狠爱 | 网友自拍区视频精品 | 亚洲国产精品无码一区二区三区 | 精品久久久久久人妻无码中文字幕 | 日韩亚洲欧美中文高清在线 | 国产亚洲人成在线播放 | 麻花豆传媒剧国产免费mv在线 | 色偷偷人人澡人人爽人人模 | 日本肉体xxxx裸交 | 亚洲一区二区观看播放 | 奇米综合四色77777久久 东京无码熟妇人妻av在线网址 | 精品日本一区二区三区在线观看 | 少妇性l交大片 | √天堂资源地址中文在线 | 久久精品国产大片免费观看 | 一个人免费观看的www视频 | 精品无码av一区二区三区 | 国精品人妻无码一区二区三区蜜柚 | 国产又爽又猛又粗的视频a片 | 国产人妻精品午夜福利免费 | 国产黑色丝袜在线播放 | 性色欲网站人妻丰满中文久久不卡 | 99久久久无码国产精品免费 | 熟妇人妻无码xxx视频 | 亚洲午夜无码久久 | 国产精品久久久av久久久 | 成熟人妻av无码专区 | 欧美性猛交内射兽交老熟妇 | 无码人妻精品一区二区三区不卡 | 午夜福利一区二区三区在线观看 | 国产免费久久精品国产传媒 | 久久久久se色偷偷亚洲精品av | 精品无码一区二区三区的天堂 | 无码av最新清无码专区吞精 | 亚洲精品中文字幕 | 88国产精品欧美一区二区三区 | 欧美精品国产综合久久 | 久久久亚洲欧洲日产国码αv | 久激情内射婷内射蜜桃人妖 | 日日躁夜夜躁狠狠躁 | 丰满人妻被黑人猛烈进入 | 日本一卡2卡3卡四卡精品网站 | 无套内谢老熟女 | 亚洲va中文字幕无码久久不卡 | av香港经典三级级 在线 | 乱人伦人妻中文字幕无码 | 性欧美熟妇videofreesex | 成人性做爰aaa片免费看不忠 | 欧美日韩久久久精品a片 | 无码人妻精品一区二区三区下载 | 18精品久久久无码午夜福利 | 国产亚洲精品久久久久久久 | 色综合天天综合狠狠爱 | 国产亚洲精品久久久久久久 | 香港三级日本三级妇三级 | 图片区 小说区 区 亚洲五月 | 少妇厨房愉情理9仑片视频 | 日日噜噜噜噜夜夜爽亚洲精品 | 国产偷国产偷精品高清尤物 | 激情爆乳一区二区三区 | 亚洲国产精品久久人人爱 | 中文字幕日韩精品一区二区三区 | 亚洲aⅴ无码成人网站国产app | 玩弄少妇高潮ⅹxxxyw | 强开小婷嫩苞又嫩又紧视频 | 中文字幕日产无线码一区 | 人妻少妇被猛烈进入中文字幕 | 国产亚洲美女精品久久久2020 | 波多野结衣一区二区三区av免费 | 夜夜躁日日躁狠狠久久av | 欧美老妇交乱视频在线观看 | 一二三四社区在线中文视频 | 国产亚洲视频中文字幕97精品 | 成人欧美一区二区三区黑人 | 亚洲va欧美va天堂v国产综合 | 亚洲一区二区三区偷拍女厕 | 蜜臀aⅴ国产精品久久久国产老师 | 欧美丰满少妇xxxx性 | 国产99久久精品一区二区 | 久久久久国色av免费观看性色 | 精品一二三区久久aaa片 | 美女毛片一区二区三区四区 | 国产精品成人av在线观看 | 欧美 亚洲 国产 另类 | 国产av无码专区亚洲awww | 人人爽人人澡人人人妻 | 大色综合色综合网站 | 98国产精品综合一区二区三区 | 桃花色综合影院 | 任你躁在线精品免费 | 日韩精品无码一区二区中文字幕 | 久久亚洲中文字幕精品一区 | 国产乡下妇女做爰 | 免费看少妇作爱视频 | 乱码午夜-极国产极内射 | 日韩精品乱码av一区二区 | 少妇的肉体aa片免费 | 日本精品少妇一区二区三区 | 色婷婷久久一区二区三区麻豆 | 伦伦影院午夜理论片 | 国产精品无码一区二区三区不卡 | 激情人妻另类人妻伦 | 国产网红无码精品视频 | 又大又硬又黄的免费视频 | 精品乱子伦一区二区三区 | 欧美怡红院免费全部视频 | 国产人妻大战黑人第1集 | 久久精品女人的天堂av | 青青久在线视频免费观看 | 天海翼激烈高潮到腰振不止 | 精品无码国产一区二区三区av | 日韩人妻少妇一区二区三区 | www国产精品内射老师 | 97精品国产97久久久久久免费 | 亚洲综合无码久久精品综合 | 国内精品人妻无码久久久影院蜜桃 | 久久这里只有精品视频9 | 精品无码国产自产拍在线观看蜜 | 国产综合色产在线精品 | 亚洲午夜无码久久 | 日本精品人妻无码77777 天堂一区人妻无码 | 无套内射视频囯产 | 人人爽人人澡人人人妻 | 俺去俺来也在线www色官网 | 中文字幕中文有码在线 | 漂亮人妻洗澡被公强 日日躁 | 国产一区二区三区四区五区加勒比 | 天堂久久天堂av色综合 | 久久精品国产一区二区三区肥胖 | 欧美老熟妇乱xxxxx | 麻花豆传媒剧国产免费mv在线 | 精品国产aⅴ无码一区二区 | 夜夜高潮次次欢爽av女 | 沈阳熟女露脸对白视频 | 国产69精品久久久久app下载 | 欧美成人高清在线播放 | 日本一区二区更新不卡 | 18禁黄网站男男禁片免费观看 | 国产无av码在线观看 | 国产成人一区二区三区别 | 国产精品美女久久久网av | 无码人妻丰满熟妇区毛片18 | 天天做天天爱天天爽综合网 | 国产高清av在线播放 | 日韩人妻无码一区二区三区久久99 | 97久久超碰中文字幕 | 亚洲一区二区观看播放 | 久久无码专区国产精品s | 久久99精品国产麻豆 | 国产精品理论片在线观看 | 国产成人精品视频ⅴa片软件竹菊 | 国产成人无码a区在线观看视频app | 狠狠躁日日躁夜夜躁2020 | 国产精品国产自线拍免费软件 | 精品无人区无码乱码毛片国产 | 中文字幕无码免费久久99 | 国产成人一区二区三区别 | 精品亚洲成av人在线观看 | 99久久精品日本一区二区免费 | 捆绑白丝粉色jk震动捧喷白浆 | 国产三级久久久精品麻豆三级 | 国产97人人超碰caoprom | 国内少妇偷人精品视频 | 国产av剧情md精品麻豆 | 欧美人与动性行为视频 | 荫蒂添的好舒服视频囗交 | 国产特级毛片aaaaaaa高清 | 成人无码精品一区二区三区 | 俄罗斯老熟妇色xxxx | 人妻互换免费中文字幕 | 中文精品无码中文字幕无码专区 | 久久久久se色偷偷亚洲精品av | 又大又紧又粉嫩18p少妇 | 中国女人内谢69xxxx | 99久久婷婷国产综合精品青草免费 | 扒开双腿疯狂进出爽爽爽视频 | 欧美 日韩 人妻 高清 中文 | 日本饥渴人妻欲求不满 | 亚洲 日韩 欧美 成人 在线观看 | 中文字幕 亚洲精品 第1页 | 国产成人精品无码播放 | 国产精品丝袜黑色高跟鞋 | 久久久中文久久久无码 | 久久精品国产一区二区三区 | 久青草影院在线观看国产 | 97久久精品无码一区二区 | 久久aⅴ免费观看 | 免费国产成人高清在线观看网站 | 国产麻豆精品一区二区三区v视界 | 乱人伦中文视频在线观看 | 中国大陆精品视频xxxx | www国产精品内射老师 | 扒开双腿吃奶呻吟做受视频 | 亚洲精品国产第一综合99久久 | 97人妻精品一区二区三区 | 4hu四虎永久在线观看 | 亚洲色在线无码国产精品不卡 | 亚洲精品中文字幕 | 极品尤物被啪到呻吟喷水 | 日本大乳高潮视频在线观看 | 久久这里只有精品视频9 | 久久 国产 尿 小便 嘘嘘 | 无码人妻精品一区二区三区下载 | 亚洲成熟女人毛毛耸耸多 | 国产精品人妻一区二区三区四 | 初尝人妻少妇中文字幕 | 国产精品爱久久久久久久 | 国产精品久久久 | 国产免费久久精品国产传媒 | 亚洲人成网站免费播放 | 免费中文字幕日韩欧美 | 免费网站看v片在线18禁无码 | 国产在线精品一区二区高清不卡 | 无码吃奶揉捏奶头高潮视频 | 好爽又高潮了毛片免费下载 | 久久无码中文字幕免费影院蜜桃 | 国产香蕉97碰碰久久人人 | 夫妻免费无码v看片 | 熟妇女人妻丰满少妇中文字幕 | 内射白嫩少妇超碰 | 国产精品对白交换视频 | 成在人线av无码免费 | 国产又粗又硬又大爽黄老大爷视 | ass日本丰满熟妇pics | 一本一道久久综合久久 | 扒开双腿吃奶呻吟做受视频 | 捆绑白丝粉色jk震动捧喷白浆 | 亚洲精品国产精品乱码视色 | 日本又色又爽又黄的a片18禁 | 亚洲中文字幕无码一久久区 | 久久精品女人的天堂av | 国产亚洲精品久久久久久久久动漫 | 国产色精品久久人妻 | 国产成人无码av在线影院 | 特级做a爰片毛片免费69 | 久久国产自偷自偷免费一区调 | 久久精品国产一区二区三区 | 日韩av无码一区二区三区不卡 | v一区无码内射国产 | 中文字幕乱码中文乱码51精品 | 亚洲欧美日韩成人高清在线一区 | 色窝窝无码一区二区三区色欲 | 欧美亚洲国产一区二区三区 | 国产精品办公室沙发 | 国产精品福利视频导航 | 人人超人人超碰超国产 | 狂野欧美激情性xxxx | 狠狠色噜噜狠狠狠7777奇米 | 日韩无码专区 | 99久久久国产精品无码免费 | 久久久久国色av免费观看性色 | 人人妻人人澡人人爽欧美一区 | 亚洲人成无码网www | 成在人线av无码免观看麻豆 | 在线观看欧美一区二区三区 | 无码午夜成人1000部免费视频 | 中文字幕无码乱人伦 | 成年美女黄网站色大免费全看 | 免费观看又污又黄的网站 | 欧美人与动性行为视频 | 97夜夜澡人人双人人人喊 | 国产无av码在线观看 | 欧美 丝袜 自拍 制服 另类 | 成人动漫在线观看 | 天堂а√在线中文在线 | 久久精品国产一区二区三区肥胖 | 欧美黑人乱大交 | 国产人成高清在线视频99最全资源 | 亚洲欧美日韩国产精品一区二区 | 欧美阿v高清资源不卡在线播放 | 国产成人无码av在线影院 | 久久伊人色av天堂九九小黄鸭 | 欧美日韩一区二区三区自拍 | 高清国产亚洲精品自在久久 | 国内揄拍国内精品少妇国语 | 男女作爱免费网站 | 色婷婷av一区二区三区之红樱桃 | 亚洲国产欧美日韩精品一区二区三区 | 少妇人妻av毛片在线看 | 中文字幕精品av一区二区五区 | а√天堂www在线天堂小说 | 白嫩日本少妇做爰 | 娇妻被黑人粗大高潮白浆 | 999久久久国产精品消防器材 | 国产精华av午夜在线观看 | 日韩精品一区二区av在线 | 小鲜肉自慰网站xnxx | 98国产精品综合一区二区三区 | 亚洲一区二区三区香蕉 | 在线播放亚洲第一字幕 | 中文字幕人妻无码一夲道 | 欧美老人巨大xxxx做受 | 国产成人无码av片在线观看不卡 | 国产精品嫩草久久久久 | 乱中年女人伦av三区 | 天堂亚洲2017在线观看 | 欧美xxxxx精品 | 中文无码成人免费视频在线观看 | 小鲜肉自慰网站xnxx | 亚洲 另类 在线 欧美 制服 | av人摸人人人澡人人超碰下载 | 97色伦图片97综合影院 | 99久久精品午夜一区二区 | 色一情一乱一伦 | 国产精品多人p群无码 | 乱人伦人妻中文字幕无码 | 中文字幕av日韩精品一区二区 | 天堂а√在线中文在线 | 日本xxxx色视频在线观看免费 | 亚洲日本va午夜在线电影 | 国内精品九九久久久精品 | 丰满人妻翻云覆雨呻吟视频 | 久久亚洲国产成人精品性色 | 377p欧洲日本亚洲大胆 | 欧美老人巨大xxxx做受 | 99精品无人区乱码1区2区3区 | 亚洲熟女一区二区三区 | 亚洲中文字幕无码一久久区 | 人妻aⅴ无码一区二区三区 | 国产热a欧美热a在线视频 | 麻豆国产人妻欲求不满 | 动漫av网站免费观看 | 蜜桃av蜜臀av色欲av麻 999久久久国产精品消防器材 | 亚洲乱码中文字幕在线 | 亚洲一区二区三区在线观看网站 | 人妻少妇精品无码专区二区 | 亚洲春色在线视频 | 兔费看少妇性l交大片免费 | 伊人色综合久久天天小片 | 丝袜人妻一区二区三区 | 国产凸凹视频一区二区 | 4hu四虎永久在线观看 | 亚洲综合久久一区二区 | 帮老师解开蕾丝奶罩吸乳网站 | 77777熟女视频在线观看 а天堂中文在线官网 | 夜先锋av资源网站 | 国产真人无遮挡作爱免费视频 | 天天摸天天透天天添 | 色老头在线一区二区三区 | 国产成人精品一区二区在线小狼 | 亚洲成av人影院在线观看 | 国产免费无码一区二区视频 | 精品成在人线av无码免费看 | 最新版天堂资源中文官网 | 天天摸天天碰天天添 | 国产国产精品人在线视 | 亚洲国产精品无码一区二区三区 | 少妇人妻偷人精品无码视频 | 亚洲成在人网站无码天堂 | 亚洲 另类 在线 欧美 制服 | 樱花草在线播放免费中文 | 国产做国产爱免费视频 | 欧美性生交xxxxx久久久 | 女人被男人爽到呻吟的视频 | 亚洲日本va中文字幕 | 日本免费一区二区三区最新 | 精品成人av一区二区三区 | 亚洲色欲色欲欲www在线 | 国产熟妇另类久久久久 | 综合网日日天干夜夜久久 | 精品国产一区二区三区av 性色 | 丰满肥臀大屁股熟妇激情视频 | 日日摸夜夜摸狠狠摸婷婷 | 国产精品18久久久久久麻辣 | 东京无码熟妇人妻av在线网址 | 亚洲国产欧美在线成人 | 正在播放老肥熟妇露脸 | 婷婷五月综合激情中文字幕 | 精品国产福利一区二区 | 日韩少妇白浆无码系列 | 亚洲成av人影院在线观看 | 久久综合九色综合欧美狠狠 | 男女下面进入的视频免费午夜 | 国产成人一区二区三区在线观看 | 国产熟妇高潮叫床视频播放 | 国产精品久久久久无码av色戒 | 国产av剧情md精品麻豆 | 国产香蕉97碰碰久久人人 | 男人和女人高潮免费网站 | 日日麻批免费40分钟无码 | 国内精品人妻无码久久久影院蜜桃 | 亚洲成a人片在线观看日本 | 天天综合网天天综合色 | 激情人妻另类人妻伦 | 熟妇人妻无码xxx视频 | 精品国偷自产在线 | 日韩少妇内射免费播放 | 国内精品一区二区三区不卡 | 国产人妻人伦精品1国产丝袜 | 国产香蕉尹人综合在线观看 | 精品夜夜澡人妻无码av蜜桃 | 久久久久人妻一区精品色欧美 | 国色天香社区在线视频 | 美女极度色诱视频国产 | 四虎影视成人永久免费观看视频 | www国产亚洲精品久久网站 | 久久久www成人免费毛片 | 在线观看国产午夜福利片 | 亚洲日本va午夜在线电影 | 久久 国产 尿 小便 嘘嘘 | 小sao货水好多真紧h无码视频 | 99精品无人区乱码1区2区3区 | 亚洲国产精品久久久久久 | 国产成人精品视频ⅴa片软件竹菊 | 在线观看国产一区二区三区 | 成熟人妻av无码专区 | 亚洲天堂2017无码中文 | 2020久久香蕉国产线看观看 | 四虎国产精品一区二区 | 午夜丰满少妇性开放视频 | 欧美日本精品一区二区三区 | 四十如虎的丰满熟妇啪啪 | 国产尤物精品视频 | 永久免费观看国产裸体美女 | 欧美丰满少妇xxxx性 | 久久99精品国产麻豆 | 国产超碰人人爽人人做人人添 | 乱人伦人妻中文字幕无码 | 搡女人真爽免费视频大全 | 国产精品亚洲综合色区韩国 | 久久久久99精品成人片 | 国产精品无码久久av | 欧美xxxxx精品 | 娇妻被黑人粗大高潮白浆 | 日本一本二本三区免费 | ass日本丰满熟妇pics | 国产国产精品人在线视 | 国产av人人夜夜澡人人爽麻豆 | 嫩b人妻精品一区二区三区 | 性色欲情网站iwww九文堂 | 久久精品国产一区二区三区肥胖 | 亚洲色无码一区二区三区 | 国产熟妇另类久久久久 | 国产精品无码成人午夜电影 | 亚洲色大成网站www国产 | 丰满人妻被黑人猛烈进入 | 亚洲精品一区二区三区在线 | 亚洲人成网站在线播放942 | 国产小呦泬泬99精品 | 老子影院午夜伦不卡 | 亚洲国产精品成人久久蜜臀 | 成人片黄网站色大片免费观看 | 乱码午夜-极国产极内射 | 国产9 9在线 | 中文 | 中文字幕中文有码在线 | 欧美性生交活xxxxxdddd | 人妻少妇被猛烈进入中文字幕 | 99久久无码一区人妻 | 99精品无人区乱码1区2区3区 | 蜜臀aⅴ国产精品久久久国产老师 | 国内揄拍国内精品人妻 | 黑人玩弄人妻中文在线 | 永久免费精品精品永久-夜色 | 一本久道久久综合狠狠爱 | 高潮毛片无遮挡高清免费 | 性欧美疯狂xxxxbbbb | 久久精品女人的天堂av | 国产精品爱久久久久久久 | 国产亚洲tv在线观看 | 午夜性刺激在线视频免费 | 国产在线精品一区二区三区直播 | 亚洲 另类 在线 欧美 制服 | 男女猛烈xx00免费视频试看 | 久久久精品国产sm最大网站 | 97精品国产97久久久久久免费 | 亚洲男女内射在线播放 | 一本一道久久综合久久 | 男女猛烈xx00免费视频试看 | 亚洲色欲色欲欲www在线 | 三级4级全黄60分钟 | 99久久精品日本一区二区免费 | 欧美日韩在线亚洲综合国产人 | 免费无码午夜福利片69 | 国产精品人人爽人人做我的可爱 | 好屌草这里只有精品 | 伊人色综合久久天天小片 | 大乳丰满人妻中文字幕日本 | 国产乱人伦偷精品视频 | 国产精品久久久久无码av色戒 | 免费国产黄网站在线观看 | 十八禁真人啪啪免费网站 | 日本高清一区免费中文视频 | 亚洲高清偷拍一区二区三区 | 日日碰狠狠躁久久躁蜜桃 | 好男人www社区 | 亚洲精品成人福利网站 | 日产精品高潮呻吟av久久 | 成人免费视频一区二区 | 国产综合色产在线精品 | 中文字幕+乱码+中文字幕一区 | 久久99精品久久久久久动态图 | 亚洲成色在线综合网站 | 日韩精品无码免费一区二区三区 | 亚洲а∨天堂久久精品2021 | 99精品国产综合久久久久五月天 | 网友自拍区视频精品 | 狠狠色色综合网站 | 国产xxx69麻豆国语对白 | 日韩人妻系列无码专区 | 一个人看的www免费视频在线观看 | 日本乱人伦片中文三区 | 色偷偷av老熟女 久久精品人妻少妇一区二区三区 | 一二三四在线观看免费视频 | 国产极品美女高潮无套在线观看 | 欧美日本日韩 | 国产三级久久久精品麻豆三级 | 国产激情无码一区二区app | 国产午夜亚洲精品不卡下载 | 99视频精品全部免费免费观看 | 午夜精品久久久久久久久 | 精品国产av色一区二区深夜久久 | 日本精品久久久久中文字幕 | 青青青手机频在线观看 | 亚洲色欲色欲天天天www | 亚洲男女内射在线播放 | 亚洲a无码综合a国产av中文 | 窝窝午夜理论片影院 | 日韩欧美群交p片內射中文 | 又大又硬又黄的免费视频 | 搡女人真爽免费视频大全 | 少妇高潮喷潮久久久影院 | 日本一区二区三区免费高清 | 久久综合狠狠综合久久综合88 | 美女黄网站人色视频免费国产 | 高中生自慰www网站 | 黑人巨大精品欧美一区二区 | 久久久无码中文字幕久... | 久在线观看福利视频 | 亚洲精品一区二区三区在线 | 未满小14洗澡无码视频网站 | 成人精品一区二区三区中文字幕 | 亚洲娇小与黑人巨大交 | 久久久www成人免费毛片 | 日本在线高清不卡免费播放 | 欧美黑人乱大交 | 无码人妻少妇伦在线电影 | 欧美精品一区二区精品久久 | 久久久av男人的天堂 | 99久久精品午夜一区二区 | 国产乱人偷精品人妻a片 | 在线播放亚洲第一字幕 | 377p欧洲日本亚洲大胆 | 日韩精品无码一本二本三本色 | 夜夜影院未满十八勿进 | 日本欧美一区二区三区乱码 | 精品国产精品久久一区免费式 | 色情久久久av熟女人妻网站 | 又大又黄又粗又爽的免费视频 | 日日鲁鲁鲁夜夜爽爽狠狠 | 国产精品久久国产三级国 | 一二三四社区在线中文视频 | 久久综合网欧美色妞网 | 中文字幕无线码 | 久久99精品久久久久婷婷 | 国产亚洲美女精品久久久2020 | 综合人妻久久一区二区精品 | 久久天天躁夜夜躁狠狠 | 夜夜高潮次次欢爽av女 | 亚洲最大成人网站 | 久久综合激激的五月天 | 亚洲国产午夜精品理论片 | 欧美一区二区三区视频在线观看 | 国产成人无码午夜视频在线观看 | 欧美人妻一区二区三区 | 人人妻人人澡人人爽人人精品浪潮 | 高中生自慰www网站 | 久久久精品欧美一区二区免费 | 欧美日韩在线亚洲综合国产人 | 国产精品久久久久久无码 | 九九久久精品国产免费看小说 | 麻花豆传媒剧国产免费mv在线 | 亚洲中文字幕在线无码一区二区 | 午夜精品一区二区三区的区别 | 欧美人与善在线com | 国产免费观看黄av片 | 又大又黄又粗又爽的免费视频 | 天天爽夜夜爽夜夜爽 | 久久久久99精品成人片 | 欧美zoozzooz性欧美 | 最近免费中文字幕中文高清百度 | 88国产精品欧美一区二区三区 | 人妻体内射精一区二区三四 | 1000部夫妻午夜免费 | 蜜桃视频韩日免费播放 | 狂野欧美性猛xxxx乱大交 | 久久99精品国产麻豆蜜芽 | 中文字幕乱码亚洲无线三区 | 亚洲中文字幕无码中文字在线 | 亚洲国产欧美国产综合一区 | 天下第一社区视频www日本 | 麻豆av传媒蜜桃天美传媒 | 国产人妖乱国产精品人妖 | 久久久久免费精品国产 | 欧美 丝袜 自拍 制服 另类 | 亚洲中文字幕av在天堂 | 撕开奶罩揉吮奶头视频 | 秋霞成人午夜鲁丝一区二区三区 | 性做久久久久久久免费看 | 亚洲色www成人永久网址 | 无码人中文字幕 | 无码人中文字幕 | 国产色精品久久人妻 | 天天综合网天天综合色 | 久久久久免费精品国产 | 97色伦图片97综合影院 | 欧美日韩一区二区免费视频 | 婷婷综合久久中文字幕蜜桃三电影 | 玩弄中年熟妇正在播放 | 天天拍夜夜添久久精品 | 欧美日韩综合一区二区三区 | 欧美性生交活xxxxxdddd | 国产精品怡红院永久免费 | 午夜精品一区二区三区的区别 | 日本熟妇乱子伦xxxx | 少妇愉情理伦片bd | 欧美丰满熟妇xxxx | 久久熟妇人妻午夜寂寞影院 | 国产做国产爱免费视频 | 少妇人妻av毛片在线看 | 女人和拘做爰正片视频 | 国产偷自视频区视频 | 日本一卡2卡3卡4卡无卡免费网站 国产一区二区三区影院 | 中文字幕无码视频专区 | 中文字幕 人妻熟女 | 夜夜躁日日躁狠狠久久av | 欧美大屁股xxxxhd黑色 | 国产午夜无码精品免费看 | 久久久久久av无码免费看大片 | 色婷婷综合中文久久一本 | 久久综合香蕉国产蜜臀av | 成人无码视频在线观看网站 | 鲁鲁鲁爽爽爽在线视频观看 | 国产午夜福利亚洲第一 | 亚洲色偷偷偷综合网 | 亚洲第一网站男人都懂 | 粗大的内捧猛烈进出视频 | 97久久超碰中文字幕 | 午夜无码区在线观看 | 亚洲热妇无码av在线播放 | 少妇的肉体aa片免费 | 少妇太爽了在线观看 | 狠狠色欧美亚洲狠狠色www | 麻豆md0077饥渴少妇 | 国产精品理论片在线观看 | 在线观看国产一区二区三区 | 日韩少妇内射免费播放 | 亚洲阿v天堂在线 | 国产熟女一区二区三区四区五区 | 欧美性猛交xxxx富婆 | 波多野结衣高清一区二区三区 | 99麻豆久久久国产精品免费 | 色婷婷久久一区二区三区麻豆 | 精品久久8x国产免费观看 | 伊人久久大香线蕉午夜 | 日韩精品无码一本二本三本色 | 国产色xx群视频射精 | 撕开奶罩揉吮奶头视频 | 日韩av无码一区二区三区不卡 | 丝袜美腿亚洲一区二区 | 亚洲国产精品一区二区第一页 | 少妇一晚三次一区二区三区 | 又色又爽又黄的美女裸体网站 | 国产午夜亚洲精品不卡下载 | 国产亚洲欧美日韩亚洲中文色 | 无码毛片视频一区二区本码 | 蜜桃无码一区二区三区 | 精品少妇爆乳无码av无码专区 | 熟妇人妻无乱码中文字幕 | 日韩无码专区 | 久久亚洲a片com人成 | 性欧美熟妇videofreesex | 国产熟妇高潮叫床视频播放 | 黑人巨大精品欧美一区二区 | 麻豆果冻传媒2021精品传媒一区下载 | 强辱丰满人妻hd中文字幕 | 日韩av无码一区二区三区不卡 | 国产熟女一区二区三区四区五区 | 亚洲毛片av日韩av无码 | 老司机亚洲精品影院无码 | 无码人妻出轨黑人中文字幕 | 亚洲日韩乱码中文无码蜜桃臀网站 | 十八禁视频网站在线观看 | 色五月五月丁香亚洲综合网 | 人人妻人人澡人人爽欧美精品 | 蜜臀av无码人妻精品 | 久久久亚洲欧洲日产国码αv | 亚洲日本va午夜在线电影 | 欧美老人巨大xxxx做受 | 青草青草久热国产精品 | 自拍偷自拍亚洲精品被多人伦好爽 | 夜夜夜高潮夜夜爽夜夜爰爰 | 日本爽爽爽爽爽爽在线观看免 | 久久99精品国产麻豆蜜芽 | 无码精品国产va在线观看dvd | 国产亚洲视频中文字幕97精品 | 日韩精品久久久肉伦网站 | 任你躁国产自任一区二区三区 | √8天堂资源地址中文在线 | 久久精品99久久香蕉国产色戒 | 国产亚洲精品久久久闺蜜 | 亚洲一区二区三区含羞草 | 97久久国产亚洲精品超碰热 | 国产乱人伦app精品久久 国产在线无码精品电影网 国产国产精品人在线视 | 日本爽爽爽爽爽爽在线观看免 | 国内揄拍国内精品人妻 | 久久国产精品萌白酱免费 | 无码播放一区二区三区 | 亚洲中文字幕成人无码 | 欧美高清在线精品一区 | 国产成人精品必看 | 国产精品无码永久免费888 | 一本一道久久综合久久 | 久久久久久久女国产乱让韩 | 亚洲一区二区三区无码久久 | 久久99精品久久久久久 | 狠狠色丁香久久婷婷综合五月 | 性啪啪chinese东北女人 | 欧美人与动性行为视频 | 欧美xxxxx精品 | 大地资源网第二页免费观看 | 国产精品无码一区二区三区不卡 | 亚洲区欧美区综合区自拍区 | 久久国产精品萌白酱免费 | 青青久在线视频免费观看 | 天堂亚洲2017在线观看 | 免费国产成人高清在线观看网站 | 女人被男人躁得好爽免费视频 | 全球成人中文在线 | 国产网红无码精品视频 | 国产亚洲精品久久久闺蜜 | 精品无码一区二区三区爱欲 | 少妇性荡欲午夜性开放视频剧场 | 久久久久se色偷偷亚洲精品av | 天堂а√在线地址中文在线 | 亚洲人成人无码网www国产 | 中文字幕乱码人妻二区三区 | 国产精品亚洲五月天高清 | 亚洲色欲久久久综合网东京热 | 波多野结衣av一区二区全免费观看 | 免费国产黄网站在线观看 | 欧美兽交xxxx×视频 | 国产又爽又黄又刺激的视频 | 亚洲精品午夜国产va久久成人 | 亚洲中文字幕无码一久久区 | 曰韩少妇内射免费播放 | 日本www一道久久久免费榴莲 | 久久久久av无码免费网 | 成人欧美一区二区三区 | 亚洲精品国产精品乱码不卡 | 亚洲春色在线视频 | 久久亚洲国产成人精品性色 | 沈阳熟女露脸对白视频 | 在线播放亚洲第一字幕 | 久久www免费人成人片 | 欧美日韩一区二区免费视频 | 亲嘴扒胸摸屁股激烈网站 | 亚洲自偷自拍另类第1页 | 亚洲精品国产品国语在线观看 | 欧美国产日产一区二区 | 无码国产色欲xxxxx视频 | 欧美高清在线精品一区 | 中国大陆精品视频xxxx | 日韩无码专区 | 中文字幕人妻无码一夲道 | 成人亚洲精品久久久久软件 | 国产精品久久久av久久久 | 国产亚洲美女精品久久久2020 | 国产精品无码一区二区桃花视频 | 国产性猛交╳xxx乱大交 国产精品久久久久久无码 欧洲欧美人成视频在线 | 曰韩无码二三区中文字幕 | 99久久久国产精品无码免费 | 成人毛片一区二区 | 久久精品人妻少妇一区二区三区 | 麻豆蜜桃av蜜臀av色欲av | 国产午夜福利100集发布 | 日本肉体xxxx裸交 | 国产精品亚洲lv粉色 | 日日摸夜夜摸狠狠摸婷婷 | 99精品国产综合久久久久五月天 | 婷婷综合久久中文字幕蜜桃三电影 | 国产亚洲精品久久久久久久久动漫 | 欧美喷潮久久久xxxxx | 亚洲精品久久久久中文第一幕 | 国产suv精品一区二区五 | 日本欧美一区二区三区乱码 | 亚洲男女内射在线播放 | 牲欲强的熟妇农村老妇女视频 | 草草网站影院白丝内射 | 国产精品久久福利网站 | 成人影院yy111111在线观看 | 国产在线精品一区二区高清不卡 | 国产va免费精品观看 | 大地资源网第二页免费观看 | 成人av无码一区二区三区 | 国产av一区二区精品久久凹凸 | 奇米影视888欧美在线观看 | 久久久av男人的天堂 | 97精品人妻一区二区三区香蕉 | 欧美日韩一区二区三区自拍 | 麻豆国产人妻欲求不满谁演的 | 国产精品内射视频免费 | 国产精品va在线播放 | 欧美人与禽猛交狂配 | 久久久婷婷五月亚洲97号色 | 国产高清不卡无码视频 | 成在人线av无码免费 | 99久久精品无码一区二区毛片 | 久久久www成人免费毛片 | 亚洲精品成a人在线观看 | 亚洲日本va中文字幕 | 波多野结衣高清一区二区三区 | 国产精品人妻一区二区三区四 | 4hu四虎永久在线观看 | 国产精品多人p群无码 | 无码人妻丰满熟妇区五十路百度 | 国产午夜手机精彩视频 | 男女性色大片免费网站 | 成人精品视频一区二区 | 狠狠色噜噜狠狠狠7777奇米 | 色一情一乱一伦一区二区三欧美 | 性欧美熟妇videofreesex | 熟女俱乐部五十路六十路av | 一本久久a久久精品vr综合 | 少妇无码一区二区二三区 | 初尝人妻少妇中文字幕 | 捆绑白丝粉色jk震动捧喷白浆 | 久久亚洲日韩精品一区二区三区 | 国产性猛交╳xxx乱大交 国产精品久久久久久无码 欧洲欧美人成视频在线 | 中文字幕精品av一区二区五区 | 国产亚洲视频中文字幕97精品 | 荫蒂被男人添的好舒服爽免费视频 | 亚洲精品一区二区三区四区五区 | 国产色精品久久人妻 | 无码福利日韩神码福利片 | 国产偷抇久久精品a片69 | 日本一区二区三区免费高清 | 丝袜 中出 制服 人妻 美腿 | 亚洲国产精品一区二区美利坚 | 色欲av亚洲一区无码少妇 | 亚洲色无码一区二区三区 | 99在线 | 亚洲 | 亚拍精品一区二区三区探花 | 日韩人妻系列无码专区 | 丰满少妇熟乱xxxxx视频 | 亚洲aⅴ无码成人网站国产app | 狂野欧美性猛xxxx乱大交 | 国产精品久久久久无码av色戒 | 99精品国产综合久久久久五月天 | 亚洲色www成人永久网址 | 国产精品久久久 | 丰满妇女强制高潮18xxxx | 成熟人妻av无码专区 | 亚洲中文字幕无码中文字在线 | 一个人看的www免费视频在线观看 | 在线 国产 欧美 亚洲 天堂 | 亚洲熟悉妇女xxx妇女av | 亚洲一区二区三区国产精华液 | 亚洲熟妇自偷自拍另类 | 在线成人www免费观看视频 | 色婷婷av一区二区三区之红樱桃 | 国产高清av在线播放 | 中文字幕乱码亚洲无线三区 | 国精产品一品二品国精品69xx | 欧美人与物videos另类 | 无码播放一区二区三区 | 黑人粗大猛烈进出高潮视频 | 国产小呦泬泬99精品 | 亚洲精品综合一区二区三区在线 | 成在人线av无码免观看麻豆 | 久久久中文字幕日本无吗 | 天下第一社区视频www日本 | 麻豆国产丝袜白领秘书在线观看 | 国产成人无码区免费内射一片色欲 | 好爽又高潮了毛片免费下载 | 国产精品久久久久久无码 | 免费国产成人高清在线观看网站 | 国内精品九九久久久精品 | 国产农村妇女高潮大叫 | 国产av人人夜夜澡人人爽麻豆 | 成人无码视频在线观看网站 | 久久综合给合久久狠狠狠97色 | 中文字幕av伊人av无码av | 国产精品爱久久久久久久 | 久久久久99精品成人片 | 97久久精品无码一区二区 | 日日摸夜夜摸狠狠摸婷婷 | 无码av岛国片在线播放 | 国产深夜福利视频在线 | 永久免费精品精品永久-夜色 | 东京无码熟妇人妻av在线网址 | 日韩人妻无码中文字幕视频 | 亚洲の无码国产の无码影院 | 精品无码成人片一区二区98 | 国产激情精品一区二区三区 | 亚洲精品久久久久久一区二区 | 娇妻被黑人粗大高潮白浆 | 国产午夜无码精品免费看 | 亚洲一区二区三区含羞草 | 国产在线无码精品电影网 | 国产成人无码av一区二区 | 男人的天堂av网站 | 国产av人人夜夜澡人人爽麻豆 | 免费国产成人高清在线观看网站 | 亚洲国产精品久久久久久 | 国产亚洲美女精品久久久2020 | 亚洲熟妇自偷自拍另类 | 久久99精品国产麻豆蜜芽 | 亚洲一区二区三区含羞草 | 无套内谢的新婚少妇国语播放 | 东京热男人av天堂 | 久久久中文字幕日本无吗 | 天堂亚洲免费视频 | 无码任你躁久久久久久久 | 日本大香伊一区二区三区 | 麻豆精产国品 | 图片区 小说区 区 亚洲五月 | 亚洲精品鲁一鲁一区二区三区 | 国产精品-区区久久久狼 | 亚洲欧美日韩国产精品一区二区 | 国产精品毛多多水多 | 精品国精品国产自在久国产87 | 人人妻人人澡人人爽欧美一区 | 日日天干夜夜狠狠爱 | 亚洲男人av天堂午夜在 | 18禁黄网站男男禁片免费观看 | 国产精品亚洲一区二区三区喷水 | 国产在线aaa片一区二区99 | 少妇高潮一区二区三区99 | 任你躁国产自任一区二区三区 | 午夜福利试看120秒体验区 | 中文字幕+乱码+中文字幕一区 | 麻豆蜜桃av蜜臀av色欲av | 老太婆性杂交欧美肥老太 | 亚洲呦女专区 | 爽爽影院免费观看 | 2019午夜福利不卡片在线 | 欧美日本精品一区二区三区 | 亚洲一区二区三区无码久久 | 欧美日韩一区二区综合 | 日韩欧美群交p片內射中文 | 最近中文2019字幕第二页 | 熟妇人妻无码xxx视频 | 亚洲第一网站男人都懂 | 东京无码熟妇人妻av在线网址 | 亚洲熟熟妇xxxx | 成熟女人特级毛片www免费 | 性生交大片免费看女人按摩摩 | 麻豆精产国品 | 中文字幕av伊人av无码av | 色五月丁香五月综合五月 | 最近免费中文字幕中文高清百度 | 熟妇女人妻丰满少妇中文字幕 | 国产办公室秘书无码精品99 | 亚洲性无码av中文字幕 | 在教室伦流澡到高潮hnp视频 | 国产av一区二区精品久久凹凸 | 亚洲狠狠色丁香婷婷综合 | 日产精品99久久久久久 | 俺去俺来也在线www色官网 | 亚洲狠狠色丁香婷婷综合 | 天天摸天天碰天天添 | 蜜臀aⅴ国产精品久久久国产老师 | 好男人社区资源 | 久久精品无码一区二区三区 | 国产成人精品三级麻豆 | 日韩成人一区二区三区在线观看 | 中文字幕av日韩精品一区二区 | 亚洲中文字幕无码中字 | 国产精品人人爽人人做我的可爱 | 国内揄拍国内精品少妇国语 | 无套内谢老熟女 | 天天摸天天透天天添 | 18无码粉嫩小泬无套在线观看 | 亚洲精品国产a久久久久久 | 免费无码的av片在线观看 | 国产亚洲欧美在线专区 | 国产亚洲精品精品国产亚洲综合 | 欧美激情内射喷水高潮 | 久久精品国产99精品亚洲 | 国产精品国产三级国产专播 | 麻豆国产人妻欲求不满谁演的 | 无码中文字幕色专区 | 蜜桃av蜜臀av色欲av麻 999久久久国产精品消防器材 | 国产精品办公室沙发 | 国产精品久久久一区二区三区 | 精品国产av色一区二区深夜久久 | 亚洲国产一区二区三区在线观看 | 又紧又大又爽精品一区二区 | 国产人妻人伦精品 | 精品一区二区三区无码免费视频 | 精品国产一区二区三区四区在线看 | 激情内射亚州一区二区三区爱妻 | 男女下面进入的视频免费午夜 | 国产精品美女久久久 | 久久久久久久女国产乱让韩 | 300部国产真实乱 | 丰满岳乱妇在线观看中字无码 | 亚洲精品无码国产 | 久久久久久久人妻无码中文字幕爆 | 国内揄拍国内精品少妇国语 | 欧洲熟妇色 欧美 | 兔费看少妇性l交大片免费 | 亚洲国产综合无码一区 | 国产猛烈高潮尖叫视频免费 | 日韩亚洲欧美精品综合 | 国产xxx69麻豆国语对白 | 沈阳熟女露脸对白视频 | 人妻aⅴ无码一区二区三区 | 久久久久亚洲精品中文字幕 | 男人和女人高潮免费网站 | 2019午夜福利不卡片在线 | 国产亚洲精品精品国产亚洲综合 | 欧美丰满老熟妇xxxxx性 | 日韩欧美群交p片內射中文 | 亚洲狠狠色丁香婷婷综合 | 美女黄网站人色视频免费国产 | 亚洲精品国产品国语在线观看 | 国产精品亚洲一区二区三区喷水 | 日本一区二区更新不卡 | 丝袜人妻一区二区三区 | 色综合久久88色综合天天 | 国产高潮视频在线观看 | 久久国语露脸国产精品电影 | 玩弄少妇高潮ⅹxxxyw | 精品一二三区久久aaa片 | 午夜福利不卡在线视频 | 久久久精品成人免费观看 | 日韩精品久久久肉伦网站 | 亚洲国产成人a精品不卡在线 | 国产一区二区不卡老阿姨 | 国产精品久久久久久亚洲影视内衣 | 国产精品内射视频免费 | 国产性生大片免费观看性 | 日产精品99久久久久久 | 色婷婷综合中文久久一本 | 一本大道伊人av久久综合 | 国模大胆一区二区三区 | 中文字幕乱码人妻无码久久 | 国产乱人伦偷精品视频 | 亚洲国产精华液网站w | 99久久亚洲精品无码毛片 | 18禁止看的免费污网站 | 久久午夜夜伦鲁鲁片无码免费 | 亚洲一区二区三区国产精华液 | 午夜福利试看120秒体验区 | 麻豆精品国产精华精华液好用吗 | 国产av一区二区精品久久凹凸 | 国产精品人人妻人人爽 | 噜噜噜亚洲色成人网站 | 日本爽爽爽爽爽爽在线观看免 | 永久免费精品精品永久-夜色 | 人人澡人摸人人添 | 丝袜 中出 制服 人妻 美腿 | 欧美午夜特黄aaaaaa片 | 亚洲熟妇色xxxxx亚洲 | 在线播放无码字幕亚洲 | 亚洲中文字幕无码一久久区 | 乱人伦人妻中文字幕无码久久网 | 三上悠亚人妻中文字幕在线 | 日本大香伊一区二区三区 | 色综合久久久无码网中文 | 国产成人无码午夜视频在线观看 | 又湿又紧又大又爽a视频国产 | 天堂在线观看www | 日本护士xxxxhd少妇 | 国产97人人超碰caoprom | 亚洲国产精华液网站w | 色欲av亚洲一区无码少妇 | 亚洲va欧美va天堂v国产综合 | 国产精品久久久久久亚洲影视内衣 | 亚洲精品一区二区三区大桥未久 | 乱码av麻豆丝袜熟女系列 | 精品无码一区二区三区爱欲 | 久久久久久九九精品久 | 精品一区二区三区无码免费视频 | 久久久精品国产sm最大网站 | 丰满护士巨好爽好大乳 | 久久国产自偷自偷免费一区调 | 国产成人无码a区在线观看视频app | 亚洲成a人片在线观看日本 | 欧美人妻一区二区三区 | 色综合天天综合狠狠爱 | 激情内射日本一区二区三区 | 性史性农村dvd毛片 | ass日本丰满熟妇pics | 一本久久a久久精品亚洲 | www国产亚洲精品久久网站 | 亚洲色欲色欲天天天www | 中文字幕乱码人妻无码久久 | 激情内射日本一区二区三区 | 樱花草在线社区www | 狠狠亚洲超碰狼人久久 | 久久综合九色综合欧美狠狠 | 久久久久成人精品免费播放动漫 | 中文字幕无线码免费人妻 | 兔费看少妇性l交大片免费 | 麻豆国产人妻欲求不满谁演的 | 亚洲精品国产第一综合99久久 | 亚洲成色在线综合网站 | 亚洲爆乳精品无码一区二区三区 | 国产精品亚洲五月天高清 | 国产精品久久久久久无码 | 国产亚洲精品久久久久久 | 色一情一乱一伦一区二区三欧美 | 少妇高潮喷潮久久久影院 | 精品一区二区三区波多野结衣 | 日韩人妻系列无码专区 | 激情五月综合色婷婷一区二区 | 亚洲成av人在线观看网址 | 永久黄网站色视频免费直播 | 麻豆蜜桃av蜜臀av色欲av | 超碰97人人做人人爱少妇 | 欧美日本免费一区二区三区 | 丰满少妇弄高潮了www | 装睡被陌生人摸出水好爽 | 久久无码专区国产精品s | 成人aaa片一区国产精品 | 欧美性生交活xxxxxdddd | 桃花色综合影院 | 日韩视频 中文字幕 视频一区 | 麻豆成人精品国产免费 | 国产suv精品一区二区五 | 亚洲第一网站男人都懂 | 无码任你躁久久久久久久 | 欧美肥老太牲交大战 | 精品久久久中文字幕人妻 | 麻豆果冻传媒2021精品传媒一区下载 | 久久精品国产99久久6动漫 | 99久久久国产精品无码免费 | 少妇愉情理伦片bd | 亚洲第一网站男人都懂 | 国产精品多人p群无码 | 少妇无码av无码专区在线观看 | 欧洲vodafone精品性 | 国产特级毛片aaaaaa高潮流水 | 国产精品va在线播放 | 国产绳艺sm调教室论坛 | 伊人久久大香线焦av综合影院 | 精品人妻中文字幕有码在线 | 精品少妇爆乳无码av无码专区 | 又大又黄又粗又爽的免费视频 | 中文字幕无码人妻少妇免费 | 国内揄拍国内精品少妇国语 | 亚洲熟妇色xxxxx欧美老妇y | 国产精品美女久久久久av爽李琼 | 国产无套粉嫩白浆在线 | 熟妇人妻无码xxx视频 | 乌克兰少妇性做爰 | 白嫩日本少妇做爰 | 国产凸凹视频一区二区 | 天堂а√在线中文在线 | 国产超级va在线观看视频 | 中国女人内谢69xxxx | 男人和女人高潮免费网站 | 日本va欧美va欧美va精品 | 国产精品毛片一区二区 | 亚洲一区二区三区偷拍女厕 | 国产av无码专区亚洲a∨毛片 | 内射爽无广熟女亚洲 | 久久精品国产大片免费观看 | 人妻熟女一区 | 亚洲国产精品一区二区美利坚 | 中国女人内谢69xxxx | 无码av最新清无码专区吞精 | 成人三级无码视频在线观看 | 亚洲乱亚洲乱妇50p | 青春草在线视频免费观看 | 两性色午夜视频免费播放 | 国产激情艳情在线看视频 | 性啪啪chinese东北女人 | 精品无码国产一区二区三区av | 妺妺窝人体色www在线小说 | 国产av无码专区亚洲a∨毛片 | 免费无码的av片在线观看 | 综合激情五月综合激情五月激情1 | 成人一在线视频日韩国产 | 国产精品-区区久久久狼 | 爽爽影院免费观看 | 婷婷综合久久中文字幕蜜桃三电影 | 精品一区二区不卡无码av | 国产综合色产在线精品 | 国产莉萝无码av在线播放 | 88国产精品欧美一区二区三区 | 国产精品人人妻人人爽 | 日本一本二本三区免费 | 欧美国产日韩亚洲中文 | ass日本丰满熟妇pics | 强奷人妻日本中文字幕 | 中文字幕人成乱码熟女app | 天天躁夜夜躁狠狠是什么心态 | 3d动漫精品啪啪一区二区中 | 精品国精品国产自在久国产87 | 国产手机在线αⅴ片无码观看 | 欧美三级a做爰在线观看 | 久久五月精品中文字幕 | 亚洲人亚洲人成电影网站色 | 无码人妻久久一区二区三区不卡 | 日本熟妇大屁股人妻 | 久久久精品欧美一区二区免费 | 成人一区二区免费视频 | 亚洲男人av香蕉爽爽爽爽 | aⅴ在线视频男人的天堂 | 久久久无码中文字幕久... | 荫蒂被男人添的好舒服爽免费视频 | 国产女主播喷水视频在线观看 | 欧美zoozzooz性欧美 | 丰满肥臀大屁股熟妇激情视频 | 精品 日韩 国产 欧美 视频 | 麻豆精品国产精华精华液好用吗 | 麻花豆传媒剧国产免费mv在线 | 国内精品九九久久久精品 | 精品午夜福利在线观看 | 水蜜桃av无码 | 亚洲男人av香蕉爽爽爽爽 | 无套内谢老熟女 | 99精品视频在线观看免费 | 国产精品亚洲а∨无码播放麻豆 | 狠狠色丁香久久婷婷综合五月 | 久久无码人妻影院 | 久久无码中文字幕免费影院蜜桃 | 国产口爆吞精在线视频 | 久久久久久九九精品久 | 中文字幕乱码人妻无码久久 | 亚洲aⅴ无码成人网站国产app | 波多野结衣一区二区三区av免费 | 77777熟女视频在线观看 а天堂中文在线官网 | 亚洲日韩中文字幕在线播放 | 一本久久伊人热热精品中文字幕 | 国产av人人夜夜澡人人爽麻豆 | 久久99精品国产麻豆 | 精品午夜福利在线观看 | 日本丰满熟妇videos | 四虎永久在线精品免费网址 | 亚洲一区二区三区偷拍女厕 | 丰满人妻被黑人猛烈进入 | 成人无码视频免费播放 | 特黄特色大片免费播放器图片 | 欧美 丝袜 自拍 制服 另类 | 久久久精品国产sm最大网站 | 国产精品igao视频网 | 97久久国产亚洲精品超碰热 | 乱人伦人妻中文字幕无码 | 性生交大片免费看女人按摩摩 | 成人免费视频在线观看 | 亚洲欧美国产精品专区久久 | 国产精品久久久久久亚洲毛片 | 国产va免费精品观看 | 亚洲小说图区综合在线 | а√资源新版在线天堂 | 亚洲一区二区三区播放 | 又大又紧又粉嫩18p少妇 | 日韩精品无码一区二区中文字幕 | 亚洲日本一区二区三区在线 | 国产sm调教视频在线观看 | 成人性做爰aaa片免费看 | 国产区女主播在线观看 | 又大又黄又粗又爽的免费视频 | 午夜福利不卡在线视频 | 荫蒂被男人添的好舒服爽免费视频 | 天下第一社区视频www日本 | 国产做国产爱免费视频 | 国产精华av午夜在线观看 | 色偷偷av老熟女 久久精品人妻少妇一区二区三区 | 久久综合九色综合欧美狠狠 | 荫蒂添的好舒服视频囗交 | 日本免费一区二区三区最新 | 亚洲日韩av一区二区三区中文 | 亚洲精品久久久久久久久久久 | 日韩在线不卡免费视频一区 | 乱人伦人妻中文字幕无码久久网 | 亚洲色大成网站www国产 | 成人无码精品一区二区三区 | 久久五月精品中文字幕 | 98国产精品综合一区二区三区 | 亚洲性无码av中文字幕 | 国产三级精品三级男人的天堂 | 国产一区二区不卡老阿姨 | 亚洲精品久久久久久久久久久 | 伊人久久大香线蕉亚洲 | 国产成人精品久久亚洲高清不卡 | 久久久精品456亚洲影院 | 国产乱人偷精品人妻a片 | 精品久久久中文字幕人妻 | 中文字幕中文有码在线 | 成年美女黄网站色大免费全看 | 成人性做爰aaa片免费看不忠 | 亚洲国产精品毛片av不卡在线 | 亲嘴扒胸摸屁股激烈网站 | 亚洲精品午夜国产va久久成人 | 久久久久99精品成人片 | 人人妻人人澡人人爽欧美一区 | 蜜桃视频韩日免费播放 | 曰本女人与公拘交酡免费视频 | 国内揄拍国内精品少妇国语 | 国产成人精品一区二区在线小狼 | 久在线观看福利视频 | 久久亚洲中文字幕无码 | 午夜精品久久久久久久久 | √天堂中文官网8在线 | 亚洲一区二区三区含羞草 | 黄网在线观看免费网站 | 激情亚洲一区国产精品 | 国产农村妇女aaaaa视频 撕开奶罩揉吮奶头视频 | 奇米综合四色77777久久 东京无码熟妇人妻av在线网址 | 中文字幕日产无线码一区 | 又大又硬又黄的免费视频 | 亚洲综合久久一区二区 | 色婷婷综合中文久久一本 | 亚洲精品美女久久久久久久 | av香港经典三级级 在线 | 日韩精品一区二区av在线 | 国产精品人人爽人人做我的可爱 | 午夜成人1000部免费视频 | 免费中文字幕日韩欧美 | 欧美成人家庭影院 | 国产网红无码精品视频 | 18无码粉嫩小泬无套在线观看 | 秋霞成人午夜鲁丝一区二区三区 | 欧美性生交活xxxxxdddd | 三上悠亚人妻中文字幕在线 | 波多野结衣aⅴ在线 | 国产明星裸体无码xxxx视频 | 美女毛片一区二区三区四区 | 精品欧洲av无码一区二区三区 | 欧美精品一区二区精品久久 | 久激情内射婷内射蜜桃人妖 | 国产精品无码mv在线观看 | 国产乱子伦视频在线播放 | 97精品人妻一区二区三区香蕉 | 永久免费精品精品永久-夜色 | 亚洲区小说区激情区图片区 | 国产精品.xx视频.xxtv | 性欧美牲交xxxxx视频 | 一个人免费观看的www视频 | 欧美日韩在线亚洲综合国产人 | 美女黄网站人色视频免费国产 | 国产极品美女高潮无套在线观看 | 高清国产亚洲精品自在久久 | 日本一本二本三区免费 | 国产精品-区区久久久狼 | 久久久久亚洲精品中文字幕 | 成 人 网 站国产免费观看 | 亚洲欧美综合区丁香五月小说 | 帮老师解开蕾丝奶罩吸乳网站 | 国产舌乚八伦偷品w中 | 欧美日韩一区二区综合 | 亚洲欧美中文字幕5发布 | 色爱情人网站 | 欧美三级不卡在线观看 | 国产精品手机免费 | 人妻体内射精一区二区三四 | 国产精品视频免费播放 | 欧美猛少妇色xxxxx | 曰本女人与公拘交酡免费视频 | 日韩无码专区 | 无码精品国产va在线观看dvd | 一本色道久久综合亚洲精品不卡 | 欧美激情一区二区三区成人 | 日韩少妇内射免费播放 | 久久久成人毛片无码 | 日本xxxx色视频在线观看免费 | 人妻中文无码久热丝袜 | 国产精品99久久精品爆乳 | 婷婷丁香五月天综合东京热 | 99久久亚洲精品无码毛片 | 特大黑人娇小亚洲女 | 国产在线aaa片一区二区99 | 免费人成网站视频在线观看 | av无码不卡在线观看免费 | 丝袜足控一区二区三区 | 亚洲国产欧美日韩精品一区二区三区 | 色综合久久久久综合一本到桃花网 | 无码成人精品区在线观看 | 日韩亚洲欧美精品综合 | 内射爽无广熟女亚洲 | 真人与拘做受免费视频 | 国产精品沙发午睡系列 | 美女扒开屁股让男人桶 | 无码播放一区二区三区 | 国产网红无码精品视频 | 亚洲日韩av一区二区三区中文 | 小sao货水好多真紧h无码视频 | 九月婷婷人人澡人人添人人爽 | 午夜精品久久久久久久久 | 色婷婷香蕉在线一区二区 | 国产亚洲人成a在线v网站 | 疯狂三人交性欧美 | 久久久久成人片免费观看蜜芽 | 国内少妇偷人精品视频 | 国产激情精品一区二区三区 | 成人欧美一区二区三区黑人免费 | av香港经典三级级 在线 | 亚洲一区二区三区播放 | 午夜精品久久久久久久 | 欧美 亚洲 国产 另类 | 亚洲а∨天堂久久精品2021 | 久久国产精品精品国产色婷婷 | 啦啦啦www在线观看免费视频 | 性欧美大战久久久久久久 | 欧美激情内射喷水高潮 | 98国产精品综合一区二区三区 | 亚洲毛片av日韩av无码 | 又大又硬又爽免费视频 | 亚洲欧美综合区丁香五月小说 | 成人精品视频一区二区 | 日韩av无码一区二区三区不卡 | 99久久精品国产一区二区蜜芽 | 性欧美牲交xxxxx视频 | 欧美三级a做爰在线观看 | 国产婷婷色一区二区三区在线 | 久久国产精品精品国产色婷婷 | 俺去俺来也www色官网 | 亚洲aⅴ无码成人网站国产app | 老太婆性杂交欧美肥老太 | 青青青手机频在线观看 | 亚洲精品一区国产 | 欧美放荡的少妇 | 国产福利视频一区二区 | 老熟女重囗味hdxx69 | 亚洲人成网站色7799 | 熟女体下毛毛黑森林 | 日韩欧美中文字幕在线三区 | 亚洲精品一区二区三区婷婷月 | 狠狠色欧美亚洲狠狠色www | 久久久无码中文字幕久... | 人妻aⅴ无码一区二区三区 | 高清无码午夜福利视频 | 久久99久久99精品中文字幕 | 色爱情人网站 | 国产内射爽爽大片视频社区在线 | 国产精品第一国产精品 | 国产69精品久久久久app下载 | 性生交大片免费看l | 日本一卡2卡3卡4卡无卡免费网站 国产一区二区三区影院 | 大色综合色综合网站 | 精品成在人线av无码免费看 | 色综合久久久久综合一本到桃花网 | 亚洲а∨天堂久久精品2021 | 亚洲欧美色中文字幕在线 | 久久天天躁狠狠躁夜夜免费观看 | 99久久精品无码一区二区毛片 | 日本熟妇大屁股人妻 | 波多野结衣乳巨码无在线观看 | 玩弄中年熟妇正在播放 | 国内精品九九久久久精品 | 狂野欧美性猛交免费视频 | 国产成人无码午夜视频在线观看 | 午夜熟女插插xx免费视频 | 久久99精品久久久久久动态图 | 中文字幕无码av激情不卡 | 国产精华av午夜在线观看 | 青青青爽视频在线观看 | 美女黄网站人色视频免费国产 | 一本久久a久久精品vr综合 | 日本大香伊一区二区三区 | 特黄特色大片免费播放器图片 | 18精品久久久无码午夜福利 | 九九热爱视频精品 | 欧洲vodafone精品性 | 成人精品视频一区二区三区尤物 | 国产精品人妻一区二区三区四 | 亚洲欧洲无卡二区视頻 | 亚洲中文字幕成人无码 | 国精品人妻无码一区二区三区蜜柚 | 成人欧美一区二区三区黑人免费 | 欧美xxxx黑人又粗又长 | 国产成人精品一区二区在线小狼 | 老熟妇乱子伦牲交视频 | 在线а√天堂中文官网 | 国产九九九九九九九a片 | 精品一区二区三区无码免费视频 | 亚洲精品一区三区三区在线观看 | 在线精品亚洲一区二区 | 精品无码成人片一区二区98 | 日本精品久久久久中文字幕 | 丰腴饱满的极品熟妇 | 亚洲国产高清在线观看视频 | 国产亚洲日韩欧美另类第八页 | 久久伊人色av天堂九九小黄鸭 | 老熟妇仑乱视频一区二区 | 欧美一区二区三区 | 免费播放一区二区三区 | 午夜熟女插插xx免费视频 | 捆绑白丝粉色jk震动捧喷白浆 | 无码人妻精品一区二区三区下载 | 久激情内射婷内射蜜桃人妖 | 人妻无码久久精品人妻 | 性史性农村dvd毛片 | 国产精品高潮呻吟av久久 | 丝袜足控一区二区三区 | 人人妻人人澡人人爽欧美一区 | 狠狠色色综合网站 | 久久精品国产一区二区三区肥胖 | 国产成人无码区免费内射一片色欲 | 精品乱子伦一区二区三区 | 无码人妻出轨黑人中文字幕 | 中文字幕无码av激情不卡 | 久久99精品国产麻豆 | 伊人久久婷婷五月综合97色 | 国产午夜精品一区二区三区嫩草 | 成熟妇人a片免费看网站 | 99久久99久久免费精品蜜桃 | 国产偷国产偷精品高清尤物 | 在线天堂新版最新版在线8 | 久久久久亚洲精品男人的天堂 | 97精品人妻一区二区三区香蕉 | 精品国产乱码久久久久乱码 | 精品少妇爆乳无码av无码专区 | 在线观看国产一区二区三区 | 久久综合香蕉国产蜜臀av | 蜜桃无码一区二区三区 | 波多野42部无码喷潮在线 | 熟妇女人妻丰满少妇中文字幕 | 一个人看的视频www在线 | 日本爽爽爽爽爽爽在线观看免 | 久久精品丝袜高跟鞋 | 精品偷自拍另类在线观看 | 国产精品久久久久7777 | 国产精品人人妻人人爽 | 亚洲一区二区观看播放 | 亚洲小说春色综合另类 | 日韩亚洲欧美精品综合 | 欧美老人巨大xxxx做受 | 黑人巨大精品欧美一区二区 | 中文字幕人妻无码一夲道 | 又大又黄又粗又爽的免费视频 | 午夜理论片yy44880影院 | 国产乱人无码伦av在线a | 中文精品无码中文字幕无码专区 | 又粗又大又硬毛片免费看 | 亚洲无人区一区二区三区 | 图片小说视频一区二区 | 久久亚洲精品中文字幕无男同 | 亚拍精品一区二区三区探花 | 国产三级久久久精品麻豆三级 | 国产av剧情md精品麻豆 | 欧美猛少妇色xxxxx | 波多野结衣av在线观看 | 奇米综合四色77777久久 东京无码熟妇人妻av在线网址 | 狠狠色噜噜狠狠狠7777奇米 | 亚洲欧美中文字幕5发布 | 亚洲精品久久久久久一区二区 | 日产精品99久久久久久 | 欧美激情内射喷水高潮 | 又色又爽又黄的美女裸体网站 | 久久国产精品二国产精品 | 欧美高清在线精品一区 | 日本护士xxxxhd少妇 | 国产农村乱对白刺激视频 | av香港经典三级级 在线 | 性开放的女人aaa片 | 天天综合网天天综合色 | 人妻无码久久精品人妻 | 波多野结衣一区二区三区av免费 | 日韩欧美群交p片內射中文 | 99久久久无码国产aaa精品 | 免费无码一区二区三区蜜桃大 | 欧美日本免费一区二区三区 | 国产乱人偷精品人妻a片 | 久久久精品人妻久久影视 | 成人试看120秒体验区 | 亚洲无人区午夜福利码高清完整版 | 日韩亚洲欧美中文高清在线 | 中文字幕亚洲情99在线 | 精品国产一区二区三区四区 | 亚洲春色在线视频 | 国产精品99爱免费视频 | 国产精品久久久av久久久 | 5858s亚洲色大成网站www | 日欧一片内射va在线影院 | 成年美女黄网站色大免费全看 | 强开小婷嫩苞又嫩又紧视频 | 狂野欧美激情性xxxx | 国产精品亚洲а∨无码播放麻豆 | 国产婷婷色一区二区三区在线 | 无码毛片视频一区二区本码 | 亚洲自偷精品视频自拍 | ass日本丰满熟妇pics | 少妇性荡欲午夜性开放视频剧场 | 久久精品中文字幕大胸 | 日韩视频 中文字幕 视频一区 | 丰满人妻翻云覆雨呻吟视频 | 色综合久久久无码网中文 | 精品无人区无码乱码毛片国产 | 久久亚洲中文字幕精品一区 | 亚洲一区二区三区无码久久 | 欧美国产亚洲日韩在线二区 | 男女作爱免费网站 | 日韩精品a片一区二区三区妖精 | 无码人妻精品一区二区三区下载 | 亚洲国产精品成人久久蜜臀 | 午夜免费福利小电影 | 少妇被黑人到高潮喷出白浆 | 中文字幕人妻无码一夲道 | 小泽玛莉亚一区二区视频在线 | 久久国产劲爆∧v内射 | 精品aⅴ一区二区三区 | 中文精品无码中文字幕无码专区 | 成人免费视频一区二区 | 欧美丰满少妇xxxx性 | 精品少妇爆乳无码av无码专区 | 丰满人妻一区二区三区免费视频 | 5858s亚洲色大成网站www | 高潮毛片无遮挡高清免费视频 | 国产区女主播在线观看 | ass日本丰满熟妇pics | 久久精品国产日本波多野结衣 | 99国产精品白浆在线观看免费 |