分享

prep, bake, juice三步说的真清楚

 医学数据科学 2021-04-26

Let's walk through what each of these functions does. First, let's define a recipe with a couple of steps. Notice that the data going into the recipe is the training data.

library(recipes)

cars_train <- mtcars[1:20,]
cars_test <- mtcars[21:32,]

cars_rec <- recipe(mpg ~ ., data = cars_train) %>%
  step_log(disp) %>%
  step_center(all_predictors())
cars_rec#> Data Recipe#> #> Inputs:#> #>       role #variables#>    outcome          1#>  predictor         10#> #> Operations:#> #> Log transformation on disp#> Centering for all_predictors

The preprocessing recipe cars_rec has been defined but no values have been estimated. For example, the log has not been taken for disp, and the mean has not been calculated for predictors so that they can be centered.

The prep() function takes that defined object and computes everything so that the preprocessing steps can be executed. For example, the mean of each predictor is calculated in this example so the predictors can be centered. This is done with the training data.

cars_prep <- prep(cars_rec)
cars_prep#> Data Recipe#> #> Inputs:#> #>       role #variables#>    outcome          1#>  predictor         10#> #> Training data contained 20 data points and no missing data.#> #> Operations:#> #> Log transformation on disp [trained]#> Centering for cyl, disp, hp, drat, wt, qsec, vs, am, ... [trained]

Notice that before, with the unprepped recipe, it just said Centering for all_predictors because it had not been evaluated yet. Now it has been evaluated and we know which columns are predictors and what their means are.

The bake() and juice() functions both return data, not a preprocessing recipe object. The bake() function takes a prepped recipe (one that has had all quantities estimated from training data) and applies it to new_data. That new_data could be the training data again...

bake(cars_prep, new_data = cars_train)#> # A tibble: 20 x 11#>      cyl   disp    hp   drat      wt   qsec    vs    am  gear   carb   mpg#>    <dbl>  <dbl> <dbl>  <dbl>   <dbl>  <dbl> <dbl> <dbl> <dbl>  <dbl> <dbl>#>  1  -0.2 -0.222 -26.2  0.355 -0.778  -1.98   -0.5   0.7   0.5  1.30   21  #>  2  -0.2 -0.222 -26.2  0.355 -0.523  -1.42   -0.5   0.7   0.5  1.30   21  #>  3  -2.2 -0.615 -43.2  0.305 -1.08    0.169   0.5   0.7   0.5 -1.7    22.8#>  4  -0.2  0.256 -26.2 -0.465 -0.183   0.999   0.5  -0.3  -0.5 -1.7    21.4#>  5   1.8  0.589  38.8 -0.395  0.0415 -1.42   -0.5  -0.3  -0.5 -0.7    18.7#>  6  -0.2  0.119 -31.2 -0.785  0.0615  1.78    0.5  -0.3  -0.5 -1.7    18.1#>  7   1.8  0.589 109.  -0.335  0.172  -2.60   -0.5  -0.3  -0.5  1.30   14.3#>  8  -2.2 -0.309 -74.2  0.145 -0.208   1.56    0.5  -0.3   0.5 -0.7    24.4#>  9  -2.2 -0.350 -41.2  0.375 -0.248   4.46    0.5  -0.3   0.5 -0.7    22.8#> 10  -0.2 -0.176 -13.2  0.375  0.0415 -0.141   0.5  -0.3   0.5  1.30   19.2#> 11  -0.2 -0.176 -13.2  0.375  0.0415  0.459   0.5  -0.3   0.5  1.30   17.8#> 12   1.8  0.323  43.8 -0.475  0.672  -1.04   -0.5  -0.3  -0.5  0.300  16.4#> 13   1.8  0.323  43.8 -0.475  0.332  -0.841  -0.5  -0.3  -0.5  0.300  17.3#> 14   1.8  0.323  43.8 -0.475  0.382  -0.441  -0.5  -0.3  -0.5  0.300  15.2#> 15   1.8  0.860  68.8 -0.615  1.85   -0.461  -0.5  -0.3  -0.5  1.30   10.4#> 16   1.8  0.834  78.8 -0.545  2.03   -0.621  -0.5  -0.3  -0.5  1.30   10.4#> 17   1.8  0.790  93.8 -0.315  1.95   -1.02   -0.5  -0.3  -0.5  1.30   14.7#> 18  -2.2 -0.932 -70.2  0.535 -1.20    1.03    0.5   0.7   0.5 -1.7    32.4#> 19  -2.2 -0.970 -84.2  1.38  -1.78    0.079   0.5   0.7   0.5 -0.7    30.4#> 20  -2.2 -1.03  -71.2  0.675 -1.56    1.46    0.5   0.7   0.5 -1.7    33.9

Or it could be the testing data. In this case, the column means from the training data are applied to the testing data, because that is what happens IRL in a modeling workflow. To do otherwise is data leakage.

bake(cars_prep, new_data = cars_test)#> # A tibble: 12 x 11#>      cyl   disp    hp     drat      wt   qsec    vs    am  gear  carb   mpg#>    <dbl>  <dbl> <dbl>    <dbl>   <dbl>  <dbl> <dbl> <dbl> <dbl> <dbl> <dbl>#>  1  -2.2 -0.509 -39.2  0.155   -0.933   1.57    0.5  -0.3  -0.5 -1.7   21.5#>  2   1.8  0.465  13.8 -0.785    0.122  -1.57   -0.5  -0.3  -0.5 -0.7   15.5#>  3   1.8  0.420  13.8 -0.395    0.0366 -1.14   -0.5  -0.3  -0.5 -0.7   15.2#>  4   1.8  0.561 109.   0.185    0.442  -3.03   -0.5  -0.3  -0.5  1.30  13.3#>  5   1.8  0.694  38.8 -0.465    0.447  -1.39   -0.5  -0.3  -0.5 -0.7   19.2#>  6  -2.2 -0.928 -70.2  0.535   -1.46    0.459   0.5   0.7   0.5 -1.7   27.3#>  7  -2.2 -0.507 -45.2  0.885   -1.26   -1.74   -0.5   0.7   1.5 -0.7   26  #>  8  -2.2 -0.742 -23.2  0.225   -1.89   -1.54    0.5   0.7   1.5 -0.7   30.4#>  9   1.8  0.564 128.   0.675   -0.228  -3.94   -0.5   0.7   1.5  1.30  15.8#> 10  -0.2 -0.320  38.8  0.075   -0.628  -2.94   -0.5   0.7   1.5  3.3   19.7#> 11   1.8  0.410 199.  -0.00500  0.172  -3.84   -0.5   0.7   1.5  5.3   15  #> 12  -2.2 -0.501 -27.2  0.565   -0.618   0.159   0.5   0.7   0.5 -0.7   21.4

The juice() function is a nice little shortcut. Because the prepped recipe was estimated from the training data, you can process the training data only from it. Picture yourself squeezing the prepped recipe to get the training data back out that you used to estimate the preprocessing parameters with to start with.

juice(cars_prep)#> # A tibble: 20 x 11#>      cyl   disp    hp   drat      wt   qsec    vs    am  gear   carb   mpg#>    <dbl>  <dbl> <dbl>  <dbl>   <dbl>  <dbl> <dbl> <dbl> <dbl>  <dbl> <dbl>#>  1  -0.2 -0.222 -26.2  0.355 -0.778  -1.98   -0.5   0.7   0.5  1.30   21  #>  2  -0.2 -0.222 -26.2  0.355 -0.523  -1.42   -0.5   0.7   0.5  1.30   21  #>  3  -2.2 -0.615 -43.2  0.305 -1.08    0.169   0.5   0.7   0.5 -1.7    22.8#>  4  -0.2  0.256 -26.2 -0.465 -0.183   0.999   0.5  -0.3  -0.5 -1.7    21.4#>  5   1.8  0.589  38.8 -0.395  0.0415 -1.42   -0.5  -0.3  -0.5 -0.7    18.7#>  6  -0.2  0.119 -31.2 -0.785  0.0615  1.78    0.5  -0.3  -0.5 -1.7    18.1#>  7   1.8  0.589 109.  -0.335  0.172  -2.60   -0.5  -0.3  -0.5  1.30   14.3#>  8  -2.2 -0.309 -74.2  0.145 -0.208   1.56    0.5  -0.3   0.5 -0.7    24.4#>  9  -2.2 -0.350 -41.2  0.375 -0.248   4.46    0.5  -0.3   0.5 -0.7    22.8#> 10  -0.2 -0.176 -13.2  0.375  0.0415 -0.141   0.5  -0.3   0.5  1.30   19.2#> 11  -0.2 -0.176 -13.2  0.375  0.0415  0.459   0.5  -0.3   0.5  1.30   17.8#> 12   1.8  0.323  43.8 -0.475  0.672  -1.04   -0.5  -0.3  -0.5  0.300  16.4#> 13   1.8  0.323  43.8 -0.475  0.332  -0.841  -0.5  -0.3  -0.5  0.300  17.3#> 14   1.8  0.323  43.8 -0.475  0.382  -0.441  -0.5  -0.3  -0.5  0.300  15.2#> 15   1.8  0.860  68.8 -0.615  1.85   -0.461  -0.5  -0.3  -0.5  1.30   10.4#> 16   1.8  0.834  78.8 -0.545  2.03   -0.621  -0.5  -0.3  -0.5  1.30   10.4#> 17   1.8  0.790  93.8 -0.315  1.95   -1.02   -0.5  -0.3  -0.5  1.30   14.7#> 18  -2.2 -0.932 -70.2  0.535 -1.20    1.03    0.5   0.7   0.5 -1.7    32.4#> 19  -2.2 -0.970 -84.2  1.38  -1.78    0.079   0.5   0.7   0.5 -0.7    30.4#> 20  -2.2 -1.03  -71.2  0.675 -1.56    1.46    0.5   0.7   0.5 -1.7    33.9

Created on 2020-06-04 by the reprex package (v0.3.0)

    本站是提供个人知识管理的网络存储空间,所有内容均由用户发布,不代表本站观点。请注意甄别内容中的联系方式、诱导购买等信息,谨防诈骗。如发现有害或侵权内容,请点击一键举报。
    转藏 分享 献花(0

    0条评论

    发表

    请遵守用户 评论公约