Skip to content

Commit 7c253c4

Browse files
committed
Merge branch 'main' into 0.5.X
2 parents 15ba87f + c9a67bc commit 7c253c4

File tree

4 files changed

+24
-13
lines changed

4 files changed

+24
-13
lines changed

DESCRIPTION

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,8 @@ Authors@R: c(
66
person("Philipp", "Bach", email = "philipp.bach@uni-hamburg.de", role=c("aut", "cre")),
77
person("Victor", "Chernozhukov", role="aut"),
88
person("Malte S.", "Kurz", email = "mkurz-software@gmx.de", role="aut"),
9-
person("Martin", "Spindler", email="martin.spindler@gmx.de", role="aut"))
9+
person("Martin", "Spindler", email="martin.spindler@gmx.de", role="aut"),
10+
person("Klaassen", "Sven", email="sven.klaassen@uni-hamburg.de", role="aut"))
1011
Description: Implementation of the double/debiased machine learning framework of
1112
Chernozhukov et al. (2018) <doi:10.1111/ectj.12097> for partially linear
1213
regression models, partially linear instrumental variable regression models,
@@ -36,7 +37,7 @@ Imports:
3637
mlr3learners (>= 0.3.0),
3738
mlr3misc
3839
Roxygen: list(markdown = TRUE, r6 = TRUE)
39-
RoxygenNote: 7.2.1
40+
RoxygenNote: 7.2.3
4041
Suggests:
4142
knitr,
4243
rmarkdown,

LICENSE

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,2 +1,2 @@
1-
YEAR: 2019-2021
2-
COPYRIGHT HOLDER: Philipp Bach, Victor Chernozhukov, Malte S. Kurz, Martin Spindler
1+
YEAR: 2019-2023
2+
COPYRIGHT HOLDER: Philipp Bach, Victor Chernozhukov, Malte S. Kurz, Martin Spindler, Sven Klaassen

man/DoubleML.Rd

Lines changed: 10 additions & 1 deletion
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

vignettes/getstarted.Rmd

Lines changed: 9 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -79,7 +79,7 @@ dml_data_bonus = DoubleMLData$new(df_bonus,
7979
print(dml_data_bonus)
8080
8181
# matrix interface to DoubleMLData
82-
dml_data_sim = double_ml_data_from_matrix(X=X, y=y, d=d)
82+
dml_data_sim = double_ml_data_from_matrix(X = X, y = y, d = d)
8383
dml_data_sim
8484
```
8585

@@ -94,12 +94,12 @@ library(mlr3learners)
9494
# surpress messages from mlr3 package during fitting
9595
lgr::get_logger("mlr3")$set_threshold("warn")
9696
97-
learner = lrn("regr.ranger", num.trees=500, mtry=floor(sqrt(n_vars)), max.depth=5, min.node.size=2)
98-
ml_g_bonus = learner$clone()
97+
learner = lrn("regr.ranger", num.trees = 500, max.depth = 5, min.node.size = 2)
98+
ml_l_bonus = learner$clone()
9999
ml_m_bonus = learner$clone()
100100
101101
learner = lrn("regr.glmnet", lambda = sqrt(log(n_vars)/(n_obs)))
102-
ml_g_sim = learner$clone()
102+
ml_l_sim = learner$clone()
103103
ml_m_sim = learner$clone()
104104
```
105105

@@ -111,9 +111,10 @@ When initializing the object for PLR models `DoubleMLPLR`, we can further set pa
111111
* The number of folds used for cross-fitting `n_folds` (defaults to `n_folds = 5`) as well as
112112
* the number of repetitions when applying repeated cross-fitting `n_rep` (defaults to `n_rep = 1`).
113113

114-
Additionally, one can choose between the algorithms `"dml1"` and `"dml2"` via `dml_procedure` (defaults to `"dml2"`). Depending on the causal model, one can further choose between different Neyman-orthogonal score / moment functions. For the PLR model the default score is `"partialling out"`.
114+
Additionally, one can choose between the algorithms `"dml1"` and `"dml2"` via `dml_procedure` (defaults to `"dml2"`). Depending on the causal model, one can further choose between different Neyman-orthogonal score / moment functions. For the PLR model the default score is `"partialling out"`, i.e.,
115+
\begin{align}\begin{aligned}\psi(W; \theta, \eta) &:= [Y - \ell(X) - \theta (D - m(X))] [D - m(X)].\end{aligned}\end{align}
115116

116-
The user guide provides details about the Sample-splitting, cross-fitting and repeated cross-fitting, the Double machine learning algorithms and the Score functions
117+
Note that with this score, we do not estimate $g_0(X)$ directly, but the conditional expectation of $Y$ given $X$, $\ell_0(X) = E[Y|X]$. The user guide provides details about the Sample-splitting, cross-fitting and repeated cross-fitting, the Double machine learning algorithms and the Score functions
117118

118119

119120
## Estimate double/debiased machine learning models
@@ -122,11 +123,11 @@ We now initialize `DoubleMLPLR` objects for our examples using default parameter
122123

123124
```{r}
124125
set.seed(3141)
125-
obj_dml_plr_bonus = DoubleMLPLR$new(dml_data_bonus, ml_g=ml_g_bonus, ml_m=ml_m_bonus)
126+
obj_dml_plr_bonus = DoubleMLPLR$new(dml_data_bonus, ml_l = ml_l_bonus, ml_m = ml_m_bonus)
126127
obj_dml_plr_bonus$fit()
127128
print(obj_dml_plr_bonus)
128129
129-
obj_dml_plr_sim = DoubleMLPLR$new(dml_data_sim, ml_g=ml_g_sim, ml_m=ml_m_sim)
130+
obj_dml_plr_sim = DoubleMLPLR$new(dml_data_sim, ml_l = ml_l_sim, ml_m = ml_m_sim)
130131
obj_dml_plr_sim$fit()
131132
print(obj_dml_plr_sim)
132133
```

0 commit comments

Comments
 (0)