Discussion of ‘how to make this a meta-analysis’ here
Note that we are aware of issues surrounding
“meta-analyzing work including one’s own research”
Challenges incorporating the ‘grey literature’ studies below (not yet published or peer reviewed, maybe not even indexed)
(The code below brings together this data)
For each, collect (if applicable) or generate
- donation (amount)
- d_donation (whether donated – obviously it can be inferred from donation)
- treat_eff_info (whether given ‘effectiveness information’)
- (other demographics in common; other major treatments, any ‘noncompliance or other filtering’ type variable)
Stata import from
/other_experimental_data/karlan_wood_dataverse_files/Effectiveness Dataverse Files/Effectiveness Dataverse Files/data/FFH_analysis.dta
p_load("readstata13") <- readstata13::read.dta13( karlan_ffh ::here("other_experimental_data", "karlan_wood_dataverse_files", "Effectiveness Dataverse Files", "Effectiveness Dataverse Files", "data", "FFH_analysis.dta"), hereconvert.factors = TRUE, generate.factors = TRUE, encoding = "UTF-8", fromEncoding = NULL, convert.underscore = FALSE, missing.type = TRUE, convert.dates = FALSE, replace.strl = TRUE, add.rownames = FALSE, nonint.factors = FALSE, select.rows = NULL, select.cols = NULL, strlexport = FALSE, strlpath = ".") %>% # ... Conversion of stata dates #### mutate(across(contains("_date"), ~as.Date('1900-01-01') + days(as.integer(.x)) )) %>% ##needlessly fancy; only 2 date vars ::as_tibble() dplyr
# rdr_cbk("full_codebook_karlan.Rmd") #THIS doesn't mesh with bookdown
dp_sx_mt, created in
… reconstruct minimal creation steps from raw data here, if possible.
dv_ranks, created in
(Maybe … reconstruct minimal creation steps from raw data here, if possible.)
- Kassirer and Karlan multi-charity choice field experiment (OSF link)
Sam Kassirer and Dean Karlan ran a large scale trial … c 1 million video-gamers could each make a choice between 3 charities, with c $1 million distributed. The environmental charity (ERP) was preferred by substantially more people than the mental health charity (MHA); the education charity supporting diversity in tech (Code.org) was very much the least-favored.
One of their important between-participant treatment conditions was “whether one of the charity descriptions included ImpactMatters impact information” (by the way, this info is/was rather overoptimistic IMO).
Their interventions seemed to have ‘null effects’, and these effects (need to check) seem likely to be bounded close to zero. The charity did not seem more likely to be chosen when the impact information was provided. The ‘frames’ used to motivate giving — vote for the charity that “does the most good”, makes you “feel the most good” to support, vs. no frame — also seem to have had little direct or interaction effect.
- OftW ‘upselling’ email trial here, if you have access