8 Meta-analysis: Impact of impact information
Discussion of ‘how to make this a meta-analysis’ here
Note that we are aware of issues surrounding
“meta-analyzing work including one’s own research”
Challenges incorporating the ‘grey literature’ studies below (not yet published or peer reviewed, maybe not even indexed)
8.1 Preliminary data collection and scoping
(The code below brings together this data)
For each, collect (if applicable) or generate
- study
- wave
- treatment
- donation (amount)
- d_donation (whether donated – obviously it can be inferred from donation)
- treat_eff_info (whether given ‘effectiveness information’)
- (other demographics in common; other major treatments, any ‘noncompliance or other filtering’ type variable)
8.1.1 Bring in and clean Karlan and Wood data
Stata import from /other_experimental_data/karlan_wood_dataverse_files/Effectiveness Dataverse Files/Effectiveness Dataverse Files/data/FFH_analysis.dta
p_load("readstata13")
<- readstata13::read.dta13(
karlan_ffh ::here("other_experimental_data", "karlan_wood_dataverse_files", "Effectiveness Dataverse Files", "Effectiveness Dataverse Files", "data", "FFH_analysis.dta"),
hereconvert.factors = TRUE, generate.factors = TRUE,
encoding = "UTF-8", fromEncoding = NULL, convert.underscore = FALSE, missing.type = TRUE, convert.dates = FALSE, replace.strl = TRUE, add.rownames = FALSE, nonint.factors = FALSE, select.rows = NULL,
select.cols = NULL, strlexport = FALSE, strlpath = ".") %>%
# ... Conversion of stata dates ####
mutate(across(contains("_date"), ~as.Date('1900-01-01') + days(as.integer(.x)) )) %>% ##needlessly fancy; only 2 date vars
::as_tibble() dplyr
# rdr_cbk("full_codebook_karlan.Rmd") #THIS doesn't mesh with bookdown
See codebooks/full_codebook_karlan.html
8.1.2 Bring in/clean Bergh and Reinstein data (mostly done)
dp_sx_mt
, created in code/ImportData.R
… reconstruct minimal creation steps from raw data here, if possible.
8.1.3 Bring in/clean DV statistics
base_stats
, dv_ranks
, created in analysis/dv_input_anal.Rmd
(Maybe … reconstruct minimal creation steps from raw data here, if possible.)
8.1.5 Other possible data
- Kassirer and Karlan multi-charity choice field experiment (OSF link)
Sam Kassirer and Dean Karlan ran a large scale trial … c 1 million video-gamers could each make a choice between 3 charities, with c $1 million distributed. The environmental charity (ERP) was preferred by substantially more people than the mental health charity (MHA); the education charity supporting diversity in tech (Code.org) was very much the least-favored.
One of their important between-participant treatment conditions was “whether one of the charity descriptions included ImpactMatters impact information” (by the way, this info is/was rather overoptimistic IMO).
Their interventions seemed to have ‘null effects’, and these effects (need to check) seem likely to be bounded close to zero. The charity did not seem more likely to be chosen when the impact information was provided. The ‘frames’ used to motivate giving — vote for the charity that “does the most good”, makes you “feel the most good” to support, vs. no frame — also seem to have had little direct or interaction effect.
- OftW ‘upselling’ email trial here, if you have access