weighing in on using electronic devices to collect survey data
Folks at the Development Impact blog wrote a two-part series on the benefits and drawbacks of using traditional pen-and-paper surveys or surveys administered using an electronic device (often referred to as CAPI, which is short for computer-assisted personal interviewing).* My own experience using CAPI in 2011 and again in 2012 mirrors some of the benefits and drawbacks highlighted by Development Impact. In this blog post, I summarize the series by Development Impact, introduce the CAPI format I’ve used for surveys collected in Malawi in 2011 and 2012, and weigh in with my own thoughts on CAPI.
Part I of the Development Impact series highlights the benefits of CAPI:
- enumerators can get instant feedback on their performance
- cleaner and easier to track
- get data faster
- ability to get more confidential data
- monitoring with date/time/GPS location stamps
- easier validation for survey design improvement
Part II of the Development Impact series highlights the drawbacks:
- simple mistakes can be very costly
- high start-up costs (programming, piloting, training)
- data can get lost
- data security
- potential device failure
- dependence on electricity
- limited empirical evidence on the improved data quality
In 2011, I fielded a small survey (N=205) in a rural area of Zomba District, Malawi. I had purchased 4 iPads (1st generation refurbished wifi-only 16GB models, sold at a reduced price** on Apple’s web site following the release of iPad-2).
I tasked a research assistant with identifying potential software options. We ultimately decided on iSurvey, which offers a free app from the App Store, unlimited survey questions, unlimited survey observations, to be used on as many devices as you wish. The cost was a monthly payment of $89 for a survey. A primary benefit of using iSurvey is that surveys are created using their web site, and their current version allows for a lot of different options (grids, skip logic, image use, etc.). iSurvey does not require programming skills, just someone who knows how to navigate a web page. The process of linking an iPad to a given survey on iSurvey is rather simple, and once a survey has been completed on an iPad, the data will be uploaded to iSurvey’s server (pending an internet connection, of course). Once data are uploaded, you can download CSV or SPSS versions of the data from iSurvey’s web site.
In 2011, iSurvey and iPads in the field worked incredibly well. There were only a handful of evenings when we couldn’t upload data from the iPads because of our lack of an internet connection,*** and on these nights, we backed up the iPads to our project laptop to ensure data weren’t lost (or, at least, it made us feel better about not being able to upload the data to the cloud). There were a number of electricity outages during August 2011, so we were diligent about charging batteries when we could and carried a vehicle inverter and fully charged laptops (to funnel charge via the USB port to iPads) with us in case we had to charge them in the vehicle on our way to the field site. Each night we uploaded data, we also downloaded data — to see if there were odd patterns or problems with specific questions or interviewers. Any problems or inconsistencies were discussed the following morning with the field team before going out for another round of interviews.
The 2012 implementation was not so smooth. No one in the field in 2012 had any previous experience with the back-end of iSurvey and only the field supervisor had used the iSurvey app in the field previously (but he had joined the 2011 team late, when we were nearly done with the survey interviewing segment of data collection). Internet connections were difficult to come by, especially when the team left Zomba for more rural districts in our sample. I would check the data from the US, but with the time lag (in terms of uploading and the 7-hour time difference between Malawi and Texas) and difficulties with communicating to the Malawian team, there was not the same fast turnaround as when I was in the field checking the data each night and talking to the field team the following morning. There was also a larger field team (9 interviewers/iPads in 2012, compared to 5 in 2011).
One of the benefits mentioned by Development Impact is the ability to monitor with time stamping — and iSurvey allows for this. Given some of the timestamps I saw in the data, however, it seemed to me that some interviewers were “cheating.”**** This led to a serious fissure in the research team, to a point where I couldn’t even talk to them because it was making me physically ill debating what was really going on. Though the iSurvey team were responsive to our requests for more information about the timestamps and how unusual they seemed, the matter was never resolved. Given the tense situation it created, I’m still left wondering whether I want to know if data has been faked… but there is sufficient content for an entire post (or series!) on data validity that is certainly not limited to CAPI.
In sum, I would definitely want to use iSurvey/iPads again, but I am hesitant to implement in the same way it was done in 2012. Development Impact is correct in saying: “There is no computerized substitute for a well-designed, well-supervised field work effort.” I think we probably had too few people managing data collection in the field, especially anyone with the time or skills to take care of back-end data checking in-field. One of the benefits of paper surveys is that a supervisor can sit in a vehicle all afternoon and read through the surveys done by interviewers and identify problems (and immediately send interviewers for “call-backs” for anything serious). Though I felt I could easily manage data-checking from afar with a team run by people I knew well and had worked with before, there is always a great distance between people collecting data and people sitting in air-conditioned offices in the US (in addition to challenges posed by the aforementioned time lags for data uploads). This distance isn’t just a physical one. There is something about being there in the midst of it all that gives you a different sense of the data and of the team collecting it. In fact, when I pursued the alleged cheating incident, one email I received pointed out that I was not “even in the field.” As a field researcher, that is one of the worst things you can hear about yourself. But in fact, it was correct. Just because CAPI allows you to see the data in “real-time,” it doesn’t necessarily bring you closer to the subject of study.
*This topic is not so new to development blogs — see, for example, this post from the previous decade, written by light-years-ahead researcher Chris Blattman.
**It does not take infinity billion dollars to use CAPI in the field. For me, the start-up costs were $349/iPad and $89 for the software. Data entry costs: $0. Of course, if you’re a broke graduate student, you might be entering the data yourself (in addition to cost-saving benefits of entering the data yourself, you also get to see all of the data, which is helpful). If the costs of purchasing iPads (or whatever device it is that you choose) is something your budget can’t cover, you might also consider borrowing from someone else who has them. I am happy to loan mine out to other field researchers when I’m not using them.
*** Malawi is not like Kenya and our iPads were 1st generation, meaning we couldn’t just connect via a 3G connection. Instead, we connected via a broadcasted signal from a Mac laptop that connected via a dongle.
**** By “cheating” I mean it looked as if there were interviews that were not with real people, but rather made-up data. This was never confirmed.