top of page
  • Writer's pictureLindaKKaye

Internet-mediated Research

Internet-mediated research (IMR), often interchangeably referred to as “online research”, refers to the collection of research data is made possible by being connected through the Internet, to a functional online environment (e.g., online survey software, website, online forum, social networking site).

The British Psychological Society (BPS) defines IMR as “any research involving the remote acquisition of data from or about human participants using the internet and its associated technologies.” (BPS, 2017, pp3).

There is a distinction between reactive and non-reactive methodologies:

  • Reactive approach- participants interact with materials (e.g., online surveys, online interviews, online experimental tasks)

  • Non-reactive - researcher makes use of data which is collected unobtrusively (data mining, observations).


Ethics in general

All the ethical standards outlined in the BPS Code of Human Research Ethics (2014) and Code of Ethics and Conduct (2009) should be met when doing any form of research. However, there are some specific practical steps researchers should consider when doing IMR, which are largely outlined in the BPS Ethics Guidelines for Internet-mediated Research (2017). You should therefore consult these alongside the standard BPS guidelines before conducting any IMR research. The following sections will give some additional ethical and practical recommendations which will be relevant for different types of IMR.



Online questionnaires or surveys are becoming increasingly popular as a form of data collection for research. Some commonly used ones are Qualtrics, Bristol Online and SurveyMonkey.

Some things to be mindful when using online surveys relative to hard-copy questionnaires:

  • Platform- participants will have a range of platforms on which to complete an online survey. Many will use a smartphone. Make sure when building your online survey, that you use settings and options which will be mobile-compatible for a better user-experience for these participants

  • Uptake and drop-out- these are both greatly different from hard-copy questionnaires in which a researcher is typically present. Uptake is usually much lower and drop-out is considerably increased with online formats of surveys (Hoerger, 2010; Nulty, 2008).

  • Context-participants will be completing online surveys in a range of different places (e.g., on the bus, in front of the TV, etc). Be aware that there may be more “noise” in this option relative to more controlled settings when doing hard-copy questionnaires

  • Bots- you may find erroneous data which has come from bots (automated apps which are programmed to do tasks). Some great tips from Melissa Simone on ways to reduce the likelihood of bots interfering with your survey:

1. Include advanced branch logic in your survey

2. Include some open-ended questions (even if these are asking random things and are not related to any actual research constructs

3. Ask similar questions a few times to check for consistencies (e.g., ask about gender twice)

4. Have a filler item which asks participant to respond in a certain way (e.g., Please select “Strongly agree” to this item)

  • Consent- your participants will not be able to sign a consent form as per a hard-copy form. Set up a new page as your consent page, with your consent statements as separate check box questions. Your participants are then asked to check next to each of these as an active form of consent.

  • Confidentiality- remember that confidentiality of online data will also need to be maintained through it being stored on a password-protected survey software.

  • Anonymity- to ensure you have some way of corroborating a participant with their data should they choose to withdraw after taking part, set up an open-ended text box on the consent form and ask participants to provide a memorable code/password on here.

  • Withdrawal- within the briefing stage, let participants know that they can withdraw during the study simply by closing their internet browser.

  • Intellectual Property- some instruments and measures may have Copyright stipulations which mean they should not be published online. Be aware of this before selecting research instruments in IMR

  • Voluntary participation- most online survey software give researchers the option of whether they want to make questions mandatory (i.e. a participant must answer all questions on a page in order to progress onwards). Arguably this is best avoided as it is imposing a level of force in the process which wouldn’t exist in a hard-copy format


Interviews/Focus groups

There are a great deal of commercially-available online communication tools which can be used to host interviews and focus groups. Many of these offer free versions and may be appealing to use as a way of conducting interviews. There are a number of things to bear in mind when selecting appropriate online software:

  • Confidentiality- do not use software which uses third party hosts with users’ data. Also make sure the software you select is end-to-end encrypted (See the image below for a summary of platforms

  • Consent- gaining an audio statement of consent, alongside an electronic consent form (e.g., via email) may be typical to use.

  • Anti-virus- any devices which are being used for interviews should have appropriate and up-to-date anti-virus software installed

  • Internet network- when conducting interviews, use Private WiFi networks (e.g., home or university network) and avoid less secure public networks (e.g., accessible in cafes, airports etc

  • App permissions- many online communication software operate as Apps so be aware of what permissions you and your participants may need to accept if you choose to use these. These permissions may include giving access to device features such as camera, microphone, storage and contacts.

  • Authentication- make sure any device you are using to conduct interviews and store any audio data has secure authentication (e.g, passwords, biometrics, such as face, voice or fingerprint identification)

  • Recording- does the software have the function to make a recording to ensure you have an accurate and full account of the interview

  • Capacity- if you are doing focus groups, does the software have a maximum capacity on audio or video calls

  • Time zones- given that you may be recruiting widely, it may be that your participants may be geographically dispersed. Some may be in different time zones, so be aware of this when setting up mutually convenient times for interviews

  • Avoid transcription services such as Otter as this moves data to a third party to understand it so therefore compromises data confidentiality


Smartphone data

Recent research has made good use of smartphone apps (e.g., Apple screen-time) to understand people’s online behaviour. This can help give more accurate data on screen use than self-reports can (Andrews, Ellis, Shaw & Piwek, 2015; Ernala, Burke, Leavitt & Ellison, 2020; Sewall, Bear, Merranko & Rosen, in press). Some considerations for using smartphones to gain objective data:

· Gaining consent- this would typically be done as per the usual consent process, of recruiting participants at the start of research and obtaining informed consent.

· What metrics- it may be your research is focused on overall screen-time or may be more focused on specific app usage (e.g., Instagram use), or perhaps both. Make sure participants are aware which apps are the focus of study

· Practicalities – features such as the Apple screen-time app records use of all apps, websites etc the user has visited. Be aware that participants themselves can change their settings to block themselves using certain apps (i.e. “Downtime”) and set limits on how much time they use apps. Make sure participants are aware that changing settings mid-way through their participation in research may confound the data. Include some briefing information on how you prefer your participants to respond to this issue


Online data

This may be using Twitter posts or other forms of publicly-available online data on which to undertake secondary data analysis

· Anonymity- never use quotes from participants in the reporting of the work. These can be reverse searched to identify participants.

· Privacy- only use data which is in a public domain, which is accessible without the need for a log-in, authentication or requesting access. See the BPS IMR Guidelines (2017) for additional detail on this.

· Informed consent- as only public platforms are being used, users are behaving in what are deemed to be public spaces, with the notion that this behaviour is observable to anyone. Therefore, no informed consent is needed for unobtrusively obtaining publicly-accessible data


Recruitment for reactive forms of IMR

  • As with any research, make sure any written adverts have been approved ethically alongside other ethics documentations.

  • Adverts should be concise, user-friendly (e.g., avoiding academic jargon), and informative

  • If you are adverting on social media, carefully consider which communities are representative and relevant and state this in your ethics documentation. This may be specific Facebook groups, community pages, your own profile etc. Do not assume that posting adverts to your own social media platforms will garner the most representative sample for all types of research.

  • If advertising on social media, also consider the use of an image with key detail on to supplement your text advert. This may allow you to include additional information which may not fit onto platforms which have character limits. It is good practice to make sure images have Alt text added to them when uploaded online

  • Be aware that advertising online may allow information about you to be publicly available (e.g., profile photos, bio information, university name). Make sure you check that your profile information on any platforms are not revealing anything you do not want the world to see

  • Be considerate to the online communities you are seeking help from. Engage in these communities prior to your research, build rapport, come back to any adverts to check whether people have asked questions, and return to show thanks at the end of the research process



Andrews, S., Ellis, D.A., Shaw, H., & Piwek, L. (2015). Beyond self report: Tools to compare estimated and real-world Smartphone use. PLoS One, 10 (10), e0139004. doi:10.1371/journal.pone.0139004

British Psychological Society (2014). Code of Human Research Ethics. Leicester.

British Psychological Society (2009). Code of Ethics and Conduct. Leicester.

British Psychological Society (2017). Ethics Guidelines for Internet-mediated Research. INF206/04.2017. Leicester

Ernala, S.K., Burke, M., Leavitt, A., & Ellison, N.B. (2020). How well do people report time spent on Facebook? An evaluation of established survey questions with recommendations. ACM CHI Conference on Human Factors in Computing Systems. Honolulu, Hawaiʻi,

Hoerger, M. (2010). Participant Dropout as a Function of Survey Length in Internet-Mediated University Studies: Implications for Study Design and Voluntary Participation in Psychological Research. Cyberpsychology, Behavior and Social Networking, 13 (6), 697-700. doi: 10.1089/cyber.2009.0445

Nulty, D. D. (2008). The adequacy of response rates to online and paper surveys: What can be done? Assessment and Evaluation in Higher Education, 33 (3), 301-314

Sewall, C.J.R., Bear, T.B., Merranko, J., & Rosen, D. (in press). How psychosocial well-being and usage amount predict inaccuracies in retrospective estimates of digital technology use. Mobile Media and Communication. doi: 10.1177/2050157920902830

3,024 views0 comments

Recent Posts

See All


bottom of page