1st June 2012 at 11:43 am #2342Gareth MorrellMember
There was one issue that cropped up again and again in each contribution to the NSMNSS launch event. No, not Justin Beiber. Nor macaroni cheese (sorry, you had to be there. Or you have to watch the videos from the day). It was ethics. This is a fundamental issue, one that should underpin our discussions of qual and quant methods and has massive implications for the quality of social media research.
In our third workshop, researchers from across disciplines and sectors grappled with these issues and came up with the following areas that should be priority for the network to engage with.
- Understanding the nature of ‘data self – data body’ in social media terms; we need to explore what ‘the individual’ means in the digital worlds and how that then affects how we think about protection from harm/privacy.
- A high priority was seen as engaging with a series of social media stakeholders and finding out from them what they think the ethical issues are:
- social media researchers
- users of social media
- social media companies providing tools, apps and platforms
- media/data journalists
- community moderators
- funders/research commissioner… leading to the co-creation of a set of ethical guidance notes from best practice = fantastic output of the network: understanding better what people are doing already, what funders are expecting and shaping that ethical framework rather than having one foisted on us
- We need to consider the ethics of the social media tools and platforms themselves and understand our role in selecting which platforms and tools we use data from.
- A more sophisticated discussion of what is public and what is private is required: can we just transfer concepts of privacy from the non digital world to the digital or do we need to go beyond assumptions and find out what users mean or want in relation to digital privacy?
- Exploring the protection from harm issues for researchers, what does digital research mean for the safety of researchers (especially in relation to community moderators but also the ‘political’ threats of agents of the state/estate wanting to exploit data and findings for their own ends
- Thinking about the role of bots in creating voices in the digital world, we assume most exchanges are generated by humans but what about the dialogues which are created by bots or organisations for their own ends, this plays to the bigger issue of power and where it lies in the researcher-researched relationship on the web and related potential threats to individual safety.
To take this this really important area on, we want to engage all of you in these issues. Are they the right ones? What have we missed out?2nd June 2012 at 12:28 pm #2344NSMNSSMember
Farida Vis’ insightful talk on the ethics of research in new social media is now available here6th July 2012 at 4:59 pm #2343Grant StanleyMember
Much food for thought. Many thanks for sharing. 🙂
- The forum ‘Default Forum’ is closed to new topics and replies.