Categories
Action Research Project

Data analysis – questionnaire

I opened my questionnaire to digital learning staff via the UAL Digital Learning Support channel on MS Teams. It was open for a period of three weeks from 06/11 to 27/11. Overall I had a 36% response rate, as 14 of the 39 members in the chat responded. While I would have liked to have had a larger sample size – at least 50% would have been great – I decided to call it in the interest of time constraints of the project. Additionally, if I can reflect on why I didn’t get a larger response rate, there may have been multiple factors at play: not everyone uses MS Teams, staff absences, disinterest in the topic, lack of incentive to complete the questionnaire, workload.

Ultimately however I do think that 14 responses does still equate to a range of participants and perspectives. I have used Braun and Clarke’s model of reflexive thematic analysis (no date) to understand the questionnaire results.

Reflexive thematic analysis of questionnaire results

Familiarising myself with the dataset

With my data collected, it was time to get familiar with it. Helpfully, Qualtrics has a report builder functionality so I was able to convert the responses of each of the questions into visualisations. This helped me to take in and retain the findings from the survey. It also gave me the sense that overall, we are in a positive position with most (77%) of respondents being somewhat familiar about digital sustainability (although as further analysis showed, definitions of that meant varied) and 92% being interested in training or guidance for best practice in reducing emissions in online course build and design.

The full report I created in order to familiarise myself with the data below.

Coding

While coding is a suggested step in the thematic analysis process, as Braun and Clarke (n.d.) state “These phases do not prescribe a rigid process you must follow”. Therefore due to the concise dataset I have gathered, I don’t believe the coding process, whereby codes are applied to the data in multiple rounds of analysis, is required. Rather working with the themes the data presents is more appropriate for this data.

Themes: generating, developing, reviewing, refining, and naming

  • The first stage of this process is to generate initial themes. To do this I’ve copied each of the data visualisations for each question to a Miro board, so that I can move them around and group them depending on theme.
    • Existing strengths + areas of growth: I split the data into these categories, and the majority of the data signalled existing strengths in the community; 79% were familiar with the concept of digital sustainability, more than half were able to suggest guidelines or tools for reducing online carbon emissions, 92% were interested in training or guidance for best practice, and 69% were interested in being further involved in projects around this topic. One area for growth though, was that the majority felt unconfident in sharing sustainable best practice for online learning with fellow staff. This isn’t a negative thing, it presents the need for a resource to help digital learning staff to do this – this is where my intervention can come in.
  • I then looked at the free text fields to generate some themes. From analysing this data I discovered that digital sustainability means different things to different people, so I grouped these responses into the following categories of what we talk about when we talk about digital sustainability:
    • Hardware + hosting: This includes hardware procurement and disposal, the internet servers an organisation includes, etc. It is generally under the remit of IT and university leadership. This was mentioned by three respondents.
    • General sustainable best practice for being online: This is general guidance for using devices, browsing the web, managing files on cloud storage, sending emails, etc. This was the most common category the free text field responses fell under.
    • Sustainable design of online learning materials: This is specifically how we design and deliver our courses and online learning materials. No one actually specifically mentioned this, which suggests there is a gap in guidance and understanding around this specifically and also points to the complexity of this topic.
  • The last theme I gleaned from the data was around external guidelines, there were three resources suggested:
    • Government Digital Sustainability Alliance: This one was new to me. It seems to be a consortium of a number of government departments and tech companies. There is a blog that details their work but it doesn’t seem have direct application or guidance for our context at UAL.
    • Website carbon calculator: I knew about this and used it to generate reports for the UAL website and Moodle.
    • Wholegrain digital: I also knew about this, it’s connected to the website carbon calculator. It’s a useful and comprehensive overview of the different elements that contribute to online carbon emissions and is a more condensed and accessible version of the Web Sustainability Guidelines. These weren’t mentioned by anyone which isn’t too much of a surprise as they are quite new and quite dense to navigate through. That said, even the wholegrain digital blog is wordy and hard to read end to end for someone engaged on this topic let alone asking other staff to read it.

Conclusions

  • Within the digital learning team there is a base level of existing knowledge around digital sustainability. However this is largely with regard to devices and general internet usage. There wasn’t evidence of a large understanding of digital sustainability for content creation.
  • However, there is a desire to learn more about this topic and how it connects to their role.
  • Online learning staff are hugely involved in training for the wider university with learning technologies. We advise on technical aspects of this, as well as pedagogy for online and digital accessibility. However, advising on digital sustainability is not an area that staff currently feel confident in advising colleagues about.
  • The resources that online learning staff know about with regard to digital sustainability are not always relevant to our practice, or they give a basic level of knowledge or too much information. This can make it hard for staff to tangibly apply this to their own practice, or share with others across the college.

Writing up

The write up is constituted by this blog and the accompanying Miro board (screenshot below) which details how I analysed the data.

Additionally I created the below infographic to visualise what I find. Ultimately, the questionnaire signalled to me that we have some promising foundations for growth in the area of digital sustainability. These are: familiarity with the concept, awareness of carbon calculation tools, confidence to share best practice, desire for training/guidance and interest in a UAL approach – and are represented as roots in the infographic. But the digital sustainability cannot flourish with these roots alone, it needs UAL specific guidance and institutional policy in order to become a fully fledged practice.

UAL digital sustainability infographic

Next steps

Analysing this data is an important part of informing what comes next in my ARP, that is creating an intervention. From the data I can see that the digital learning community has a baseline knowledge of this topic but wants to do more and needs help with sharing sustainable best practice with colleagues. This data alongside the analysis I have done of the web sustainability guidelines and the gamified approach I saw demonstrated in the UAL Carbon Literacy training [read more in this blog] will lead me to creating my intervention to help develop digital sustainability at UAL.

References

Braun, V. and Clarke, V. (no date) Doing reflexive Thematic Analysis. Available at: https://www.thematicanalysis.net/doing-reflexive-ta/ (Accessed: 23 November 2023). 

Leave a Reply

Your email address will not be published. Required fields are marked *