Thursday 30 July 2015

Taking the Plunge...

The art of asking a good research question is something that I hope to develop during my time with the MALAT program. So far, developing a good question is not as easy as it sounds... but I believe that I am getting closer! I found this concise and informative PowToon on Youtube posted by Steely Library NKU, which gives some quick tips on how to formulate a good research question.



In Susan Musante's (2010) article Learning How to Ask Research Questions, she quotes a graduate student named Julia Svoboda as stating "The most important thing... students do is articulate a research question" (Musante, 2010, para. 3). This graduate student has observed other students who were part of a research program called CLIMB. She notes that the ability to develop the skill of asking good questions takes time and that one of the biggest hurdles for students to overcome is their fear of talking and asking the wrong questions.  This feels very familiar to me! I have to admit, I have been hesitant to join the Unit 4 WIKI and put forth a research question. Not because I don't need the practice or because I am not interested in research, but because I am afraid of asking the wrong thing... of making a mistake.

So, enough with the fear!  I'm taking the plunge and have posted a first draft of a research question that I am interested in to the WIKI and here on my blog for everyone's feedback.  My question is:

How can technology be leveraged to improve student engagement in providing student course evaluations?

Why is this important to me? I believe that student feedback is a crucial element in course design and continual quality improvement. In the past, feedback forms were given out to students during class to fill in and hand back anonymously to their student reps. When done in this manner, we had close to 100% return rates of student evaluations (granted, the quality and depth of the feedback may have been arguably low). Since switching to voluntary online student surveys, the average rate of return has dropped to less than 40% in most programs... some as low as 10%. Do students really not care? Are they overloaded by surveys? Are they disengaged because they feel that their input is not valued? Is it the interface of using technology itself that is limiting participation? I'm curious to find out what the driving factors are that are causing such low response rates and what can then be done to improve both the rate of return and the quality of the feedback received.

I'm thinking that understanding the student experience of completing student evaluations is the first step through perhaps a phenomenological study (or maybe behavioural inquiry). Then, an action research inquiry to determine a better way of engaging students in course evaluations. The action research would bring together representatives from the student body, the student association, faculty and administration.

Thoughts anyone??


References:
Musante, S. (2010). Learning how to ask research questions. Bioscience, 6(4). DOI: 10.1525/bio.2010.60.4.4  
Steely Library NKU. (2014). Developing a research question [video]. Retrieved from https://www.youtube.com/watch?v=LWLYCYeCFak 

3 comments:

  1. I already gave you some thoughts in the wiki...although you hadn't mentioned behavior inquiry which I think would be good to get a sense of why people don't complete these online evaluations. Does it boil down to a few key things that could be acted upon or is it quite a diverse set of reasons that will entail further research to find something actionable. Great powtoon...a lot of good points covered in a short amount of time...I'll add to my resources...thanks!

    On another note, sometimes when I am trying to focus a research question I'll try a generative process (although I know that seems counter-intuitive). I'll just start writing about what is I'm interested in about the topic, what I think may be going on or not going on, what others seem to be saying about my topic that I agree or disagree with, why the topic seems to have captured my attention, what has been my own experience with it...etc. Out of this more divergent thinking (kind of like what you are starting to do in this blog), I'll begin to find the nuggets of my research question.

    ReplyDelete
  2. Hi Loni, thanks for your comments in both the WIKI and here on the blog. Between the two of them and also ready other Q&A threads, I think I'm starting to finally sort out Phenomenology and Behaviour Inquiry. The concept of what exactly is a "lived experience" has been a stumbling block for me. I think trying to fumble my way through this research question has helped me to clarify it. The lived experience is trying to understand what its like to be a student who is asked to fill out a survey. There is no judgment as to why they do or don't... it just "is". Whereas in trying to understand why a student does or does not fill out the survey, that is interpreting their behaviour... which means I am now doing a behavioural inquiry. It's similar because I am looking at the student experience, but what I am trying to understand is different.

    ReplyDelete
  3. Hi Erin,

    Thanks for sharing this PowToon presentation. It describes the complex process of formulating a good research question in a simple format that is illustrative and fun. Also, I found the difference between the evaluation returns online verus the evaluation returns in class to be huge. I think your research question would help uncover answers as to how those low numbers could be brought back up and it sounds like an area that needs further studying. Excellent idea!

    ReplyDelete