Programme Redesign, Classroom Scalability and Technology Innovation for Bioinformatics Education
Overview
Aim 1. Online materials and reduction of practical hours. By including on-line videos and simulations and on-line exploration tools, we decreased the practical classroom hours by 50%, to two hours/week, and increased time on task for inquiry-based activities. We also created materials sufficient to “flip” the lecture portion of the module, and treat the
module as “blended”. In addition, we were able to devote two practicals the inquiry-based assignments, and one practical to a critical review of the first assignment.
Total classroom laboratory hours – 44 (before) 22 (after)
Classroom laboratory hours devoted 100% to inquiry or reflection – 0 (before) 6 (after)
Some specialised software was created, including software for automatic inquiry problem generation and exploration of bioinformatics algorithms. Much of this software is tangible and reusable (e.g., software for exploring the internals of dynamic programming algorithms). We developed some software tools for on-line exercise generation, although not as much as intended (see Evaluation, below). Intangibles: we made a lot of mistakes and learned by trial and error about weaknesses in our problem generation, as well as less valuable aspects of our software.
Aim 2. Adopt technology to scale the inquiry-based laboratory. We had initially proposed to use TeamMates (developed in SoC) for peer assessment. We instead used the Coursera MOOC platform in our first implementation, but found the platform was
ineffective and disliked by students. We eventually adopted the new IVLE platform in combination with the PeerMark tools within the Turnitin integration into IVLE. With adequate student and instructor training, this system can be very effective, but is complex to set up for peer evaluation.
Aim 3. Improve the systematic use of teaching feedback. Using systematic analysis of feedback, we were able to identify pain points for student learning and develop targeted additional material (such as videos or exercises), and add these tools into the module. We found that students could be overwhelmed with the feedback requests, and so eventually lowered the level of regular feedback to the end of key learning units. We carried out systematic evaluation of feedback data through the use of a novel statistical approach called structural topic modelling. We first used this method to discover specific topics raised by students in feedback that were strongly associated with pedagogical change. This result is novel, tangible, and being prepared for publication. We also received (through our department and the computer centre) data for three years of teaching across the department. This large scale analysis also allowed us to discover topics students raised that were associated with excellent feedback scores, as well as topics associated with areas of improvement. We believe this approach could be used for targeted professional development of teaching.