We are aiming to develop and evaluate a variety of engagement data collection tools for their richness in informing us about engagement changes and effects for teachers, students and researchers. In this post I analyse a definition and some dimensions of engagement with the aim of characterising the kind of data we should collect so as to give us the rich picture we are aiming for.
So. we're interested in 'engagement data' to inform us about 'engagement changes' and engagement effects. But what do we mean by 'engagement' in this context? A definition we are working with is provided by our funders, the Open University 'Public Engagement with Research Catalyst Team':
Liz has noted some shortcomings of this definition from her point of view as a researcher and teacher, i.e. that this definition allows engagement to be recognised, but it does not indicate how to engage. Liz wants to know
In an attempt to tie down how engagement can be characterised (defined?), I begin with the 'dimensions of engagement' also suggested by Rick Holliman. In the following paragraphs I consider the dimensions that Rick has put forward with respect to the context of our project. Rick's dimensions are 6 Ps: People, Purposes, Processes, Participation, Performance and Politics. Note that in Rick's post he was writing about the particular case of how a research student should consider engagement with respect to their research project, e.g. they refer to 'the candidate'. I have modified the wording of the definitions of each P with the intention of making them applicable to our context, and I hope, more generally applicable. The Participation and Performance dimensions are concerned with measures of engagement, so it is within these that I intend to put forward some initial ideas about how we can operationalise collection of data about all the dimensions for our particular context (our context being engagement with the use of practices and tools developed within the the Juxtalearn project within schools). Before starting on that, I'll throw in two dictionary definitions of 'engage' to bear in mind as we work on ways of collecting data about engagement (and possibly creating a modified definition of 'engagement' for the purposes of this project)
Now, to Rick's dimensions.
"The main goal of JuxtaLearn is to research, develop and evaluate a pedagogical and technological framework that
Now, to consider question 1. I interpret this first question as being focused on the quality of the engagement process from the various publics point of view. Recognising that publics can be engaged at a range of levels from
So we're looking for measures of 'benefits, changes and effects' and of contributions made to the shaping of the research agenda, of the research process, and research products. I wrote at little about different levels (A to B) of engagement under Processes.
For the students and the teachers, one approach to collecting data about the quality of engagement is to encourage and facilitate these publics to reflect on and document their process of engagement. For Juxtalearn, the intended effects of the engagement with both the students and teachers include that they should learn about both the tricky topic and the benefits of the video making approach. If we ask each of these publics to document the process of their learning then this should be beneficial for the individuals, but will have a side effect of informing our research. Of course, the way that methods through which each public or individual prefers to document their learning process may be different, so we need to make an range of different tools available. However, the prompts and questions used to elicit the knowledge must be consistent within each public, even if the tools used vary between the publics (and even between individuals). The scheduling of the prompts and questions will be determined by the schedule of the Juxtalearn interventions. E.g. questions may be asked just before, during, and just after each intervention. There will also be value in prompting participants to reflect on the effects of the interventions some time after the last intervention; this could yield information about how teachers foresee longer term engagement with the ideas behind the Juxtalearn interventions, and if or how students have been able to utilise the Juxtalearn ideas and practices. Due to to timescale of the project 'some time after' will be approximately one or two months after the intervention.
For the third public, i.e the school management, the scheduling of questions should be before, a week or so after (to find out if and what information about the interventions has been passed to management personnel, and their view of it), and also one or two months time after the interventions (to find out if their views have changed on reflection, and if they have developed any plans related to the interventions).
In the paragraphs related to the 6ps People, Purposes, Processes, Participation, Performance and Politics I have started to unpick the kind of information we could collect so as to inform us about engagement changes and effects for the 3 publics of the initial focus i.e. teachers, students and school management.
In my next post I will outline some drafts of questions for these publics, along with some ideas for mechanisms by which the questions will be asked and responses collected. Mechanisms under consideration include sms messages, blogs, questionnaires etc. We will have to consider whether different questions and prompts will be necessary for those who seemed to be enthused by the Juxtalearn interventions vs. those who seem indifferent. To ensure a good response rate, in combination with facilitating good quality responses, the nature of the questions and tools must not be a barrier to the publics responding, i.e. they should make responding both quick and easy. In this way, I hope that we can facilitate conversations with a high proportion of the publics about their perceptions of engagement.
Big data by Marius B |
"Defining engaged research
Excellent public engagement with research is reflected in the different ways that researchers meaningfully connect and share research with various stakeholders, user communities and members of the public. Done well, public engagement with research will generate benefits, changes and effects for all participants as they share knowledge, expertise and skills. Excellence will be demonstrated partly through recognition of the contributions that all participants make to the shaping of research agendas, the processes of conducting research, and in the products of that research"
(Holliman, 2014).
Excellent public engagement with research is reflected in the different ways that researchers meaningfully connect and share research with various stakeholders, user communities and members of the public. Done well, public engagement with research will generate benefits, changes and effects for all participants as they share knowledge, expertise and skills. Excellence will be demonstrated partly through recognition of the contributions that all participants make to the shaping of research agendas, the processes of conducting research, and in the products of that research"
(Holliman, 2014).
Liz has noted some shortcomings of this definition from her point of view as a researcher and teacher, i.e. that this definition allows engagement to be recognised, but it does not indicate how to engage. Liz wants to know
"how to engage,
what I have to do to engage, and I need to know beforehand what those
actions are so I can plan my engagement and others' engagement, not have
to recognise it after the event, when it's too late".
In an attempt to tie down how engagement can be characterised (defined?), I begin with the 'dimensions of engagement' also suggested by Rick Holliman. In the following paragraphs I consider the dimensions that Rick has put forward with respect to the context of our project. Rick's dimensions are 6 Ps: People, Purposes, Processes, Participation, Performance and Politics. Note that in Rick's post he was writing about the particular case of how a research student should consider engagement with respect to their research project, e.g. they refer to 'the candidate'. I have modified the wording of the definitions of each P with the intention of making them applicable to our context, and I hope, more generally applicable. The Participation and Performance dimensions are concerned with measures of engagement, so it is within these that I intend to put forward some initial ideas about how we can operationalise collection of data about all the dimensions for our particular context (our context being engagement with the use of practices and tools developed within the the Juxtalearn project within schools). Before starting on that, I'll throw in two dictionary definitions of 'engage' to bear in mind as we work on ways of collecting data about engagement (and possibly creating a modified definition of 'engagement' for the purposes of this project)
"13. a. To entangle, involve, commit, mix up (in an undertaking, quarrel, etc.).
14. trans. To attract and hold fast (attention, interest)"
(Oxford English Dictionary, 2014).
People: Who are the publics who could and should be engaging with this research?
With respect to Publics, 4 come to mind. Firstly, there are teachers who may adopt and use the Juxtalearn Storyboard and tricky topics approaches. Secondly, there are the students who come into contact with these new approaches because one or more of their teachers have adopted them. Thirdly, there is school management, i.e. those who influence the kinds of teaching practices that are encouraged and adopted for mainstream use within the school. Finally, there are other researchers who could be engaging with this project. In this initial post, I will focus on the first 3 i.e. those associated with the school.Purposes: What are the aims and objectives of this engaged research?
The Juxtalearn project is exploring how creative performance through participative video making can be used to engage students in STEM subjects. This 'Evaluating ways of capturing engagement processes' project is investigating how the Juxtalearn approach could be applied in non-STEM subjects (i.e. Arts, Humanities and English subjects), and the goals of Juxtalearn are applicable in this non-STEM context:"The main goal of JuxtaLearn is to research, develop and evaluate a pedagogical and technological framework that
- engages students curiosity in science and technology learning, through performances.
- provides assistance in video making and presentation through online tools integrated into a support system.
- provides a situated juxtaposition performance framework to progress learners to become knowledgeable about key discipline ‘threshold concepts’.
- solidifies this understanding through personal and shared reflective performances tied to formal education processes.
- involves and collates video evidence from Learners through an online Exemplar Repository highlighting best practice videos that provide knowledge of ‘threshold concepts’" (Juxtalearn website, 2014).
- how the publics we have identified understand the purposes of the Juxtalearn research
- how members of these publics communicate with each other about Juxtalearn's research goals
- how members of these publics communicate with the research team about Juxtalearn's research goals
Processes: (1) How does the research involve relevant publics in meaningful ways? (2) When, and (3) how often, are publics be involved? (4) Where are these interventions likely to take place, and (5) through what mechanisms?
To begin with I'll set question (1) aside to deal with later, and look at the other questions as these should be relatively easy to answer, because they are part of an 'Engagement plan' which already exists. It is a series of Juxtalearn interventions (i.e. consultations and workshops) have been planned and described in the project proposal. To summarise:- 1 hour briefing session with teachers explaining approach and storyboard creation on a tricky topic of study
- Juxtalearn workshops with students & teachers
and, after the workshops - Student focus group on creating and practical video-making
- Collect feedback from teachers via interviews.
Now, to consider question 1. I interpret this first question as being focused on the quality of the engagement process from the various publics point of view. Recognising that publics can be engaged at a range of levels from
- engagement with a research project's (prototype) products
to - engagement with the research process that are generating and modifying a research project's (prototype) products.
Participation: measures of how the publics and researchers participated?
Here are some suggestions of different things we could aim to measure- Participation at scheduled events
- Unscripted and unplanned interactions with researchers about the research, initiated by members of the publics
- Interactions about the research between members of the publics not involving Juxtalearn researchers.
Performance: (1) What measures are proposed to explore the quality of the engagement processes? (2) How will the findings be used to improve future practice, and (3) shared with other researchers?
With respect to measuring quality, let's look again at a couple of sentences from the definition of Engaged Research provided by our funders, the Open University 'Public Engagement with Research Catalyst Team':
"Done well,
public engagement with research will generate benefits, changes and
effects for all participants as they share knowledge, expertise and
skills. Excellence will be demonstrated partly through recognition of
the contributions that all participants make to the shaping of research
agendas, the processes of conducting research, and in the products of
that research"
(Holliman, 2014).
(Holliman, 2014).
So we're looking for measures of 'benefits, changes and effects' and of contributions made to the shaping of the research agenda, of the research process, and research products. I wrote at little about different levels (A to B) of engagement under Processes.
For the students and the teachers, one approach to collecting data about the quality of engagement is to encourage and facilitate these publics to reflect on and document their process of engagement. For Juxtalearn, the intended effects of the engagement with both the students and teachers include that they should learn about both the tricky topic and the benefits of the video making approach. If we ask each of these publics to document the process of their learning then this should be beneficial for the individuals, but will have a side effect of informing our research. Of course, the way that methods through which each public or individual prefers to document their learning process may be different, so we need to make an range of different tools available. However, the prompts and questions used to elicit the knowledge must be consistent within each public, even if the tools used vary between the publics (and even between individuals). The scheduling of the prompts and questions will be determined by the schedule of the Juxtalearn interventions. E.g. questions may be asked just before, during, and just after each intervention. There will also be value in prompting participants to reflect on the effects of the interventions some time after the last intervention; this could yield information about how teachers foresee longer term engagement with the ideas behind the Juxtalearn interventions, and if or how students have been able to utilise the Juxtalearn ideas and practices. Due to to timescale of the project 'some time after' will be approximately one or two months after the intervention.
For the third public, i.e the school management, the scheduling of questions should be before, a week or so after (to find out if and what information about the interventions has been passed to management personnel, and their view of it), and also one or two months time after the interventions (to find out if their views have changed on reflection, and if they have developed any plans related to the interventions).
Politics: (1) What is the localised political context of the publics involved with the research? (2) What is the wider context for engaged research?
The publics directly involved with the Juxtalearn research are students, teachers, and management at the schools in which the Juxtalearn interventions take place, plus the researchers. We will focus on determining the effect of the local political context, by which I mean the context within the school and its governing body, on the engagement process. The wider political context in which the publics and the school exists will affect this local context, but we will not aim to ask direct questions about this wider context.Conclusions and next steps
Quick by Burns! |
In my next post I will outline some drafts of questions for these publics, along with some ideas for mechanisms by which the questions will be asked and responses collected. Mechanisms under consideration include sms messages, blogs, questionnaires etc. We will have to consider whether different questions and prompts will be necessary for those who seemed to be enthused by the Juxtalearn interventions vs. those who seem indifferent. To ensure a good response rate, in combination with facilitating good quality responses, the nature of the questions and tools must not be a barrier to the publics responding, i.e. they should make responding both quick and easy. In this way, I hope that we can facilitate conversations with a high proportion of the publics about their perceptions of engagement.
'THAT WAS EASY!' by joepopp |
I think there should be a seventh 'P': place.
ReplyDelete