I've been playing with a new (to me) analysis tool called Gephi. It's really impressive, and was able to quickly show me that students are mostly looking at the profile pages of students in their year, and less often at the profiles of students in other years. Below are two different renderings of the same social network data. Apologies for the lack of a colour key, but each colour corresponds to one cohort of students. The keys are different, and it's actually the same cohorts in the same location on each graph, with the newest cohort at the bottom, and the oldest cohorts mixed in together at the top (that mixing is partly an artifact of my data collection - I only captured the last year or two of those cohorts' data, and in those years the students from different cohorts are more mixed in together, in their various hospitals.
Next steps: look at the detail within a particular cohort, and see what effect clinical school allocations, stage of the program, and PBL tutorial group have on the data.
Soylent Networks
Saturday, April 23, 2016
Wednesday, March 9, 2016
Writing!
Very happy to be making some small amount of headway on my thesis. It's up to 13,024 words - a long way to go, but now that I'm actually writing, instead of pondering and playing with data, I can see the light at the end of the tunnel.
And being the data nerd that I am, I have a progress graph, with monthly targets for the number of words I need to have written.
See that tiny bit of red line? That's me. The progress before 03/2016 was extrapolated from my current status, but the last couple of weeks is real data.
Thursday, July 2, 2015
On relating to software
My previous post brought up some interesting (to me) issues around how people relate to software. By this I mean whether they perceive the software tool to be just there, kind of an immutable fact of the world, with good bits and bad bits bit still something that just exists; or whether they perceive it to be a dynamic, mutable creation of human endeavour that can be changed at will. In a sense, I think most people understand that it really is the latter if they think about it, but do they behave differently when they're using it on a day to day basis? When a problem crops up, do they think about the person who created the problem, and what they did wrong to make it behave in that bad way, or do they think of it as a deterministic, mechanical thing that they need to work around?
I'm thinking out loud here - I don't know the answers. At this stage, I'm not even sure what I'd put into a journal article search to find the research on the subject. I'm sure it has been studied by someone, somewhere. But I think it might be pertinent to my research - perhaps there was a gap in my understanding of my interview subjects. Perhaps I was expecting them to think like software developers rather than like normal people. If so, it may be that I framed my questions incorrectly, and didn't understand the answers and why the answers I was getting didn't give me the information I was interested in.
A quick straw poll of my PhD study group backs this up - I just asked what their first reactions to a software problem would be, and they answered in terms of seeking help and finding workarounds. There was no mention of any thoughts about a person or a motivation behind the software; it was just a fact that needed to be attended to.
So I think I will need to do a search of the literature to find out what has been written about the differences between how tool-makers and tool-users relate to their tools, and they try to figure out how that has affected my interview outcomes.
I'm thinking out loud here - I don't know the answers. At this stage, I'm not even sure what I'd put into a journal article search to find the research on the subject. I'm sure it has been studied by someone, somewhere. But I think it might be pertinent to my research - perhaps there was a gap in my understanding of my interview subjects. Perhaps I was expecting them to think like software developers rather than like normal people. If so, it may be that I framed my questions incorrectly, and didn't understand the answers and why the answers I was getting didn't give me the information I was interested in.
A quick straw poll of my PhD study group backs this up - I just asked what their first reactions to a software problem would be, and they answered in terms of seeking help and finding workarounds. There was no mention of any thoughts about a person or a motivation behind the software; it was just a fact that needed to be attended to.
So I think I will need to do a search of the literature to find out what has been written about the differences between how tool-makers and tool-users relate to their tools, and they try to figure out how that has affected my interview outcomes.
Interview thoughts
I have (finally) finished coding my interviews in NVivo. It's a nifty system, but painfully slow, even on my quad-core i7 iMac. The outcomes were interesting.
I got a lot of feedback on the use of Facebook (as I mentioned previously). Interestingly, many of my interviewees were not big Facebook users, and a couple didn't use it for anything except their studies. One had created the Facebook account specifically to join these groups. I think this was partly an artifact of my sampling process - many of those I interviewed were those who had posted a lot of resources into my system, so they were the ones who hadn't instinctively gravitated towards Facebook as a first choice. There were mixed feelings about Facebook - some were very positive about what was happening there, seeing a lot of valuable resource sharing going on, and others annoyed by the amount of noise that was unrelated to their study needs. Several expressed disappointment about the quality of the engagement - that students were far too exam-focused and weren't interested in useful medical knowledge that was related to the curriculum but not assessable.
Regarding the use of my tools, the major complaint I got was that it was too fine-grained. Linking the resource sharing to Learning Objectives meant that it was hard to get an overview of the resources that were available, or to see new ones as they were posted (Facebook, on the other hand, notifies you when there is a new post in a group that you belong to). A couple mentioned that it might be better to link the resources to the week, rather than to one of the 20 objectives in the week. There were a lot of other complaints, but they were largely related to the curriculum, not the system itself. I was a bit disappointed by the lack of useable suggestions for improvements, but in retrospect that's not surprising; the students hadn't been thinking as hard about my system as I had, and to them it was just another feature in the system, something that just floated out there in the ether, rather than a concrete set of tools that was modifiable. As a software developer, this is a hard perspective for me to understand. I naturally look at each tool in my possession with a critical eye, and almost instinctively note the ways in which it could be better. I'm probably the kind of customer software companies hate, as I'm regularly contacting them and telling them how they could do it better. There is probably a whole field of research out there on how people relate to software and other tools.
In the first interview, I realized that a major motivator for students to participate in my interviews was that they wanted to get feedback through to the Faculty. I proceeded to tell each of them that I would be collating feedback and giving it to the Faculty. There ended up being a lot of feedback, some quite blistering, about the quality of the course. Each student said that overall, the experience was good, but each found significant problems with the course and the culture it promoted among students. My next step is to sort this feedback into useful groups and pass it to the head of the program.
The interviews have turned out to be extremely useful, and I probably should have conducted them a year earlier - they led to some real insights into what is happening and why it's happening, and what steps I should take next in my research.
I got a lot of feedback on the use of Facebook (as I mentioned previously). Interestingly, many of my interviewees were not big Facebook users, and a couple didn't use it for anything except their studies. One had created the Facebook account specifically to join these groups. I think this was partly an artifact of my sampling process - many of those I interviewed were those who had posted a lot of resources into my system, so they were the ones who hadn't instinctively gravitated towards Facebook as a first choice. There were mixed feelings about Facebook - some were very positive about what was happening there, seeing a lot of valuable resource sharing going on, and others annoyed by the amount of noise that was unrelated to their study needs. Several expressed disappointment about the quality of the engagement - that students were far too exam-focused and weren't interested in useful medical knowledge that was related to the curriculum but not assessable.
Regarding the use of my tools, the major complaint I got was that it was too fine-grained. Linking the resource sharing to Learning Objectives meant that it was hard to get an overview of the resources that were available, or to see new ones as they were posted (Facebook, on the other hand, notifies you when there is a new post in a group that you belong to). A couple mentioned that it might be better to link the resources to the week, rather than to one of the 20 objectives in the week. There were a lot of other complaints, but they were largely related to the curriculum, not the system itself. I was a bit disappointed by the lack of useable suggestions for improvements, but in retrospect that's not surprising; the students hadn't been thinking as hard about my system as I had, and to them it was just another feature in the system, something that just floated out there in the ether, rather than a concrete set of tools that was modifiable. As a software developer, this is a hard perspective for me to understand. I naturally look at each tool in my possession with a critical eye, and almost instinctively note the ways in which it could be better. I'm probably the kind of customer software companies hate, as I'm regularly contacting them and telling them how they could do it better. There is probably a whole field of research out there on how people relate to software and other tools.
In the first interview, I realized that a major motivator for students to participate in my interviews was that they wanted to get feedback through to the Faculty. I proceeded to tell each of them that I would be collating feedback and giving it to the Faculty. There ended up being a lot of feedback, some quite blistering, about the quality of the course. Each student said that overall, the experience was good, but each found significant problems with the course and the culture it promoted among students. My next step is to sort this feedback into useful groups and pass it to the head of the program.
The interviews have turned out to be extremely useful, and I probably should have conducted them a year earlier - they led to some real insights into what is happening and why it's happening, and what steps I should take next in my research.
Tuesday, April 7, 2015
What should we do about non-institutional learning environments?
I'm sitting in a group study room with a group of fellow research students, having a "Shut up and write" session. This kind of study room is a relative novelty at my University - when I studied for my undergraduate degree, there weren't bookable rooms for group study available (we would have made a lot of use of them, I can assure you). Our University seems to have realized that if they are handing out group assignments, then they need to ensure that students have the physical spaces in which to do them. My experience is that there aren't enough of these spaces - it's extremely hard to book a room for group study - as soon as the room bookings are made available each week, they all disappear within an hour or two.
And yet, the opposite trend seems to be happening in the online world. Ten years ago, if students in a non-IT course were asked to collaborate online, it would need to happen in a University provided space, simply because there were no other spaces known to the students in which to study. But now, students have a huge variety of online communication tools for both group and one-to-one communication. They are naturally going to these spaces rather than the University provided online spaces. I don't have the research with me now to back this claim, but it would seem that these other online spaces are (a) less restrictive than the University provided spaces, (b) allow for a wider range of types of communication (there isn't the institutional pressure to stay on topic), (c) give students greater control over how they communicate (research shows that autonomy is a big motivator), and (d) are just a better user experience - easier to use, more pleasant to look at, and simply more functional.
So what are we (as institutions, and as a community of educators) to do about this? Is this a problem? There does seem to be a knee-jerk reaction in some corners to insist that students ought to use the institutional LMS, because
It is probably safe to say that a space in which students have more autonomy, and which has a much more intuitive and friendly interface, is one which will enable higher quality learning. So if our goal is high quality learning then we should be encouraging students to find the best spaces in which to learn. We still need to worry about privacy and equity and those things, but they are more easily addressed. There are a range of things we can do:
And yet, the opposite trend seems to be happening in the online world. Ten years ago, if students in a non-IT course were asked to collaborate online, it would need to happen in a University provided space, simply because there were no other spaces known to the students in which to study. But now, students have a huge variety of online communication tools for both group and one-to-one communication. They are naturally going to these spaces rather than the University provided online spaces. I don't have the research with me now to back this claim, but it would seem that these other online spaces are (a) less restrictive than the University provided spaces, (b) allow for a wider range of types of communication (there isn't the institutional pressure to stay on topic), (c) give students greater control over how they communicate (research shows that autonomy is a big motivator), and (d) are just a better user experience - easier to use, more pleasant to look at, and simply more functional.
So what are we (as institutions, and as a community of educators) to do about this? Is this a problem? There does seem to be a knee-jerk reaction in some corners to insist that students ought to use the institutional LMS, because
- it's designed for learning: a lot of educators and researchers have spent a lot of time optimizing the theory and praxis of collaborating in discussion forums, and so it must be better,
- privacy is important, and who knows what will be visible to the world when students use tools the University doesn't know about,
- we can't control the activity, or assess it, or even know that it's happening, and
- we can't ensure that all students have equal access to the space if we don't control it.
It is probably safe to say that a space in which students have more autonomy, and which has a much more intuitive and friendly interface, is one which will enable higher quality learning. So if our goal is high quality learning then we should be encouraging students to find the best spaces in which to learn. We still need to worry about privacy and equity and those things, but they are more easily addressed. There are a range of things we can do:
- We should teach out students about how to learn, and what constitutes good learning. I wasn't taught this until I started an education PhD, but if students are taught this earlier, then they can evaluate the spaces available to them with an informed eye, and make good decisions.
- We should educate our students about privacy, as part of the professional ethics education they already receive. Don't bad-mouth patients, but especially don't bad-mouth them in a public or semi-public space. Or even a private group space where the bad-mouthing is recorded, and a screenshot could be made public by a disgruntled group member.
- We should ask students to register the spaces they are using for their courses, so that other students in the course can find them. This doesn't mean that teachers need to be given access, only that fellow students can find the spaces without relying on word of mouth. This prevents students who aren't as sociable as others from missing out on the group discussion activities. Assigning students to study groups can also aid in ensuring equitable access.
Monday, March 23, 2015
PhD Pivot time
The big outcome from my interviews has been the realization that the subjects of my study are wedded to Facebook and that there's not much I can do about that. Each interview participant has come back saying they are using Facebook because everyone else is using it, and aren't using my tools because no-one else is. Several of my interview participants were selected specifically because of their posting frequency on my system - each posted a number of times, then stopped. The interviews revealed that they stopped because they discovered that everyone else was in Facebook, so if they wanted their posts to be seen and responded to, they needed to go there. And if they wanted to see what everyone else was sharing, then Facebook was the place to be. Perhaps in a different time, with a different cohort of students, this wouldn't be the case; but where I'm conducting my study, it's inescapable.
This result changes what I can do with regard to my study. The evidence is saying that regardless of what I do, there is no incremental improvement I can make to my system that will tip the balance - the network effects of students who are familiar with Facebook drawing the rest of the cohort into Facebook are overwhelming. This is an interesting developing in itself - it says a lot about the importance of various effects. It also says a lot about my original research questions: that to be an effective social network for learning, the space needs to be (a) private, (b) dedicated to learning, and (c) linked to curriculum.
The way the students are using Facebook indicates that (a) and (b) still apply, to some extent. Students are using a combination of "closed" (anyone can see the group, but needs approval to join) and "secret" (you need to be specifically invited to join) Facebook groups, so they are protecting their privacy in that way. In one sense, these groups are more private than the ones I can offer, since mine will always be vulnerable to Faculty oversight, whereas Facebook is unlikely to care about the contents of these groups except to algorithmically deliver advertising to the participants. These groups are dedicated to curriculum, and so my thoughts about students being concerned about friends and family not needing to be exposed to this content are still somewhat valid. However, the linking to curriculum isn't there at all. Several of my interviewees said that linking to curriculum would be valuable, but in the end that being in the same place as the other students was more important.
Which leads me to the conclusion that it may make sense to accept that I'm beaten, and start working with the enemy. I've started exploring the Facebook APIs, to see what is available. The features I'd like to be able to offer students include:
There are wider philosophical issues that come into play here, about what institutions should be doing and providing for their students, which I will cover in another post.
This result changes what I can do with regard to my study. The evidence is saying that regardless of what I do, there is no incremental improvement I can make to my system that will tip the balance - the network effects of students who are familiar with Facebook drawing the rest of the cohort into Facebook are overwhelming. This is an interesting developing in itself - it says a lot about the importance of various effects. It also says a lot about my original research questions: that to be an effective social network for learning, the space needs to be (a) private, (b) dedicated to learning, and (c) linked to curriculum.
The way the students are using Facebook indicates that (a) and (b) still apply, to some extent. Students are using a combination of "closed" (anyone can see the group, but needs approval to join) and "secret" (you need to be specifically invited to join) Facebook groups, so they are protecting their privacy in that way. In one sense, these groups are more private than the ones I can offer, since mine will always be vulnerable to Faculty oversight, whereas Facebook is unlikely to care about the contents of these groups except to algorithmically deliver advertising to the participants. These groups are dedicated to curriculum, and so my thoughts about students being concerned about friends and family not needing to be exposed to this content are still somewhat valid. However, the linking to curriculum isn't there at all. Several of my interviewees said that linking to curriculum would be valuable, but in the end that being in the same place as the other students was more important.
Which leads me to the conclusion that it may make sense to accept that I'm beaten, and start working with the enemy. I've started exploring the Facebook APIs, to see what is available. The features I'd like to be able to offer students include:
- Linking groups - students should be able to connect a group in the Medical Program to a Facebook group, so that links from one to the other can be created. The Medical Program's Portal can then give students links to the Facebook sites relevant to their groups.
- Curriculum linking - it would be good to give students the ability to easily link content shared in Facebook with curriculum within the Medical Program. How to do this is unclear - maybe a sharing tool within the Medical Program that pushed content to the right Facebook group? Each of the tools Facebook provides seems to be in some way limited that makes this difficult.
There are wider philosophical issues that come into play here, about what institutions should be doing and providing for their students, which I will cover in another post.
Friday, December 5, 2014
Interview progress
I'm well into my interviews - I have now interviewed five students about their use of my system and other social networks. I'm getting an interesting range of perspectives from students about how they use these tools and what they'd like to see from them.
Firstly, it's now very clear to me that the student body I'm studying is heavily invested in Facebook. All participants use Facebook for their studies, and no other social network. A couple had a presence on Academia.edu and ResearchGate, but there was no Twitter, Tumblr, Instagram, or other social networks in use, particularly not for their studies. Facebook has a clear first-mover advantage over any system I develop: the students all have Facebook accounts before they come to this degree, so it's natural to them to set up Facebook study groups. They are setting up whole-of-year and tutorial group Facebook groups, but also study groups of varying sizes and success rates. One very active study group has 260 members - nearly the whole cohort. From the way these groups are entrenched in the students' study lives, it would be very difficult to switch students over to a new system - they would need to be steered to use the new system from the beginning of first year, probably by academics warning them of the dangers of Facebook. But their patterns of Facebook usage have also indicated to me some features that are missing from the system as it stands - particularly, the ability to post to groups, rather than to the whole cohort. Interestingly, that facility was in my original design plan, but I abandoned it after deciding that I needed to maximize the exposure of each item to students in order to increase cross-cohort collaboration.
The students also had some interesting ideas about changes that would improve the system. A common thread was the request for more flexibility in their self-presentation on the profile pages. Currently, the pages allow them to present a small amount of information about their interests and previous degrees, but suggestions for additions (particularly from one participant) included travel plans, favourite books, medical specialty and placement interests. They would like to be able to better present themselves as professionals. Another common thread was that it would be useful to have less rigidity in how the resources were attached to the curriculum: rather than exclusively binding the resources to learning objectives, it would be useful to bind to other levels of the curriculum; and to search and be able to tag the resources.
The students have generally agreed that an embedded system made sense, and would be preferable to Facebook if done right, but generally pointed out that it would be difficult to do. It wasn't clear to me whether they were sincere or were just trying not to hurt my feelings - as they all knew I was the architect of the system and had a vested interest in it succeeding. But they did understand the reasoning behind linking to the curriculum, as well as having a protected space. The issue of copyright was one that several noted - they mentioned that one frequently shared item on Facebook is textbooks, and that they wouldn't be comfortable sharing those in a University controlled space.
Most students had a reasonable amount of awareness of the available functionality. But when discussing it, a few things stood out. Many didn't realize that the rating tools existed - that they could collaboratively curate the resources uploaded by their peers. And none of them were aware that staff can't see the shared resources. I had announced this early on, and I'm fairly sure that I repeated my announcement, but none of them were aware. This wasn't one of my scripted interview questions - I found out when students stated that one reason for their wariness of these tools was that staff could see what they were doing.
Lastly, I realized very quickly that after these interviews I would need to prepare some feedback to the Faculty about the students' experiences in the degree. One student stated that one of the reasons she was so keen to participate in the interviews was that it would give her a way to feedback to the University (side note here: I am also a staff member, though not in the Faculty I am studying). It seems to me that it would be an ethical breach not to give this feedback to the Faculty, so in the later interviews I have informed students that I would be collating the feedback that is relevant to the teaching of the program, and giving that to the Faculty. Quite a bit of it is relevant to my project - particularly the students' descriptions of the way assessment drives their learning behaviour, and how it limits their interest in sharing, but also in reading outside the scope of what will be assessed. It's clear that a culture has been developed within the student body, driven inadvertently by the Faculty, of learning only what will be on the exam, rather than developing a deep and broad understanding of Medicine and how it fits into our world.
These have been very informative interviews; I'm planning to interview two or three more students, but I already have a collection of things I can act on, and with luck be able to deliver to students before next year's intake starts their studies.
Firstly, it's now very clear to me that the student body I'm studying is heavily invested in Facebook. All participants use Facebook for their studies, and no other social network. A couple had a presence on Academia.edu and ResearchGate, but there was no Twitter, Tumblr, Instagram, or other social networks in use, particularly not for their studies. Facebook has a clear first-mover advantage over any system I develop: the students all have Facebook accounts before they come to this degree, so it's natural to them to set up Facebook study groups. They are setting up whole-of-year and tutorial group Facebook groups, but also study groups of varying sizes and success rates. One very active study group has 260 members - nearly the whole cohort. From the way these groups are entrenched in the students' study lives, it would be very difficult to switch students over to a new system - they would need to be steered to use the new system from the beginning of first year, probably by academics warning them of the dangers of Facebook. But their patterns of Facebook usage have also indicated to me some features that are missing from the system as it stands - particularly, the ability to post to groups, rather than to the whole cohort. Interestingly, that facility was in my original design plan, but I abandoned it after deciding that I needed to maximize the exposure of each item to students in order to increase cross-cohort collaboration.
The students also had some interesting ideas about changes that would improve the system. A common thread was the request for more flexibility in their self-presentation on the profile pages. Currently, the pages allow them to present a small amount of information about their interests and previous degrees, but suggestions for additions (particularly from one participant) included travel plans, favourite books, medical specialty and placement interests. They would like to be able to better present themselves as professionals. Another common thread was that it would be useful to have less rigidity in how the resources were attached to the curriculum: rather than exclusively binding the resources to learning objectives, it would be useful to bind to other levels of the curriculum; and to search and be able to tag the resources.
The students have generally agreed that an embedded system made sense, and would be preferable to Facebook if done right, but generally pointed out that it would be difficult to do. It wasn't clear to me whether they were sincere or were just trying not to hurt my feelings - as they all knew I was the architect of the system and had a vested interest in it succeeding. But they did understand the reasoning behind linking to the curriculum, as well as having a protected space. The issue of copyright was one that several noted - they mentioned that one frequently shared item on Facebook is textbooks, and that they wouldn't be comfortable sharing those in a University controlled space.
Most students had a reasonable amount of awareness of the available functionality. But when discussing it, a few things stood out. Many didn't realize that the rating tools existed - that they could collaboratively curate the resources uploaded by their peers. And none of them were aware that staff can't see the shared resources. I had announced this early on, and I'm fairly sure that I repeated my announcement, but none of them were aware. This wasn't one of my scripted interview questions - I found out when students stated that one reason for their wariness of these tools was that staff could see what they were doing.
Lastly, I realized very quickly that after these interviews I would need to prepare some feedback to the Faculty about the students' experiences in the degree. One student stated that one of the reasons she was so keen to participate in the interviews was that it would give her a way to feedback to the University (side note here: I am also a staff member, though not in the Faculty I am studying). It seems to me that it would be an ethical breach not to give this feedback to the Faculty, so in the later interviews I have informed students that I would be collating the feedback that is relevant to the teaching of the program, and giving that to the Faculty. Quite a bit of it is relevant to my project - particularly the students' descriptions of the way assessment drives their learning behaviour, and how it limits their interest in sharing, but also in reading outside the scope of what will be assessed. It's clear that a culture has been developed within the student body, driven inadvertently by the Faculty, of learning only what will be on the exam, rather than developing a deep and broad understanding of Medicine and how it fits into our world.
These have been very informative interviews; I'm planning to interview two or three more students, but I already have a collection of things I can act on, and with luck be able to deliver to students before next year's intake starts their studies.
Subscribe to:
Comments (Atom)

