"Are you engaged?" How do we measure user engagement on CommonGround?
Recently, our team at King’s College London have been thinking about how we can cultivate an engaged community on our online peer support platform, CommonGround. We have dedicated so much time to co-producing CommonGround with software developers and people with lived experience, that we want to make the community as vibrant and active as possible.
In our previous blog, we explored what an engaged community means, and our plan to reduce the chances of CommonGround being a ‘tumble weed ghost town’. We also discussed how our definition of engagement differs to the traditional view of engagement where “the more time spent logged in the better”. We have defined engagement as “any measurable activity that a user undertakes on CommonGround, including reading, reacting to, and responding to posts, as well as creating new content, editing their profile, following other users”. We know that we will therefore need to measure engagement in a way that reflects our broad definition of engagement, and so in this blog are exploring how should we measure the engagement of CommonGround community members?
A simplistic view of measuring engagement?
Often a user’s engagement is determined from their usage metrics, considering the number of logins and the total time logged in, alongside the volume of other actions such as ‘liking’ posts. This approach considers that a highly engaged user is someone who logs in frequently and for significant periods of time. Moreover, if two user’s login for an equal period of time, the user that is ‘liking’ lots of content is considered more engaged than a user who is simply scrolling through the content.
For our research trial, capturing these usage metrics will help us understand what features of CommonGround are used the most and the least, and help us understand how often they are doing different activity. This will provide insight into patterns of user activity – it might tell us that we need to raise awareness of under-used features, perhaps by creating ‘how to’ tutorials, or a post from the engagement team drawing attention to the feature. These metrics might also highlight where we could prioritise future development of the platform, for example, if people use the 'reactions' feature often, perhaps we should consider expanding the range of reactions that we have. However, these usage metrics do not capture the user experiences or motivations behind their activity. For instance, if the use of these reactions is high, this could reflect different motivations, for instance users opting for quicker or less taxing forms of engagement or alternatively could reflect some anxiety or apprehension in creating their own original comment.
Therefore, these usage metrics in isolation will not paint a picture of how engaged our users are. Painting the full picture is essential because we know that how a user feels while engaging with CommonGround is extremely important, and that it is possible users could experience real benefit(s) from using CommonGround in a way that might appear as relatively disengaged from considering the volume of activity alone. How are we supposed to understand if a platform like CommonGround is liked, wanted, and needed if we do not fully understand user engagement?
The need to capture how users feelTo unpack how users feel about using CommonGround, we plan to explore their specific experiences, preferences, feelings, and any potential struggles – and how all these varied over their time in the community. To gain these insights, some users will be interviewed after they have used CommonGround for three months. To evaluate our engagement plan, we will also delve into their opinions of the posts and interactions from the engagement team, and how they felt their presence influenced both themselves and the wider community. Our interviews will help us understand the different patterns of engagement and user experience on CommonGround, and those that users find beneficial or potentially unpleasant or harmful. For instance, we could have users who log in fortnightly; according to the analysis of their usage metrics alone, they would be considered disengaged user due to their infrequent activity, however, in reality this fortnightly login may have been a positive experience. It is also possible that we could find that someone who logs in and posts daily may fall into a cycle of comparing their symptoms as they are consuming content about other people’s conditions at a high rate and volume. In this case, a user that appears highly engaged when considering usage metrics alone, may be experiencing some negative effects as a result of their engagement pattern. Understanding the nuance of user experience and how they feel while using CommonGround is therefore essential.
Beyond the individual user
We also know that our users are not using the platform in isolation, but rather are engaging as part of a community. It is therefore important for us to understand the sense of community on CommonGround: what are users’ feelings of membership, identity, influence, and attachment with each other? There are a couple of questionnaires that we can use to capture how people feel about their participation in the online community – this includes rating how much they agree with statements such as “I feel at home in this group” and “the CommonGround platform feels like a community to me”. Understanding the user experience as both an individual and as a member of the community will help shed light into engagement on CommonGround. We will also be able to consider any differences between individual experience and sense of community membership. For instance, what might it mean if people feel a strong sense of community but tell us that they did not feel satisfied by their individual engagement with the platform?
Reflecting on the complexity of engagement
While no single measure is able to paint the whole picture of engagement, we hope that our combination of traditional usage metrics, ratings of the sense of community, and explorations of users’ individual experiences will build the picture together.
We hope to begin to shed light onto the different types of engagement in online health communities and what may foster or hinder the development of the community. We will also be able to evaluate whether our approach to engagement is a liked by our users, and whether it is possible for our engagement team to successfully implement the engagement plan. Our findings will help us refine our definition and plan for encouraging engagement, both for future CommonGround trials and potentially the development of online health communities more generally. We are excited to see what we will learn when we observe the community and explore their experiences. Most importantly however, we hope to see CommonGround becoming a place for people with long-term conditions to learn, share, support, and grow.