Using social media to measure passenger satisfaction with their train company
One of the key applications of social media analysis and measurement, and one that we have explored through a number of posts on this blog, is customer experience.
Through a pilot project with Transport Focus, which findings were recently published, we further explored the possibilities social media offers when it comes to deriving customer insight. We wanted to share some of the key findings on the occasion of AMEC Measurement Month (#AMECMM),
Transport Focus was keen to understand the value of using metrics derived from social media conversations alongside traditional customer satisfaction survey results to identify potential correlations and future applications.
Some of the key questions that Transport Focus was looking to answer resonated with other pilots we have conducted in the past.
- How do the results of the National Rail Passenger Survey (NRPS) around Train Operating Companies (TOCs) compare with those derived from the analysis of social media posts?
- What are the key differences and how can they be explained?
- What are the key drivers of trust and satisfaction as derived from social media analysis?
- Can automated sentiment and topics offer any insights around customer satisfaction?
Compare and contrast
Broadly speaking, social metrics were in line with the NRPS but there were some key differences. For example, Arriva performed well on the NRPS but performed poorly on social media. This could be explained by the higher proportion of younger passengers (as identified through the profiling exercise of social media users we conducted), who are more active on social media and may be more prone to using social media as a platform for complaining or reporting issues.
Interestingly, the language and tone used by passengers of Southeastern, the worst performing company on both the NRPS and social media metrics, alongside Arriva Trains Wales (Arriva TW), was more aggressive and vitriolic compared to other TOCs, with passengers appearing to have lost trust in the service and the brand.
Matching higher levels of satisfaction on both the NRPS and social media, Chiltern Railway and Virgin Train saw its passengers sharing more positive experiences and often thanked and praised these operators for their service, experience and responsiveness, indicating a decidedly more positive relationship.
A question of trust
We have previously highlighted some examples of our work around trust on this blog and took our work further through the study by looking at drivers of trust based on social media conversations.
Cleanliness and room in carriages were the key drivers of negative mentions across all seven TOCs included in the study.
What was interesting though was that levels of trust were significantly lower when complaining about the brand than when complaining about the service only.
Punctuality and reliability were among the key drivers to negatively impact trust more for TOC brands. Crowded trains also led passengers to believe that TOCs do not have passengers’ well-being in mind.
Manual vs. automated
One of the objectives of this pilot was to test the accuracy of automated sentiment coding. We have talked at length about the human touch in the analysis of social media mentions. Transport Focus required insight on the accuracy and potential application of automated sentiment. Our manual validation confirmed that tools are currently not able to interpret sarcasm, accurately categorise topics or provide context and insight. It also confirmed that automated sentiment was accurate 50-60 per cent of the time, a statistic in line with many of our previous comparisons. We also concluded, as we have in previous posts, that sentiment alone may not be a relevant or useful metric for measuring customer experience.
What has Transport Focus learnt?
There is significant value in using social media to benchmark and gain insights into customer experience and satisfaction. It can be used as a more regular barometer to understand and benchmark passenger opinions, in-between or during established surveys such as the NRPS. Key metrics such as levels of satisfaction and net sentiment taken from samples of passenger comments, coded manually through a researcher’s lens, can help identify new insights.
The fact insight is drawn from unprompted ‘in the moment’ experiences of passengers, also provides a different, and complementary view to survey data. It adds invaluable colour and context which is often missing from survey verbatim.