The 7 must-knows of internationalisation data

The 7 must-knows of internationalisation data

This week on the EAIE blog, we will be covering the summer issue of Forum magazine. This issue is dedicated to data in international education. We look at how data can be and are used, as well as collected, by practitioners. Today’s blog post, the first of the week, looks at the seven main challenges that professionals in internationalisation should be aware of regarding data.

After drafting and implementing two Strategic Internationalisation Plans (SIP), participating in the IMPI Project (Indicators for Mapping and Profiling Internationalisation) and taking part in the process to agree upon and approve a set of common indicators amongst Catalan universities, the Universitat Rovira i Virgili is in a good position to talk about the challenges posed by working with data. We may not have any magical solutions, but we know plenty regarding challenges.
 
It is easy to say that we all need a good set of data to measure, be accountable and advocate for the internationalisation cause both within our institutions as well as externally, to governments and ranking agencies. But what about the challenges? I have tried to summarise our experience in the 7 must-knows of internationalisation data. If one of you can avoid a mistake or two by reading this, it will have been worth it!

  1. Experts at interpreting data (from our own perspective, of course!)

When we talk numbers, all of the sudden we all become experts. Throwing a percentage here and there can help us defend our case. However, it is not the indicator per se that is important, but the context and its interpretation. Having a university with 50% of international students can seem like a great achievement. Knowing that this university is right at the border and that most of this 50% of international students are coming from only 10 km away does change the way we interpret this percentage, doesn’t it?

  1. Challenges of data collection

Data collection challenges start from identifying who owns the data, when is it collected, how is it processed and what a certain indicator exactly means. For instance, at a meeting during the IMPI project, six universities spent a full morning discussing what an international student is: Is an international student the one who holds a foreign passport? A foreign degree? A full-degree student with a different nationality? A mobility student? A full morning spent in this discussion and with no agreement reached, the conclusion was that indicators are truly tricky and that more often than not we are comparing apples with oranges even unknowingly!

  1. Where are the magical data collection mechanisms?

Establishing mechanisms to gather the data can also be a significant challenge. If there is no incentive or a clear mandate to provide the data, gathering the information will most probably not be an easy task. Why should I provide the data? What do I get in return? Ensuring funding linked to providing data is undeniably a very efficient way to receive these data. In 2001, the Catalan Government created a set of indicators that work as the basis for the funding that the Government gives to public universities. And many governments resort to this.

  1. What about challenges in using the data?

Internally, having a Strategic Internationalisation Plan (SIP) may be a good way to think strategically about the goals that we want to achieve, the operational objectives needed to achieve them, the activities to develop these objectives and the indicators that can measure the results. Nevertheless, a Strategic Internationalisation Process can test anyone’s patience and endurance! And dealing with its indicators is one of the main difficulties.
 
While gathering the starting data for the SIP was not an easy task, and studying the past trend in order to set the goal indicator was a difficult exercise, the biggest difficulty was half way through the implementation of the plan when we realised that the definition of certain indicators was not clear. We were not sure what to count within each indicator and the internal source to provide the indicator had not been clearly identified. That is when we created a standard form per indicator containing all relevant information, thus agreeing in all aspects of each of the indicators of the SIP and making it possible to implement them and evaluate them properly. In this standard form, we included the concepts of: definition, source, formula used to get the data (if needed), period of time of reference (academic or natural year), original data at the beginning of the SIP and expected Key Performance Indicator (KPI) to attain at the end. If only we would have known how to do this from the beginning, it would have saved us a lot of headaches!

  1. Data, data and more data requests.

Universities receive constant requests for data from within the institution or from the outside (government, agencies, studies, etc). Nevertheless, in this field, one purpose does not fit all. For each purpose, for each request, data must be presented in a different way, must include different concepts and is sometimes even given by different units from within our institutions. This makes it not only challenging but also very labour-intensive.

  1. Data as a means or a goal?

In all of this, one important thing to consider is that when we have to gather and work with data, we tend to forget that data are not a goal in itself but a means. A powerful means if it is contextualised, interpreted and used properly of course, but not a goal.

  1. Quantitative versus qualitative?

Quantitative data is generally easier to define, but let us not forget the discussion between quantity versus quality, that we all know well enough. Do we count only what can be counted or do we really count what truly counts? Or even: Can everything that counts be counted?
 
As a conclusion, I can only wish all of us good luck! After all, in higher education institutions we are bound to have to keep referring to data for everything and nothing in order to proof ourselves with the ‘scientific method’ our institutions so proudly advocate.
 
Marina is Director of International Relations at Universitat Rovira i Virgili in Spain.
 
EAIE members will receive their copies of Forum on their doorsteps soon, but can already download the full version online. Non-members can view the editor’s pick in the Resources Library. Gain full access to Forum by becoming an EAIE member.

Marina Casals Sala
Universitat Rovira i Virgili, SpainMarina Casals Sala is Director of International Relations at Universitat Rovira i Virgili. She is also a previous member of both the EAIE Board and General Council, in addition to being a trainer with the EAIE Academy.