“parliamentarians and the administration may…have different understandings about the nature and the extent of services to be offered by the research service. Establishing a Service Charter may help to reduce misunderstandings by formally stating the scope of services available”
Guidelines, p. 24
The Guidelines defines a service charter as
“a document that states what the research service will commit to provide to its parliamentary clientele. It includes the parameters that will guide how the products and services will be provided. Typically, it would spell out the services offered, who can access these services and the manner in which they are delivered…[Elements of a charter] include operating principles and descriptions of what is – or is not – within the scope of the research service. From this charter, it should be clear that: • the research service is offered to support parliamentary duties only; • there is no support for the educational or academic pursuit of a parliamentarian (or a member of his or her family or staff); and • personal financial, medical or legal advice will not be provided”
Guidelines, p. 24
It is not clear, so far, how widespread the use of ‘service charters’ is. Some services have, rather, ‘rules’ which are formally defined by their institution and these may well incorporate at least some of the recommended elements of a service charter. Publicly-available examples of charters and rules are accessible in the links below.
Examples of research service charters and rules
These are available in their original language. (NB Google Translate can provide auto translations even for complete files).
European Parliament – rules of the members’ research service (in English)
Greece – rules of procedure of the scientific service of the parliament (in Greek)
United Kingdom, library & research statement of services (in English)
Lithuania – formal procedure for requesting work by the research service (in Lithuanian)
New Zealand – library and research statement of services (in English)
Thanks to all the colleagues who provided the examples above. Further examples are welcome! Thanks also to other colleagues who provided feedback on the topic.
A summary of practice on service charters
What are the reasons for having a service charter?
Reduce risk of misunderstanding of what the service will do, for whom, for which purposes
Spell out what it will not do
Set service standards and manage expectations
Clarify the priorities for service
Promote the service and its specific products
Guide staff and management, and offer them some protection
So, while the most obvious purposes are to demonstrate the quality of the service and to manage the expectations of clients, there can be additional and less-explicit intentions to (a) set standards for staff performance (b) manage staff behaviour, reducing the risk over-delivery as well as under-delivery (c) protect staff and the service from excessive and inappropriate demands.
It is worth noting that at least one service went down the road of a service charter only to find their first version too detailed, difficult to apply consistently and a cause of inflexibility. They stepped back to a more aspirational (values-based) approach, more strategic and less operational. in their promise to clients. There is also the risk that, if promised standards are not consistently achieved, client satisfaction and service reputation might even be damaged by a charter. Adopting a service charter is not an automatic and risk-free step – with the undoubted benefits there is a potential downside. One mitigation is to be conservative in the promise to clients: better to promise less and deliver more than the reverse!
What form of service charter?
The main options in use are:
‘Rules’, formally approved by the parliamentary authority. This gives the document force but it may be a slow process to get initial agreement and to make necessary updates later. There is always the risk of drafts being modified without full awareness of the operational consequences.
Guidelines set by the research service and translated into a communication to clients. This has the advantage of speed (to introduce and to revise) and coherence with service management – what is promised should be deliverable. But it lacks the force of Member decision.
Combination – Main principles in a formal ‘Rules’ but operational standards kept as service guidelines for flexibility.
Inspirational – A statement of the service’s purpose and commitment to certain values and to service quality. This might be stand-alone or might complement any of the three options above
Which option to choose?
The choice of option depends in part on the culture of the parliament concerned. In some, a set of ‘rules’ is the only format that will be acceptable, in others there is more flexibility.
It also depends on why the service wants a charter, for example
To set standards for staff
To make it easier to say ‘no’ to some requests
To promote your service and improve its image
How much flexibility you need
Do you need anyway to review and define your services, products and standards?
What goes into a service charter?
The list of potential inclusions is long. These are some of the main areas typically covered:
Why – The reason the service exists – mission / vision / values / the essence of what it offers Members and other clients. Elements such as independence, impartiality, professionalism, confidentiality, and relevance to Members, are likely to be prominent.
What the service offers – the content, the different products and services
Who the service is for
How the service is accessed, requested and delivered
When the service is available and how quickly it will respond
Exclusions, limits, priorities – services that are not offered; request types that are deemed inappropriate; limits to effort in response to requests; types of request and client that will take priority, especially in times of high demand
Commitments, e.g. client satisfaction, confidentiality; impartiality; ethics; quality of content, personnel and processes
What is expected of the client – a sensitive point, possibly, but it might be worth setting some limits on how clients use research and how they behave in relation to staff and to other clients.
The following methods have been identified from professional experience and from detailed returns by several parliamentary research services. The original list of 40 items used in the ongoing survey is being added to. (They are marked ‘new’ in the list below)
Further information, including results from the survey, will be added progressively, including pages on particular methods.
Intranet site/pages dedicated to the research service
The concept of a ‘research week’ was pioneered in parliamentary research services by Uganda in 2016. It has since been taken up by the parliaments of Ghana, Zimbabwe and the United Kingdom. The generic concept of ‘research week’ was not, however, invented in Uganda or for parliaments but creatively borrowed from the wider world of science communications (‘SciComm’).
According to the ‘White Book on Science Communications Events in Europe’, a ‘research week’ in SciComm might typically be used to:
promote awareness of science research activities to a wider public;
encourage dialogue between science and society;
foster a ‘scientific culture’ – public understanding of science;
develop interest in scientific careers.
It is generally intended to be a positive marketing effort rather than an occasion for critical scrutiny and debate.
The ‘White Book’ [PDF download], is a 2005 overview by EUSCEA – the European Science Engagement Association – of SciComm events which were then taking off across Europe. Today, ‘research weeks’ are widespread around the world, held by universities, academic societies, cities, regions – and parliaments.
The primary purposes of a ‘research week’ in a parliamentary context appear to be to connect the national research community with Members, to raise the profile of scientific evidence in policymaking and to ease the transmission of scientific content to the policy process. It is not ostensibly about raising the profile of the parliamentary research service with Members, even if in the Ugandan case, at least, this was the primary motivation. This post will consider ‘research weeks’ in terms of their impact with Members and on the profile of the parliamentary research service – rather than the effect on relations with the scientific community.
The Ugandan experience
Based on publicly available documents and on interviews with John Bagonza, Director of the Research Service, Parliament of Uganda and Emily Hayter, Programme Manager for INASP. Thanks to John and Emily for their assistance. Any errors or omissions are my responsibility.
The parliamentary research service in Uganda had concerns about their profile with Members and tried various measures to improve it, which had all failed. These methods included displays in the library and in the parliament restaurant, presentations to Members and individual meetings with Members and staff. The service still had ‘just a few clients’ and many Members did not know about the service or that it was free to use. The service also felt it was relatively unknown to the wider research community and even to government departments.
The service discovered the concept of ‘research week’ and wanted to use it as a big breakthrough event to raise their profile but were unable to proceed with it until offered external financial support through the VakaYiko project. (The project was UK government funded and implemented by a consortium led by by INASP (the International Network for Availability of Scientific Publications) – a UK-based NGO).
The first parliamentary Research Week was held on 23-26th August, 2016, with the theme “Using evidence to strengthen Parliament”. The aim of the research week was to showcase the research products provided to Members of Parliament (MPs) by the Department of Research Services (DRS); and also to to present other research organizations and academia as complementary sources of evidence for policy. For its first three days there were information stalls and presentations just outside the entrance to the Parliament of Uganda; and it concluded on the fourth day with an off-site symposium. The event was organised jointly by DRS, Uganda National Academy of Sciences (UNAS) and INASP.
The key targets for the research week were the MPs and staff members of parliament, and they were reached in significant numbers – 242 Members (of 427 at the time) and more than 100 staffers. High profile visitors included the Prime Minister and the Deputy Speaker as well as other office-holders of parliament. The event also attracted around 50 students and members of the general public.
Eighteen research organisations were represented at the event.
The DRS produced a report on the event in November 2016 (three months after it took place) and a paper for a meeting in London in April 2017.
The event generated 81 new research requests from Members and the level of research requests broadly increased throughout the remainder of the parliamentary term, as reflected in the graph of outputs below. The relative decline in the most recent two years can be ascribed to the the pandemic.
The profile of the research service was raised with Members and also with the parliamentary administration. The notion of using evidence from external research bodies was publicised and those bodies increased their understanding of what Members required and of how to cooperate with the parliamentary research service. The various institutions are reported to have had more contact after the event.
The research service had hoped that the success of the event would lead to additional parliamentary funding to continue running ‘research weeks’, possibly even on an annual basis. That funding has not materialised and there have been no further research weeks held. The new parliament elected in 2021 has 60% new Members and the research service is again facing a problem with its profile.
The research service did obtain a ‘small’ increase in its regular budget and it found it easier to obtain support from international collaborators after the event.
INASP brought in people from research services in Ghana and Zimbabwe to observe the exercise. Both parliaments went on to hold some form of ‘research week’.
Organising the event
The research service had virtually all 35 staff working full-time to prepare the event for two months, although the total staff input was not measured. The Director feels this timeline was very short and that 3-4 months would be optimal. The time was needed not only to organise the event but to prepare content, including audio and video, as promotional material. In addition, the external institutions needed much more time than they were given, to adapt their content to a parliamentary audience. They had to learn how to work in a parliamentary context; as the parliament had to learn how to work with them. There were issues of unmet expectations and the consequences of a crowded programme. Seven expected institutions (of 25) did not show up, in part apparently because of these issues. Overall, there was enthusiasm to be involved, to be given access to parliament.
The conception of event and its realisation was mostly the work of the parliamentary research service. INASP provided some review of plans and technical advice, only. It is an event that could, then, in principle be managed by a research service with sufficient staff and resources.
There were some challenges during the event, notably in the different cultures, interests and expectations of the main participant groups – Members, academics and parliamentary researchers. The Members did not much attend the presentations that were held as side events to the exhibition, and the report on the symposium mentions an issue with their engagement. One of the key differences of interest was the Members’ focus on constituency matters – evidence about issues, needs and features of their constituency – and the academic interest in generic scientific and national policy issues.
Some risks did not materialise but are learning points for a future event. The combination of novelty, ambition and tight timetable put some pressure on planning and delivery. In addition, busy and motivated Members’ difficulty in sticking to a planned timetable and agenda can be an issue in any parliamentary event, anywhere, and can disrupt even the best planned programme.
Budget figures are not available but there were some significant costs – marquee hire, conference facilities in a hotel, branded clothing and other giveaways, speaker fees/expenses, professional printing, purchase of press coverage. Interestingly, to the extent that this was funded by INASP it was done through the Ugandan National Academy of Sciences (UNAS) and not through the parliamentary research service. The remit of INASP prohibited direct funding of parliamentary services. INASP found UNAS an invaluable partner, partly because it is an individual membership organisation and so ‘neutral’ in a landscape of research bodies each with an institutional interest. This neutral local partner was critical to connecting with the national research system. UNAS was also helpful through being able to make quick financial decisions. The parliamentary research service reported that it did not use much of its own budget and that the exhibitors were self-funding, in addition to the INASP contribution.
The INASP Programme Manager noted that the high-cost items were possibly not all functionally necessary but apparently had the desired effect in terms of attracting the target audience – the DRS knew its audience.
The large number of new requests in one week, on the one hand, was a successful result, and on the other, became a serious problem. It exceeded the capacity of the service to deliver in a reasonable time. Any service organising a ‘big-bang’ event needs to consider the reputational risk from raising, then not meeting. expectations. There is no report of long-term damage in this case.
For the Director of the research service, these were the three key lessons for anyone else organising a ‘research week”:
Have enough time to prepare the event and plan it thoroughly
Engage the other institutions in good time, bearing in mind their need to adapt
Don’t go into it without adequate (money) resources, including budget for promotion
For the INASP coordinator, the involvement of a local research partner – external to the parliament and any research vested interest – was critical.
Based on correspondence with Mr Mohammed Hardi Nyagsi, former Director of Research at the Parliament of Ghana, and a report supplied by him. Any errors or omissions are my responsibility
The ‘research week’ in Ghana had a different origin and purpose to that in Uganda. In the context of a memorandum of understanding (MOU) with the Westminster Foundation for Democracy (WFD), the Parliament of Ghana established the Inter-Departmental Research and Information Group (IDRIG)
“which commenced on a pilot basis in June 2016 [consisting of six] Departments, namely Research, Library, Hansard, Committees, ICT and Public Affairs Departments. These departments hitherto, were operating in silos with a significant amount of turf protections and inter-departmental rivalry being the norm. This led to duplication of information storage systems and a lack of clarity on the roles of the departments among their primary clients – the Members of Parliament”
IDRIG organised two types of event in the ‘research week’ style – an ‘Induction Exhibition’ and a ‘Research and Information’ week – they are described in the extract below. Both appear to be more focused on the in-house research service than the exercise in Uganda, or as suggested by the general ‘research week’ concept.
Extract from ‘The IDRIG Concept’:
The then Director of Research led three (annual) ‘Research and Information’ weeks from 2017.
As reported by WFD in October 2020 when its support programme ended, “[that] the IDRIG has since received funding from the African Development Bank to provide a digital research and information service to MPs is testament to the success and promise of the initiative”.
More recently, in 2021 the Parliament of Ghana held a ‘Data Fair’ which drew on principles and approaches from the ‘research week’ model. The Fair was part of the ‘Data for Accountability’ project led by the African Centre for Parliamentary Affairs/ACEPA, with the participation of the Ghana Statistical Service and INASP. The project is funded by Hewlett Foundation. A video on the Fair illustrates very well its similarity to a ‘research week’. (The link below starts at 01:49 with an introduction by Omar Seidu of the Ghana Statistical Service)
[No information currently]
United Kingdom experience
The UK parliament has run a form of research week annually since 2018, under the title ‘Evidence Week’. Interestingly, this also uses a ‘neutral’ external partner (Sense about Science) to connect the parliamentary research services with the national research network. Sense about Science has published an overview of ‘Evidence Week’ – and there is more detail on the event below.
As far as can be understood from published material, this event is much more about raising the profile of external research than about the profile of parliamentary research services.
[Awaiting result of contacts with organisers.]
2021 Evidence Week
Three main parts:
“1. Opening event (online). The opening event will be held online Monday 1 November, via zoom. The format will consist of constituents and community groups asking questions to MPs and committee chairs on how parliament is using or scrutinising evidence on subjects that matter to them. MPs and experts will respond, with additional comments from information providers and analysts.
This is a fantastic opportunity to bring together MPs, researchers, and constituents, engaging with cutting edge research on policy decisions that the public have questions about.
2. 3 minute speed briefings and meetings with leading researchers (online and in person). Evidence Week will be hosting a series of virtual and in-person speed briefings from our research partners covering topics including, Net Zero, Health, and Data and AI. Face to Face briefings will run in Westminster, room TBC, on Tuesday 2 and Wednesday 3 November, with pods set up as a live exhibition. MPs, peers, and parliamentary staff can drop in any time and discuss the evidence and briefings presented. MPs and parliamentary staff will also be able to book individual online meetings with researchers to discuss in more detail their work and how evidence can be used in policy making decisions, throughout the entirety of the week.
3. Training for parliamentary staff on using and understanding data in their work on 5th November 1pm-3.30pm (online…)
A researcher from Imperial College has written a blog post on the experience in 2021.
“On the day in parliament, I hosted an exhibition ‘pod’ to share insights and resources with MPs and peers and answered questions. I had on average three minutes to share research findings with each visitor to my pod. MPs, peers, and their staff were very receptive to my research findings, and some of them booked a one-to-one to discuss further. It was a good experience for me, and the interactions were greatly appreciated by both sides. MPs, peers and their staff had a positive attitude towards input from scientists, which I found really encouraging.”
Previous Evidence Weeks
The first ‘Evidence week’ event in 2018 looked at how evidence could be used in policy, both in general and in relation to specific topics. These were covered in multiple 1-2 hour events over four days.
The second ‘Evidence week’ in 2019 took a different format. After a one-hour opening event there were “two days of 3-minute briefings at interactive ‘Evidence Pods’ in the Upper Waiting Hall, with more than 20 partners, on different facets of interrogating evidence, from drones to the census”. Although the first year was described as ‘successful’ it appears that a more rapid-fire format was either seen as more accessible and/or more deliverable.
As reported by the main external partner, ‘Sense about Science’, the ‘Evidence week’ in 2020 was forced into a virtual format by the pandemic. The same concept of three-minute presentations – with the option of more in-depth discussion – was retained.
“In 2020, we were unable to hold a physical event in Upper Waiting hall where constituents and researchers can meet MPs to discuss urgent policy issues. With our communications partners, POST, the UK Statistics Authority, Office for Statistics Regulation, and Ipsos MORI, we created an innovative new way of replicating this experience online. From 16th- 20th November, Evidence Week in Westminster looked at evidence on a range of issues including Covid-19, AI, and climate, that Parliamentary committees, constituents and researchers are grappling with.”
“Through the online platform, MPs, Peers and parliamentary staff were able to hear 3-minute video briefings and jump into meeting rooms with the researchers themselves to instantly discuss emerging evidence and potential strategies for tackling complex policy problems.”
The University of Southampton published a blog post on their researchers’ experience of the event, including screen shots of MPs and their staff being briefed on Zoom.
Sense about Science published an interesting assessment of the 2020 event with data on attendance. This states that 34 Members of Parliament (of 650) and 10 members of the second chamber (of 783) engaged with the event. It is not clear if this statistic includes the impressive participation of 13 Committee Chairs in the opening event. In addition, 47 staff of MPs took part; it is reported that altogether 78 MPs or their offices participated. Twenty parliamentary and government officials also attended. The University of Southampton refers to 100 Members of the two chambers of Parliament being involved in the event in 2019, suggesting that engagement reduced in 2020 – due to the virtual format and/or for other reasons.
In its origin, and in its application in one case – the United Kingdom parliament – a ‘research week’ is a marketing tool for science in general, not the promotion of a parliamentary research service. In the UK the number of Members reached appears modest – around 5% directly involved in 2020, for example – and the content apparently does not include the parliamentary research services own material.
The Uganda case demonstrates, however, that a research week can act for service promotion. It becomes then a multipurpose tool – connecting up the national research network; promoting science to Members; shaping science communication to meet Member needs; and raising the profile of the parliamentary research service. It is a ‘big-bang’ solution that reaches Members and cuts through multiple problems. It allows a set of institutions and services to combine their promotional capacity into a single powerful campaign. Issues remain: is it the optimal tool for the purpose of service promotion; is it cost effective in that purpose; with multiple objectives comes dispersion of effort and complexity, and so risk. Conceived in that way, a research week needs substantial resources and time to plan and implement; the results in Uganda were largely limited to one parliamentary term and they cannot afford to re-run the research week in the format they want in the new term. In that case it was not sustainable and it might be difficult for a parliament without significant external support to emulate it.
The Ghana example shows a smaller scope and an internal focus; it may not be comparable with the research weeks in Uganda or the UK. It appears more attainable and sustainable but it is not known if an event has happened since the WFD project ended.
The interim conclusion, pending more research, is that for raising the profile of a parliamentary research service a ‘research week’ has potential benefits but also potentially serious costs, challenges and risks. This is particularly the case if there are additional objectives of connecting the wider world of research to parliament. The search is on for a model with that scope which is sustainable by a parliament with limited resources.
EUSCEA provides an interesting set of current resources for organisers of science events. One link in the resources is to a ‘Science Communications Toolbox’ (in English, created by two Swedish organisations).
I recently had the honour of being invited keynote speaker at the conference of the Association of Parliamentary Libraries of Asia and the Pacific. It was an invitation as Chair of IFLAPARL, and the conference was open to all interested worldwide – two very welcome gestures at a time when regional associations might benefit from closer links between themselves and with IFLAPARL at global level.
Although invited as Chair of IFLAPARL, the request was to give a personal view – sharing my “knowledge, experiences, and insights on library and research services that are responsive to the needs of parliaments and parliamentarians in winning challenges during times of crisis and the corollary human resource management and information access issues.
The presentation slides are attached here but recordings of the full conference will be available online, as will the same presentation file [links to be added when available].
“If the clients understand what ‘policy analysis’ should be, a parliamentary research service which promises to do it is creating an expectation that cannot or should not be fulfilled”
Do parliamentary research services do “policy analysis”? I was asked that recently and my answer was “no” and if both question and answer seem strange, I agree. My answer would have been different a few years ago – we even created posts titled ‘Policy Analyst’ in my then service. So what’s going on?
Firstly, there is no disagreement that parliamentary researchers analyse policy. But, strictly, that is not the same as “policy analysis”. Saying analysis of policy is not the same as policy analysis may sound very like the “Yes, Minster” official making the difference between the policy of administration and the administration of policy. It is, though, an important distinction.
The LSE has been a highly-effective user of social media and these three articles provide some good reasons for a parliamentary research service to set up its own blog and Twitter account(s), and some excellent practical advice on how to do so.
Parliamentary library and research services have the mission to deliver high-quality services – but what does ‘quality’ mean? Put your own rankings on the cars above – what does it is say about your idea of ‘quality’?
Quality management was a new frontier in the 1980s in the UK but became simply a condition for staying in business for much of the corporate and public sector by the 2000’s. This is not the case everywhere, and there are still challenges in public services – it is a lot easier to apply quality management methods in a car factory than in a professional service. For services and people new to quality management it is worth rehearsing the basics, with a particular focus on their application in services. The download is a presentation on quality management basics with some thoughts on how it can be applied to parliamentary research services.