‘Raising the profile’ / 1. Introduction

What methods are in use to raise the profile of parliamentary research services, and which methods do the professionals consider to be effective? These were the questions opened to subscribers to IFLAPARL’s mailing list – and the professional community beyond that – in the winter of 2021/22. An initial request round produced four very useful full-text responses from Israel, Slovakia, Canada and New Zealand. Based partly on these responses, a survey form was designed and this had 22 respondents. The report’s statistics are based on the survey responses but the full text answers have been used where relevant in the commentary.

What does ‘raising the profile’ mean?

Profile’ is about more than volume of use or reach amongst the target audience. It is also about intangibles such as reputation amongst Members and other decision-makers, the strength of the brand and the authority which its products command. There is a distinction between direct clients – typically support staff of Members and operational staff of the secretariat – and decision-makers – Members and senior officials of the secretariat. It would be possible for a service to have a high profile and very good level of success in terms of volume of use and satisfaction amongst direct clients without achieving a high profile amongst decision-makers – and vice versa. A high profile amongst decision-makers is a useful asset and arguably an essential to the service being truly valued in the institution. The focus in the survey was on the profile amongst Members.

‘Raising the profile’ has at least three main kinds of activity:

  1. Direct promotion of service use
  2. Indirect promotion of service use
  3. Building the reputation or brand of the service

Direct promotion of service use‘ is straightforward advertising of specific services or products to target clients – with a priority to Members and their offices, but often extending to parliamentary officials, specialist external audiences and even the general public. This can be seen as similar to ‘sales’ advertising.

Indirect promotion of service use’ concerns promotional actions which advertise the service without a direct message about use – creating awareness of service availability amongst target clients rather than the ‘selling’ of a particular service or product. The use of social media to broadcast research products externally may be done with an aim of eventually reaching clients internally. The intention is to increase readership of a particular product but also to create awareness of service availability. A simpler example of indirect promotion of use is giveaway branded gadgets, clothing or stationery.

‘Building the reputation or brand of the service’ refers to promotional actions which are not about direct ‘selling’ or awareness of specific products or services. For example, the involvement of a service in parliamentary strengthening projects elsewhere is one way to raise its profile within its own parliament. This may be because it brings the institution prestige, or because it validates the expertise of the service, or because it is seen to be ‘doing the right thing’ and gains reputation as a result. Another example is the publication of academic articles or academic conference presentations by research staff – it makes no direct contribution to service delivery or advertising, but it might enhance the reputation of the service.

Some actions can work on more than one plane – a seminar or workshop on research around a policy issue might build the reputation of the service as an intellectual centre while also including indirect and even direct promotion.

The survey and respondents

The survey form asked about forty different methods for profile-raising. The responses led to the identification of additional methods – the complete list is here, with links to examples in some cases. The survey also covered special measures adopted for new parliaments; how the profile and marketing methods are measured; and marketing strategy.

Respondents were given the choice to be identified in this report or to remain anonymous; and for those identified to have any description of their service or quote from their responses to be identifiable or to be anonymised. Of the 22 services responding to the formal survey, 18 agreed to be credited (see below) and of those 10 have allowed the content of their responses to be identifiable – all else is anonymised. If particular services are named or quoted and others not, it may be as a consequence of those choices rather than any other reason. The responding services are from across the globe.

Credited contributors in the first and/or second round: Slovakia (Department of Parliamentary Institute); Israel (Knesset Research & Information Centre); Canada (Library of Parliament); New Zealand (Parliamentary Library); Andalucia, Spain (Parlamento de Andalucía. Servicio de Biblioteca); Portugal (Divisão de Informação Legislativa e Parlamentar); Sweden (research service); Uganda (Department of Research Service); New South Wales, Australia (New South Wales Parliamentary Research Service); Zambia (Research Department), Queensland (Queensland Parliament – Research & Information Service); Hungary (Information Service for MPs (research service); Finland (Internal Research Service); Romania (Directorate for Studies and Legislative Documentation); Burundi (Cour des comptes) [This is not a parliamentary research service in the conventional sense, but it does undertake research for the parliament] ; Malawi (Parliament of Malawi research service); North Macedonia (Parliamentary Institute); Pakistan (PIPS – Pakistan Institute for Parliamentary Services); Lithuania (Research Unit of the Information and Communication Department of the Office of the Seimas); Argentina (Dirección Servicios Legislativos).

Contributor services requesting anonymity: Four, not listed above.

2. Main findings and recommendations

All posts on ‘Raising the Profile’

Checklist of methods

Survey introduction

Main findings

Most popular methods

Most effective methods

Respondents comments on methods

Special activities for a new parliament

What is the value to Members?

Marketing strategies

Conclusions and recommendations

Raising the profile of a parliamentary research service – a checklist of methods

The following methods have been identified from professional experience and from detailed returns by several parliamentary research services. The original list of 40 items used in the ongoing survey is being added to. (They are marked ‘new’ in the list below)

Further information, including results from the survey, will be added progressively, including pages on particular methods.

  1. Intranet site/pages dedicated to the research service
  2. External website/pages dedicated to the research service [UK – House of Commons library & research service]
  3. Social media (Twitter, Facebook, LinkedIn, Instagram, TikTok etc) [List of parliamentary library & research services & people on Twitter ; A blog on WordPress by a specialist research service for European Parliament committees ]
  4. Promotional emails – newsletters, adverts etc
  5. Promotional videos [European Parliamentary Research Service YouTube channel ; The history of POST (UK)].
  6. Promotional audio clips, podcasts [European Parliamentary Research Service podcasts]
  7. Promotional gadgets, clothing, bags etc
  8. Research products published/accessible externally [Ghana – research papers ; Knesset – research papers translated into English ; New Zealand ]
  9. Press releases and/or actively generating coverage in mainstream media
  10. Permanent displays of hard-copy research reports
  11. Special Products – e.g. condensed research reports [example from the European Parliament]
  12. Special publication with an overview of policy issues for a newly-elected parliament [example from the UK parliament, also available as a video with sign language]
  13. Publishes practical guide(s) for Members on how parliament works
  14. Branding on all research products
  15. Common visual identity for all products and promotional materials – same ‘look and feel’
  16. Presentation of services to groups of Members
  17. Presentation of the service in Committee meetings
  18. Presentation of the service to Members individually, interviews with Members
  19. Presentation of the service to party (research) staff [New]
  20. Account managers designated for relations with individual Members. [New] [There is a post about the ‘account manager’ concept – in general, not about the New Zealand case – here].
  21. ‘Floorwalking’ to meet Members and their staff. [New]
  22. Social events on service premises (‘morning tea’) to encourage contact with Members’ staff [New]
  23. Training/induction for Members in how to use the research service, benefits of research
  24. Training/induction for staff of Members in how to use the research service, benefits of research
  25. Physical presence of service where Members gather (e.g. desk in a lounge area)
  26. Knowledge-based events (seminars, conferences, workshops etc) around research themes
  27. ‘Open Day’ or ‘Research week’ events
  28. Personal meeting with Member when they request research
  29. Request for feedback by form when research is delivered to a Member
  30. Propose in-person discussion to get feedback when research is delivered to a Member
  31. Presence of research staff in Committee meetings
  32. Workshops with Committee Members or their staff to identify research priorities
  33. Formal Committee decisions on research topics
  34. Relevant research products proactively circulated to Committee Members
  35. Surveys of Members
  36. ‘Focus group’ meetings to discover client views of the service [New]
  37. Publication of performance data for the research service
  38. Publication of service standards for the research service or ‘Service Charter’ or ‘rules’ of the service
  39. Advisory board with Members [UK – the Board of POST] or similar advisory body [The ‘evidence caucus in Kenya]
  40. External engagement/publication by staff research experts
  41. International cooperation with other research services, engagement with international professional bodies [For example, at the global level, IFLAPARL]
  42. Provides parliamentary strengthening support to other parliaments
  43. Engagement of the service with wider research & professional networks
  44. Engagement in general inter-parliamentary activities, IPU events etc
  45. Internships for young people offered in the research service

‘Research weeks’ and parliamentary research services


This post follows from the checklist of methods to raise the profile of parliamentary research services.

The concept of a ‘research week’ was pioneered in parliamentary research services by Uganda in 2016. It has since been taken up by the parliaments of Ghana, Zimbabwe and the United Kingdom. The generic concept of ‘research week’ was not, however, invented in Uganda or for parliaments but creatively borrowed from the wider world of science communications (‘SciComm’).

According to the ‘White Book on Science Communications Events in Europe’, a ‘research week’ in SciComm might typically be used to:

  • promote awareness of science research activities to a wider public;
  • encourage dialogue between science and society;
  • foster a ‘scientific culture’ – public understanding of science;
  • develop interest in scientific careers.

It is generally intended to be a positive marketing effort rather than an occasion for critical scrutiny and debate.

The ‘White Book’ [PDF download], is a 2005 overview by EUSCEA – the European Science Engagement Association – of SciComm events which were then taking off across Europe. Today, ‘research weeks’ are widespread around the world, held by universities, academic societies, cities, regions – and parliaments.

The primary purposes of a ‘research week’ in a parliamentary context appear to be to connect the national research community with Members, to raise the profile of scientific evidence in policymaking and to ease the transmission of scientific content to the policy process. It is not ostensibly about raising the profile of the parliamentary research service with Members, even if in the Ugandan case, at least, this was the primary motivation. This post will consider ‘research weeks’ in terms of their impact with Members and on the profile of the parliamentary research service – rather than the effect on relations with the scientific community.

The Ugandan experience

Based on publicly available documents and on interviews with John Bagonza, Director of the Research Service, Parliament of Uganda and Emily Hayter, Programme Manager for INASP. Thanks to John and Emily for their assistance. Any errors or omissions are my responsibility.

The parliamentary research service in Uganda had concerns about their profile with Members and tried various measures to improve it, which had all failed. These methods included displays in the library and in the parliament restaurant, presentations to Members and individual meetings with Members and staff. The service still had ‘just a few clients’ and many Members did not know about the service or that it was free to use. The service also felt it was relatively unknown to the wider research community and even to government departments.

The service discovered the concept of ‘research week’ and wanted to use it as a big breakthrough event to raise their profile but were unable to proceed with it until offered external financial support through the VakaYiko project. (The project was UK government funded and implemented by a consortium led by by INASP (the International Network for Availability of Scientific Publications) – a UK-based NGO).

The first parliamentary Research Week was held on 23-26th August, 2016, with the theme “Using evidence to strengthen Parliament”. The aim of the research week was to showcase the research products provided to Members of Parliament (MPs) by the Department of Research Services (DRS); and also to to present other research organizations and academia as complementary sources of evidence for policy. For its first three days there were information stalls and presentations just outside the entrance to the Parliament of Uganda; and it concluded on the fourth day with an off-site symposium. The event was organised jointly by DRS, Uganda National Academy of Sciences (UNAS) and INASP.

The key targets for the research week were the MPs and staff members of parliament, and they were reached in significant numbers – 242 Members (of 427 at the time) and more than 100 staffers. High profile visitors included the Prime Minister and the Deputy Speaker as well as other office-holders of parliament. The event also attracted around 50 students and members of the general public.

Eighteen research organisations were represented at the event.

The DRS produced a report on the event in November 2016 (three months after it took place) and a paper for a meeting in London in April 2017.

INASP reported on the week in blog posts on 16 September 2016 and again on 16 December 2016, and also produced a video on the event:

Summary of results

The event generated 81 new research requests from Members and the level of research requests broadly increased throughout the remainder of the parliamentary term, as reflected in the graph of outputs below. The relative decline in the most recent two years can be ascribed to the the pandemic.

The profile of the research service was raised with Members and also with the parliamentary administration. The notion of using evidence from external research bodies was publicised and those bodies increased their understanding of what Members required and of how to cooperate with the parliamentary research service. The various institutions are reported to have had more contact after the event.

The research service had hoped that the success of the event would lead to additional parliamentary funding to continue running ‘research weeks’, possibly even on an annual basis. That funding has not materialised and there have been no further research weeks held. The new parliament elected in 2021 has 60% new Members and the research service is again facing a problem with its profile.

The research service did obtain a ‘small’ increase in its regular budget and it found it easier to obtain support from international collaborators after the event.

INASP brought in people from research services in Ghana and Zimbabwe to observe the exercise. Both parliaments went on to hold some form of ‘research week’.

Organising the event

The research service had virtually all 35 staff working full-time to prepare the event for two months, although the total staff input was not measured. The Director feels this timeline was very short and that 3-4 months would be optimal. The time was needed not only to organise the event but to prepare content, including audio and video, as promotional material. In addition, the external institutions needed much more time than they were given, to adapt their content to a parliamentary audience. They had to learn how to work in a parliamentary context; as the parliament had to learn how to work with them. There were issues of unmet expectations and the consequences of a crowded programme. Seven expected institutions (of 25) did not show up, in part apparently because of these issues. Overall, there was enthusiasm to be involved, to be given access to parliament.

The conception of event and its realisation was mostly the work of the parliamentary research service. INASP provided some review of plans and technical advice, only. It is an event that could, then, in principle be managed by a research service with sufficient staff and resources.

There were some challenges during the event, notably in the different cultures, interests and expectations of the main participant groups – Members, academics and parliamentary researchers. The Members did not much attend the presentations that were held as side events to the exhibition, and the report on the symposium mentions an issue with their engagement. One of the key differences of interest was the Members’ focus on constituency matters – evidence about issues, needs and features of their constituency – and the academic interest in generic scientific and national policy issues.

Some risks did not materialise but are learning points for a future event. The combination of novelty, ambition and tight timetable put some pressure on planning and delivery. In addition, busy and motivated Members’ difficulty in sticking to a planned timetable and agenda can be an issue in any parliamentary event, anywhere, and can disrupt even the best planned programme.

Budget figures are not available but there were some significant costs – marquee hire, conference facilities in a hotel, branded clothing and other giveaways, speaker fees/expenses, professional printing, purchase of press coverage. Interestingly, to the extent that this was funded by INASP it was done through the Ugandan National Academy of Sciences (UNAS) and not through the parliamentary research service. The remit of INASP prohibited direct funding of parliamentary services. INASP found UNAS an invaluable partner, partly because it is an individual membership organisation and so ‘neutral’ in a landscape of research bodies each with an institutional interest. This neutral local partner was critical to connecting with the national research system. UNAS was also helpful through being able to make quick financial decisions. The parliamentary research service reported that it did not use much of its own budget and that the exhibitors were self-funding, in addition to the INASP contribution.

The INASP Programme Manager noted that the high-cost items were possibly not all functionally necessary but apparently had the desired effect in terms of attracting the target audience – the DRS knew its audience.

The large number of new requests in one week, on the one hand, was a successful result, and on the other, became a serious problem. It exceeded the capacity of the service to deliver in a reasonable time. Any service organising a ‘big-bang’ event needs to consider the reputational risk from raising, then not meeting. expectations. There is no report of long-term damage in this case.

For the Director of the research service, these were the three key lessons for anyone else organising a ‘research week”:

  1. Have enough time to prepare the event and plan it thoroughly
  2. Engage the other institutions in good time, bearing in mind their need to adapt
  3. Don’t go into it without adequate (money) resources, including budget for promotion

For the INASP coordinator, the involvement of a local research partner – external to the parliament and any research vested interest – was critical.

Ghana experience

Based on correspondence with Mr Mohammed Hardi Nyagsi, former Director of Research at the Parliament of Ghana, and a report supplied by him. Any errors or omissions are my responsibility

The ‘research week’ in Ghana had a different origin and purpose to that in Uganda. In the context of a memorandum of understanding (MOU) with the Westminster Foundation for Democracy (WFD), the Parliament of Ghana established the Inter-Departmental Research and Information Group (IDRIG)

“which commenced on a pilot basis in June 2016 [consisting of six] Departments, namely Research, Library, Hansard, Committees, ICT and Public Affairs Departments. These departments hitherto, were operating in silos with a significant amount of turf protections and inter-departmental rivalry being the norm. This led to duplication of information storage systems and a lack of clarity on the roles of the departments among their primary clients – the Members of Parliament”

The IDRIG concept – history, structure and programmes, IDRIG, 2018 [PDF download]

IDRIG organised two types of event in the ‘research week’ style – an ‘Induction Exhibition’ and a ‘Research and Information’ week – they are described in the extract below. Both appear to be more focused on the in-house research service than the exercise in Uganda, or as suggested by the general ‘research week’ concept.

Extract from ‘The IDRIG Concept’:

The then Director of Research led three (annual) ‘Research and Information’ weeks from 2017.

As reported by WFD in October 2020 when its support programme ended, “[that] the IDRIG has since received funding from the African Development Bank to provide a digital research and information service to MPs is testament to the success and promise of the initiative”.

More recently, in 2021 the Parliament of Ghana held a ‘Data Fair’ which drew on principles and approaches from the ‘research week’ model. The Fair was part of the ‘Data for Accountability’ project led by the African Centre for Parliamentary Affairs/ACEPA, with the participation of the Ghana Statistical Service and INASP. The project is funded by Hewlett Foundation. A video on the Fair illustrates very well its similarity to a ‘research week’. (The link below starts at 01:49 with an introduction by Omar Seidu of the Ghana Statistical Service)

Zimbabwe experience

[No information currently]

United Kingdom experience

The UK parliament has run a form of research week annually since 2018, under the title ‘Evidence Week’. Interestingly, this also uses a ‘neutral’ external partner (Sense about Science) to connect the parliamentary research services with the national research network. Sense about Science has published an overview of ‘Evidence Week’ – and there is more detail on the event below.

As far as can be understood from published material, this event is much more about raising the profile of external research than about the profile of parliamentary research services.

[Awaiting result of contacts with organisers.]

2021 Evidence Week

Three main parts:

1. Opening event (online). The opening event will be held online Monday 1 November, via zoom. The format will consist of constituents and community groups asking questions to MPs and committee chairs on how parliament is using or scrutinising evidence on subjects that matter to them. MPs and experts will respond, with additional comments from information providers and analysts.

This is a fantastic opportunity to bring together MPs, researchers, and constituents, engaging with cutting edge research on policy decisions that the public have questions about.

2. 3 minute speed briefings and meetings with leading researchers (online and in person). Evidence Week will be hosting a series of virtual and in-person speed briefings from our research partners covering topics including, Net Zero, Health, and Data and AI. Face to Face briefings will run in Westminster, room TBC, on Tuesday 2 and Wednesday 3 November, with pods set up as a live exhibition. MPs, peers, and parliamentary staff can drop in any time and discuss the evidence and briefings presented. MPs and parliamentary staff will also be able to book individual online meetings with researchers to discuss in more detail their work and how evidence can be used in policy making decisions, throughout the entirety of the week.

3. Training for parliamentary staff on using and understanding data in their work on 5th November 1pm-3.30pm (online…)

From: https://post.parliament.uk/evidence-week-in-parliament-2021/
Partners in the 2021 ‘Evidence Week’ in the UK parliament

A researcher from Imperial College has written a blog post on the experience in 2021.

“On the day in parliament, I hosted an exhibition ‘pod’ to share insights and resources with MPs and peers and answered questions. I had on average three minutes to share research findings with each visitor to my pod. MPs, peers, and their staff were very receptive to my research findings, and some of them booked a one-to-one to discuss further. It was a good experience for me, and the interactions were greatly appreciated by both sides. MPs, peers and their staff had a positive attitude towards input from scientists, which I found really encouraging.”

Previous Evidence Weeks

The first ‘Evidence week’ event in 2018 looked at how evidence could be used in policy, both in general and in relation to specific topics. These were covered in multiple 1-2 hour events over four days.

The second ‘Evidence week’ in 2019 took a different format. After a one-hour opening event there were “two days of 3-minute briefings at interactive ‘Evidence Pods’ in the Upper Waiting Hall, with more than 20 partners, on different facets of interrogating evidence, from drones to the census”. Although the first year was described as ‘successful’ it appears that a more rapid-fire format was either seen as more accessible and/or more deliverable.

As reported by the main external partner, ‘Sense about Science’, the ‘Evidence week’ in 2020 was forced into a virtual format by the pandemic. The same concept of three-minute presentations – with the option of more in-depth discussion – was retained.

“In 2020, we were unable to hold a physical event in Upper Waiting hall where constituents and researchers can meet MPs to discuss urgent policy issues. With our communications partners, POST, the UK Statistics Authority, Office for Statistics Regulation, and Ipsos MORI, we created an innovative new way of replicating this experience online. From 16th- 20th November, Evidence Week in Westminster looked at evidence on a range of issues including Covid-19, AI, and climate, that Parliamentary committees, constituents and researchers are grappling with.”

“Through the online platform, MPs, Peers and parliamentary staff were able to hear 3-minute video briefings and jump into meeting rooms with the researchers themselves to instantly discuss emerging evidence and potential strategies for tackling complex policy problems.”

The University of Southampton published a blog post on their researchers’ experience of the event, including screen shots of MPs and their staff being briefed on Zoom.

Sense about Science published an interesting assessment of the 2020 event with data on attendance. This states that 34 Members of Parliament (of 650) and 10 members of the second chamber (of 783) engaged with the event. It is not clear if this statistic includes the impressive participation of 13 Committee Chairs in the opening event. In addition, 47 staff of MPs took part; it is reported that altogether 78 MPs or their offices participated. Twenty parliamentary and government officials also attended. The University of Southampton refers to 100 Members of the two chambers of Parliament being involved in the event in 2019, suggesting that engagement reduced in 2020 – due to the virtual format and/or for other reasons.


In its origin, and in its application in one case – the United Kingdom parliament – a ‘research week’ is a marketing tool for science in general, not the promotion of a parliamentary research service. In the UK the number of Members reached appears modest – around 5% directly involved in 2020, for example – and the content apparently does not include the parliamentary research services own material.

The Uganda case demonstrates, however, that a research week can act for service promotion. It becomes then a multipurpose tool – connecting up the national research network; promoting science to Members; shaping science communication to meet Member needs; and raising the profile of the parliamentary research service. It is a ‘big-bang’ solution that reaches Members and cuts through multiple problems. It allows a set of institutions and services to combine their promotional capacity into a single powerful campaign. Issues remain: is it the optimal tool for the purpose of service promotion; is it cost effective in that purpose; with multiple objectives comes dispersion of effort and complexity, and so risk. Conceived in that way, a research week needs substantial resources and time to plan and implement; the results in Uganda were largely limited to one parliamentary term and they cannot afford to re-run the research week in the format they want in the new term. In that case it was not sustainable and it might be difficult for a parliament without significant external support to emulate it.

The Ghana example shows a smaller scope and an internal focus; it may not be comparable with the research weeks in Uganda or the UK. It appears more attainable and sustainable but it is not known if an event has happened since the WFD project ended.

The interim conclusion, pending more research, is that for raising the profile of a parliamentary research service a ‘research week’ has potential benefits but also potentially serious costs, challenges and risks. This is particularly the case if there are additional objectives of connecting the wider world of research to parliament. The search is on for a model with that scope which is sustainable by a parliament with limited resources.

Further information

EUSCEA provides an interesting set of current resources for organisers of science events. One link in the resources is to a ‘Science Communications Toolbox’ (in English, created by two Swedish organisations).