Single-use food containers: call for evidence - summary of responses

Summary of responses from our call for evidence on tackling consumption of single-use food containers and other commonly littered or problematic single-use items. The report brings together a range of evidence and views from stakeholders to inform policy development on single-use plastic


Overview of Responses

9. The call for evidence opened on 7 April 2022 and closed on 30 June 2022. 67 responses were received online via Citizen Space. Responses to the call for evidence, where permission for publication was granted, can be found on Citizen Space. Of these responses, 26 were from individuals and 41 were from organisations (including individuals responding on behalf of organisations). Of the organisations that responded this included: charities, NGOs, community groups, local authorities, businesses and industry bodies. Two organisation responses were received separately, not via Citizen Space.

10. There were 41 core questions, with 6 questions on food containers and 7 questions on each of the other items. All questions were open-ended (free text) questions.

11. Out of the 69 that responded, not all responded to all 41 questions, and there was variation in response numbers and the level of detail and evidence provided. For each question covered in the report, a count of substantive responses is provided, split by individual and organisation response numbers. This is a simple count to illustrate where there was more engagement (and in some cases more evidence provided) across the questions. The count excludes responses such as ‘n/a’, ‘no comment’, ‘see other responses’ and ‘none’. Some responses are campaign-style (‘co-ordinated’) responses that consist of similar or the same content and information – these have been treated as individual/separate responses for the purposes of the count here for simplicity.

Approach to Analysis

12. The analysis was undertaken by Scottish Government analysts in the Rural & Environmental Science and Analytical Services division.

13. Responses to each question were reviewed and coded into different themes, based on the nature and content of the evidence, opinion or arguments provided. This involved: i) reading through the responses; ii) labelling individual responses to questions according to themes; iii) reviewing areas of agreement or disagreement among these themes and drawing out the key points; and iv) writing a summary of the responses for each question.

14. The analysis is divided into sections based on the item under review, with commentary against each of the questions and themes raised for that item/product. The question numbers are there for ease of reference for reading this report. The original call for evidence had a different ordering of questions, with 6 questions on food containers covered first, and then questions by theme (environmental impact, size and nature of the market, effective actions etc.) for the other items. The report has been structured by item so that each item can be considered separately and to allow policy development to be aligned to the specific issues and concerns raised in each case. It is recognised that further analysis could be undertaken to explore the themes in more detail or explore the evidence through particular lenses (e.g. by respondent type).

15. The analysis and commentary presented here reports on the nature and content of the responses and the evidence provided. It does not make any assessment of the quality or validity of the evidence, opinions and arguments. All responses were reviewed and the vast majority of the external evidence and references provided in the responses (that were accessible at the time of analysis) have been referred to in the analysis and recorded in the reference list – regardless of the type, quality or any perceived credibility associated with the evidence (also see notes on p.60). Where figures and statistics were used by respondents, these were quoted and referenced where appropriate. They were not independently checked or verified beyond some general spot checks of the sources for quality assurance purposes. The report does not comment on the quality or accuracy of those figures and statistics and this would require further review and validation.

16. Some of the evidence was reviewed briefly to help understand the context or potential reasons that respondents may have cited such evidence and (as above) in order to cross-check that the numbers or information was cited correctly. In some cases assumptions and interpretations were made as to what the key message of the response was and what the context or logic of the response may have been.

17. The analysis did not aim to quantify in any detailed way how many responses mentioned particular themes or how many respondents provided evidence. Instead, a more general framing has been used to illustrate who responded and what was expressed and whether responses were reflective of the views of other respondents across the total dataset. The phrasing used in the analysis includes: ‘one response’, ‘some responses’, ‘a number of responses raised this theme’, ‘a small number of responses mentioned’, or ‘one co-ordinated response noted this’. This phrasing has been used to broadly indicate whether one or multiple respondents mentioned a particular idea or theme. In some cases reference is made to whether the response was from an individual or organisation, and in some cases, where it is relevant for context, and where permissions were provided by the respondent, specific organisation names are included. The main focus of this analysis was to present the evidence, arguments and opinions provided, and the implications this may have for policy.

Contact

Email: socialresearch@gov.scot

Back to top