Until relatively recently, the guidance was sparse in relation to ecology report writing. The Chartered Institute of Ecology and Environmental Management have worked hard over the last few years to provide guidance documents to cover the technical reports that commonly accompany planning applications. In addition, a British Standard Institution publication (BS 42020:2013 Biodiversity. Code of practice for planning and development), aims to bring clarity and consistency to reporting (again for planning purposes/regulatory approval).
During my time as a professional ecologist, I had the opportunity to lead on numerous schemes with a wide variety of objectives. A number of these were high-profile, public-realm projects that ran for several years and required a whole range of ecological assessments to be undertaken. Consequently, a large proportion of my time was spent writing and assessing schemes. Over the years I produced countless reports designed for a variety of purposes, including: Preliminary Ecological Appraisals (‘extended’ Phase 1 Habitat Surveys), protected species survey reports, Ecological Impact Assessment, Habitat Regulations Assessment (for plans and projects), management plans, BREEAM assessments, Corporate Social Responsibility reporting, Biodiversity Action Plans (habitat and species), ecology audits, mitigation strategies, enhancement and habitat creation strategies … the list goes on. I also reviewed colleague’s reports prior to them being issued to clients.
Using the experience gained over seventeen years (and a natural inclination towards an attention to detail), I review reports on behalf of freelance ecologists and consultancies – prior to them being issued – and have seen first-hand how reporting varies significantly between organisations and individuals. Reviewing the report at arms length, with no prior involvement in the project, provides a unique viewpoint from which to identify issues, such as clarity of reporting and apparent robustness of the assessment process – highlighting queries that may be picked up by the regulators, and generally identifying typos and inconsistencies that can be missed with familiarity/time pressures.
I thought it would be useful in this post to highlight some of the points that come up time and again when I’m reviewing reports:
I’m starting off with the basics, i.e. making sure that you are consistent throughout the report in terms of how you represent the main terms, species, tables (etc.). The following examples will give you an idea of what I’m getting at:
- Abbreviations: I often find an abbreviation suddenly appears in a report with no previous explanation as to what it refers to. Similarly, wording can alternate at random between the full version and the abbreviation with no apparent rhyme or reason. General rule of thumb is: first time, write in full with the abbreviation in brackets. After that, use the abbreviation throughout the report.
- Scientific names: Again, I see huge variations and inconsistencies with this. General rule is that scientific names for species are mentioned after the common name, the first time it is written (within or without brackets and always italicised).
- Titles: decide how these are going to be represented (usually a house-style decision) and then stick to it consistently throughout the report.
- Tables/diagrams/photos: make sure that these are easy to read (i.e. you don’t need a magnifying glass to read the detail and/or they are not just a big fuzzy blur), are labelled correctly and consistently, are relevant, are referred to in the text, look the same (e.g. table shading and appearance is consistent) … etc.
- References: be consistent with how these are written out, make sure they are included where necessary, make sure they are up-to-date (where referencing guidance and policy) and make use of cross-referencing footnotes throughout the report each time the document/organisation (etc.) is mentioned.
- Information and assessment: make sure that the information provided and the assessment process is consistent for all species and habitats e.g. don’t include detailed methodology for a couple of survey types, but not the rest; or the relevant points to note about the ecology of a species for some but not others; or assess one section in one way and the next in another; or detail your assessment process in the methodology, but then fail to follow it …
Ensure that your report is ordered in a logical manner so that your sections build on from each other. The reader should be able to follow the assessment process, shouldn’t be left with unanswered questions, and shouldn’t have to jump back and forth between sections to understand the report.
Keep terminology consistent and in-line with the aims of the report e.g. if there is specified guidance that you are using to assess the significance of effects, make sure the relevant terminology is used (and an explanation provided, if necessary, so that the reader isn’t bamboozled by your terms).
Include the right detail. By that I mean, don’t include things that aren’t relevant to the report in terms of its purpose (nobody wants to trawl through a whole load of irrelevant information and it could detract from the important points that you are drawing out), but do make sure that all relevant information is provided in there.
This is probably one of the main areas that I find myself querying and the most crucial to the report itself. Ecologists in the main are great at describing their survey findings, but can be less transparent in their assessment process (which can be totally excluded from some reports). There are often giant leaps made from results to conclusions with little explanation in between. As a reader, I need to understand how you have reached the conclusions you have drawn. The assessment process should be clear, transparent and robust (as should your methodology etc). This is aided by building up the knowledge base of the reader throughout the report – hence the reason it is important to order the sections in a logical manner.
I hope you have found the above points useful; this post isn’t intended to comprise an exhaustive account, but should highlight some of the more commonly encountered issues that are flagged during my report reviews.