Availability: |
Bardos, R.P. (2002) Report of the NICOLE Workshop: Cost-effective Site Characterisation - Dealing with uncertainties, innovation, legislation constraints, 18-19 April 2002, Pisa. Land Contamination and Reclamation 10 (3) 189-219 http://www.nicole.org
|
Report / download web link (=direct link): |
http://www.nicole.org
|
Long description: |
The 2002 NICOLE meeting in Pisa focused on site characterisation and in particular it looked a how site characterisation might be made more efficient and value for money enhanced. Site characterisation costs are inevitable, however NICOLE would like to find an optimum that balances the depth of knowledge of pollution necessary to manage risks against the cost. Insufficient site characterisation will ultimately lead to higher overall site management costs as risk management cannot be effectively planned and carried out. Excessive site characterisation adds unnecessary costs for what is effectively redundant information from the perspective of contaminated site risk management.
The meeting included several papers comparing costs and regulatory perspectives in different Member States and also the USA. Detailed technical presentations were made on a number of emerging site characterisation techniques and statistical tools, including: biosensors for toxicity assessment, chemical sensors for a wide range of organic and inorganic substances, geophysical techniques and also “direct push” technologies as an alternative to conventional drilling and excavation and the integration of sensors with these direct push tools. Characterisation techniques for soil air, water, groundwater and solid phase were discussed. A number of papers presented views on data interpretation, for example using geostatistics or using fuzzy logic. Keynote presentations were made on site characterisation strategies, in particular the value of adopting dynamic work plans in site investigation. A number of clear messages came out from this workshop:
1. There has been a misplaced emphasis throughout the contaminated land community on the value of analytical precision for samples taken for off site analyses. Frequently variation from analytical errors is greatly exceeded by sources of error from sampling. Or: put another way: in general sampling introduces a far greater variation in analytical results than any lack of precision in the analysis. Hence it may be more effective to spend a greater proportion of site characterisation budgets on cheaper on site measurement techniques, using off site analyses for validation and verification.
2 However, this generalisation is also an over-simplification. In fact there are a variety of site characterisation techniques available to us from on site analyses, through on site sensors to geophysical techniques each offering valuable “nuggets” of information. At the heart of any site characterisation work must be the derivation of a site conceptual model that integrates what is already known about a site, and identifies both what still needs to be discovered, and how that information should be used. Underpinning each site characterisation, therefore, must be a clear information objective.
3 A large number of site characterisation plans have been relatively inflexible. A rigid proforma of tasks is carried out, which is not necessarily affected by the site investigation findings in real time. Rather a series of site investigations may take place on a more or less ad hoc basis to fill in gaps in information. Several speakers highlighted the need for interaction between the site investigation tasks and what is found out: feedback on real-time basis. This kind of dynamic approach to site characterisation needs to be considered as a part of overall site investigation strategy, long before deployment of site characterisation equipment. It is also rather difficult to achieve using conventional sample to laboratory systems, as the laboratory data will take at best days to be generated, over which time site characterisation equipment can only be kept idle at huge cost. Dynamic approaches to site investigation should also encompass remediation planning, as different remedial approaches have varying information requirements.
4 There have been tremendous strides in the development of tools for on-site use in site characterisation, in particular the use of “direct-push” tools as opposed to conventional drilling, and a variety of attachments and sensors which can used with these direct push tools to collect site investigation information. This real-time information is an important opportunity for the development and implementation of dynamic site characterisation strategies.
5 However, there is no “one best” solution. Choices of site investigation approaches should be made on a rational basis using a site conceptual model (or its development) as a goal. The site conceptual model also serves as the basis for risk assessment and for remediation. The information needed for risk assessment and remediation planning is not necessarily the same. However, a site conceptual model provides a tool to integrate this information, and to approach its collection in a systematic manner.
6 Site investigation data needs to be integrated over time, as well as space, so that trends in contaminant behaviour can be assessed.
7 The current hurdle to the wider use of on site technologies is that they have yet to be comprehensively benchmarked against more conventional measurements, and that it is not clear how they should be verified on a site by site basis. Such benchmarking can be particularly important for the acceptance of techniques by regulators. It may be an oversimplification, but the meeting divided on this point, industry sees this development of benchmarking and verification as something service providers should carry out, and service providers see verification and benchmarking as something industry should contribute to as long term beneficiaries (e.g. of better for value site investigations). Another important perspective is that it is possible for benchmarking to be counter-productive, if it is carried out in the absence of a comprehensive understanding of site conditions, and knowledge of the analytical techniques that are serving as the benchmarks.
8 Statistical inferences are an important part of the interpretation of site investigation data. A variety of techniques from population statistics and geostatistics have been applied, along with an emerging use of “fuzzy logic”. However, there was no clear technical consensus amongst those speaking about statistics at this meeting as to the optimum approach. Rather several techniques appear to have value, and to be favoured by individual speakers. Statistics is seen by many stakeholders as a “black art”. Furthermore, its underlying assumptions and limitations are not necessarily explicit. There is also something of a lag between the new possibilities for dynamic site characterisation approaches, and how these can be best exploited in a statistical sense. However, one thing is clear: no matter how good your statistician, he or she cannot make up for inadequate data for heterogeneous ground environments.
9 The surveys of site investigation costs and strategies showed significant variation both between and within countries. While some of this variation may be due to “physical” causes, e.g. different hydrogeologies, a substantial amount of variation is due to differences in regulatory and other stakeholder perspectives and requirements.
10 Regulatory systems do not necessarily take into account the opportunities afforded by the state of the art in site characterisation. The take up of new ideas such as dynamic site characterisation varies from country to country, but is clearly constrained by regulations in some countries, whereas others realise a more flexible approach is necessary. This led to calls from some delegates and speakers for some kind of European harmonisation of the regulation of site characterisation. It is not certain that this would benefit all countries. It would reduce flexibility and reduce forward looking Member States’ ability to respond to new tec
|