- Station representativeness
- Handling changes in observation data uncertainty
- Performance criteria for high percentile values
- Data availability
- Application of the procedure to other parameters

When performing validation using the Delta Tool, it is helpful to look at both NOx as well as NO2, as the former pollutant is less influenced by chemistry, and is therefore a better measure of the models’ ability to represent dispersion processes. The NOx uncertainty is not available but could be approximated by the NO2 uncertainty for now. (Table ).

document -> Azərbaycan Universiteti Kafedra: İctimai Elmlər Kafedrası

document -> İlkram Əliyev

document -> Azərxalça” Açıq Səhmdar Cəmiyyətinin yaradılması haqqında Azərbaycan Respublikası Prezidentinin Sərəncamı

document -> Milli Büdcə Qrupunun üzvləri ilə görüşlər keçirilmişdir

document -> Ərzaq məhsullarının tədarükü və təchizatı

document -> Sayin veliMİz bu broşürün amacı

©2018 Учебные документы

Рады что Вы стали частью нашего образовательного сообщества.

Рады что Вы стали частью нашего образовательного сообщества.

## Open issues - Guidance Document on Model Quality Objectives and Benchmarking | ||||||

səhifə | 6/10 | |||||

tarix | 23.11.2017 | |||||

ölçüsü | 189.05 Kb. | |||||

## 5.5.Open issuesIn this section a few topics are introduced on which there currently is no consensus yet but which merit further consideration. ### Data assimilation
The AQD suggests the integrated use of modelling techniques and measurements to provide suitable information about the spatial and temporal distribution of pollutant concentrations. When it comes to validating these integrated data, different approaches can be found in literature which are based on dividing the set of measurement data into two groups, one for the integration and one for the evaluation of the integrated fields. The challenge is how to select the set of validation stations. By repeating the procedure e.g. using a Monte Carlo approach until all stations have been included at least once in the evaluation group, validation is then possible for all stations. As a specific case the ‘leaving one out’ method can be mentioned in which all stations are included in the integration except for the single station that we want to validate. By repeating this procedure for each station in turn, all stations can be validated ‘Leaving one out’ therefore requires as many re-analyses as there are stations. It is currently investigated within FAIRMODEs Cross Cutting Activity Modelling & Measurements which of the methodologies is most robust and applicable in operational contexts. ### Station representativeness
In the current approach only the uncertainty related to the measurement device is accounted for but another source of divergence between model results and measurements is linked to the lack of spatial representativeness of a given measurement station. Although objectives regarding the spatial representativeness of monitoring stations are set in the AQD these are not always fulfilled in real world conditions. The formulation proposed for the MQO and MPC could be extended to account for the lack of spatial representativeness if quantitative information on the effect of station (type) representativeness on measurement uncertainty becomes available. ### Handling changes in observation data uncertainty
As defined in 5.2 the MQO depends on the observation data uncertainty. As measurement techniques improve this observation data uncertainty will likely reduce over time. A consequence of this could be that a model that produced results that complied to the MQO based on a set of measurements could have a problem fulfilling the MQO for a new set of measurements obtained using the improved technique. A clear procedure is thus needed on how to define and update the different parameters needed for quantifying the observation data uncertainty. ### Performance criteria for high percentile values
The model quality objective described above provides insight on the quality of the model average performances but does not inform on the model capability to reproduce extreme events (e.g. exceedances). For this purpose, a specific indicator is proposed as: (29) where “perc” is a selected percentile value and Mperc and Operc are the modelled and observed values corresponding to this selected percentile. The denominator, U(Operc) is directly given as a function of the measurement uncertainty characterizing the Operc value. For pollutants for which exceedance limit values exist in the legislation this percentile is chosen according to legislation. For hourly NO2 this is the 99.8% (19th occurrence in 8760 hours), for the 8h daily maximum O3 92.9% (26th occurrence in 365 days) and for daily PM10 and PM2.5 90.1% (36th occurrence in 365 days). For general application, when e.g. there is no specific limit value for the number of exceedances defined in legislation, the 95% percentile is proposed. To calculate the percentile uncertainty used in the calculation of the equation 22 is used with . ### Data availability
Currently a value of 75% is required in the benchmarking both for the period considered as a whole and when time averaging operations are performed for all pollutants. The Data Quality Objectives in Annex I of the AQD require a minimum measurement data capture of 90% for sulphur and nitrogen oxides, particulate matter (PM), CO and ozone. For ozone this is relaxed to 75% in winter time. For benzene the Directive specifies a 90 % data capture (dc) and 35% time coverage (tc) for urban and traffic stations and 90% tc for industrial sites. The 2004 Directive in Annex IV requires 90% dc for As, Cd and Ni and 50% tc and for BaP 90 % dc of 33% tc. As these requirements for minimum data capture and time coverage do not include losses of data due to the regular calibration or the normal maintenance of the instrumentation the minimum data capture requirements are in accordance with the Commission implementing decision of 12 December 2011 laying down rules for the AQD reduced by an additional 5%. In case of e.g. PM this further reduces the data capture to 85% instead of 90%. In addition, in Annex XI the AQD provides criteria for checking validity when aggregating data and calculating statistical parameters. When calculating hourly averages, eight hourly averages and daily averages based on hourly values or eight hourly averages, the requested percentage of available data is set to 75%. For example a daily average will only be calculated if data for 18 hours are available. Similarly O3 daily maximum eight hourly average can only be calculated if 18 eight hourly values are available each of which requires 6 hourly values to be available. This 75% availability is also required from the paired modelled and observed values. For yearly averages Annex XI of the AQD requires 90 % of the one hour values or - if these are not available - 24-hour values over the year to be available. As this requirement again does not account for data losses due to regular calibration or normal maintenance, the 90% should in line with the implementing decision above again further be reduced by 5% to 85%. In the assessment work presented in the EEA air quality in Europe reports we can find other criteria. There, we find the criteria of 75% of valid data for PM10, PM2.5, NO2, SO2, O3, and CO, 50% for benzene and 14 % for BaP, Ni, As, Pb, and Cd. In these cases you also have to assure that the measurement data is evenly and randomly distributed across the year and week days. -
## MPC fulfilment criteria: improved statistical basis for the MQO
By considering the requirement that the MPC should be fulfilled in at least 90% of the observation stations as a requirement for the confidence interval of the differences between observed and modelled values an alternative basis for the MQO can be derived. The Model Quality Objective () is derived above with the simple principle of allowing a similar margin of tolerance to both model and observations. Assume a set of normal distributed data pairs consisting of observations and model calculations. The standard deviations of the observations and modelled concentrations are O and M. As the observations and model calculations are assumed to follow a normal distribution, their difference does too. For the MQO we define that 90% of the differences between observations and model results must be between -2 and +2 . In statistical terms: the 90% CI4 is 4. From the above it follows that the 95%CI of the concentrations differences, defined by 2d, is given by (2.0/1.64) 4 O. The factor 2.0/1.64 = 1.22 takes into account the difference between the 90%CI and 95%CI. The standard deviation (d) of the distribution of the differences can be expressed as d = O, with = 2.44. It is furthermore evident that d)2 = (O)2 + (M)2 or: O)2 = (O)2 + (M)2 or: 2 – 1) O2 = M2 and, finally: M √2 – 1) O Numerically: M √2 – 1) O ≈ 2.3 O . Conclusion: the present Model Quality Objective (MQO) implies that the uncertainty in the model result can be up to roughly twice as high as the measurement uncertainty. ### Application of the procedure to other parameters
Currently only PM, O3 and NO2 have been considered but the methodology could be extended to other pollutants such as heavy metals and polyaromatic hydrocarbons which are considered in the Ambient Air Quality Directive 2004/107/EC. The focus in this document is clearly on applications related to the AQD and thus those pollutants and temporal scales relevant to the AQD. However the procedure can off course be extended to other variables including meteorological data as proposed in Pernigotti et al. (2014) In Table below values are proposed for the parameters in (23) and (29) for wind speed and temperature data.
## Table List of the parameters used to calculate the uncertainty for the variables wind speed (WS) and temperature (TEMP) | ||||||

| α | |||||

WS (test) | 2.00 | 0.130 | 5 m/s | 0.800 | NA | NA |

TEMP (test) | 2.00 | 0.025 | 25 K | 1.000 | NA | NA |

? |