Links to UN bodies

Links to site map

Main Links

National Activities: Belgium

Belgium Flag

Belgium
First Substantive Report

1. INTRODUCTION

1.1.The aim of the CSD Work Programme on indicators

The aim of the CSD with respect to indicators of sustainable development is to have a tool available to measure progress towards sustainable development for all countries to use, based on their national priorities, by the year 2000. In order to meet this goal the testing phase is planned from November 1996 till December 1999 and will attempt to assess and evaluate the efficiency and effectiveness of the preliminary list of 134 indicators available from the CSD working list as of August 1996. This list provides a menu of indicators from which countries can choose according to national goals, targets and priorities for sustainable development.

The main goal of the indicators programme is to create a viable and flexible system for monitoring progress on sustainable development strategies, policies and activities. The indicators are tools and not ends in themselves and should be (re)viewed and adapted according to the national decision-making processes. They will differ from country to country depending on the national objectives and targets, infrastructure, expertise and availability of data and information and its influence on the understanding of sustainable development. It is understood that the users of the indicators will have different needs and therefore the appropriate set of indicators will depend on their particular use. The selection of indicators, to the extent possible, should include the four categories of economic, environmental, social and institutional aspects of sustainable development as set out in the framework.

Regular reporting on the testing phase will assist countries and the CSD in the final assessment and evaluation of the usefulness of the CSD indicators.

In this context, we are requesting countries participating in the testing phase to fill out the following questionnaire. Please feel free to give additional comments wherever applicable with reference to the corresponding section of the report. You may use extra sheets for the comments, if needed.

1.2. Country description

Please give a short description of your country's background and aims for the testing programme, including a summary of progress to date and focus for the next year:

Context

In order to contribute to the implementation of Chapter 40 of Agenda 21 at the international level, Belgium has helped the efforts of the CSD to develop and adopt a work program on ISDs. As a first step towards this objective, Belgium hosted in 1995 an International Workshop on Indicators of Sustainable Development organised with the Government of Costa Rica, UNEP and SCOPE (Scientific Committee on Problems of the Environment).

Consensus was reached, during this first Workshop, on the principle of a Working List of ISDs (from which countries could select the indicators that they may use in their national policies, according to their own problems, policies and targets) and on the need of a set of methodology sheets to design and explain each of these indicators. These principles and list have been adopted by the third and fourth sessions of the Commission on Sustainable Development in April 1995 and 1996. In November of 1996, Belgium hosted a second international workshop aiming at structuring and launching the test of indicators. This workshop succeeded in harmonising methodological approaches for the test and produced a set of guidelines to implement the test.

In the meantime, Belgium had decided to become a "testing country" in the CSD Program on Indicators of Sustainable Development. The decision was taken on the 4th of June 1996 by the Inter-ministerial Conference on the Environment (ICE), chaired by the Federal State Secretary for Environment and composed of the three Regional Ministers of Environment and the Federal Minister of Science.

Progress to date

As the ICE is in this way responsible for the testing of the environmental part of the Working List, it can be regarded as the National co-ordination mechanism for the testing of environmental ISDs.

ICE has established a Working Group which is in charge of the implementation of the test. This Working Group is mainly composed of representatives from the Regional and Federal Environment Ministries. Being the sustainable development international focal point of Belgium, as well as the Belgian focal point for the testing of ISDs, the Federal Planning Bureau (FPB) also belongs to this Group; it assists the participants with all available information regarding the CSD process on indicators.

As first step, the group assessed the availability of the data in the different regions and at the federal level. The group concluded that data are, in principle, immediately available for 21 of the 57 environmental indicators of sustainable development, and that five indicators aren't connected to Belgian features. Then, the ICE decided, on 12 November 1996, that Belgium was to start the testing with the in-depth analysis and filling-in of the methodology sheets of three indicators: domestic consumption of water per capita, use of agricultural pesticides, household waste disposed per capita. At this moment, only two indicators (domestic consumption of water per capita, use of agricultural pesticides) have been duly analysed with the help of the methodology sheets.

Regarding the other ISDs, the testing of social and economical ISDs (which has started with the determination of the data availability) will go on with the evaluation of their methodology sheets. And about institutional ISDs, a research program provided an excellent report, offering a frame for the elaboration and testing of indicators.

The testing process is thus launched in Belgium at the political and administrative level for the environmental ISDs, but its further implementation and possibilities for opening to major group participation will depend on the level of human and financial resources allocated to it by all departments concerned (not only Science and Environment).

The Federal Planning Bureau completed this Format for Reporting on Progress of the National Testing of Indicators of Sustainable Development required by the CSD to assess the implementation of the test. The first draft was submitted for comments during a meeting of the working group (29/01/98). All the comments and remarks formulated, helped to improve and finalize the present report.

Focus for the next year

Regarding the test of environmental indicators, the Working Group should evaluate the methodology sheets of the 18 other environmental indicators for which data are available.
Regarding the test of social and economical indicators, several working groups should be set up to assess the methodology sheets of these indicators.
Regarding the test of institutional indicators, no decision has yet been taken.

2.ORGANIZATION OF THE TESTING PHASE

2.1.Focal Point and National Coordinating Mechanism (NCM) Focal Point of the Testing:

Bureau fédéral du Plan - Nadine Gouzée
Mailing address: 47/49 Avenue des Arts, B-1000 Bruxelles, Belgium
Telephone no.: (32-2) 507-7311; Fax no.: 507-7373; E-mail: ng@plan.be

Name of NCM for the Testing of Environmental Indicators:

Inter-ministerial Conference on Environment

Ministries and other bodies included in the NCM:

Federal State Secretary for Environment, and composed of the three Regional Ministers of Environment and the Federal Minister of Science.

Mandate and modus operandi (incl. schedule of meetings) of NCM:

The Inter-ministerial Conference on Environment is the official coordination organ about environment in Belgium. The Conference is in charge of coordinating all the environmental issues that requires coordination between the regions and the federal authority, and is responsible for making all the decisions that regard the coordination of environmental issues.

2.2.Major groups and other stakeholders

2.2.1.List which major groups and stakeholders got involved, if any, at what stage of the testing programme, how this happened and what their contribution has been (for example, did they help to identify key national priorities, assist in indicator selection, or help in data collection, etc.):

MAJOR GROUP OR STAKEHOLDER PHASE OF INVOLVEMENT CONTRIBUTION
NGOs Not directly involved in the testing Through two contracts from the Environment department, two environmental NGOs could approach the problematic of ISDs:
- they developed their own opinion on the working list of the CSD, focused on lacks in this working list, identified key national priorities and proposed some new indicators
- they increased public awareness on ISDs, and informed other associations (mainly environmental ones) on this issue.
Scientific Community Not directly involved in the testing Several researches aiming at developing new indicators: general indicators of sustainable development, sectoral indicators, urban indicators, institutional indicators.

2.2.2.What were the criteria and/or mechanism, if any, for the selection of major group and stakeholders involved?

NGOs: NGOs were involved in the development of ISDs through two contracts provided by the Environmental Department. These two contracts went to two important environmental NGOs (Inter-Environnement Wallonie, Vlaams Overleg Duurzame Ontwikkeling), one in the Flemish Region and one in the Walloon Region.

Scientific Community: The Science Policy Department and the Environment Department financed some universities and research institutes to work on ISDs.

2.2.3.What are the main lessons learned regarding the involvement of major groups and other stakeholders during the testing?

NGOs: NGOs showed a large interest in developing ISDs, and would like to be more integrated in the process. They formulated very helpful comments on the working list of ISDs, as well as on the issues not covered by this list.

Scientific Community: Many researches are lead on the issue of indicators, but no specific body, except the Science Policy although it does not cover all the researches, centralizes all this information. Coordination between the researches should be increased to improve the process.

3.THE TESTING

3.1. Implementation process

3.1.1.Please describe the steps taken by your country to implement the testing of indicators.

Regarding environmental indicators:
  1. determine the NCM;
  2. NCM set up a working group to implement the test of the environmental ISDs;
  3. identify for each indicator whether the matter covered by the indicator is under federal and/or regional responsibilities;
  4. assess the availability of data: data are immediately available for 21 indicators on 57;
  5. decision of the NCM to test 3 of these indicators: domestic consumption of water per capita, use of agricultural pesticides, household waste disposed per capita;
  6. share the work between the members of the group, and start the evaluation of the methodology sheets of these 3 indicators;
  7. completion of two reports, that comment 2 of these indicators: domestic consumption of water per capita, use of agricultural pesticides;
  8. share the evaluation of the methodology sheets for the 18 other indicators between its members.
Regarding social and economical indicators:
  1. determine whether these indicators are already used in our country;
  2. establish priority issues and indicators;
  3. assess the general availability of data;
  4. try to organize the following steps of the testing.

3.2.Institutional support and capacity-building

3.2.1What capacity-building needs have been identified as necessary to implement the testing programme (e.g., number of people, financial needs)?

Regarding environmental indicators, each issue requires a specialist or a person involved in this specific issue to give relevant advice on the indicators. Many different persons must then be involved in the testing. And as the Regions are the main authorities on Environment, this also multiplied the number of persons dealing with this subject.

Regarding the general organization and momentum needed to implement the test, one person should be affected full-time on the project. It would speed up the process and contribute to its effectiveness.

3.2.2.Was it necessary to provide additional training to staff in order to undertake the required data collection, compilation and analysis? If yes, please specify type and structure of the training.

Not yet, because the working group chose to start the testing only with the indicators for which data were directly available.

And regarding environmental data, a specific Group is in charge of collecting and compiling environmental data. It would be useful to collaborate with this group in the future in order to benefit from their capacities.

3.3. Twinning arrangements

3.3.1.Did you have twinning arrangements with any other country? If so, please specify:

Belgium organized two workshops in Ghent in collaboration (or twinning) with Costa Rica. But for the implementation of the test, no twinning was undertaken by Belgium at this stage. The main reason is that we have not enough human resources on this project, and that collaboration with another country would require additional resources.

Country: no
Initiation of the Cooperation: /
Nature of Cooperation: /
Results: /
Future plans: At this time, twinning or other cooperation agreements proposals received by Belgium are being examined carefully, while seeking for additional capacities.

3.4.National strategies and indicator selection

3.4.1.Please use the table below to provide (if possible) a list of key national sustainable development plans and priorities, give a brief description of them and list what indicators are/will be used for monitoring this issue. This will facilitate identifying and comparing key indicators related to priority issues of sustainable development across countries and regions.

Neither at the federal level nor at the regional level, there is yet a real sustainable development plan.

Nevertheless, the regions have already adopted different plans dealing with sustainable development themes. For example, the Walloon Region adopted in March 95 an "Environment Plan for Sustainable Development", after public consultation. A new plan is foreseen in 5 years. This Region has also plans for the development Planning and for the management of waste. Plans for the management of water, nature, mobility, air and soil are foreseen in a near future. The Flemish Region has several plans that run around the thematic of sustainable development: Environmental Management Plan, Program to implement Action 21, Structural Development Planning Flanders, Waste Planning. The Brussels Region has a Regional Development Planning (development planning, economy) and will soon, after public consultation, adopt a Waste Planning. Other plans for the management of air quality and noise data, are in development. Some of these plans include indicators, but not all of them. The use of ISDs could facilitate and encourage the monitoring of all these plans.

At the federal level, a Bill on the Coordination of the Federal Policy on Sustainable Development has been voted last year that binds the government to make a report of sustainable development each two year and a plan of sustainable development each four year.

The first report will be completed at the end of 1998. Indicators of sustainable development will be integrated in this report, that could indicate the main trends in the priority sectors of the report. Our aim is to use some of the ISDs working list. This would allow us to test the technical quality of these indicators, as well as their relevance for decision-making through the report.

Please use a separate table for each key plan and priority.

National plan or strategy for sustainable development (if any):/
Key sustainable development priority:/
Indicators selected for monitoring this priority issue/plan:/

4. DECISION-MAKING ISSUES AND POLITICAL USEFULNESS

4.1.Integration of indicators in decision-making

4.1.1.Please describe how the indicators are/or could be integrated into the decision-making process with respect to:

a) Application in policy analysis and planning;

Regarding Flemish Regional indicators, demographic and environmental indicators are published annually in VRIND. The Flemish Government created a database - Functional Regional Database, FRED - to cover all types of policy relevant indicators. All Flemish agencies concerned with environment and sustainable development policies are connected to this database.

The Walloon Region publishes each 3 year, The Walloon State of the Environment. It has proved an excellent tool for decision-making to both the public and private sectors in the Walloon Region. The Walloon Government is also developing a central database of environmental data and indicators.

In the Brussels Region, the implementation of existing plans includes the use of indicators to assess the performance of policies and actions launched in those fields (waste management and promotion of the biological inheritage). About fifteen environmental and social indicators included in a "dashboard" are published regularly as well as Reports about the State of the Environment (1990,1994). Since October 1996, an environmental statistical observatory has been created to collect data and produce indicators. A report of the notion of "indicators of sustainability", which has different interpretations, has been published.

As said in point 3.4.1, some ISDs will be used in the Federal Report of Sustainable Development, that the Federal Planning Office has to establish. This report will be "communicated to the Commission and to the other Minister, who transmits it to the Council of Ministers, to the legislative Chambers, to the Council and to the governments of the Regions and the Communities".

b) Application for integrated modeling, forecasting or scenario analysis;

The Belgian Bill quotes that the report has to include a point on forecasting: "a description of the development expected in case of unchanged policy and in case of a change in policy, according to a number of relevant scenarios". In this context, some indicators will be used to model and forecast different scenarios.

c) Application for integrated environmental reporting and assessment;

d) Green accounting;

Some researches are lead to develop a system of green accounting. One of these researches is a European one piloted by Eurostat. But, it's too recent to integrate the researches (made on green accounting) and these made on ISDs.

e) Other application in communicating overall government performance to the public;
No

4.2.Integration of indicators in decision-making

4.2.1.How and when are indicator trends and the related analytical information conveyed to decision-makers (for example which media are used: print or electronic; short summaries, reports, or comprehensive publications)?

Almost all information is conveyed to decision-makers through reports and comprehensive publications.

4.3.First evaluation

4.3.1.Have you evaluated the usefulness of indicators to national decision-makers and if so, what are the results:

Here are the results obtained through the evaluation of the two indicators tested at this stage in Belgium:

Use of agricultural pesticides: this indicator is too global to be a relevant tool for decision-makers in Belgium, according to the Working Group. The decision-maker fears in particular that the indicator is misinterpreted. Some more detailed indicators would be preferable and would likely be used.

The debate on the indicators was here very helpful to develop the reflection on the possibilities to use such indicators. It wasn't indeed in the agendas and the programs of the decision-makers before it.

Domestic consumption of water per capita: This indicator could be relevant for decision-making in Belgium, but not in the way explained in the methodology sheets (access to a minimum quantity of water per person). This indicator should show whether water is wasted to be relevant in Belgium. A maximum level of water consumption per capita per day could help decision-makers.

5. OVERALL ASSESSMENT

Please note that the assessment of the individual indicators will be dealt with in Section Two of the report containing the Indicator Report Sheet.

5.1.Assessment of indicator menu, organization and methodology sheets

At this stage of the testing, no effective debate on the working list of ISDs and their methodology sheets took place in the working group. Nevertheless, some general remarks can be done.

5.1.1.Please give an overall assessment of the CSD indicator menu and its appropriateness for the purpose of monitoring progress towards sustainable development.

The working list seems to be too long. It discourages people that have to test all these indicators, and increase the difficulties to communicate.

There are some overlappings between indicators that could be avoided. For example, there are four ISDs for poverty, that have almost the same meaning.

5.1.2.Did your country find the organization as given in the CSD list of indicators a useful classification system?

If not, please indicate why according to organizational elements such as Agenda 21 chapters, the four categories or the driving force - state - response framework.

The driving force - state - response framework is not really appropriate to classify social and economical indicators.

The division in four categories of ISDs doesn't give a clear view of the interlinkages between the social, economical, environmental and institutional dimensions.

5.1.3.Please describe what efforts have been made to identify interlinkages between the selected indicators. If possible, please include in your response issues of aggregation.

No efforts have been made yet.

5.1.4.If applicable, please give your country's proposals for improvements in the overall organization and menu of indicators, keeping in mind that the assessment of the individual indicators will be dealt with in the Indicator report sheet.

No proposals have been made yet.

5.1.5.Please give an overall assessment of the usefulness of the methodology sheets developed for each indicator.

Our working group found the methodology sheets very useful to understand and compile the indicators tested. But the approach of the working list is oriented towards international definitions. As the indicators have to be used at the country level, it would be useful to add a paragraph on the country meaning of each indicator for each specific country, that this country could fill in.

5.2.Lessons learnt and changes proposed

5.2.1.Describe the main lessons learnt in the testing so far e.g., institutional, financial and data related.

Successes:

Increase public awareness on the need to implement sustainable development at the level of the civil servants participating to the testing;

Raise awareness on the demand aspects of indicators and the need to discuss their adequacy to the decision-making challenges.

Clarify the aim of the testing in comparison with the collection of data to compile indicators. The testing is a demand-oriented activity, that aims at selecting the priority and relevant indicators for decision-making, whereas the collection of data to compile indicators is a supply-oriented activity.

Problems:

The testing is time-consuming. Then, it's necessary to organize it with structure and planning, to decide a time-table for its implementation, and to try to respect this time-table. Moreover, the issue of availability of data shouldn't overwhelm the task of the Working Group which must to focus on the technical adequacy and performance of indicators for decision-making.

The comments made on the methodology sheets until here for the two indicators tested are not enough structured and detailed to answer to all the questions of the Indicator Report Sheet. For the indicators that will be tested in the near future, the testing group will directly base their work on the Indicator Report Sheet. This would avoid to multiply needlessly the job reporting.

5.2.2.Will there be changes in the national testing programme as a response to the results of the first year of testing? If yes, please specify what you intend to change (e.g., do you intend to change the organizational structure of the NCM, or to use different or additional indicators and if so, which ones etc.).

Most changes have already been quoted in the previous questions. They are the followings:

- work directly with the IRS to make the evaluation of the methodology sheets;
- structure more the process, set up a time table and try to follow it;
- concentrate on the technical adequacy and performance of indicators for decision-making, rather than on the availability of data;
- distinguish for the test of social and economical ISDs, the work on the technical features of the indicators and the work on the policy relevance of these indicators.

6. INDICATOR REPORT SHEET (IRS):
Domestic consumption of water per capita

6.1Name of indicators: Domestic consumption of water per capita

Key issue(s) addressed:

The Working group agreed on the relevance of this indicator to assess the sustainability of the development. But the/an indicator should also show that in most developed countries, the problem caused by water consumption is a problem of wasting, that jeopardizes the available resources.

This indicator comes under the regional responsibilities, and is computed at the regional level. This means that data have to be collected in the 3 regions and have to be harmonized to get one national indicator.

Category: environmental
Placement in Agenda 21:Chapter 18
Placement DSR: driving force

SUMMARY

Brief Definition:

Domestic consumption of water per capita is the amount of water consumed per person for the purpose of ingestion, hygiene, cooking, washing of utensils and other household purposes including garden use. Where it is customary for domestic animals to be kept at or in the living environ their needs are also included in the assessment.

The Working Group mentioned that this definition is different from the one used by Eurostat.

Units of measurement: litres per capita per day

Set Targets - International: Agenda established a target of access to at least 40 litres per capita per day of safe water in urban areas by the year 2000.
Set Targets - National: A maximum target of 120 litres per capita per day has been proposed in the working group.

Current level:

Region Population Year Domestic consumption Litres/capita/day
Walloon Region 3.208.289 1992 145.875.000 m3 127,57 litres
Flemish Region 5.824.626 1992   92 litres (water meter 0-250 m3)
124 liters (water meter 0-1000 m3)
Brussel Region 954.931 1995 44.333.000 m3 127,19 litres

1. INFORMATION ON INDICATOR SELECTION

1.1.Was the indicator already being used in your country (Yes/No)?

NO

1.2.Why was this indicator selected? Was it because

a) it was related to national priorities on sustainable development?/
b) data was available?/
c) it was part of an existing indicator programme?/
d) other?/

If d), please expand on the reason and if possible, list selection criteria in order of importance: /

1.3.Please give an assessment to the extent possible of the appropriateness of the indicator for describing trends for the issue addressed. This may be in terms of how adequate, useful and problem oriented the indicator is.

This indicator is meaningful to provide a quantitative reference at the international level. But at the national level, the problem is lightly different. In Belgium, almost 100% of the population has sufficient access to safe water. Then, this indicator would only be relevant in regard of a maximum level of water consumption. Above this level, the consumption of water would be wasted. It is proposed that this indicator becomes, for a country like Belgium, a "waste indicator". The Working Group proposed a maximum level of 120 litres/capita/day.

2. INFORMATION REGARDING THE METHODOLOGY SHEET FOR THIS INDICATOR

2.1.Please give a brief overall assessment of the methodology sheet for this particular indicator.

No remark has been done on the methodology sheet of this indicator. This one was helpful to understand the meaning of the indicator.

2.3.Please describe the underlying definitions and concepts if different from the CSD list of indicators and explain why these differing definitions and concepts were chosen:

Cf: 1.3

2.4.Describe the measurement method if different from the one in the CSD list of indicators and explain why this differing method was chosen:

Differences:

The main difference in the measurement method is that the indicator is computed on the basis of the distribution of water and not on the consumption of water. This creates two problems. Firstly, the water distributed through the distribution net isn't only consumed by households. A part can be consumed by small entreprises. This part can only approximately be deducted. Secondly, some inhabitants stock up with water coming from their water wells. This consumption isn't taken into account.

The measurement method varies in the 3 regions:

The Walloon Region computes the consumption of water on the basis of the water invoiced to all its consumers (households and companies) linked to the public distribution net. Then, the industrial consumption and the public consumption are subtracted. This amount is divided by the number of inhabitants.

The Brussels Region computes the consumption of water on the basis of the readings of water meters. A distinction is made between industrial consumption of water and household consumption of water. The consumption of water is industrial when the water is used in the production process and belongs to the cost price of the final products. The consumption of water is domestic when it only covers the household needs.

The Flemish Region computes the consumption of water on the basis of the readings of water meters (0-250 m3and 0-1000 m3). Then, she computes the number of inhabitants linked to the distribution net and assesses, on the basis of the readings of water meters 0-250 m3 and 0-1000 m3, the number of litres per capita per day consumed.

3. ASSESSMENT OF DATA

3.1.The following information regarding data quality, quantity and availability is requested. Indicate with a (X) or report as appropriate.

Indicator Domestic consumption of water per capita
Does copyright apply No
Restrictions on use No
Additional cost involved to obtain data No
Method of data collection Monitoring
Update frequency  

Additional Reports