Data Quality Index Consultation - sub-section Overall Questions

Instructions for submitting your feedback
  1. Feel free to share your overall feedback on the Data Quality Index Consultation through the comment-box below.
  2. Consider the guiding questions as outlined.

Guiding questions:

  • If you can use five words, what would you describe as good quality IATI data?
  • Do you have suggestions on ways of  using the Data Quality Index to incentivise better quality publishing and ensure increased attention by publishers on the transparency of aid data?
  • Have we missed any important measures in the DQI? If so, what is your additional proposal?

  • Should an overall summary index measure be added? If so, what weight should be given to each measure? Or should  each measure be left as a separate assessment?


BACK TO MAIN DQI-PAGE
 

Files

Comments (12)

Herman van Loon
Herman van Loon
  • If you can use five words, what would you describe as good quality IATI data?

    Data successfully used by others.
     
  • Do you have suggestions on ways of  using the Data Quality Index to incentivise better quality publishing and ensure increased attention by publishers on the transparency of aid data?

    Publish the DQI in newsletters and other IATI publications.
     
  • Have we missed any important measures in the DQI? If so, what is your additional proposal?

    Regarding the availability of data it is not enough if data have been published. Equally import is to assess that the data published is valid: e.g only existing codes and references to existing identifiers are being made. 
     

  • Should an overall summary index measure be added? If so, what weight should be given to each measure? Or should  each measure be left as a separate assessment?

    Yes, with emphasis on publishing valid usable data. Weight should be given to elements relevant for network transparency and completeness of data taking into account the role a publisher has in the network.

leo stolk
leo stolk
  • If you can use five words, what would you describe as good quality IATI data?

Data useful and used by others and yourself

  • Do you have suggestions on ways of  using the Data Quality Index to incentivise better quality publishing and ensure increased attention by publishers on the transparency of aid data?

make DQI a dashboard,
recognition for degrees of quality in various categories. 
allow publishers to use this recognition in their comms in agreed way, with IATI logo.
 

  • Have we missed any important measures in the DQI? If so, what is your additional proposal?

The length of trails and the last mile matter in terms of quality. Assessment of effort in helping next peer publishers 'down the trail' could be a missing measure. How many recipient organisations publish in IATI ? 

  • Should an overall summary index measure be added? If so, what weight should be given to each measure? Or should  each measure be left as a separate assessment?

Suggest to calculate an average of the measure rankings, without too much emphasis. the individual scores per measure are most important.

matmaxgeds
matmaxgeds

Hi - my experience has been that what is 'good quality data' varies so much based on what you are trying to use it for, that you probably need a different index/measure for each use case e.g. if I want to know 'how much aid went to Mali in 2020' then there are many fields I am not bothered about, but comprehensiveness, use of the org_ids needed to avoid double counting, not deleting completed activities etc are very important. That is completely different for other use cases which would priorities other fields. I think we need publishers to say "I seek to meet use case X' so we can say....ok, then look at index Y which specifically targets that use case.

Measure I would consider adding:

  1. RE document links - are the embedded links/contacts functioning e.g. some donors have a url for the project website that is just a link to the search function on their main website - this should not count. Links also need testing to see if they 404 etc.
  2. Whether old activities are published or are removed - and if the schedule for removal is shared e.g. in the org file (part of comprehensiveness e.g. if completed activities are removed in-year, you cannot make annual stats)
  3. Whether the org file indicates if the data is considered official by the publisher
  4. Whether the license allows commercial re-use
  5. RE sub-national location - I suggest removing a) the capital and b) the geographic center of the country or you will get too many false positives
  6. For coverage, the current measure will not capture where spend has been removed from the total expenditure element e.g. by not including there, spend from activities that are also not being reported - e.g. if a publisher org is only publishing a subset of their activities
  7. It might be worth a basic one on date failures e.g. start date after end date
  8. One on 'duplicate activities' as these do not consistently appear in datastores
  9. One on failures where the activity-status does not agree with the dates e.g. end date passed, marked as in implementation

Great to see the validator will be used: Is there somewhere in the SSOT etc that explains what is a 'warning' vs an 'error' etc and why, so that publishers would be able to understand the feedback - otherwise these terms are rather random.

Yohanna  Loucheur
Yohanna Loucheur

Would agree with several of Mat's suggestions, especially regarding the basic data failure checks, and also the complexity of defining data quality - it does really depend on the use case one has in mind. It's almost like we need a big interactive  dashboard to cover them adequately:

- press "Use Case A", a sub-set of measures adds up to provide a publisher's score in meeting it

- press "Use Case B", another sub-set (which includes the same mandatory elements but also some different measures) adds up and provides the relevant score (which may be completely different!);

- press "Humanitarian" or "Grand Bargain Use Case", the Grand Bargain elements add up to calculate the score. 

Granted, it would be hard to come up with a single, overall score for each publisher, but it would provide more useful guidance to data user.

 

Evgenia Tyurina
Evgenia Tyurina

Hello, everyone. Here is the ILO's feedback:

If you can use five words, what would you describe as good quality IATI data?

1. comprehensive

2. truthful/reliable

3. understandable

4. accessible

5. relevant

Have we missed any important measures in the DQI? If so, what is your additional proposal?

We do not have any additional proposals

Should an overall summary index measure be added? If so, what weight should be given to each measure? Or should each measure be left as a separate assessment?

It would be good to have an overall summary index. The weight for each measure to be discussed separately. In our view the coverage, thematic focus of activities, participating org. info and timeliness of updates should be prioritized together with the financial information

Anna de Vries
Anna de Vries

If you can use five words, what would you describe as good quality IATI data?

  • Valid, clear, measurable, complete, traceable.

Do you have suggestions on ways of  using the Data Quality Index to incentivise better quality publishing and ensure increased attention by publishers on the transparency of aid data?

  • IATI remains a black box for many publishers, often they are under the impression that they are publishing correctly while there are still many errors or the data does not appear at all. If when publishing significant errors appear, the publisher could be actively notified that the publication is invalid. 
  • Additionally, sending a report of the quality statistics to the organisations on a more regular basis would be good since many organisations are not yet familiar with the validator.
  • Many publishers are incentivized to improve data quality based on donor requirements, so making it insightful which errors are most important for the larger donors would help

Have we missed any important measures in the DQI? If so, what is your additional proposal?

  • Agree that it would be a good idea to award additional ‘points’ for if recipients also publish in iati
  • Whether the organisation has any open data policy or other information about their data choices available (could be a separate document type?)
  • Completeness of information about the publisher on the registry

Should an overall summary index measure be added? If so, what weight should be given to each measure? Or should  each measure be left as a separate assessment?

  • A summary would be a good idea since you need to see the most important points at a glance and then be incentivized to dive deeper into the report. So a short assessment including their data quality score and/or ranking, number of eros + significance would be very helpful. Additionally for each measurement the users checking the errors should already be steered towards guidance on how to resolve it.
Athira Lonappan
Athira Lonappan

If you can use five words, what would you describe as good quality IATI data?

Accurate, Lean, Relevant, Purpose-driven

Do you have suggestions on ways of  using the Data Quality Index to incentivize better quality publishing and ensure increased attention by publishers on the transparency of aid data?

The statistics board could be made more attractive by adding a bit of gamification.
E.g. Using DQI to rank top 5 contributors of the month, have badges or rewards etc.

Have we missed any important measures in the DQI? If so, what is your additional proposal?

Right now the IATI Validator validates the schema but not the data involved. I read about the additional validation checks performed. But if we are to ensure that the quality of data is to be optimized, it is really important to validate that all the data entered is from the expected options. For example: Having a sector code which does not exist in the mentioned DAC sector code file drops the quality so validating that info could help is what I think.

 

Otto Reichner
Otto Reichner
  • If you can use five words, what would you describe as good quality IATI data?

Comprehensive, timely, frequent and qualitative

  • Have we missed any important measures in the DQI? If so, what is your additional proposal?

No, rather than adding we should refine the way we measure the existing KPIs

  • Should an overall summary index measure be added? If so, what weight should be given to each measure? Or should  each measure be left as a separate assessment?

Yes, an overall summary would be good, with focus on the comprehensiveness (mandatory and recommended fields), timeliness, frequency and valid financials

Marie Maasbol
Marie Maasbol

Please find below the feedback from the Commission (FPI, DG NEAR, ECHO and INTPA).

If you can use five words, what would you describe as good quality IATI data?

  • Availability, completeness, usefulness, accuracy, timeliness

Do you have suggestions on ways of using the Data Quality Index to incentivise better quality publishing and ensure increased attention by publishers on the transparency of aid data?

  • The results of the Index are very good to provide simple messages on the importance of transparency to policy makers and administrative management. Being able to communicate the results in an official manner to show progress, lessons learned and areas of improvements will be key to incentivise support for better quality publishing. Being able to share a report of the Index will be key for political and communication purposes. This question relates also to the last question of this section.

Have we missed any important measures in the DQI? If so, what is your additional proposal?

  • The Commission would like to suggest that the Index includes a degree of flexibility regarding the organisation type and acknowledging the structural barriers an organisation might face because of its type. Examples have been mentioned in the Commission comments throughout, such as for geolocation or for three years forward-looking budgets. This would also affect a weighting system, as the weight should be distributed differently depending on the type of publisher (multilateral, bilateral etc.)

Should an overall summary index measure be added? If so, what weight should be given to each measure? Or should each measure be left as a separate assessment?

  • A summary report of how well a publisher is doing in terms of an Index score will be a key outcome of this exercise. The summary should highlight how well a publisher has done within each of the measures outlined in the Index in order to be able to locate the areas where a publisher needs to improve, as well as for political and communication purposes.  
Michelle Levesque
Michelle Levesque

I will be the voice of dissention because I don't believe scores should be given to any of this.  It often provides a false sense of quality.  I've seen data in high scoring publishers that is not quality (made up publisher references) but they get full credit because the field is not blank.  The statistics don't capture the gender policy marker in spite of the importance many data users and donors put on that item.  Until there is a way to look the data with the nuances mentioned above (type of publisher, GB/humanitarian publishers etc.) weights and scores will never be accurate. 

Getting usable, comprehensive, truthful/reliable and accessible can't be measure by the current scoring system.  Matt's suggestion on capturing where data fails is more important than the current scoring mechanism.

 

 

Alex Tilley
Alex Tilley
  • If you can use five words, what would you describe as good quality IATI data?

Comprehensive, accurate, detailed, readable and standardised.

  • Do you have suggestions on ways of using the Data Quality Index to incentivise better quality publishing and ensure increased attention by publishers on the transparency of aid data?

We'd be very happy to share our experiences of driving improvements in quality/availability of data.

  • Have we missed any important measures in the DQI? If so, what is your additional proposal?

Human readability of text data fields and whether these make sense and meet the definitions outlined in the Standard.

Review of the quality of documents including: checking the correct documents are included under the corresponding codes, checking document dates, relevance to activities and whether they contain the relevant information for the document type.

  • Should an overall summary index measure be added? If so, what weight should be given to each measure? Or should  each measure be left as a separate assessment?

This depends on what IATI would intend to do with the measure and what the agreed approach is to question 2 (on incentives and attention). For the Aid Transparency Index we produce an Index score with weightings for the 35 indicators that make up the total score. We base the weighting of the indicators on feedback from stakeholders regarding what information is most important for them. This includes carrying out surveys with civil society, researchers, technical experts and other data users.

Pelle Aardema
Pelle Aardema
  • If you can use five words, what would you describe as good quality IATI data?

Data that can be succesfully used by (many) others, not just a single application.

Rich data: covering many elements, not only financials but also narratives, documents and results.

  • Do you have suggestions on ways of  using the Data Quality Index to incentivise better quality publishing and ensure increased attention by publishers on the transparency of aid data?

Publishers could be automatically notified of their DQI 'score'. Currently, many publishers are not even aware when they have published a file that doesn't validate or cannot be retrieved.

  • Have we missed any important measures in the DQI? If so, what is your additional proposal?

Matt made several good suggestions

  • Should an overall summary index measure be added? If so, what weight should be given to each measure? Or should  each measure be left as a separate assessment?


Please log in or sign up to comment.