Goals and success criteria

From DE4A
Revision as of 13:56, 8 November 2021 by Ard.vanderheijden (talk | contribs)
Jump to navigation Jump to search

Back to main page of D4.7 Initial Running Phase Report

Back to Previous Chapter: 2. Current Status of Pilot

3.1 Goals and pilot success criteria

Objectives and how they are satisfied in relation to success criteria. Use D4.6 section 2.1 (Final version of success criteria and Common Pilot Criteria) as a basis.

GOALS

Actor ID Goal
Public authorities A Improve the quality of Company data within the service fulfilment process by re-using data from authentic sources, thereby reducing manual work and lowering processing costs.
Companies B Reduce manual work, lower transaction costs and improving enrolment speed for the company when using the Once Only Principle
Project C Evaluate the OOP-components supporting the cross-border information flow:

-         Assess (technical) impact on national services/registers already in place

-         Evaluate connections of national systems to the OOP TS

D Evaluate whether the solutions designed to the DBA specific challenges have proven adequate in piloting the DBA eProcedures:

-         Usability of harmonised Company Evidence model

-         Degree to which powers must be validated

-         Scalability of solution for powers validation

-         Usability and security of Explicit Request and Preview

-         Need for record matching on Natural Persons

-         Adequacy of patterns to keep data up-to-date

Success Criteria for Public Authorities

ID Criterion Technical Common Criteria Principles


Pilot goal A: Improve the quality of Company data within the service fulfilment process by re-using data from authentic sources, thereby reducing manual work and lowering processing costs

A1


The DE recognizes the company data is of higher quality, more reliable and easier to process when using the OOP TS to retrieve company data directly from the DO. (e.g. can data is available in an electronic and structured format for easy processing in the systems of the DE, data requires less correcting, data is kept up to date automatically, data is reliable and leads to less exceptions when processing, data is more meaningful, has less inconsistencies and errors, is more complete).


Reusability, Transparency, Effectiveness & Efficiency, Administrative Simplification


U, A, L, V

A2


The DE recognizes the method of powers validation to provide data of higher quality and reliability, proving that the representative is sufficiently authorized to represent the company (e.g. authorisation data is easier to interpret, authenticity is clear, data is trustworthy, there is less manual work in validating the users powers to represent the company with documents proving the relationship of the user to the company, authorization data requires less correcting, verification is easier).


Reusability, Transparency, Effectiveness & Efficiency, Administrative Simplification


U, A, L, V

3.2 Pilot dimensions

Qualitative description on lessons learned (technical, functional, proess, data, usability etc) and preliminary conclusions on these dimensions, based on metrics, questiuonnaires and interviews? The dimensions target the scope of the piloted functionality and patterns (until delivery of the report)

3.2.1 Use

  • Overview
  • Initial feedback from focus group and real users
  • Initial results from use related metrics (logs)
  • Usefulness of DE4A patterns and components related to internal stakeholders take-up
  • Strategy on pilot use until final report

3.2.2 Value

  • Verified benefits with users and DEs / DOs
    • Contribution of pilot to DE4A benefits and to external community of SDG stakeholders
    • Pilot specific benefits

3.2.3 Learning towards Adoption

  • Approach to knowledge-building
  • Lessons learned from integration and testing useful for Adoption
  • Technical, semantic and organisational/legal knowledge provided to other WPs
  • Pilot learning for “Sustainable impact and new governance models” WP (to be agreed e.g. Sustainability recommendations, standardisation needs)
  • Lessons being learned from users (questionnaires & interviews)
  • Lessons being learned from DEs and DOs (results and outputs questionnaires & interviews)
  • Other lessons from interaction with other initiatives (SEMPER, EBSI…)

3.3 Technical common criteria (questionnaire for evaluation?)

[Qualitative description of preliminary conclusion per criterium. Explanation (in written) on how success criteria /metrics are related with Technical common criteria, distributed by pilot dimensions]

Openness => U, A

Transparency => U, V

Reusability => V, L

Technological neutrality and data portability => L

User-centricity => V

Inclusion and accessibility => U

Security and privacy => U

Administrative simplification => A, V

Effectiveness and efficiency => A, V

[Qualitative comments, and follow-up with quantitative comments on second iteration]

Next Chapter: 4. Pilot Procedures