Difference between revisions of "Goals and success criteria"

From DE4A
Jump to navigation Jump to search
Line 215: Line 215:
 
*** <span style="color:red"> 1 evicende type, no multi-evidence, use full powers validation etc. </span>
 
*** <span style="color:red"> 1 evicende type, no multi-evidence, use full powers validation etc. </span>
 
*''Technical, semantic and organisational/legal knowledge provided to other WPs''
 
*''Technical, semantic and organisational/legal knowledge provided to other WPs''
 +
** <span style="color:red"> Prepare the creation of an animation by setting up a good storyline and slides that illustrate the flow of the animation. </span>
 +
** <span style="color:red"> Slack seems to be a good means to have developers of different MS / WPs collaborate  </span>
 
*''Pilot learning for “Sustainable impact and new governance models” WP (to be agreed e.g. Sustainability recommendations, standardisation needs)''
 
*''Pilot learning for “Sustainable impact and new governance models” WP (to be agreed e.g. Sustainability recommendations, standardisation needs)''
 
*''Lessons being learned from users (questionnaires & interviews)''
 
*''Lessons being learned from users (questionnaires & interviews)''

Revision as of 16:02, 9 November 2021

Back to main page of D4.7 Initial Running Phase Report

Back to Previous Chapter: 2. Current Status of Pilot

3.1 Goals and pilot success criteria

Objectives and how they are satisfied in relation to success criteria. Use D4.6 section 2.1 (Final version of success criteria and Common Pilot Criteria) as a basis.

GOALS

Actor ID Goal
Public authorities A Improve the quality of Company data within the service fulfilment process by re-using data from authentic sources, thereby reducing manual work and lowering processing costs.
Companies B Reduce manual work, lower transaction costs and improving enrolment speed for the company when using the Once Only Principle
Project C Evaluate the OOP-components supporting the cross-border information flow:

-         Assess (technical) impact on national services/registers already in place

-         Evaluate connections of national systems to the OOP TS

D Evaluate whether the solutions designed to the DBA specific challenges have proven adequate in piloting the DBA eProcedures:

-         Usability of harmonised Company Evidence model

-         Degree to which powers must be validated

-         Scalability of solution for powers validation

-         Usability and security of Explicit Request and Preview

-         Need for record matching on Natural Persons

-         Adequacy of patterns to keep data up-to-date

- Success Criteria for Public Authorities

- ID - Criterion - Technical Common Criteria - Principles

Pilot goal A: Improve the quality of Company data within the service fulfilment process by re-using data from authentic sources, thereby reducing manual work and lowering processing costs

- A1

- The DE recognizes the company data is of higher quality, more reliable and easier to process when using the OOP TS to retrieve company data directly from the DO. (e.g. can data is available in an electronic and structured format for easy processing in the systems of the DE, data requires less correcting, data is kept up to date automatically, data is reliable and leads to less exceptions when processing, data is more meaningful, has less inconsistencies and errors, is more complete).

- Reusability, Transparency, Effectiveness & Efficiency, Administrative Simplification

- U, A, L, V

- A2

- The DE recognizes the method of powers validation to provide data of higher quality and reliability, proving that the representative is sufficiently authorized to represent the company (e.g. authorisation data is easier to interpret, authenticity is clear, data is trustworthy, there is less manual work in validating the users powers to represent the company with documents proving the relationship of the user to the company, authorization data requires less correcting, verification is easier).

- Reusability, Transparency, Effectiveness & Efficiency, Administrative Simplification

- U, A, L, V

Success criteria for Companies applying for a service

ID Criterion Technical Common Criteria Principles

Pilot goal B: Reduce manual work, lower transaction costs and improving enrolment speed for the company when using the Once Only Principle

B1

The user acknowledges the procedure for applying for a service to be effective and efficient (e.g. the procedure requires acceptable effort and cost, the procedure is not complex, has no language barriers, no interruptions. The user spends little manual time to correct company data, and experiences no errors after finishing the enrolment process).

Reusability, Effectiveness & Efficiency, Administrative Simplification, Transparency

U, A, L, V

B2

The user acknowledges the method to proof their authorisation as effective and efficient (e.g. requires little effort, is established with simple and effective communication, is reliable).

Reusability, Effectiveness & Efficiency, Transparency, Security and Privacy

U, A, L, V

B3

The user acknowledges the duration of completing the online eProcedure activities to apply for a service as acceptable.

Effectiveness & Efficiency, Administrative Simplification

V, A

B4

The user saves time and/or cost when completing the eProcedure using the OOP TS.

Effectiveness & Efficiency

V, A

Success criteria and research questions for Pilot Technical Goals

ID Criterion Technical Common Criteria Principles
Pilot goal C: Evaluate the OOP-components supporting the cross-border information flow:

·         Assess technical impact on national services/registers already in place

·         Evaluate connections of national systems to the OOP TS

C1 The DO believes the cost and effort for integrating to the DE4A Connector will eventually be outweighed by the benefits. Openness, Technical Neutrality and Data Portability U, A, V
C2 The DE believes the cost and effort for integrating to the DE4A Connector will eventually be outweighed by the benefits. Openness, Technical Neutrality and Data Portability U, A, V
C3 The DO believes the cost and effort for integrating to the Mandate Management System will eventually be outweighed by the benefits. Openness, Technical Neutrality and Data Portability U, L, V
C4 The participating Member States believe the cost and effort for setting up and deploying the DE4A Connector in their national infrastructure will eventually be outweighed by the benefits. Openness, Technical Neutrality and Data Portability U, L, V
Pilot goal D: Evaluate whether the solutions designed to the DBA specific challenges have proven adequate in piloting the DBA eProcedures
D1 Has the Company Evidence Model proven adequate for cross-border exchange of information on companies for the DBA eProcedures? Openness, Neutrality and Data Portability, Reusability U, V, L
D2 Have the solutions to validate powers proven adequate for the eProcedures involved in piloting? Reusability, Administrative Simplification U, L
D3 Have the explicit request and preview requirements as specified in the SDGR proveD5n suitable for company eProcedures (representation scenarios)? Administrative Simplification, User Centricity, Inclusion and Accessibility U, L
D4 Have the mechanisms for record matching at the DC proven adequate for the DBA eProcedures? Administrative Simplicity U, L
D5 Have the mechanisms to keep the company information up-to-date (second pilot iteration) proven adequate Administrative Simplicity, Effectiveness & Efficiency U, V

3.2 Pilot dimensions

Qualitative description on lessons learned (technical, functional, proess, data, usability etc) and preliminary conclusions on these dimensions, based on metrics, questiuonnaires and interviews? The dimensions target the scope of the piloted functionality and patterns (until delivery of the report)

3.2.1 Use

  • Overview
  • Initial feedback from focus group and real users
  • Initial results from use related metrics (logs)
  • Usefulness of DE4A patterns and components related to internal stakeholders take-up
  • Strategy on pilot use until final report

3.2.2 Value

  • Verified benefits with users and DEs / DOs
    • Contribution of pilot to DE4A benefits and to external community of SDG stakeholders
    • Pilot specific benefits

3.2.3 Learning towards Adoption

  • Approach to knowledge-building
  • Lessons learned from integration and testing useful for Adoption
    • components to be used (in the pilot) are sometimes distributed over several authorities in a Member State, requiring the commitment from all authorities. This commitment is not obvious and must be secured beforehand. Also, as the systems are distributed, the teams working on each system are distributed. Collaboration takes more time and in each team, a battle for prioritizing DE4A needs to be fought. Finally, because there are many systems involved (on both the DO and the DE- side) it is necessary to have a coordinating team (having sufficient knowledge about the DE4A solution) in each Member State to make sure that planning and communication issues are being resolved.
    • The speed of development varies per Member State. Therefore readiness for testing (and piloting, for that matter) of combinations of Data Owners and Data Evaluators from different Member States is also distributed in time. MS A can have their DE ready months before MS B has (due to several impediments). Testing on fixed moments in time for alle DEs and all DOs has proven not realistic, as does starting pilots at one moment in time. DE/DO combinations will become available for testing and piloting over a period of several weeks or months.
    • Establish clear readiness criteria for DE/DO/DE4A Connector before starting connectathons.
    • Take the time for handing over Solution Architecture to WP3 and 5, and make sure that everything is understood.
    • Keep audit-trail and error-logging simple, considering the limited number of participating companies and the highly controlled fashion of the pilot.
    • Collect pilot data in the most simple way possible, without risking data loss or integrity-loss. But don't set up complex systems to vcollect data for metrics.
    • When integrating to the DT/DR, expect to run into existing problems in the DO/DE systems that need resolving as well. This will create extra work, although the work is not directly being created due to integration with the DT/DR. The problems in the DE/DO systems were existing already, but were not causing real issues until then (problems were accepted) but might need to be resolved in order to achieve good integration to the DT/DR.
    • Work with wireframes in order to have Generic steps (Like Explicit Request and Preview) implemented in a similar way in all MS.
    • Invest in good and clear documentation for developers in MS, so they can get the DE4A Connector up and running, as well as integrate to the DE4A Connector wit minimal effort.
    • Perhaps a note on using Docker for the DE4A Connector? This caused issues during deployment as well as testing .
    • Perhaps a note on firewalls and DNS? These things caused several issues during connectathons.
    • Make sure to have a sufficient number of test eIDs available during development and testing.
    • Something on the availability and usability of eIDAS nodes for piloting? There were many challenges in this domain.
    • Working with working assumptions to reduce uncertainties and secure progress seems lto be good practice.
      • 1 evicende type, no multi-evidence, use full powers validation etc.
  • Technical, semantic and organisational/legal knowledge provided to other WPs
    • Prepare the creation of an animation by setting up a good storyline and slides that illustrate the flow of the animation.
    • Slack seems to be a good means to have developers of different MS / WPs collaborate
  • Pilot learning for “Sustainable impact and new governance models” WP (to be agreed e.g. Sustainability recommendations, standardisation needs)
  • Lessons being learned from users (questionnaires & interviews)
  • Lessons being learned from DEs and DOs (results and outputs questionnaires & interviews)
  • Other lessons from interaction with other initiatives (SEMPER, EBSI…)
    • The added value of the OOP TS compared to other developments / pilots / experiments in the EU must be broadcasted continuously in order to keep authorities involved and secure commitment.

3.3 Technical common criteria (questionnaire for evaluation?)

[Qualitative description of preliminary conclusion per criterium. Explanation (in written) on how success criteria /metrics are related with Technical common criteria, distributed by pilot dimensions]

Openness => U, A

Transparency => U, V

Reusability => V, L

Technological neutrality and data portability => L

User-centricity => V

Inclusion and accessibility => U

Security and privacy => U

Administrative simplification => A, V

Effectiveness and efficiency => A, V

[Qualitative comments, and follow-up with quantitative comments on second iteration]

Next Chapter: 4. Pilot Procedures