Goals and success criteria
Back to Doing Business Abroad main page
Back to main page of D4.7 Initial Running Phase Report
Back to Previous Chapter: 2. Current Status of Pilot
[Work in progress]
3.1 Goals and pilot success criteria
Objectives and how they are satisfied in relation to success criteria. Use D4.6 section 2.1 (Final version of success criteria and Common Pilot Criteria) as a basis.
GOALS (BLUE TEXT IS COPIED FROM D4.6)
Actor | ID | Goal |
---|---|---|
Public authorities | A | Improve the quality of Company data within the service fulfilment process by re-using data from authentic sources, thereby reducing manual work and lowering processing costs. |
Companies | B | Reduce manual work, lower transaction costs and improving enrolment speed for the company when using the Once Only Principle |
Project | C | Evaluate the OOP-components supporting the cross-border information flow:
- Assess (technical) impact on national services/registers already in place - Evaluate connections of national systems to the OOP TS |
D | Evaluate whether the solutions designed to the DBA specific challenges have proven adequate in piloting the DBA eProcedures:
- Usability of harmonised Company Evidence model - Degree to which powers must be validated - Scalability of solution for powers validation - Usability and security of Explicit Request and Preview - Need for record matching on Natural Persons - Adequacy of patterns to keep data up-to-date |
- Success Criteria for Public Authorities
- ID | - Criterion | - Technical Common Criteria | - Principles |
---|---|---|---|
Pilot goal A: Improve the quality of Company data within the service fulfilment process by re-using data from authentic sources, thereby reducing manual work and lowering processing costs | |||
- A1 |
- The DE recognizes the company data is of higher quality, more reliable and easier to process when using the OOP TS to retrieve company data directly from the DO. (e.g. can data is available in an electronic and structured format for easy processing in the systems of the DE, data requires less correcting, data is kept up to date automatically, data is reliable and leads to less exceptions when processing, data is more meaningful, has less inconsistencies and errors, is more complete). |
- Reusability, Transparency, Effectiveness & Efficiency, Administrative Simplification |
- U, A, L, V |
- A2 |
- The DE recognizes the method of powers validation to provide data of higher quality and reliability, proving that the representative is sufficiently authorized to represent the company (e.g. authorisation data is easier to interpret, authenticity is clear, data is trustworthy, there is less manual work in validating the users powers to represent the company with documents proving the relationship of the user to the company, authorization data requires less correcting, verification is easier). |
- Reusability, Transparency, Effectiveness & Efficiency, Administrative Simplification |
- U, A, L, V |
Success criteria for Companies applying for a service
ID | Criterion | Technical Common Criteria | Principles |
---|---|---|---|
Pilot goal B: Reduce manual work, lower transaction costs and improving enrolment speed for the company when using the Once Only Principle | |||
B1 |
The user acknowledges the procedure for applying for a service to be effective and efficient (e.g. the procedure requires acceptable effort and cost, the procedure is not complex, has no language barriers, no interruptions. The user spends little manual time to correct company data, and experiences no errors after finishing the enrolment process). |
Reusability, Effectiveness & Efficiency, Administrative Simplification, Transparency |
U, A, L, V |
B2 |
The user acknowledges the method to proof their authorisation as effective and efficient (e.g. requires little effort, is established with simple and effective communication, is reliable). |
Reusability, Effectiveness & Efficiency, Transparency, Security and Privacy |
U, A, L, V |
B3 |
The user acknowledges the duration of completing the online eProcedure activities to apply for a service as acceptable. |
Effectiveness & Efficiency, Administrative Simplification |
V, A |
B4 |
The user saves time and/or cost when completing the eProcedure using the OOP TS. |
Effectiveness & Efficiency |
V, A |
Success criteria and research questions for Pilot Technical Goals
ID | Criterion | Technical Common Criteria | Principles |
---|---|---|---|
Pilot goal C: Evaluate the OOP-components supporting the cross-border information flow:
· Assess technical impact on national services/registers already in place · Evaluate connections of national systems to the OOP TS | |||
C1 | The DO believes the cost and effort for integrating to the DE4A Connector will eventually be outweighed by the benefits. | Openness, Technical Neutrality and Data Portability | U, A, V |
C2 | The DE believes the cost and effort for integrating to the DE4A Connector will eventually be outweighed by the benefits. | Openness, Technical Neutrality and Data Portability | U, A, V |
C3 | The DO believes the cost and effort for integrating to the Mandate Management System will eventually be outweighed by the benefits. | Openness, Technical Neutrality and Data Portability | U, L, V |
C4 | The participating Member States believe the cost and effort for setting up and deploying the DE4A Connector in their national infrastructure will eventually be outweighed by the benefits. | Openness, Technical Neutrality and Data Portability | U, L, V |
Pilot goal D: Evaluate whether the solutions designed to the DBA specific challenges have proven adequate in piloting the DBA eProcedures | |||
D1 | Has the Company Evidence Model proven adequate for cross-border exchange of information on companies for the DBA eProcedures? | Openness, Neutrality and Data Portability, Reusability | U, V, L |
D2 | Have the solutions to validate powers proven adequate for the eProcedures involved in piloting? | Reusability, Administrative Simplification | U, L |
D3 | Have the explicit request and preview requirements as specified in the SDGR proven suitable for company eProcedures (representation scenarios)? | Administrative Simplification, User Centricity, Inclusion and Accessibility | U, L |
D4 | Have the mechanisms for record matching at the DC proven adequate for the DBA eProcedures? | Administrative Simplicity | U, L |
D5 | Have the mechanisms to keep the company information up-to-date (second pilot iteration) proven adequate | Administrative Simplicity, Effectiveness & Efficiency | U, V |
3.2 Pilot dimensions
Qualitative description on lessons learned (technical, functional, proess, data, usability etc) and preliminary conclusions on these dimensions, based on metrics, questiuonnaires and interviews? The dimensions target the scope of the piloted functionality and patterns (until delivery of the report)
3.2.1 Use
- Overview
- Initial feedback from focus group and real users
- Initial results from use related metrics (logs)
- Usefulness of DE4A patterns and components related to internal stakeholders take-up
- Strategy on pilot use until final report
3.2.2 Value
- Verified benefits with users and DEs / DOs
- Contribution of pilot to DE4A benefits and to external community of SDG stakeholders
- Pilot specific benefits
3.2.3 Learning towards Adoption
- Approach to knowledge-building
- Lessons learned from analysing and designing cross-border OOP
topic | observation | lesson for wider adoption and implementing SDG | |
---|---|---|---|
1 | knowledge required | Designing national integration required in depth knowledge of both eIDAS and OOTS. This knowledge (specifically the combination of both) is not broadly available in Member States. Knowledge of both domains should be brought together in order to prevent designing based on false assumptions of the other domain. | Invest time to bring together the eIDAS and OOTS knownledge. This requires organising and prioritising as this knowledge is scarse. |
2 | powers validation | Validating full powers has been proven to be a first (and good) step in implementing cross-border OOP for businesses (requiring company representation). It allows for moving ahead with eIDAS as is available today, is acceptable to the DE's participating in the pilots and seems fitting for SME's (it will be an official representative initiating doing business abroad most of the time). | Focus at iplementing full-powers validation flows to start with. Adding more fine grained powers validation is required for 100% implementing the ePorcedures, but also adds complexity to the solutions. |
3 | record matching | The pilot partners agreed to provide the national company registry numbers as eIDASLegalIdentifier. This diminished the need to do record matching on companies at the data owner. | DBA advises Member States to use the national company id's as eIDASLegalIdentifiers when extending the pilot to SDG-wide implementation. |
4 | explicit request | In some cases, users need to express consent for the retrieval of attributes. In almost all cases when using the OOTS, the user needs to express explicit request. Although legally sound, in practise the difference between both is difficult to understand for data evaluators. DE's furthermore expect that users might ignore such requests and just click "ok". | DBA advises data evaluators to integrate request to consent and explicit request into one joint question to the user to prevent adding to the confusion. |
5 | Multiple-MS scenario's | Three- or multiple Member State scenario's (add examples) tend to get really complex, requiring disproportionate resources to analyse, design and implement. | DBA advises Member States to start SDG-implementation with the simplest interaction patterns involving just two Membert States. |
- components to be used (in the pilot) are sometimes distributed over several authorities in a Member State, requiring the commitment from all authorities. This commitment is not obvious and must be secured beforehand. Also, as the systems are distributed, the teams working on each system are distributed. Collaboration takes more time and in each team, a battle for prioritizing needs to be fought. Finally, because there are many systems involved (on both the DO and the DE- side) it is necessary to have a coordinating team (having sufficient knowledge about the solution) in each Member State to make sure that planning and communication issues are being resolved.
- The speed of development varies per Member State. Therefore readiness for testing (and piloting, for that matter) of combinations of Data Owners and Data Evaluators from different Member States is also distributed in time. MS A can have their DE ready months before MS B has (due to several impediments). Testing on fixed moments in time for alle DEs and all DOs has proven not realistic, as does starting pilots at one moment in time. DE/DO combinations will become available for testing and piloting over a period of several weeks or months.
- Establish clear readiness criteria for DE/DO/DE4A Connector before starting connectathons.
- Take the time for handing over Solution Architecture to WP3 and 5, and make sure that everything is understood.
- Keep audit-trail and error-logging simple, considering the limited number of participating companies and the highly controlled fashion of the pilot.
- Collect pilot data in the most simple way possible, without risking data loss or integrity-loss. But don't set up complex systems to collect data for metrics.
- When integrating to the DT/DR, expect to run into existing problems in the DO/DE systems that need resolving as well. This will create extra work, although the work is not directly being created due to integration with the DT/DR. The problems in the DE/DO systems were existing already, but were not causing real issues until then (problems were accepted) but might need to be resolved in order to achieve good integration to the DT/DR.
- Work with wireframes in order to have Generic steps (Like Explicit Request and Preview) implemented in a similar way in all MS.
- Invest in good and clear documentation for developers in MS, so they can get the DE4A Connector up and running, as well as integrate to the DE4A Connector wit minimal effort.
- Perhaps a note on using Docker for the DE4A Connector? This caused issues during deployment as well as testing .
- Perhaps a note on firewalls and DNS? These things caused several issues during connectathons.
- Make sure to have a sufficient number of test eIDs available during development and testing.
- Something on the availability and usability of eIDAS nodes for piloting? There were many challenges in this domain.
- Working with working assumptions to reduce uncertainties and secure progress seems lto be good practice.
- 1 evicende type, no multi-evidence, use full powers validation etc.
- Obtaining certificates proves not easy due to signing of documents and actually receiving the certificates.
- Technical, semantic and organisational/legal knowledge provided to other WPs
- Prepare the creation of an animation by setting up a good storyline and slides that illustrate the flow of the animation.
- Slack seems to be a good means to have developers of different MS / WPs collaborate
- Also use input from additional questionnaire regarding technical issues and process of connecting (connectathons) and documentation etc
- Pilot learning for “Sustainable impact and new governance models” WP (to be agreed e.g. Sustainability recommendations, standardisation needs)
- need for harmonisation of PoR-scope (harmonised services - see example harmonisation in DBA)
- need for hamonisation of event types (see example DBA as well)
- need for uniform way of communicating company representation in eIDAS
- Lot of dicsussion going on dealing with the value of piloting DE4A OOTS while at the same time the commission has been working on the SDG OOTS. Although deliberately chosen strategy, it serverly hindered involvement of (organisations surrounding) the pilot partners.
- Lessons being learned from users (questionnaires & interviews)
- Lessons being learned from DEs and DOs (results and outputs questionnaires & interviews)
- Bank account information is not part of the CompanyEvidence. But it can be requested to provide afterwards, in screens in the eProcedure after exchange of evidence by the OOP TS. It seems that BIC-codes are unique, but BANK-codes (like INGB etc) are not unique across Europe (but are unique within one country). The DE might do some kind of validation on the bankcode and might then run into issues because of new bank-codes (from Romania, for example) that are identical to bank-codes in the country of the DE (The Netherlands for example). DE systems should therefore consider closely how to work with bank information (when entered manually, but also in case of future extension of CompanyEvidence).
- Other lessons from interaction with other initiatives (SEMPER, EBSI…)
- The added value of the OOP TS compared to other developments / pilots / experiments in the EU must be broadcasted continuously in order to keep authorities involved and secure commitment.
- (Potential) differences between the DE4A scope and the Implementing Act distract and demotivate participants from prioritizing development and deployment of changes in their national infrastructures.
- integration with or alignment to BRIS seems logical, but in practise very difficult to achieve due to differences in legal contact, purpose, allowed use, data definition, partners involved, development planning, etc.
3.3 Technical common criteria (questionnaire for evaluation?)
[Qualitative description of preliminary conclusion per criterium. Explanation (in written) on how success criteria /metrics are related with Technical common criteria, distributed by pilot dimensions]
Openness => U, A
Transparency => U, V
Reusability => V, L
- eIDAS tech profile 1.2 is more prescriptive than profile 1.1. Depending on the way profile 1.1 is implemented in a MS, it might not work with the way 1.2 prescribes the implementation of this version, introducing an error in the authentication process between MS A (using 1.1) and MS B (using 1.2).
Technological neutrality and data portability => L
User-centricity => V
Inclusion and accessibility => U
Security and privacy => U
Administrative simplification => A, V
Effectiveness and efficiency => A, V
- Not sure if this is the right pace, but we should address the design decisions we made too. And for that matter, also say something about how we comply to the SDG (or where we deviate and why, for example we work with 1 company evidence type while the SDG states that a DE should not receive more attributes than they strictly need for the procedure, which might not always be the case in DBA). Another example is that the preview is provided by the DE, meaning that the transfer has already taken place.
[Qualitative comments, and follow-up with quantitative comments on second iteration]