13.08.2022. The concept of privacy by design requires that privacy should be taken into account during the entire lifetime of a project. From the early design stages to the operation of the system. This means that data protection measures are “baked in” the project and are not an “afterthought”.
The European committee for standardisation, CEN/CENELEC, has recently published the new standard “EN 17529:2022 Data protection and privacy by design and by default”. The standard was prepared by WG 5 “Data Protection Privacy and Identity Management” of the CEN/CENELEC JTC 13 “Cybersecurity and Data Protection”. It is aimed towards manufacturers and/or service providers to implement Data Protection and Privacy by Design and by Default early in the development of their products and services to make sure that they are “privacy ready” as early as possible. The document is applicable to all business sectors including the security industry. The standard’s definitive text has been made available on 18/05/2022.
Moreover, the concept of data protection by design is also defined in Article 25 of GDPR which dictates that entities responsible for the processing of personal data implement measures which protect the personal data both during the design and determination of the means of processing as well as at the time of processing itself.
In , the author describes the 7 foundational principles of the Privacy-by-Design paradigm.
This principle dictates that a system employing the privacy-by-design approach should anticipate and prevent privacy invasive issues before they occur. This means that strong privacy practices need to be proactively adopted “early and consistently”.
Collected personal data need to be protected by default. The privacy by design approach requires the maximum level of privacy to be enabled automatically and by default. If the data subject does not take any steps, their privacy is protected without any action on their part.
This principle dictates that privacy is not a simple add-on to the system. Rather it is embedded into the design process and the architecture of the system. Privacy then becomes an essential component of the system.
The full functionality principle ensures that no unnecessary trade-offs are made for the sake of privacy alone. Functionality should not be compromised unnecessarily for privacy. It dictates that it is possible to achieve full functionality while respecting user privacy.
Before collecting any data, and during the entire lifecycle of data collection and processing strong security measures and guarantees need to be applied to ensure that all collected data is securely stored and securely destroyed when they are no longer necessary.
The principle of visibility and transparency aims at ensuring that no matter the selected technology and measures implemented, the system is operating according to the stated objectives subject to independent verification. The system’s components need to be visible and transparent at all times to both users and providers.
Finally, and most importantly, the Privacy-by-Design approach requires that the interests of the individual data subjects are protected by offering measures that promote “strong privacy defaults, appropriate notice and user-friendly options”.
Confidentiality, Integrity, and Availability (CIA): The GDPR dictates that confidentiality and integrity be protected. In addition to these, availability needs to be ensured. Data need to be available when needed. This means that the systems used to store and process the data, as well as the security measures used to protect it, and the communication channels used to access it must be functioning correctly. Denial-of-Service (DoS) attacks need to be protected against in order to ensure availability. Such attacks aim at flooding the system with incoming messages such that the system will not be able to handle them all and be forced to shut down.
Unlinkability, Transparency and Intervenability: The goal of unlinkability as defined in ISO/IEC 15408-1:2009 is to ensure that a user may use their data in multiple services without third parties being able to link the different uses back to the same individual. It also requires that third parties are unable to determine whether the same user has caused different operations within the system. Transparency has been discussed in previous subsections.
Intervenability dictates that a system needs to ensure that intervention by the data subjects is possible for all ongoing or planned processing of personal data. This enables the application of corrective measures whenever that is deemed necessary and ensures that essential rights of the data subjects like the right to rectification and erasure, the right to withdraw consent etc., that are defined in GDPR, can be accommodated at all times.
Privacy by design strategies
Given the GDPR, the Privacy-by-Design key principles, in  the author derives several design strategies that the system should follow in order to achieve the above-mentioned goals.
By ensuring the minimization of collected data, the possible privacy impact of the system is limited.
Any collected personal data and their interrelationships should be hidden from plain view. The implementation of the hide strategy depends on the specific context of the data that need to be hidden and the specific access control rules that should apply to them. In certain cases, this strategy can be used to hide information which “spontaneously” emerges from the use of a system from anybody, while in other cases certain parties need to be given access to the data. The hide strategy in this case needs to ensure that no other parties will be given access and thus ensuring confidentiality.
Personal data should be processed in a distributed fashion in separate compartments whenever possible. By separating a user’s data, the system makes it difficult to create a complete profile for the user. This separation can be used to achieve the purpose limitation requirement. Moreover, a successful attack on a database will not leak all user data if these are separated in different locations and thus reduces the benefit to cost ratio of a successful attack.
“Personal data should be processed at the highest level of aggregation and with the least possible detail that is still useful”. By aggregating data from multiple users, we reduce the sensitivity of said data. This means that a successful attack on the system that leaks the aggregated data will leak less information for individual users.
Data subjects need to be informed whenever their personal data is processed. This includes information about which third parties may have access to their data as well as being informed on their rights and how to exercise them.
Along with informing users about their rights the system needs to allow them to exercise these rights. It needs to give users the control over their data by providing them with the ability to modify or remove them.
In the ReHyb project, we have a rigorous ethical approach to handling personal data from our users. Before collecting personal data, our users are always informed of the type of collected data, the processing that will occur and all their rights regarding their personal data. All the processes for data collection have passed through official ethical procedures and have been approved by local authorities. Oversight mechanisms have been put in place in order to guide and oversee the project development and ensure that the privacy by design principles, the design strategies and the privacy goals mentioned above are always respected and implemented.
 A. Cavoukian and A. Stoianov, “Privacy by Design The 7 Foundational Principles,” 2007. DOI: 10.1016/S0969-4765(07)70084-X.
 J.-H. Hoepman, “IFIP AICT 428 – Privacy Design Strategies,” pp. 446–459, 2014, Accessed: Sep. 02, 2020. [Online]. Available: http://privacypatterns.org/.