Tihan, Eusebiu Jean (2026), Information and Personal Freedom: Individual Informational Security, Intelligence Info, 5:2, DOI: 10.58679/II33199, https://www.intelligenceinfo.org/information-and-personal-freedom-individual-informational-security/
Abstract
This analysis examines the tension between the need for collective security and the right to privacy, exploring the concept of individual informational security in the digital age.
Keywords: informational security, privacy, informational autonomy, surveillance, digital democracy
Informație și libertate personală: Securitatea informațională individuală
Rezumat
Analiza tensiunii dintre necesitatea de securitate colectivă și dreptul la viața privată, examinând conceptul de securitate informațională individuală în era digitală.
Cuvinte cheie: securitate informațională, viață privată, autonomie informațională, supraveghere, democrație digitală
INTELLIGENCE INFO, Volumul 5, Numărul 2, Iunie 2026, pp. xxx
ISSN 2821 – 8159, ISSN – L 2821 – 8159, DOI: 10.58679/II33199
URL: https://www.intelligenceinfo.org/information-and-personal-freedom-individual-informational-security/
© 2026 Eusebiu Jean TIHAN. Responsabilitatea conținutului, interpretărilor și opiniilor exprimate revine exclusiv autorilor.
Information and Personal Freedom: Individual Informational Security
Psih. Eusebiu Jean TIHAN, MSc[1]
eusebiu.tihan@gmail.com
[1] Independent researcher, https://orcid.org/0009-0008-8316-3679
Objectives: This analysis examines the tension between the need for collective security and the right to privacy, exploring the concept of individual informational security in the digital age.
Methods: The study employs an interdisciplinary, integrative theoretical analysis, based on a critical examination of legal frameworks (constitutional and international), philosophical concepts, and relevant psychological theories.
Results: The digital identity extends and exposes the personality. The protection of informational autonomy, founded on the principles of necessity and proportionality, is essential for human dignity, social trust, and democratic legitimacy. Excessive surveillance generates negative psychosocial effects such as self-censorship and the erosion of social capital.
Conclusion: Individual informational security is a fundamental condition, not an obstacle, for genuine and sustainable national security in a democratic state.
Public Significance Statement: The study highlights that protecting the digital private sphere is vital for the health of democracy, social cohesion, and resilience against threats, being equally as important as traditional security measures.
Introduction: From Macro-Security to Micro-Security
Contemporary national security faces a fundamental paradox: to protect its citizens from external threats, the state needs to know a great deal about these citizens, yet this very effort of knowledge-gathering can itself become a threat to individual liberties. This tension between collective security and individual freedom represents one of the most complex challenges for the modern democratic state. This chapter explores this dynamic, examining how information, as a constitutive element of human identity, becomes simultaneously a tool for security and an object of protection.
The shift from an exclusively “macro” approach to security (focused on the state and its strategic interests) towards a perspective that integrates the “micro” dimension (centered on the individual and their fundamental rights) reflects a profound evolution in national security theory and practice. If in the past individual security was seen primarily as a derivative of state security, it is now increasingly recognised that genuine national security cannot be built upon the ruins of individual security. On the contrary, a society where citizens feel protected in their fundamental rights and freedoms is a more resilient, cohesive, and ultimately safer society [1].
This chapter analyses this complex relationship on multiple levels: conceptual (defining informational identity and personal autonomy), legal (the normative framework regulating the tension between security and liberty), psychological (the impact of privacy violations on the individual and society), and operational (the implications for intelligence activities within a democratic state). In this endeavour, we aim to demonstrate that protecting individual informational security does not represent an obstacle to intelligence and operational activities, but rather a condition for their legitimacy and long-term effectiveness.
Methodological Positioning and Epistemological Limitations
This research adopts an interdisciplinary approach at the intersection of psychology, constitutional law, security studies, and the philosophy of technology to analyse the complex dynamic between individual informational security and collective security needs in a democracy.
The methodological positioning is based on theoretical and conceptual analysis, accompanied by a critical examination of the normative framework. It is not an original empirical study but a theoretical synthesis that integrates and interprets data and concepts from several disciplines:
- Legal and Normative Analysis:Critical examination of the constitutional framework, national legislation (with emphasis on Law no. 51/1991 and data protection regulations), and international norms (especially the GDPR and the European Convention on Human Rights). This includes analysis of relevant case law from the European Court of Human Rights (ECtHR) and the Constitutional Court of Romania.
- Conceptual-Philosophical Analysis:Exploration and development of key concepts such as informational identity, informational self-determination, and personal autonomy in the context of the digital era, with reference to the works of contemporary theorists (e.g., Floridi, Zuboff).
- Psychological Synthesis:Integration of theories and research from psychology (e.g., communication privacy management theory, the chilling effect, perceived control, vicarious trauma) to understand the impact of privacy violations on the individual and social cohesion.
- Contextual Security Analysis:Application of perspectives from security studies to reconceptualise the relationship between individual and collective security, exploring frameworks such as human security and resilient security.
The primary method is the critical analysis of literature and theoretical synthesis. This approach is suitable for mapping the complex and rapidly evolving landscape of challenges to individual informational security and for proposing conceptual directions for reflection and action.
The epistemological limitations of this approach are acknowledged:
- Lack of Original Empirical Data:The analysis relies heavily on secondary research and theoretical literature. It does not present new empirical data collected by the author (e.g., surveys, experiments, specific content analyses), which limits the ability to make new empirical claims about the specific perceptions or behaviours of the Romanian population. Generalising psychosocial conclusions based on studies from other contexts must be done cautiously.
- Complexity and Dynamism of the Subject:The fields of technology, cyber threats, and regulations evolve extremely rapidly. An analysis written at a given moment may quickly become partially outdated by technological or legislative developments, representing an intrinsic limitation of any academic endeavour in this domain.
- Challenges of Interdisciplinary Integration:The attempt to synthesise such diverse perspectives (legal, philosophical, psychological, technical) involves the risk of a certain superficiality or of “managing” the complexity of each field, which a monodisciplinary approach would avoid.
- Limited Access to Operational Practices and Data:The analysis of intelligence service activities and the implementation of regulations is based exclusively on public sources (laws, reports, published doctrine, academic literature). It was not possible to examine classified operational practices, internal decisions, or data regarding the volume and nature of interventions, which limits the perspective to the formal legal framework and declared principles.
Despite these limitations, the interdisciplinary and theoretical approach is considered valuable and necessary to understand the problem in its full complexity. It offers a conceptual map and an essential framework for reflection for future empirical research and for the formulation of balanced policies. This study does not aim to provide definitive empirical answers, but to structure the problem, elucidate fundamental tensions, and propose directions for the necessary reconciliation between security, freedom, and human dignity in the informational space.
1. Informational Identity: The Digital Extension of Personality
1.1 Evolution of the Concept of Identity
Human identity has evolved significantly with technological progress and social transformations. If in traditional societies identity was largely determined by local and relational factors (belonging to a family, a community, a profession), in the modern era it has become increasingly bureaucratic and documentary. The emergence of birth certificates, passports, driving licences, and other identity documents represented a first stage in the process of formalising and externalising identity [2].
In the digital age, this process has intensified and radicalised. Today, the individual is no longer defined only by the physical documents they possess, but also by a complex “digital double” composed of: biometric data (fingerprints, facial recognition, DNA), biographical data (educational, professional, medical history), behavioural data (consumption patterns, cultural preferences, social relationships) and generic identification data (name, address, contact details). This informational aggregate does not merely describe the individual; it increasingly becomes a precondition for their access to rights, services, and opportunities in contemporary society [3].
1.2 Informational Self-Determination: From Concept to Right
The concept of informational self-determination (informationale Selbstbestimmung) first appeared in German jurisprudence in the 1980s, in the context of a controversy regarding the population census. In a landmark 1983 decision, the German Federal Constitutional Court established that “in the context of automated data processing, the protection of the individual against the unlimited collection, storage, use, and transmission of their personal data is encompassed in the general right of personality” (BVerfGE 65, 1). This decision formulated the principle that every individual has the right to decide freely, in general, who knows what about them and when [4].
This concept was subsequently adopted and developed in European law, culminating in its inclusion in the European Union’s General Data Protection Regulation (GDPR). The GDPR enshrines not only the right to the protection of personal data but also a series of derived rights that constitute components of informational autonomy: the right of access, the right to rectification, the right to erasure (‘the right to be forgotten’), the right to restriction of processing, the right to data portability, and the right to object [5].
Informational self-determination thus represents a fundamental extension of personal autonomy in the digital age. It recognises that without control over the information that describes and defines us, individual freedom becomes illusory. As philosopher Luciano Floridi observes, in the “information society,” the individual is no longer just a biological organism but an “inforg” (informational entity), and protecting the integrity of this entity becomes an essential condition for human dignity [6].
1.3 Vulnerabilities of Digital Identity
While digital identity offers new opportunities for expression and social participation, it also creates specific vulnerabilities. Among the most significant are:
- Identity Theft:This represents one of the most frequent forms of cybercrime. According to a European Commission report, in 2022 approximately 8% of EU citizens reported being victims of identity theft or related fraud [7]. The impact of this phenomenon extends far beyond the financial dimension; it can profoundly affect victims’ lives through damage to reputation, blocking access to essential services, and causing significant psychological trauma.
- Excessive Profiling and Algorithmic Discrimination:Automated decision-making systems based on profiling individuals can perpetuate or even amplify social prejudices and inequalities. An analysis by ProPublica in 2016 demonstrated that an algorithm used in the US judicial system to predict recidivism risk was significantly more likely to classify black individuals as high-risk, even when they had a profile similar to white individuals classified as low-risk [8].
- Loss of Control Over Personal Information:Once disclosed, personal information is extremely difficult to retrieve. The phenomenon of “data exhaust” – the continuous generation of personal data through daily digital activities – creates a detailed portrait of the individual that far exceeds what they might know about themselves or wish to share with others [9].
- Identity Fragmentation:In the absence of a coherent digital identity management framework, individuals often face multiple partial and incompatible identities in different digital contexts, which can lead to confusion, conflict, and a loss of identity coherence.
These vulnerabilities are not merely technical or legal problems; they touch the essence of human autonomy and dignity. They highlight the need for a robust digital identity protection framework that balances the opportunities offered by digitalisation with the imperative of protecting fundamental freedoms.
2. The Tension Between Collective Security and Privacy: The Legal and Institutional Framework
2.1 The Constitutional Foundation of the Right to Privacy
In Romania, the right to privacy is protected as a fundamental right both by the Constitution and by international treaties to which Romania is a party. The Romanian Constitution, in Article 26, guarantees that “the private, family, and intimate life of the person is inviolable,” and Article 27 provides that “the security and secrecy of letters, telegrams, other postal dispatches, telephone conversations, and other legal means of communication are inviolable” [10]. These constitutional provisions are complemented and developed by the European Convention on Human Rights, whose Article 8 protects the right to respect for private and family life, home, and correspondence.
The case law of European courts, particularly the European Court of Human Rights (ECtHR), has developed a rich doctrine regarding the interpretation and application of this right, with direct implications for intelligence and operational activities. In the case of Klass v. Germany (1978), the ECtHR recognised that states have the right to adopt secret surveillance measures for the purpose of protecting national security, but emphasised that these measures must be subject to adequate safeguards to prevent abuses. The decision established several key principles that continue to guide European jurisprudence in this area: (1) the existence of a clear and accessible legal basis; (2) strict definition of the conditions under which the measures may be applied; (3) the existence of independent control over the use of measures; and (4) the existence of effective remedies for affected persons [11].
2.2 Principles of Necessity, Proportionality, and Subsidiarity
The tension between national security and the right to privacy is managed through the application of three fundamental principles: necessity, proportionality, and subsidiarity.
The principle of necessity requires that any interference with the right to privacy be justified by a compelling social need. In the context of national security, this means that surveillance or information-gathering measures cannot be used in a generalised or preventive manner, but only when there are concrete and justified suspicions regarding the existence of a real threat to national security. The Constitutional Court of Romania has repeatedly emphasised that “national security does not constitute a pretext for the arbitrary limitation of fundamental rights” [12].
The principle of proportionality requires that the adopted measure be appropriate to the aim pursued, be the least restrictive among the available measures, and not impose an excessive burden on the individual’s rights. In practice, this principle imposes a three-stage analysis: (1) the measure must be suitable for achieving the legitimate aim; (2) the measure must be necessary, in the sense that there is no less restrictive alternative; (3) the measure must be proportionate in the strict sense, meaning the benefits for national security must outweigh the harm caused to individuals’ rights [13].
The principle of subsidiarity stipulates that measures interfering with the right to privacy must be the last resort, after all other less restrictive means have been exhausted. In the context of intelligence and operational activities, this principle requires that secret surveillance or interception of communications be used only when conventional investigation methods have proven insufficient or inadequate.
2.3 The Romanian Normative Framework for Intelligence Activities
In Romania, intelligence and operational activities are regulated primarily by Law no. 51/1991 regarding Romania’s national security, with subsequent amendments and completions. This law establishes the attributions of the intelligence services, the conditions under which they can conduct surveillance and information-gathering activities, as well as the control mechanisms over their activity.
According to Article 19(1) of Law no. 51/1991, intelligence services may use special investigation means (including interception of communications, technical surveillance, and the use of informants) only “under the conditions of the law, for the purpose of preventing and detecting acts which constitute threats to national security.” The law explicitly enumerates the types of threats that can justify the use of these means: espionage, terrorism, treason, actions to undermine state power, sabotage, trafficking of weapons, ammunition, explosives, radioactive or narcotic substances, and other acts provided by criminal law that affect national security.
An important amendment to this legislative framework was introduced by Law no. 235/2015, which transposed Directive 2006/24/EC on data retention into Romanian law. This law established stricter rules for the retention of traffic and location data by providers of electronic communications services, as well as for access by competent authorities to this data. The law provides that data may be retained for a maximum period of 6 months in the case of traffic data and 12 months in the case of location data, and access to it is permitted only for the purpose of investigating serious crimes or threats to national security, with the prior approval of a prosecutor or a judge.
2.4 Democratic Control Mechanisms
In a democratic society, the activity of intelligence services is subject to multiple control mechanisms, which aim to prevent abuses and guarantee respect for fundamental rights. In Romania, these mechanisms include:
- Parliamentary Control:According to Article 65 of Law no. 51/1991, the intelligence services are subject to the control of the Chamber of Deputies and the Senate through the parliamentary committees for human rights and for defence, public order, and national security. These committees have the right to request information and reports from the services, to hear their leaders, and to conduct on-site checks. Furthermore, Parliament appoints and oversees the activity of the Ombudsman, who can investigate complaints regarding the violation of fundamental rights by the intelligence services.
- Judicial Control:The activity of intelligence services is subject to the control of the courts in several ways. Firstly, the use of special investigation means requires the prior approval of a judge. Secondly, persons who consider that their rights have been violated by the services’ activity can appeal to the courts to obtain redress. Thirdly, courts can examine the legality of administrative acts issued by the services and can annul those that are illegal.
- Executive Control:The President of Romania, as head of state, and the Government, through the relevant ministries, exercise administrative control over the intelligence services. The Supreme Council of National Defence (CSAT), led by the President, has among its attributions the coordination and control of the intelligence services.
- Specialised Control:In 2005, Romania established the National Supervisory Authority for Personal Data Processing (ANSPDCP), whose mission is to supervise and ensure compliance with legislation regarding personal data protection. ANSPDCP can conduct inspections at the headquarters of intelligence services, can issue opinions and recommendations, and can apply sanctions in case of violations.
- Societal Control:The media and non-governmental organisations play a crucial role in monitoring the activity of intelligence services and in alerting public opinion regarding potential abuses. In Romania, organisations such as the Association for the Defence of Human Rights in Romania – Helsinki Committee (APADOR-CH) and the Centre for Independent Journalism have consistently worked to promote the transparency and accountability of intelligence services.
These multiple control mechanisms are not perfect and face constant challenges, especially regarding access to classified information and the ability to perform effective controls without compromising national security. Nevertheless, they represent an important network of checks and balances that contribute to preventing abuses and protecting fundamental rights in the context of intelligence and operational activities.
3. The Psychological Impact of Privacy Violations
3.1 Psychological Theories of Privacy
Psychology has approached the concept of privacy from multiple perspectives, highlighting its fundamental functions for mental health and personal development. Irwin Altman, one of the pioneers in the study of the psychology of personal space, defined privacy as “the selective control of access to the self or to one’s group” [14]. This definition highlights two crucial aspects: (1) privacy is a dynamic process, not a static state; (2) it involves exercising control over the boundaries between the self and the external world.
Sandra Petronio developed these ideas in the Communication Privacy Management Theory, which conceptualises privacy as a system of rules governing when, how, and with whom we share information about ourselves. According to this theory, individuals manage privacy boundaries through three main processes: (1) establishing rules regarding ownership and control over personal information; (2) coordinating boundaries with others through communication and negotiation; (3) managing turbulence when these boundaries are violated[15].
From a psychodynamic perspective, privacy is essential for the process of individuation and for maintaining a coherent sense of self. Psychoanalyst Donald Winnicott introduced the concept of the “potential space” or “area of experience” that exists between the individual and the external environment, an intermediate space where the individual can experiment, create, and develop without the pressure of conformity to external reality [16]. Violation of this space can disturb fundamental processes of development and adaptation.
3.2 The Chilling Effect
One of the most significant psychological impacts of surveillance on civil liberties is the “chilling effect.” This concept, originating in American law, refers to how the threat of sanctions or surveillance can cause individuals to limit their exercise of fundamental rights, even when these limits are not directly or legally imposed. In the context of state surveillance, the chilling effect manifests through multiple psychological mechanisms:
- Self-Censorship:Individuals may choose to limit their expression of political opinions, participation in political activities, or association with certain groups for fear that these activities will attract the attention of the authorities. A study conducted after Edward Snowden’s revelations about NSA surveillance programmes found that people who knew more about these programmes were less likely to search online for information about politically sensitive subjects, such as terrorism or explosives [17].
- Social Conformity:The threat of surveillance can intensify pressure for social and political conformity, discouraging deviance and dissent. This can lead to a “spiral of silence” in which persons with minority or unpopular opinions become increasingly reluctant to express their views, allowing majority opinions to become even more dominant [18].
- Political Alienation:The feeling of being surveilled can damage citizens’ trust in democratic institutions and the political process. This can lead to withdrawal from civic participation and decreased political engagement, undermining the foundations of participatory democracy.
The chilling effect is particularly problematic because it can be subtle and difficult to measure. Individuals may change their behaviour without being explicitly aware of this change or without being able to articulate the reasons for doing so. Moreover, the effect can be disproportionate on vulnerable or marginalised groups, who may be more sensitive to the threat of surveillance due to historical experiences or the perception of weaker legal protection.
3.3 Anxiety and Loss of Autonomy
Violation of privacy can cause or intensify various forms of anxiety and psychological stress. Psychologists have identified several mechanisms through which this occurs:
- Loss of Control:The perception that personal information is being collected, stored, and used without consent or control can undermine an individual’s sense of autonomy and self-efficacy. Perceived control theory suggests that the perceived ability to influence events in one’s own life is a crucial factor for psychological well-being and the ability to cope with stress [19]. When this capacity is eroded through loss of control over personal information, psychological resilience can be affected.
- Vulnerability and Distrust:Surveillance can create a feeling of vulnerability and suspicion. Individuals may become more cautious in their social interactions, more reluctant to open up or trust others, including institutions. This can lead to social isolation and a deterioration in the quality of interpersonal relationships.
- Cognitive Dissonance and Role Conflict:Persons who feel they must constantly monitor their behaviour to avoid unwanted surveillance may experience cognitive dissonance between their real behaviour and what they perceive as “safe.” This dissonance can cause stress and psychological exhaustion.
- Vicarious Trauma:Constant exposure to the potential threat of surveillance, even when it does not materialise concretely, can cause forms of stress and anxiety similar to those associated with direct exposure to threats. This phenomenon is known in psychology as “vicarious traumatisation” [20]. In therapeutic relationships, vicarious traumatisation is a transformation in the helper’s own perspective regarding safety, trust, and worldview, resulting from empathetic involvement with traumatised clients.
The impact of these psychological mechanisms can vary depending on individual characteristics, prior experiences, cultural context, and personality factors. However, research suggests the effects are real and significant, with important implications not only for individual well-being but also for the healthy functioning of society as a whole.
3.4 Impact on Social Cohesion and Institutional Trust
Violation of privacy does not only affect isolated individuals but also the broader social structure. Among the most important social effects are:
- Erosion of Social Capital:The concept of social capital, introduced by Robert Putnam, refers to social networks, norms of reciprocity, and trust that facilitate cooperation and coordination within a society [21]. Excessive surveillance can undermine social capital by damaging trust between citizens and between citizens and institutions. When individuals feel surveilled, they may become more suspicious of others’ intentions and more reluctant to engage in cooperative actions or share resources.
- Institutional Fragility:Trust in institutions is a crucial component of social and political stability. When citizens perceive that state institutions, especially those responsible for security, systematically violate the right to privacy, this trust can be eroded. Moreover, the loss of trust can become a vicious cycle: citizens become less willing to cooperate with institutions, which makes them less efficient and therefore less trustworthy.
- Social Polarisation:Surveillance can exacerbate existing social divisions. Groups that feel targeted by surveillance (based on ethnic, religious, political, or other characteristics) may become more alienated and suspicious of the majority. This can intensify intergroup tensions and undermine social solidarity.
- Normalisation of Surveillance:In the long term, constant exposure to surveillance can lead to its normalisation – a process by which practices that would have previously been considered unacceptable gradually become accepted as normal or inevitable. This phenomenon, sometimes called the “boiling frog effect,” can undermine society’s ability to protect fundamental rights by reducing sensitivity to their violation.
These social effects are particularly relevant for national security, as they touch the very foundations of societal resilience. A fragmented, distrustful, and polarised society is much more vulnerable to external and internal threats than a cohesive and united one. Therefore, protecting the right to privacy is not just a matter of individual rights but also an essential component of long-term national security.
4. Contemporary Challenges in Protecting Individual Informational Security
4.1 Technological Transformation and Emerging Threats
Rapid technological evolution has created new ways of collecting, processing, and using personal information, but also new vulnerabilities and threats to individual informational security. Among the most significant contemporary challenges are:
- The Internet of Things (IoT) and the Smart Home:The proliferation of connected devices in people’s homes – from smart thermostats and security systems to appliances and entertainment devices – has created a dense network of sensors that constantly collect data on occupants’ behaviour and preferences. This data can be extremely sensitive, revealing not only daily habits but also absences from home, health status, or family relationships. The security vulnerabilities of these devices and the lack of adequate data protection standards create significant risks to privacy [22].
- Advanced Biometrics:Technologies such as facial recognition, iris scanning, voice recognition, and other forms of advanced biometrics are increasingly widespread, both in commercial and governmental applications. While these technologies offer benefits in terms of security and convenience, they also raise profound issues related to consent, accuracy (including algorithmic bias), and potentially abusive uses. Furthermore, biometric data is considered sensitive data, as it is generally permanent and uniquely identifiable [23].
- Artificial Intelligence and Predictive Analysis:Artificial intelligence algorithms can analyse vast volumes of data to identify patterns, make predictions, and take decisions. In the security context, this can include analysing online behaviour to identify potential threats. However, these systems can perpetuate or even amplify existing biases, can make decisions based on correlations that do not reflect real causality, and can operate with an opacity that makes challenging or understanding their decisions difficult [24].
- Surveillance Capitalism:The term introduced by Shoshana Zuboff describes a new economic system in which human experience is treated as free raw material that can be transformed into behavioural data, which in turn can be sold for profit. This economic logic creates powerful incentives for the increasingly extensive and intrusive collection of personal data, often without individuals’ informed consent and with profound impacts on autonomy and democracy [9].
4.2 Changing Nature of Threats: From State to Non-State Actors
While traditional discussions about the tension between security and liberty have focused largely on the relationship between the individual and the state, contemporary threats to individual informational security increasingly come from non-state actors:
- Technology Corporations:Global tech companies collect, store, and process unprecedented volumes of personal data. While this data is often collected for commercial purposes (targeted advertising, service personalisation), it can also be the subject of government requests, the target of cyberattacks, or used in ways that exceed users’ expectations and consent.
- Organised Cybercrime:Cybercrime groups have developed sophisticated capabilities to exploit vulnerabilities in IT systems and gain access to sensitive personal data. This data can then be used for identity theft, blackmail, financial fraud, or sale on the black market.
- Hacktivists and Influence Groups:Non-state political actors can use hacking and disinformation techniques to obtain and disclose personal data with the aim of influencing political processes, discrediting individuals or organisations, or causing social instability.
- Foreign States and Influence Operations:Nation-states continue to represent a significant threat to individual informational security, but their actions are increasingly mediated through non-state actors or via techniques that make attribution difficult. Foreign influence operations may involve the abusive collection and use of personal data to manipulate public opinion or undermine democratic processes.
This diversification of threat sources creates complex challenges for the traditional framework for protecting individual informational security. While the existing legal framework offers some tools to address these threats (such as data protection regulations, criminal law on cybercrime), the effectiveness of these tools is often limited by the transnational nature of the threats, the speed of technological change, and the resource asymmetry between individuals and the major actors in the informational space.
4.3 Legal and Implementation Challenges
Protecting individual informational security faces multiple challenges in the legal and implementation domains:
- Jurisdictional Divergences and Conflict of Laws:In a globalised world, personal data frequently crosses national borders, creating conflicts between jurisdictions with different approaches to data protection. For example, the European GDPR imposes significant restrictions on the transfer of personal data to countries that do not provide an adequate level of protection, but these restrictions often clash with the extensive data collection practices of some global companies and the national security requirements of various states.
- Challenges in Law Enforcement:The complex technical nature of many violations can exceed the expertise of supervisory authorities, which do not always have specialists in cybersecurity, cryptography, or algorithmic engineering. This asymmetry of competence between supervisors and supervised entities (especially global tech platforms) creates a practical imbalance that undermines the effectiveness of the formal legal framework. Cross-border investigations also require international cooperation that can be slow, bureaucratic, and subject to divergent political interests.
- Algorithmic Opacity and Lack of Transparency:Many automated decision-making systems that process personal data operate as “black boxes” – even their creators cannot always explain precisely how they reach certain conclusions. This opacity creates profound challenges for fundamental principles of law, including the right to a fair trial, the right to an effective remedy, and the right to information.
- Informed Consent in Practice:Although the GDPR and other regulations make informed consent a cornerstone of data protection, in practice, true and informed consent is often illusory. Terms and conditions are excessively long and complex, and individuals are subjected to power imbalances and manipulative designs (“dark patterns”) that hinder autonomous and informed decisions. These problems are exacerbated by “consent fatigue” – people’s tendency to click without reading due to the excessive volume of consent requests they encounter daily [25].
- Data Protection in the Context of International Security Cooperation:The exchange of information between the intelligence services of different states is essential for combating transnational threats but raises complex data protection issues. Agreements such as the US Cloud Act or the EU-US Umbrella Agreement attempt to find a balance between security needs and the protection of fundamental rights, but their practical implementation remains a constant challenge, especially regarding safeguards for persons whose data is transferred.
4.4 Specific Vulnerabilities of Particular Groups
Certain groups are particularly vulnerable to violations of individual informational security:
- Children and Adolescents:Young people are often less aware of the risks associated with sharing personal information online and may be more susceptible to social pressures that encourage overexposure. Furthermore, data collected about children during their development can create permanent “digital footprints” that may affect future opportunities. The GDPR includes special provisions for the protection of children’s data, but their implementation remains problematic [26].
- Socially or Economically Vulnerable Persons:Individuals facing social marginalisation, poverty, or other forms of vulnerability may be less able to defend their privacy rights and may be more exposed to the exploitation of their data. For example, social assistance systems based on surveillance and profiling technologies can create “digital panopticons” that intensively monitor and control beneficiaries’ lives [27].
- Journalists, Activists, and Political Dissidents:These groups are often special targets of surveillance by both states and non-state actors, because their activities can expose abuses of power or unpopular positions. Protecting sources and confidential communications is essential for press freedom and civic activity, but increasingly sophisticated surveillance technologies call into question these groups’ ability to operate safely.
- Ethnically or Religiously Marginalised Communities:Historically, these communities have often been disproportionately targeted by state surveillance. In the digital age, this surveillance can become more invasive and opaque. Predictive analysis and data-based profiling can perpetuate and amplify systemic biases, leading to “algorithmic discrimination” that disproportionately affects certain groups [28].
Recognising these differentiated vulnerabilities is essential for developing data protection policies that are truly equitable and effective. This requires an approach that takes into account unequal power relations, specific social contexts, and the cumulative effects of multiple forms of marginalisation.
5. Solutions and Future Directions
5.1 Improving the Legislative and Regulatory Framework
To address contemporary challenges in protecting individual informational security, continuous evolution of the legislative and regulatory framework is necessary:
- Modernising Data Protection Law:Although the GDPR represents an advanced framework for data protection, its implementation must evolve to keep pace with technological changes. This could include: stricter regulations for emerging technologies such as facial recognition and IoT; more rigorous requirements for algorithmic transparency; and stronger enforcement mechanisms, including proportionate sanctions that have a real deterrent effect on large corporations.
- Developing a Legal Framework for Artificial Intelligence:The European Union is working on a comprehensive legal framework for artificial intelligence (AI Act), which could establish rules for the responsible use of AI, including in the security domain. This framework should include strict requirements for high-risk AI systems (including those used for surveillance in public spaces), impose fundamental rights impact assessments, and establish liability mechanisms for damages caused by AI systems [29].
- International Harmonisation of Regulations:To address the transnational nature of threats to individual informational security, closer international cooperation and greater harmonisation of legal frameworks are needed. This could include: agreements for mutual recognition of data protection authorities’ decisions; global minimum standards for data protection; and efficient cooperation mechanisms for investigating and prosecuting transnational violations.
- Strengthening Democratic Control Mechanisms:Existing control mechanisms over intelligence service activities must be consolidated and adapted to new technological realities. This could include: clearer and stronger mandates for parliamentary oversight committees; adequate resources and technological expertise for control authorities; and more efficient transparency mechanisms that allow for public scrutiny without compromising sensitive sources and methods.
5.2 Technical and Design Solutions
Technology can be part of the problem, but it can also be part of the solution. Several technical directions promise to improve individual informational security:
- Privacy by Design and Privacy by Default:These principles, included in the GDPR, require that data protection be integrated into the design of systems and processes from the outset, not added later. Their effective implementation requires concrete methodological tools, technical standards, and certification mechanisms. For example, techniques such as data minimisation (collecting only what is necessary), pseudonymisation, and end-to-end encryption can significantly reduce risks to individual informational security [30].
- Privacy-Enhancing Technologies (PETs):An increasingly wide range of technologies is available to protect privacy while allowing valuable use of data. These include: secure multi-party computation, which allows data analysis without revealing it; homomorphic encryption, which allows processing of encrypted data without decrypting it; and differential privacy techniques, which add statistical “noise” to data to protect individuals while preserving the aggregate utility of the data [31].
- Tools for Digital Autonomy:The development of tools that allow individuals to manage their digital identity and data more effectively could significantly enhance informational autonomy. These could include: self-sovereign identity wallets that allow individuals to control what information they share and with whom; tools for managing consent; and transparency scoring systems that evaluate and compare companies’ data protection practices.
- Algorithmic Verification and Auditing:To address algorithmic opacity, techniques and tools for verifying and auditing automated systems are needed. These could include: methods for detecting algorithmic bias; tools for explaining algorithmic decisions; and frameworks for the independent auditing of critical systems.
5.3 Education, Awareness, and Digital Literacy
Improving individual informational security requires not only adequate legal frameworks and technical solutions but also an educated and aware population:
- Digital and Privacy Literacy:Educational programmes should include not only basic technical skills but also an understanding of the risks and opportunities related to personal data, knowledge of data protection rights, and skills for actively managing one’s digital presence and identity. These programmes should start at an early age and continue throughout life.
- Training Specialists:Professionals from various fields – lawyers, journalists, social workers, teachers – need specialised training to fulfil their responsibilities regarding data protection and to protect their own practices. Specifically, employees of intelligence services and other law enforcement authorities need in-depth training in fundamental rights and data protection principles.
- Public Awareness and Advocacy:Non-governmental organisations, the media, and educational institutions have a crucial role in raising public awareness about the importance of individual informational security and the risks associated with its violation. Awareness campaigns, journalistic investigations, and advocacy work can significantly contribute to creating a culture of privacy protection.
- Public Dialogue and Civic Participation:Decisions regarding the balance between security and liberty must be the result of authentic public dialogue, not just technocratic decisions. Policy-making processes should include effective and participatory public consultation mechanisms, allowing citizens to contribute to shaping the frameworks that directly affect them.
5.4 Re-evaluating the Concept of Security
Ultimately, effectively protecting individual informational security may require a fundamental re-evaluation of the concept of security itself:
- From National Security to Human Security:The human security framework, promoted by the United Nations, shifts the focus from state security to the security of individuals. This framework recognises that true security includes not only protection against physical violence but also the assurance of fundamental rights, including the right to privacy and informational autonomy [32].
- Security as a Human Right:Instead of being seen as a justification for limiting rights, security could be reconceptualised as a human right in itself – the right to physical, psychological, and informational security. This shift in perspective could help reconcile the apparent tension between security and liberty, recognising that authentic security includes the protection of fundamental rights.
- Resilient Security versus Defensive Security:Instead of focusing exclusively on preventing threats through control and restriction, security approaches could place more emphasis on building resilience – the capacity of individuals and communities to cope with, adapt to, and recover from shocks and threats. This approach could reduce the need for intrusive surveillance measures, focusing rather on strengthening capacities and social structures.
Conclusion
Individual informational security represents an essential component of both fundamental rights and authentic national security. Understanding and protecting it requires a multidisciplinary approach that integrates legal, technological, psychological, and sociological perspectives.
This analysis has highlighted that the tension between security and liberty is not an absolute contradiction, but rather a complex dynamic that can and must be managed within a democratic framework. The solutions do not consist in an absolute choice between security and liberty, but in developing frameworks that rationally balance these fundamental values, with strict respect for the principles of necessity, proportionality, and subsidiarity.
The contemporary challenges are profound and multidimensional: accelerated technological transformation, the diversification of threat sources, the specific vulnerabilities of certain groups, and the limitations of existing legal frameworks. However, these challenges are not insurmountable. They demand responses that are equally complex and adaptive: modernised legal and regulatory frameworks, technology solutions centred on privacy protection, sustained education and awareness efforts, and, not least, a re-evaluation of the very concept of security.
For Romania, as a member state of the European Union and NATO, these discussions have immediate practical relevance. The effective implementation of the GDPR, the modernisation of the legislative framework for intelligence and operational activities under conditions of respect for fundamental rights, the development of necessary technical and legal competencies among security professionals – all these are challenges that require constant attention and resources.
Ultimately, success in protecting individual informational security will not be measured only by the absence of violations or by the complexity of regulatory frameworks, but by building a society where citizens can develop, express themselves, and participate in public life without fear of abuse, discrimination, or excessive control. Such a society is not only freer – it is also more resilient, more innovative, and, in the deepest sense of the word, safer.
Protecting individual informational security is not a luxury or a secondary concern in the context of national security priorities. It is a fundamental condition for the democratic legitimacy of the state, for citizens’ trust in institutions, and, ultimately, for the state’s ability to truly protect its fundamental interests and values. In an increasingly digitised and interconnected world, this protection becomes not only an ethical and legal necessity but an essential component of long-term national security.
References
- European Court of Human Rights. (1978). Klass and Others v. Germany, Application no. 5029/71.
- Caplan, J., & Torpey, J. (2001). Documenting Individual Identity: The Development of State Practices in the Modern World. Princeton University Press.
- Cheney-Lippold, J. (2017). We Are Data: Algorithms and the Making of Our Digital Selves. NYU Press.
- Bundesverfassungsgericht [BVerfG] [Federal Constitutional Court]. (1983). Volkszählungsurteil, 1 BvR 209/83 et al.
- European Parliament and Council. (2016). Regulation (EU) 2016/679 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation). Official Journal of the European Union, L 119/1.
- Floridi, L. (2014). The Fourth Revolution: How the Infosphere is Reshaping Human Reality. Oxford University Press.
- European Commission. (2023). Digital Economy and Society Index (DESI) 2022: Romania. Directorate-General for Communications Networks, Content and Technology.
- Angwin, J., Larson, J., Mattu, S., & Kirchner, L. (2016). Machine Bias: There’s Software Used Across the Country to Predict Future Criminals. And It’s Biased Against Blacks. ProPublica.
- Zuboff, S. (2019). The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. PublicAffairs.
- Constituția României [The Constitution of Romania]. (2003). Monitorul Oficial al României, Partea I, nr. 767 din 31 octombrie 2003.
- European Court of Human Rights. (1978). Klass and Others v. Germany, Application no. 5029/71.
- Curtea Constituțională a României [The Constitutional Court of Romania]. (2009). *Decizia nr. 125/2009*. Monitorul Oficial al României, Partea I, nr. 433 din 25 iunie 2009.
- Alexy, R. (2002). A Theory of Constitutional Rights. Oxford University Press.
- Altman, I. (1975). The Environment and Social Behavior: Privacy, Personal Space, Territory, Crowding. Brooks/Cole.
- Petronio, S. (2002). Boundaries of Privacy: Dialectics of Disclosure. SUNY Press.
- Winnicott, D. W. (1971). Playing and Reality. Tavistock Publications.
- Penney, J. W. (2016). Chilling Effects: Online Surveillance and Wikipedia Use. Berkeley Technology Law Journal, 31(1), 117–182.
- Noelle-Neumann, E. (1993). The Spiral of Silence: Public Opinion – Our Social Skin(2nd ed.). University of Chicago Press.
- Thompson, S. C. (1981). Will It Hurt Less If I Can Control It? A Complex Answer to a Simple Question. Psychological Bulletin, 90(1), 89–101.
- Figley, C. R. (Ed.). (1995). Compassion Fatigue: Coping with Secondary Traumatic Stress Disorder in Those Who Treat the Traumatized. Brunner/Mazel.
- Putnam, R. D. (2000). Bowling Alone: The Collapse and Revival of American Community. Simon & Schuster.
- Ziegeldorf, J. H., Morchon, O. G., & Wehrle, K. (2014). Privacy in the Internet of Things: Threats and Challenges. Security and Communication Networks, 7(12), 2728–2742.
- Mordini, E., & Tzovaras, D. (Eds.). (2012). Second Generation Biometrics: The Ethical, Legal and Social Context. Springer.
- O’Neil, C. (2016). Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. Crown Publishing Group.
- Acquisti, A., Brandimarte, L., & Loewenstein, G. (2015). Privacy and Human Behavior in the Age of Information. Science, 347(6221), 509–514.
- Livingstone, S., & Stoilova, M. (2021). The 4Cs: Classifying Online Risk to Children. CO:RE Short Report Series on Key Topics. Leibniz Institute for Media Research.
- Eubanks, V. (2018). Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. St. Martin’s Press.
- Benjamin, R. (2019). Race After Technology: Abolitionist Tools for the New Jim Code. Polity Press.
- European Commission. (2021). Proposal for a Regulation Laying Down Harmonised Rules on Artificial Intelligence (Artificial Intelligence Act). COM(2021) 206 final.
- Cavoukian, A. (2009). Privacy by Design: The 7 Foundational Principles. Information and Privacy Commissioner of Ontario, Canada.
- Danezis, G., Domingo-Ferrer, J., Hansen, M., Hoepman, J. H., Metayer, D. L., Tirtea, R., & Schiffner, S. (2014). Privacy and Data Protection by Design – From Policy to Engineering. European Union Agency for Network and Information Security.
- United Nations Development Programme (UNDP). (1994). Human Development Report 1994: New Dimensions of Human Security. UNDP.
Leave a Reply