ARTICLE
11 February 2026

Data Privacy In The Age Of Artificial Intelligence

E
Egemenoglu

Contributor

Egemenoglu is one of the largest full-service law firms in Turkey, advising market-leading clients since 1968. Egemenoğlu who is proud to hold many national and international clients from different sectors, is appreciated by both his clients and the Turkish legal market with his fast, practical, rigorous and solution-oriented work in a wide range of fields of expertise. Egemenoğlu has been considered worthy of various rankings by the world’s most leading and esteemed rating institutions and legal guides. We have been ranked as Recognized in “Project and Finance” and “Mergers and Acquisitions” areas by IFLR 1000. We also take place among the top- tier law firms of Turkey at the rankings of Legal 500, at which world’s best law firms are regarded, in “Employment Law” and “Real Estate / Construction” areas. Also our firm is regarded as significant by Chambers& Partners in “Employment Law” area as well.
The digital age is fundamentally transforming individual privacy.
Turkey Privacy
Oğuz Kayan’s articles from Egemenoglu are most popular:
  • within Privacy topic(s)
  • with Finance and Tax Executives and Inhouse Counsel
  • in European Union
  • in European Union
  • in European Union
  • in European Union
  • in European Union
  • with readers working within the Business & Consumer Services industries

INTRODUCTION

The digital age is fundamentally transforming individual privacy. Artificial intelligence systems, big data analytics, and Internet of Things (IoT) technologies continuously collect data from every aspect of individuals' lives. These developments necessitate a reinterpretation of the principle of "privacy" not only against state interference in the classical sense, but also against interventions by the private sector and autonomous algorithms.

The purpose of this article is to discuss the constitutional, legal, and ethical dimensions of data privacy in the age of artificial intelligence; to reveal the inadequacies of existing protection mechanisms and to develop reform proposals for the future. In this context, the study examines the structural tension that law experiences in the face of technological transformation, critically evaluates the normative limits of the Personal Data Protection Law ("KVKK"), and proposes new categories of constitutional rights for the protection of the digital identity of the individual.

I. INADEQUACY OF CONSTITUTIONAL PROTECTION MECHANISMS

The right to privacy, guaranteed in Article 20 of the Constitution of the Republic of Türkiye, has undergone a semantic transformation since its adoption in 1982.1 While the traditional constitutional interpretation prioritizes protection from state interference, the problem that emerged in the last quarter of the twentieth century is the systematic erosion of privacy, no longer just by public power, but by private corporations and technological systems.2 The Internet of Things, big data analytics, and artificial intelligence systems collect personal data from the daily life of each individual without interruption.3 The inevitable legal consequence of this situation is the necessity of redefining the traditional concept of privacy.

The current situation is not simply a "lack of protection"; on the contrary, it indicates the paradigmatic incompatibility of the legal system in the face of technological realities. The Personal Data Protection Law No. 6698 (KVKK) was enacted in 2016, at a time when IoT applications were not widespread and artificial intelligence systems operated at limited levels. 4 In fact, the personal data protection law came into force at a time when Türkiye transitioned to 4G (LTE-Advanced (LTE-A)- 3GPP Release 10 & Release 11) mobile communication technology on April 1, 2016.5 This also marks a critically important period for the law's entry into force. The widespread adoption of 4G emerged as a factor that would significantly accelerate personal data processing. Today, data processing mechanisms have evolved to radically challenge the fundamental foundations of this law. While the number of devices in the Internet of Things ecosystem is increasing rapidly, the uninterrupted data flow from each device is collected in central systems and analyzed by artificial intelligence. With each passing day, developments in this regard are being reshaped not only in Türkiye but also globally. The first use of 5G is planned to actively commence in Türkiye on April 1, 2026. The impact of a new communication generation and the increase in the importance of personal data in the last 10 years will undoubtedly be much higher.

This analysis predicts future behaviors, not just past behaviors. The data produced by an individual over a lifetime can predict the future behavior of the person with a high accuracy rate through mathematical models. This new economic order, conceptualized by Zuboff as "surveillance capitalism", is based on the transformation of individuals' behavioral data into predictable products.6 Legally, this means that the "free will" of the individual is mathematically predetermined. This undermines the foundation of human dignity. The individual's core privacy area is gradually decreasing due to this transformation, and the evolution of technological structures and operating systems is also encroaching upon individual privacy.

The core problem is that it is technically and philosophically impossible to implement Article 20 of the Constitution in 2025 as it was in 1982. This is because the concept of "private life" has changed historically. In 1982, private life meant "protection from state intervention". Private life in 2025 means "the individual's control over the digital twin created by data processors through mathematical models." In 2030–2035 and the following years, this will be evaluated against an even more critical reality. This new reality requires a restructuring of the legal system.

II. TECHNOLOGICAL TRANSFORMATION AND STRUCTURAL DEFICIENCY OF LAW

A. Hardware and Software Revolution: The Speed at which the Law Cannot Keep Up

The technological leap in the last fifteen years has far exceeded the historical pace of evolution of legal systems. The exponential growth predicted by Moore's Law is evident not only in processing power but also in storage capacity, network bandwidth, and algorithm complexity. 7 In 2010, a smartphone performed millions of operations per second, while in 2025, the same device can process trillions of transactions. The processors in today's smartphones have reached a capacity of 1-2 trillion operations per second (TOPS).8 Additionally, the introduction of artificial intelligence language models into our lives has created a significant multiplier effect in information processes. This increase in hardware and software power has created data processing capacities that were unimaginable at the time legal regulations were drafted. Data processing is no longer solely user-based but is also conducted through automation with the introduction of artificial intelligence language models. Of course, automation systems cannot be directly associated with the existence of artificial intelligence, but the fact that machines can perform operations at a level that humans can perform—through machine decision mechanisms—and can link and analyze results, has created a completely different effect.

5G technology, which is on the agenda today, has reduced data transfer time to milliseconds. In the future, 6G, 7G, and subsequent generation technologies will be of great importance in managing and operating more complex systems. These generations of connectivity technology also enable real-time data processing: An individual's location, heart rate, tone of voice, and facial expression can be collected simultaneously, processed in the cloud, and reflected in decision-making mechanisms. While defining the concept of "data processing", the law did not foresee that these transactions could take place within microseconds. More precisely, legislators discussed normative regulations from a perspective far from the working principles of these systems. Vague concepts in the KVKK, such as "within a reasonable time", become meaningless in the face of systems that process data on a millisecond scale. Real-time data processing -in which the element of being part of the data recording system as determined by the legislator is not reciprocated- is increasingly entering our lives. In addition to all these, legal regulation processes must first internalize the working principles of these technologies and combine them with legal principles. Legal evaluations that ignore the technical capabilities of technological infrastructure are also insufficient to protect the essence of the right. With a different perspective, an approach suggesting that every transaction should be considered prohibited or even constitute a crime if it does not fit within these normative regulations remains very shallow and simplistic.

B. Artificial Intelligence and IoT:Legal Ambiguity ofComplex Systems

The Internet of Things ecosystem has shattered the traditional "data controller-data processor" dichotomy of law. For example, in a smart home system, the thermostat manufacturer, software developer, cloud service provider, network operator, and AI algorithm developer are simultaneously involved in the data cycle. The rise of Web3 dominance also complicates establishing accountability. A breach in any link of this chain affects the entire system. However, law lacks the conceptual tools to describe this distributed liability structure. For this reason, the working principles of technical infrastructures should be defined, and the elements that need to be protected under basic legal principles should be evaluated together.

Web3 technologies, decentralized networks (blockchain), and smart contracts further exacerbate the problem. While traditional law is based on the concepts of "legal personality" and "jurisdiction", an autonomous protocol running on a blockchain neither belongs to a country nor has a specific legal personality. If this protocol processes personal data, who will be sued? Which court is competent? An approach holding the developer engineer responsible or the first party to publish the service as the responsible party is no different from the 1990s approach of holding website owners liable for all criminal responsibility. Today, the principles of internet law have moved far beyond this approach and progress by internalizing technical details, but even this is still insufficient. These questions cannot be answered by current legal regulations. Even if they are answered, the answers do not create real legal protection or an effect that safeguards the essence of the right. Moreover, the distributed nature of the data and the lack of a single addressee for processed personal data also create difficulties in the implementation of the law. The need for new legal definitions and regulations regarding these instruments is inevitable. Current regulations are insufficient to protect the privacy of individuals.

C. The Necessity of Lawyer-Engineer Cooperation

The most fundamental problem of law is the tendency to enact regulations without understanding the technical reality. For example, if a legislator does not know that deletion is technically impossible in a distributed database while drafting a "deletion of personal data" provision, that provision will remain unenforceable. Similarly, while requiring "algorithm disclosure," the law ignores the fact that deep learning models consist of billions of parameters, and even the concept of "explainability" is controversial. 9

The education provided in law faculties treats technology as a "secondary subject". However, in 2025, a lawyer needs to understand the basics of machine learning, database architectures, cryptography principles, and network protocols. Otherwise, regulations are doomed to be "correct on paper but meaningless in practice." Moreover, this deficiency also creates serious gaps in the protection of individuals. It is imperative that lawyers serving on the KVKK Board maintain constant dialogue with software engineers, data scientists, and cybersecurity experts.

D. Interdisciplinary Legal Design: A New Methodology

When drafting a regulation, engineering teams should be required to submit a "technical feasibility report". For example, when setting a "data breach notification within three days" rule, one should ask how long it takes to detect a breach in a company's system with billions of records. Legal regulations that lack concrete reality and are made by ignoring complex engineering-based processes also create problems in data privacy. If the detection period technically requires five days, the three-day period is unreasonable. However, such determinations will only provide the necessary legal protection when they are made after the underlying architecture and its working principles are known and understood.

In the restructuring of the KVKK, a "technical impact assessment" should be conducted for each article. This evaluation should be conducted with the participation of software companies, cybersecurity firms, and academic institutions from the relevant industry. When drafting the legal text, a "technical application guide" should be provided alongside it. This guide should explain how the provision will be applied in practice, what technologies can be used, and the minimum technical standards.

Legal education should also change radically. "Law and Technology" should be a compulsory course in law faculties, and students should be trained in basic coding, data structures, and artificial intelligence ethics. In postgraduate specialization programs, lawyers should be encouraged to undertake joint projects with engineers. Only such a cultural change can prevent law from falling behind technology. From this point on, there will be no going back, and legal doctrine and regulatory processes should proceed with greater sensitivity to this structure.

III. STRUCTURAL CONTRADICTIONS OF KVKK: STATIC REGULATION, DYNAMIC SYSTEMS

A. Collapse of the Principle of Purpose Limitation

The "principle of purpose limitation" regulated in Article 4 of the KVKK requires that a specific purpose must be predefined when collecting data.10 However, modern AI systems are built on the exact opposite logic of this principle. The unclear purpose during data collection, the emergence of new processing purposes as the system learns, and the production of derived information from original data seriously undermine the functionality of this principle.

The problem of inferential data generation is particularly critical. Psychiatric characteristics can be predicted from financial transaction records, future disease risk can be predicted from medical data, and even political tendencies can be predicted from behavioral patterns.11 The current text of the KVKK is not clear about the legal status of this derived information. While the data subject consents to the original data, they cannot predict and control which inferences will be produced. The explicit consent mechanism is also unclear when artificial intelligence is involved in these processes. Determining what data is subject to processing becomes difficult due to the system's working principles. This situation also raises the question of whether consent is truly an expression of "free will".

B. Functional Collapse of the Consent Mechanism

In the digital ecosystem, consent has become a "relational necessity" rather than a genuinely free choice mechanism in the theoretical sense. Although legal data processing conditions are determined as a general rule, individuals' control and dominance over their own data is gradually decreasing. A system has emerged in which freedom of choice has disappeared and the erosion of individuals' personal rights to survive in the current order is accepted as the "new normal". Using social media services, accessing financial applications, or even using public services requires consent to extensive data processing. In this context, consent has become a "prerequisite for exercising fundamental rights and freedoms" within the scope of Article 12 of the Constitution, and has therefore become "conditional".12 Legally, this situation questions the legitimacy of consent.

The fact that the situation is particularly critical for people at social risk also brings the social justice dimension to the agenda. Individuals receiving social assistance agree to give up privacy control in order to benefit from state-provided services. This "power imbalance" requires the particular sensitivity of the legal system.

Today, with the acceleration of digital transformation, the concept of ownership is evolving towards an increasingly ambiguous boundary. Digital access models are replacing physical property; individuals are now subject to regular and continuous subscription systems instead of one-time payments to access services. This paradigm shift profoundly impacts not only economic relations but also individuals' control over their personal data.

Data has become the most valuable asset in digital ecosystems; however, dominance over this data is declining regardless of individuals' will. Users are forced to share personal information in exchange for access to services, lacking any real control over data processing mechanisms in the process. Thus, while individuals' privacy space is narrowing, the concept of data ownership is gradually being redefined in favor of corporate actors.

This transformation brings with it important legal and ethical discussions. In particular, the protection of personal data and the right to dispose of individuals' digital identities emerges as a critical issue in the digital society of the future.

C. Incompatibility of the Right to Erasure with Technical Reality

The right to erase, destroy, or anonymize personal data regulated in Articles 7 and 11 of the KVKK, and the right to erasure regulated within the scope of the rights of the data subject, faces applicability problems given the structural features of machine learning systems. 13 Even if the data controller deletes the data, the parameters used to train the model continue to be permanently present in the system. Although "machine unlearning" technologies are in development, they cannot guarantee complete deletion at the current level.

This means that the data subject's right to erase their "digital trace" is limited due to technological limitations. Law should introduce new categories to resolve this contradiction. Concepts such as the "right to deactivation" can offer an alternative protection mechanism to the right of erasure: Even if the data is not deleted, it can be guaranteed that it will not be used for any future processing. In conclusion, the rapid proliferation of artificial intelligence and machine learning technologies necessitates a reevaluation of classical legal instruments regarding the protection of personal data. The current framework of the KVKK is designed for traditional data processing processes and cannot adequately cover the permanent data traces created by algorithmic systems. For this reason, it seems inevitable for the legislator to develop new categories of rights compatible with technological realities and to adopt dynamic and technology-neutral regulations that go beyond the "right to be forgotten".

IV. CONSTITUTIONAL AND LEGAL REFORMS

A.Constitutional Redefinition: The Right to Digital Privacy

The case law of the Constitutional Court describes the right to privacy as "indispensable for developing one's personality and enabling one's participation in social life." 14 However, this definition requires transformation in the digital age. Today, privacy means not only "protection from government interference", but also "the individual's control over the digital twin created by data processors through mathematical models."

The scope of Article 20 of the Constitution should be explicitly expanded. The new provision should explicitly protect four categories of data: original data, derived data, inferential data, and predictive profiling. More importantly, the subject of data should have the right to learn the reason for automated decisions that affect them in a way that an ordinary person can understand, without technical jargon.

The concept of digital sovereignty should be recognized as a fundamental right. This means that an individual has control authority similar to ownership rights over their own data. While social media companies generate commercial value by using individuals' data, it is legally and ethically problematic for the individual not to derive any benefit from this value.

B. KVKK Reform: Urgent Legal Interventions

New provisions titled "Personal Data Processing in Artificial Intelligence Systems" should be added to the KVKK. Currently, there are certain normative regulations, but these regulations lack transparency. In addition, there are no special provisions regarding artificial intelligence systems in the laws yet, but the necessary transformation has started globally in this regard. The articles to be enacted should include the obligation to disclose the reason for algorithmic decisions, the right to be informed prior to automated profiling activities, and the requirement to disclose the categories of data used in model training.15

The consent mechanism should be structured as a "tiered consent model". The data subject must be able to provide separate consents for inferential data and predictive profiling while consenting to the original data collection. This separation will ensure that consent is truly "informed consent".

Data responsibility in the IoT ecosystem has become blurred. Responsibility is spread across the device manufacturer, service provider, network operator, and data analyst. The "chain of responsibility" principle should be introduced in the KVKK, the scope of responsibility of each actor should be precisely determined, and the obligations to take measures should be defined. The data controller-data processor structure alone will no longer suffice to carry these elements. The reason is clear: the focus should be on tomorrow's landscape, not today's perspective. Otherwise, very serious and complex legal problems will arise. Most importantly, individuals' "personal rights" will suffer irreparable damage and consequences.

C. Privacy by Design: Legal Obligation

The principle of "Privacy by Design" should be codified as a legal obligation, not just a recommendation. The marketability of an IoT device must depend on incorporating privacy protection mechanisms from the design stage. The KVKK Authority should supervise the implementation of these requirements through a certification system. In other words, control elements such as hardware standards, health effects, and security should also apply to software. Existing regulations need to be approached beyond the fact that the "personal data protection law" remains a law that cannot protect personal data.

The success of this application depends on collaboration between law and engineering/computer science. By establishing a Technical Advisory Board, it should be evaluated whether the decisions made by the KVKK are "technically feasible and practical". When the Personal Data Protection Law No. 6698 came into force in 2016, it adopted a very strict approach to the transfer of personal data abroad. In the first version of the law, data transfer abroad was prohibited as a rule and only possible in certain exceptional cases. Although this regulation was seen as an important step in terms of protecting personal data, it posed serious challenges in practice considering technological infrastructures and the internet ecosystem. With the law updates made in 2024, the rules regarding data transfer abroad underwent radical changes. In particular, the concept of "incidental transfer" emerged, providing flexibility for data transfer abroad under certain conditions. However, these changes were insufficient to address the fundamental issues encountered in practice. Determining when and where incidental transfer begins, as written in the provision and considered from a technical perspective, remains complicated.

Therefore, neither the first regulation that came into force in 2016 nor the updates made in 2024 can provide a framework that is fully compatible with technological infrastructures. The main reason for this is that regulations alone cannot produce results. For example, it is often unknown to the user or data controller which server is located abroad within the BTK (Information and Communication Technologies Authority), Access Providers Association, or Internet Service Providers. The user processing the data simply cannot know this. It is almost impossible to determine to which country the transmitted data is transferred with current technical capabilities. Even when a user sends a simple email, it is not possible to know in which country the sent message will be stored in a database. It is not possible to determine whether a service is located domestically or abroad based on just one domain address. This severely limits the enforceability of regulations. As a result, this is a small example of how disconnected the legislator has become from technological realities. Legal regulations created with this perspective are neither sufficient to serve their purpose nor offer mechanisms to protect the essence of the right. It would not be wrong to say that regulations made while ignoring technological considerations have no effect in terms of problem solving and protecting rights.

V. TRANSFORMATION OF THE LEGAL THOUGHT PARADIGM

A. From Absolutism to Pragmatism

The law has to move from the mission of "providing perfect privacy" to the mission of "controlled transparency and harm reduction". Complete privacy has become a technical impossibility in the digital age. In the context of Article 13 of the Constitution, under the "principle of proportionality", regulations should observe the principles of reasonable protection, gradual implementation, and technological neutrality.

B. Evolutionary Law: Dynamic Update Mechanism

The current structure of the KVKK is a static regulation. In order to adapt to the pace of technological developments, a "sunset clauses" mechanism should be established. Each legal regulation should be limited to a period of seven years, and at the end of this period, the question "Is this provision still valid?" should be asked. If technology has advanced significantly, the provision must be completely rewritten.

In addition, a "Future Watch Board" should be established within the KVKK, which will follow technological developments and propose legal reforms in advance. For example, if it is known that quantum computers will break existing encryption, "post-quantum cryptography" standards should be made mandatory now.

VI. NEW RIGHTS AND PROTECTION MECHANISMS

A. The Right to Data:Recognition asa New Fundamental Right

The law should go beyond "data privacy" and recognize the concept of "data rights" as a fundamental right. This right should include the rights to access, use, modify, delete, and derive economic benefit from data. This means allowing an individual to "rent" their own data for commercial purposes and generate income. Although this right is constitutionally regulated in current regulations, it is ambiguous and insufficient. Data controllers are not transparent regarding data processing activities and do not provide sufficient access to this data. Moreover, in technological developments, holding and separating data like a magnet is not easy; data is complex and intertwined, which makes classification difficult.

If health researchers want to use an individual's health data, that individual should be able to receive "data rent". This approach not only recognizes the concept of "data labor" but also creates an economically fair sharing mechanism. The data regulations that the European Union is preparing represent pioneering steps in this regard. Türkiye can adopt these regulations and adapt them to national conditions. If an activity is carried out on a person's data, a corresponding value should be created. The approach of obtaining information in connection with the purpose, proportionately and within limits, and using it as desired while ignoring individuals in the income model based on their data should no longer be considered correct. Regarding the different transformations of personal data due to technological innovations, special sensitivity and detailed provisions should be included in the regulations. The exponential growth of artificial intelligence and new technologies causes the provisions written in legal texts to be insufficient in their implementation. Dominance and control mechanisms over personal data should be added.

B. Social Justice Dimension: Protecting Vulnerable Groups

Data privacy protection is not only a matter of individual rights but also of social justice. Individuals with low income levels, low education levels, and high dependence on basic public services are most exposed to data abuse. Vulnerable groups such as social assistance recipients, disabled individuals, and the elderly population should have the right to enhanced protection.

The "Social Protection Principle" should be introduced in the KVKK; it should be ensured that consent is subject to stricter conditions for these groups and that there are stronger control mechanisms against data processing involving the individual. For example, data collection in the social assistance system must be minimalist and must not link receiving benefits to data sharing. Because while individuals are in a vulnerable group, their personal data can be easily obtained, seriously damaging their privacy.

C. Control Mechanisms for Algorithmic Decision Making

The provision regarding automated decision-making regulated in Article 11 of the KVKK needs a comprehensive revision given today's practices. A new provision should cover automated decisions, as well as human decisions derived from those decisions. For example, if an AI system has suggested a loan refusal, and the natural person—the inspector—has approved it, this may be considered in the "automated decision" category.

The data subject should have the right to request an explanation for every decision that affects them. These explanations should not contain technical terminology such as "the algorithm logit value was x" and should be provided in language that the average person can understand. After disclosure, the individual should be able to request a re-evaluation by a non-automated mechanism, and this request should be answered within a reasonable time. This mechanism represents "algorithmic justice".

VII. DAILY LIFE AND PRACTICES

A.AnIndividual's Data Life:A SampleDaily Scenario

For example: 06:30 AM smart alarm clock, 07:00 smart shower, 07:30 smart coffee machine, 08:00 personalized news summary, 08:15 smart car navigation, 08:30 GPS location tracking, 12:00 behavioral analysis on the office computer, 18:00 return route suggestion, 20:00 viewing preferences, 22:00 sleep data recording. The amount of data generated daily is increasing. Every new application, every new IoT technology, and every hardware and software connected to the internet represents a significant step in profiling individuals and obtaining personal data.

This data forms a digital twin that is much more detailed than the physical person: everything from heart rate, sleep patterns, spending habits, viewing preferences, to social relationships. An artificial intelligence system can analyze this data and make future predictions. These predictions are sold to insurance companies, employers, and even government agencies. The individual is completely unaware of these processes. This is where transparent regulations are needed.

Legally, this means that the individual's "future is mathematically predetermined". Free will is no longer a theoretical concept but a calculated prediction. This undermines the foundation of human dignity.

B. Hierarchical Protection of Biometric Data

Biometric data, such as facial recognition, fingerprints, iris scans, and voice recordings, are direct representations of one's physical identity. This data is unchangeable, irreversible, and fixed for life. A password can be changed if forgotten, but biometric data breaches are irreversible. This fact reveals that biometric data must be subject to a special protection regime. Currently, the conditions for the processing of special categories of personal data are more stringent. However, it can be clearly observed from the decisions of the Personal Data Protection Authority and judicial decisions that the existing regulations do not provide sufficient protection.

In Türkiye, smart city applications, airport security systems, banking transactions, and even biometric identification systems, which are becoming increasingly common in private sector offices, constitute an area that is not yet sufficiently regulated. Although Article 6 of the KVKK16 refers to biometric data in the category of special categories of personal data, there are no clear standards regarding the collection, storage, and sharing of this data with third parties.

With a new provision to be added to the KVKK, a "layered protection model" should be adopted for the collection of biometric data. This model should include three levels: The first level is data stored only on the local device and never uploaded to a central system. The second level is data used for temporary verification purposes and automatically deleted after processing. The third level is data retained in a central system as required by law but subject to strict encryption and access control. Finally, structural and operational details related to the functioning of these mechanisms should be determined and disclosed. It is clear that merely enacting and implementing normative regulations is insufficient to protect the essence of the right; proper scaling cannot be achieved, and the provisions remain abandoned.

C. Children's Digital Privacy: The Principle ofProactiveProtection

In the digital age, children are the most vulnerable actors in data production. Through educational platforms, game applications, social media, and smart toys, children's behavior, preferences, and learning processes are constantly recorded. The use of this data against the child in the future is a socially unacceptable risk. First, parents often don't fully understand what data is being collected and how it's being used. Second, children cannot be expected to understand the value and risks of their personal data without receiving "digital literacy" training.

A "Digital privacy education" course should be included in Türkiye's education system. Starting from the primary school level, children should be taught about data privacy, personal information protection, and the consequences of their digital footprint, as well as internet security. In addition, the "precautionary principle" should be applied in digital services for children. The commercial use of children's data should be restricted.

VIII. TECHNOLOGICAL AND INTERNATIONAL DIMENSIONS

A. AI Transparency and the Right toExplanation

The issue of transparency in algorithmic decisions is not just a technical issue but a foundation of democratic legitimacy. Critical decisions, such as rejecting an individual's loan application, eliminating a job application, raising insurance premiums, or being deemed ineligible for social assistance, are increasingly being made by AI systems. Explaining the reasons for these decisions to the individual should be a legal requirement.

An "explainability certificate" should be made mandatory for artificial intelligence systems used in the public sector and critical private sector areas (finance, health, justice, education). This certification validates the system's capacity to explain its decisions in an understandable manner. The certification process should be carried out by independent technical auditors and renewed periodically.

B. Gaps in Cross-Border Data Transfer

The global nature of the digital economy reveals that data flows cannot be limited by national borders. The data of Turkish citizens is constantly transferred abroad through cloud services, international social media platforms, and services offered by multinational companies. Although Article 9 of the KVKK17 regulates these transfers, there are serious difficulties in practical implementation.

Türkiye should adopt a "data localization" policy for critical data categories. In particular, data related to public services, health, finance, and national security must be stored on servers located within Türkiye's borders. However, localizing all data is economically and technically impractical. For this reason, a "hybrid model" is proposed: the first category of data (critical sensitive data) should remain domestic, the second category of data (commercial data) can be transferred to countries with adequate protection standards, and the third category of data (anonymized data) can be transferred freely.

C. Rapid Response System in Data Breaches

Data breaches are an inevitable reality of the digital age. No matter how strong the cybersecurity measures are, zero risk is not possible. Therefore, the legal system should focus on rapid post-breach response and harm minimization strategies rather than aiming to prevent breaches altogether.

Article 12 of the KVKK18 regulates the notification of data breaches to the Authority and the relevant persons. However, the current regulation is insufficient in terms of the timing and content of the notification and what support will be provided to individuals affected by the breach. "Data breach insurance" should be made mandatory for organizations that process large-scale data. This insurance guarantees direct compensation to affected individuals in case of breaches, ensuring that those harmed can obtain their rights without undergoing lengthy legal processes. However, for this, the Personal Data Protection Authority must have a mechanism that determines the concrete damages of the breach as an active duty, as well as publishing and announcing breach notifications. This can be carried out by a sub-commission working under the institution or by a joint commission that works integrated with the new cybersecurity law. In this regard, regulatory authorities should also evaluate possible detection mechanisms.

Today, although regulations on the protection of personal data aim to protect the rights of individuals, there are serious gaps in practice. Let's give a simple example to make the subject concrete: You deposited money into your deposit account at the bank. One day, the bank's vault was broken into and the money was stolen. The bank informed the public as follows:

"The money in your accounts has been stolen. Respectfully announced to the public."

Does the bank's responsibility towards its customers end when this information is provided? Of course not. The bank has to fulfill both its legal and factual responsibilities towards the depositor.

So, why is the situation different regarding personal data? In the event of a data breach, the relevant institution is only obliged to notify the breach. Individuals who are already victims and whose personal data has been breached have to follow the process themselves after this notification. They can proceed by reacting to court and litigation processes themselves.

This leads to a doubling of victimization: First, your data is breached, and then you have to go through a complex process alone to protect your rights. This huge gap clearly reveals how inadequate existing regulations are in protecting individuals. In personal data breaches, just as in financial losses, institutions need to take more effective responsibility, and mechanisms to protect the rights of individuals need to be strengthened.

IX. INSTITUTIONALIZATION AND AWARENESS

A. Institutionalization of AI Ethics Committees

Given the rapid development of technology, the law is always one step behind. To close this gap, "ethical guidance" mechanisms should be established in addition to legal regulations. Every organization developing artificial intelligence systems should establish an independent Ethics Committee and evaluate the social impacts of the technology.

An "Artificial Intelligence Ethical Audit Unit" should be established within the KVKK Authority. This unit should periodically audit AI systems used in the public and private sectors for ethical standards. Audits should cover algorithmic discrimination, data minimization, transparency, and accountability criteria.

B. Digital Literacy: Social Awareness Strategy

Even the most advanced legal regulations will be ineffective if society is not aware of these issues. The level of digital literacy in Türkiye is low, especially regarding data privacy. Most individuals don't know how their data is collected, where it's stored, and who has access to it. This ignorance paves the way for abuse.

Under the coordination of the Ministry of National Education, a comprehensive "Digital Privacy Awareness Campaign" can be launched. This campaign should not be limited to schools but can be supported by adult education programs, public service announcements, social media campaigns, and workshops held in community centers.

X. CONCLUSION AND RECOMMENDATIONS

Establishinga Human-Centered Digital Legal Order

The issue of data privacy and the protection of privacy will be one of the critical areas where the Turkish legal system and democratic institutions are "tested" in the 2025-2035 period. The decisions made here will reveal whether Türkiye can create a digital order that puts human dignity at the center. Moreover, this is a problem that is on the agenda of not only Türkiye but the whole world.

The law should adopt a strategy of "technology orientation" instead of "technology resistance". The basic strategy should be implemented in a coordinated manner: the obligation of privacy by design, the recognition (expansion) of the right to data as a fundamental right, the democratic control of algorithmic decision-making, the evolutionary legal paradigm, the principle of social justice protection, the layered protection of biometric data, the guarantee of children's digital rights, and the vision of technological sovereignty. These strategies will build a constructive bridge between law and technology. Of course, the topics discussed here reflect only a limited part of the subject. It is possible to examine the subject from a deeper and more comprehensive perspective. In this study, exemplary topics and evaluations are discussed to reveal the existing concrete deficiencies and to provide sections that give an idea about the direction in which the regulatory proposals can progress.

The success of the legal system depends on the transition from static rules to a dynamic system of principles. Technology changes every day; the law has to keep up with this pace. But technical regulations are not the only thing that needs to change—legal thinking itself needs to be radically transformed.

Three critical transformations must occur: First, data privacy must move from a "negative right" (non-interference) to a "positive right" (control and management). The individual should not only be protected but should also be able to actively manage their data. Second, the law should move from a "punishment-oriented" approach to a "prevention-oriented" approach. Instead of imposing penalties after data breaches occur, systems should be established to prevent breaches from occurring. Third, regulations should be "value-based" rather than "technology-based". No matter what technology is used, the value that needs to be protected is human dignity.

The fundamental principle is simple but powerful: Law exists to protect people, not technology. The raison d'être of law is to guarantee human dignity and freedom. Technology is a tool that makes human life easier; under no circumstances can it become a system that instrumentalizes human beings. If this principle is remembered and implemented with determination, Türkiye can set an exemplary model for the world with its "fair data law" that puts human dignity at the center. Such an approach not only safeguards the rights of individuals but also ensures that technology thrives within ethical boundaries.

The main function of law is to establish the balance between the individual and power. This power is not only in the hands of the state but also in the hands of technology companies in the modern age. When personal data becomes an element of economic value, individuals' right to privacy is overshadowed by market dynamics. At this point, regulations such as the right to privacy and the KVKK, which are guaranteed in Article 20 of the Constitution, must provide not only theoretical protection but also a de facto guarantee. When the law is not applied in a way that protects the autonomy of the individual, technology takes advantage of the loopholes in the law and reduces the individual to a "data source".

Therefore, data law is not only a technical regulatory field but is also directly related to fundamental rights and freedoms law and personality law. The European Union's General Data Protection Regulation ("GDPR") epitomizes this understanding. The GDPR does not limit data processing activities solely to the principle of transparency and consent; it also grants the individual rights such as the "right to be forgotten", "data portability", and "protection against automated decision-making". Similarly, Türkiye needs to strengthen the KVKK, facilitate the process of individuals seeking their rights after a breach, and impose proactive responsibility on institutions. The purpose of law is not to eliminate victimization after a breach but to prevent the breach.

Otherwise, uncontrolled technology will turn individuals into "data carriers"; human beings will cease to be the subject of their own life and become the object of algorithms. Autonomy and freedom will remain only a theoretical concept; in practice, they will become a dream. This undermines the fundamental values of democratic societies and eliminates the right of individuals to have a say over their own identities. The duty of law is to stop this process and establish an order that puts people at the center.

Türkiye has a great opportunity before it: to develop a data law model that puts human dignity at the center, protects the autonomy of the individual, and limits technology to ethical principles. Such a model can set an example not only at the national level but also on a global scale. The biggest challenge of the future is not to control technology but to keep technology at the service of people. Law is the most powerful tool in this struggle.

Footnotes

1 Kaboğlu, İ. Ö. (2002). Özgürlükler Hukuku: İnsan Haklarının Hukuksal Yapısı. Ankara: İmge Kitabevi, s. 234-267; Gözler, K. (2011). Türk Anayasa Hukuku Dersleri. Bursa: Ekin Yayınevi, s. 456-478.

2 Solove, D. J. (2008). Understanding Privacy. Harvard University Press, s. 1-23.

3 Weber, R. H. (2010). "Internet of Things–New security and privacy challenges". Computer Law & Security Review, 26(1), 23-30

4 6698 Sayılı Kişisel Verilerin Korunması Kanunu, RG: 07.04.2016, Sayı: 29677.

5 Bilgi Teknolojileri ve İletişim Kurumu (BTK), Türkiye Elektronik Haberleşme Sektörü Üç Aylık Pazar Verileri Raporu – 2017 1. Çeyrek, Ankara 2017, s. 15 vd.; ayrıca bkz. 3rd Generation Partnership Project (3GPP), LTE Release 10 & beyond (LTE-Advanced), (Release 10 Tanıtım Sayfası), https://www.3gpp.org/specifications-technologies/releases/release-10 ; ETSI, 3GPP TR 36.912 V10.0.0, Feasibility study for Further Advancements for E-UTRA (LTE-Advanced), 2011,

6 Zuboff, S. (2019). The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. New York: PublicAffairs, s. 63-97

7 Moore, G. E. (1965). "Cramming More Components onto Integrated Circuits." Electronics, 38(8), 114–117.

8 Apple Inc., "iPhone 15 Pro – Tech Specs", 2023; Qualcomm Technologies Inc., "Snapdragon 8 Gen 3 Mobile Platform", 2023, her iki kaynakta da ilgili yongaların yapay zekâ hızlandırıcılarının TOPS seviyeleri belirtilmektedir.

9 Arrieta, A. B., et al. (2020). "Explainable Artificial Intelligence (XAI): Concepts, Taxonomies, Opportunities and Challenges toward Responsible AI". Information Fusion, 58, s. 82-115.

10 6698 sayılı Kişisel Verilerin Korunması Kanunu, m. 4/2-a (RG: 07.04.2016/29677): "Kişisel verilerin işlenmesinde aşağıdaki ilkelere uyulması zorunludur: a) Hukuka ve dürüstlük kurallarına uygun olma.

11 Küzeci, Elif (2020).Kişisel Verilerin Korunması. 4. Baskı, Turhan Kitabevi, Ankara, s. 145-189.

12 T.C. Anayasası m. 12. "I. Temel hak ve hürriyetlerin niteliği Madde 12 – Herkes, kişiliğine bağlı, dokunulmaz, devredilmez, vazgeçilmez temel hak ve hürriyetlere sahiptir. Temel hak ve hürriyetler, kişinin topluma, ailesine ve diğer kişilere karşı ödev ve sorumluluklarını da ihtiva eder.

13 6698 sayılı Kişisel Verilerin Korunması Kanunu (KVKK), m. 7, m. 11, m. 17 (RG: 07.04.2016/29677):

"Kişisel verilerin silinmesi, yok edilmesi veya anonim hâle getirilmesi - MADDE 7- (1) Bu Kanun ve ilgili diğer kanun hükümlerine uygun olarak işlenmiş olmasına rağmen, işlenmesini gerektiren sebeplerin ortadan kalkması hâlinde kişisel veriler resen veya ilgili kişinin talebi üzerine veri sorumlusu tarafından silinir, yok edilir veya anonim hâle getirilir. (2) Kişisel verilerin silinmesi, yok edilmesi veya anonim hâle getirilmesine ilişkin diğer kanunlarda yer alan hükümler saklıdır. (3) Kişisel verilerin silinmesine, yok edilmesine veya anonim hâle getirilmesine ilişkin usul ve esaslar yönetmelikle düzenlenir.

İlgili kişinin hakları - MADDE 11- (1) Herkes, veri sorumlusuna başvurarak kendisiyle ilgili; a) Kişisel veri işlenip işlenmediğini öğrenme, b) Kişisel verileri işlenmişse buna ilişkin bilgi talep etme, c) Kişisel verilerin işlenme amacını ve bunların amacına uygun kullanılıp kullanılmadığını öğrenme, ç) Yurt içinde veya yurt dışında kişisel verilerin aktarıldığı üçüncü kişileri bilme, d) Kişisel verilerin eksik veya yanlış işlenmiş olması hâlinde bunların düzeltilmesini isteme, e) 7 nci maddede öngörülen şartlar çerçevesinde kişisel verilerin silinmesini veya yok edilmesini isteme, f) (d) ve (e) bentleri uyarınca yapılan işlemlerin, kişisel verilerin aktarıldığı üçüncü kişilere bildirilmesini isteme, g) İşlenen verilerin münhasıran otomatik sistemler vasıtasıyla analiz edilmesi suretiyle kişinin kendisi aleyhine bir sonucun ortaya çıkmasına itiraz etme, ğ) Kişisel verilerin kanuna aykırı olarak işlenmesi sebebiyle zarara uğraması hâlinde zararın giderilmesini talep etme, haklarına sahiptir.

Suçlar - MADDE 17- (1) Kişisel verilere ilişkin suçlar bakımından 26/9/2004 tarihli ve 5237 sayılı Türk Ceza Kanununun 135 ila 140 ıncı madde hükümleri uygulanır. (2) Bu Kanunun 7 nci maddesi hükmüne aykırı olarak; kişisel verileri silmeyen veya anonim hâle getirmeyenler 5237 sayılı Kanunun 138 inci maddesine göre cezalandırılır.

14 Anayasa Mahkemesi, B. No: 2018/11988, 10/03/2022, § 52.

15 European Commission (2024). "AI Act enters into force". Erişim: https://commission.europa.eu/news-and-media/news/ai-act-enters-force-2024-08-01_en.

16 6698 sayılı Kişisel Verilerin Korunması Kanunu (KVKK), m. 6 (RG: 07.04.2016/29677): "Özel nitelikli kişisel verilerin işlenme şartları - MADDE 6- (1) Kişilerin ırkı, etnik kökeni, siyasi düşüncesi, felsefi inancı, dini, mezhebi veya diğer inançları, kılık ve kıyafeti, dernek, vakıf ya da sendika üyeliği, sağlığı, cinsel hayatı, ceza mahkûmiyeti ve güvenlik tedbirleriyle ilgili verileri ile biyometrik ve genetik verileri özel nitelikli kişisel veridir. (2) Özel nitelikli kişisel verilerin, ilgilinin açık rızası olmaksızın işlenmesi yasaktır. (3) Birinci fıkrada sayılan sağlık ve cinsel hayat dışındaki kişisel veriler, kanunlarda öngörülen hâllerde ilgili kişinin açık rızası aranmaksızın işlenebilir. Sağlık ve cinsel hayata ilişkin kişisel veriler ise ancak kamu sağlığının korunması, koruyucu hekimlik, tıbbî teşhis, tedavi ve bakım hizmetlerinin yürütülmesi, sağlık hizmetleri ile finansmanının planlanması ve yönetimi amacıyla, sır saklama yükümlülüğü altında bulunan kişiler veya yetkili kurum ve kuruluşlar tarafından ilgilinin açık rızası aranmaksızın işlenebilir. (4) Özel nitelikli kişisel verilerin işlenmesinde, ayrıca Kurul tarafından belirlenen yeterli önlemlerin alınması şarttır.

17 6698 sayılı Kişisel Verilerin Korunması Kanunu (KVKK), m. 9 (RG: 07.04.2016/29677; Değişik: 02.03.2024 t. 7499 s. K. m.34): "Kişisel verilerin yurt dışına aktarılması - MADDE 9- (1) Kişisel veriler, 5 inci ve 6 ncı maddelerde belirtilen şartlardan birinin varlığı ve aktarımın yapılacağı ülke, ülke içerisindeki sektörler veya uluslararası kuruluşlar hakkında yeterlilik kararı bulunması halinde, veri sorumluları ve veri işleyenler tarafından yurt dışına aktarılabilir. (2) Yeterlilik kararı, Kurul tarafından verilir ve Resmî Gazete'de yayımlanır. (...) (4) Kişisel veriler, yeterlilik kararının bulunmaması durumunda, (...) aşağıda belirtilen uygun güvencelerden birinin taraflarca sağlanması halinde (...) yurt dışına aktarılabilir: (...) b) Ortak ekonomik faaliyette bulunan teşebbüs grubu bünyesindeki şirketlerin (...) Kurul tarafından onaylanan bağlayıcı şirket kurallarının varlığı, c) Kurul tarafından ilan edilen (...) standart sözleşmenin varlığı (...)

18 6698 sayılı Kişisel Verilerin Korunması Kanunu (KVKK), m. 12 (RG: 07.04.2016/29677): "Veri güvenliğine ilişkin yükümlülükler - MADDE 12- (1) Veri sorumlusu; a) Kişisel verilerin hukuka aykırı olarak işlenmesini önlemek, b) Kişisel verilere hukuka aykırı olarak erişilmesini önlemek, c) Kişisel verilerin muhafazasını sağlamak, amacıyla uygun güvenlik düzeyini temin etmeye yönelik gerekli her türlü teknik ve idari tedbirleri almak zorundadır. (2) Veri sorumlusu, kişisel verilerin kendi adına başka bir gerçek veya tüzel kişi tarafından işlenmesi hâlinde, birinci fıkrada belirtilen tedbirlerin alınması hususunda bu kişilerle birlikte müştereken sorumludur. (3) Veri sorumlusu, kendi kurum veya kuruluşunda, bu Kanun hükümlerinin uygulanmasını sağlamak amacıyla gerekli denetimleri yapmak veya yaptırmak zorundadır. (4) Veri sorumluları ile veri işleyen kişiler, öğrendikleri kişisel verileri bu Kanun hükümlerine aykırı olarak başkasına açıklayamaz ve işleme amacı dışında kullanamazlar. Bu yükümlülük görevden ayrılmalarından sonra da devam eder. (5) İşlenen kişisel verilerin kanuni olmayan yollarla başkaları tarafından elde edilmesi hâlinde, veri sorumlusu bu durumu en kısa sürede ilgilisine ve Kurula bildirir. Kurul, gerekmesi hâlinde bu durumu, kendi internet sitesinde ya da uygun göreceği başka bir yöntemle ilan edebilir.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

See More Popular Content From

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More