ARTICLE
23 March 2026

World Consumer Rights Day (Part 1): Digital Fairness Act - The Road To Fitness For Online Consumers?

WF
William Fry

Contributor

William Fry is a leading corporate law firm in Ireland, with over 350 legal and tax professionals and more than 500 staff. The firm's client-focused service combines technical excellence with commercial awareness and a practical, constructive approach to business issues. The firm advices leading domestic and international corporations, financial institutions and government organisations. It regularly acts on complex, multi-jurisdictional transactions and commercial disputes.
To celebrate World Consumer Rights Day, the William Fry Technology team prepared a three part series highlighting key EU consumer law developments set to shape consumer digital contracts in 2026.
Ireland Consumer Protection
Rachel Hayes’s articles from William Fry are most popular:
  • within Consumer Protection topic(s)
William Fry are most popular:
  • within Environment, Insolvency/Bankruptcy/Re-Structuring and Transport topic(s)

To celebrate World Consumer Rights Day, the William Fry Technology team prepared a three part series highlighting key EU consumer law developments set to shape consumer digital contracts in 2026.

This Part 1 sets the scene by examining the policy direction of travel through the proposed Digital Fairness Act. Parts 2 and 3 then look more closely at how this broader shift is taking shape in practice – including potential changes to the classification of streaming services and the introduction of a mandatory online “withdrawal button”.

Origin of the DFA

In 2024, the European Commission (Commission) conducted a Digital Fairness Fitness Check (Fitness Check) to assess whether the current patchwork of EU consumer protection laws provides adequate safeguards to consumers from exploitative online practices. While the Fitness Check confirmed that existing laws remain suitable to regulate these practices, it also identified shortcomings driven by various factors. These included: a lack of compliance from traders due to legal uncertainty, fragmented enforcement by competent authorities and the growing complexity of the digital environment.

To deal with these shortcomings, the Commission proposed the Digital Fairness Act (DFA), supplemented by a targeted public consultation and call for evidence in July 2025 before publishing any draft legislative text of the DFA. That consultation concluded in late 2025, and the Commission has recently published a factual summary report of the responses received. The report provides early and valuable insights into where stakeholders see the greatest risks to digital consumers. It also highlights where existing law is viewed as falling short and where regulatory intervention is most strongly supported vis-à-vis the DFA.

Drawing on those consultation outcomes, in this article we identify:

  • the key themes emerging from stakeholder feedback;
  • the organisations that should closely follow the development of the DFA; and
  • the areas where regulatory change now appears most likely.

Who is impacted by the DFA?

The DFA is set to reshape the landscape for businesses engaging directly with consumers online. Organisations operating in the business-to-consumer digital space, especially those with customer facing platforms, subscription-based models, personalised marketing or dynamic pricing, should be particularly alert to the DFA's requirements.

Industries likely to feel the greatest impact include e-commerce, streaming and subscription services, airline and travel platforms, online video games and delivery apps.

In short, any business offering products or services directly to consumers online will need to review their practices to ensure compliance with the new legislation.

Focus areas of the DFA consultation

In many respects, existing EU consumer protection legislation already regulates aspects of the issues identified through the Fitness Check. Keyissues explored through the public consultation are:

  • Whether existing frameworks  can be improved (and amended) to bridge identified gaps and support more effective enforcement; or
  • Whether a new, standalone legislative instrument is required to address online‑specific risks.

While the consultation sought views on each approach, the Commission's decision to advance the DFA suggests that a new, dedicated legislative response is currently the preferred option.

The consultation attracted over 3,300 responses, the majority of which were submitted by consumers, alongside detailed input from businesses, industry associations, regulators and civil society organisations. Although stakeholder views diverged in certain areas, particularly between consumer and business respondents, the feedback revealed a number of consistent themes. These themes provide early insights into the areas where regulatory intervention is most strongly supported, as well as those where appetite for new rules is more limited.

Against that backdrop, the Commission consulted on the following gaps and areas of legal uncertainty:

1. Dark patterns

Understanding dark patterns

Dark patterns are online design tactics that unfairly steer, deceive or manipulate users towards choices that (beyond a user's consciousness) benefit businesses. Examples include false urgency, emotional manipulation or misleading consent options. These can lead to unintended purchases, unwanted subscriptions or users unwittingly sharing data.

Existing rules and gaps

As we previously explored here, several existing laws and guidelines, including the Digital Services Act (DSA), Data Act, AI Act  and European Data Protection Board (GDPR) guidance, address dark patterns from different perspectives. However, this patchwork legal framework has resulted in fragmented regulation and inconsistent enforcement, prompting calls for clearer and harmonised rules. This has led to questions about whether a dedicated regime under the DFA is needed to more effectively tackle deceptive design practices or ‘dark patterns'.

Feedback from consultation

The consultation responses show strong support for new EU action in this area. Consumers overwhelmingly want clearer and more enforceable rules to address dark patterns. Businesses were more cautious, stressing the need for legal certainty and proportionality. Even so, most stakeholders agreed that relying on the current framework has produced inconsistent application and enforcement. This strengthens the case for greater clarity and consistency under the proposed DFA.

2. Addictive design of digital products

Understanding addictive designs

Addictive designs are features that encourage users, especially vulnerable groups like children, to spend excessive time or money on a service (which ultimately result in commercial gain for businesses). Examples may include video games with loot boxes or in-app currencies (which many stakeholders argue obscure real costs and promote more spending).

Existing rules and gaps

Again, while there is no single EU or Irish law directly regulating addictive designs in digital products, several laws address related risks. For example:

  • The Unfair Commercial Practices Directive (UCPD) prohibits misleading and aggressive commercial practices, which may apply to certain addictive features if they influence transactional decisions.
  • The DSA, particularly Article 25, prohibits online platforms from designing interfaces that materially impair users' ability to make free and informed decisions and includes specific protections for minors under Article 28 (which we explored in detail here).
  • The General Data Protection Regulation (GDPR) can also be relevant where addictive design involves unlawful processing of personal data, especially in respect of profiling or automated decision-making via recommender systems.
  • The AI Act prohibits manipulative or exploitative techniques that cause significant harm, which could potentially encompass certain addictive design features.

Despite these protections, concerns about addictive design remain. Some argue that strict bans may be needed. Others warn that too much regulation could harm the user experience and reduce revenue from features like in-game purchases, which could also slow innovation.

Feedback from consultation

The consultation responses show this tension clearly. Many respondents supported new binding rules to limit addictive design features, including switching these features off by default and letting users opt in. Support for full bans was more mixed, especially outside the context of protecting minors. This suggests the Commission may prefer targeted restrictions rather than broad prohibitions.

3. Unfair personalised advertising and pricing targeting

Understanding unfair personalisation

Personalisation often relies on profiling and automated decision making, using personal data to tailor ads or offers to individual users. These techniques can become unfair or exploitative, particularly where users are targeted based on sensitive or vulnerable characteristics, such as financial stress, emotional difficulties or age.

Personalised pricing raises similar concerns. While it can deliver benefits like tailored recommendations or dynamic pricing, it can also disadvantage vulnerable groups, including older adults and children, and may undermine consumer trust.

Existing rules and gaps

Several EU laws already address aspects of unfair personalisation. For instance, Article 26 of the DSA prohibits online platforms from targeting users with ads based on profiling in certain circumstances. In addition, Article 22 of the GDPR restricts automated individual decision‑making, including profiling, unless specific conditions are met. The UCPD can apply where personalised tactics mislead or pressure users into purchases, and the AI Act restricts manipulative or exploitative techniques that could cause significant harm.

However, despite this framework, consumers often struggle to understand how their data is used. Profiling methods can be opaque, website designs can obscure meaningful choices and opt‑outs may be difficult to find or ineffective. These gaps have intensified concerns about transparency, fairness and the protection of vulnerable groups.

Feedback from consultation

Consultation respondents strongly supported tighter controls on unfair personalisation practices, especially where targeting exploits consumer vulnerabilities. Stakeholders expressed particular concern about personalised advertising and pricing aimed at minors and other at‑risk groups.

There was also notable support for limiting or banning practices such as “drip” pricing, misleading “starting from” prices and deceptive discounts. Overall, respondents called for clearer, more consistent rules to address the risks that current laws do not fully resolve.

4. Contract cancellation and digital subscriptions

Understanding challenges with cancellations and digital subscriptions

Contract cancellation and digital subscription management have long been recognised as problem areas for consumers. These issues featured heavily in the Commission's earlier Fitness Check, which found that users often struggle to locate or complete cancellation processes and may face obstacles when trying to end subscriptions. However, in the recent consultation, this topic attracted less attention than other elements of the DFA. Respondents were more divided, and many provided “no opinion,” suggesting it was not viewed as the most urgent area for new EU intervention.

Existing rules and gaps

Many stakeholders felt that current EU consumer laws already address most of the problems with subscription and cancellation practices. They pointed to the Consumer Rights Directive (CRD), the UCPD,  and the DSA's interface‑design obligations, as tools regulators can already use to act against unfair or obstructive cancellation flows. From this perspective, the main challenge is inconsistent enforcement, not gaps in the law. Some respondents noted that more prescriptive rules could create operational burdens or duplicate existing obligations.

Consultation findings

Feedback from the consultation did show some support for improvements, such as clearer or simpler cancellation flows. However, there was little agreement on introducing strict or highly detailed new rules. Businesses, in particular, warned about the practical challenges of further regulation in an area that is already covered by several legal instruments.

Overall, the consultation suggests that the Commission may address cancellation related issues indirectly, for example through broader DFA measures targeting dark patterns and manipulative interface design, rather than creating new standalone rules for subscription cancellation.

5. Misleading social media influencer marketing

Understanding misleading influencer marketing

Commercial activities on social media, particularly influencer marketing, have led to several negative consequences for consumers. Problematic practices include the use of ambiguous terms such as “collaboration” or “partnership” instead of clearly identifying posts as advertisements. Even when disclaimers are present, many consumers still interpret the content as personal recommendations rather than paid promotions.

Additional concerns involve the promotion of unhealthy foods, alcohol, vaping products, and high-risk financial services like cryptocurrency trading. In some instances, these are targeted at minors, potentially influencing their eating habits and behaviours in harmful ways.

Existing rules and gaps

Despite the joint release of updated Guidance for Influencers Advertising and Marketing by the Advertising Standards Authority (ASA) and the Competition and Consumer Protection Commission (CCPC) in 2023, transparency in this sector is still lacking. Enforcement was strengthened further through a new data‑sharing agreement between the CCPC and ASA in August 2025. However, these steps have not fully resolved ongoing issues.

Regulation under Article 26 of the DSA, as well as oversight of audio‑visual media service providers through the Audio‑Visual Media Services Directive, also remains insufficient to address persistent problems in this space. These gaps are a key reason why the DFA proposes a more focused approach to misleading social media influencer marketing.

Feedback from consultation

A majority of respondents supported stronger disclosure rules for influencer marketing, including clearer labelling of paid content and greater accountability for both brands and platforms. Many also raised concerns about content aimed at minors and the promotion of high‑risk products, reinforcing the need for more prescriptive safeguards.

What to expect next in the DFA process

The Commission's public consultation on the DFA has closed, meaning it will progress to the impact assessment phase, drawing on the consultation responses and other evidence gathered. A draft legislative proposal for the DFA is expected to follow once the impact assessment is complete, with current indications pointing to publication during 2026.

Given the strong consumer support identified in the consultation responses (particularly in relation to dark patterns, addictive design and influencer marketing), businesses operating in the digital consumer space should expect changes to their existing obligations in these areas. While the full scope of the DFA is a work in progress, the consultation results highlight where businesses may need to adjust their compliance and design practices. These insights provide a useful starting point for any readiness assessments or audits and online business will need to be agile to deal with any impacts of the Commission's DFA agenda

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

[View Source]

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More