The impact of Data Privacy and AI Regulations in Europe for SaaS companies

Table of Contents

    Running ClicData, a data management, automation, and analytics platform with customers across the globe, has given me a unique perspective on data privacy and AI regulations.

    Every day, we process vast amounts of data for thousands of customers, ensuring their information is managed securely and efficiently. Privacy and security are not just compliance checkboxes for us — they are fundamental to the trust our customers place in us.

    As AI continues to evolve and integrate into data management, analytics, and decision-making, regulatory landscapes are shifting to keep pace. Governments and industry bodies worldwide are rolling out new privacy frameworks, imposing stricter compliance measures, and redefining what responsible AI usage looks like. Balancing innovation with regulatory compliance is becoming more complex, and businesses like ours must stay ahead of these changes to safeguard both, our customers and our operations.

    In this article, I’ll share my thoughts on data privacy and AI regulations, specifically in Europe who seems to be pioneering these things somewhat more than other regions, or at least in a more strict way. I’ll touch upon how they impact businesses, where they fall short, and how to keep your business “safe” and inline with said regulations. Whether you’re a data professional, a business leader, or simply curious about the evolving digital landscape, understanding these changes is critical in shaping a responsible and secure data-driven future.

    Size Matters

    Despite being a smaller company, we definitely punch above our weight and I don’t just mean against our competitors but I would probably be right that much of the compliance policies that we put in place are adhered to better than some of the larger companies. This is, of course, a huge advantage where we can monitor a smaller company easier than a larger one, we can also apply new and corrective measures faster than a larger company. And this is a huge positive for us and good news for our customers.

    However, there is a negative side of being small related to the upkeep of the “bureaucratic” side of compliance. For our ISO 27001 compliance alone, we have over 70 documents of policies and procedures to review throughout the year and produce at least as many evidence documents. If you add SOC 2, GDPR, HIPAA, and a few others, we are now running into the hundreds of documents and processes which are audited each year or every other year.

    For a small company that has a product which is global the amount of staff required to maintain, monitor and improve any type of compliance or regulatory program, is simply not feasible unless you pay third party companies and consultants which comes with its own challenges and benefits as well.

    When we started ClicData, I ran all the compliance programs and even though a CEO should be independent from the DPO and CISO, and so on, once again size did not permit us to simply just hire staff, train them on our platform only for the purposes of compliance, an activity that still does not require someone full time.

    So, my first point of focus was ISO 27001 and SOC 2. Both geared towards customers on both sides of the Atlantic and both looking at compliance from different angles; one more technical and another more company structure & finance.

    With GDPR compliance under way, we started to look at what we needed to do to be compliant and ensure that we were covered and had all the processes defined and documented.

    Your Data. Our Data. Their Data.

    A second challenge that we faced was the grouping and classification of the data. Each policy and regulation applies to specific data, such as private data. Since we offer a platform that allows our customers to store data, which in turn, we rely on Microsoft Azure to physically store, yet we do not access or know the data, we face challenges simply in defining what data are we documenting.

    Not knowing if the data has personal or health information we are unable to take action on Data Processing Agreements or Business Associate Agreements immediately and rely on our customers to also ensure they are following the regulations required and have the necessary permission for the data they are using.

    The first years of GDPR were confusing to say the least and it seems we are still discussing regulations that are partially implemented to this date.

    11827 gdpr 257874

    Source

    Just as regulations were stabilizing, AI emerged as a new challenge—learning from, generating, and exponentially increasing data volumes.

    And the regulatory bodies are catching up as well, specifically in Europe.

    The EU Data Privacy Landscape

    Whereas the US and many other countries have a trend of convergence in principles (transparency, user control, security) but divergence in implementation, Europe is attempting to boil the ocean and converge principles with implementation at least for all countries in the EU.

    Mainly they are continuing their efforts in:

    • Expansion of GDPR to address AI and data marketplaces (e.g., EU AI and the EU Data Act).
    • Attempting to regulate communication through ePrivacy
    • Harmonization efforts for global data transfers.

    1. The EU ePrivacy

    The rule known as the eprivacy regulation, which dates from 2017 and has been in limbo since 2020, sought to create a level playing field between Big Tech and telecoms operators.

    e privacy 1

    Source

    Basically this rule is attempting to protect the confidentiality of electronic communications and regulate the use of cookies and similar tracking technologies. It complements the General Data Protection Regulation (GDPR) by specifically addressing issues related to online privacy, including direct marketing, electronic communication security, and tracking technologies like cookies, fingerprinting, and behavioral advertising. Unlike GDPR, which focuses on the processing of personal data, ePrivacy applies to all electronic communications data, regardless of whether it contains personal data or not.

    The main focus of this policy is around cookies and tracking people across sites and devices but it reaches WhatsApp and Skype as well by requiring them to operate as normal telecommunication operators which follow much more strict rules than VoiP operators.

    2. The EU AI Act

    The European Union’s Artificial Intelligence Act (AI Act) is a landmark regulation establishing a comprehensive legal framework for artificial intelligence (AI) within the EU. Adopted on May 21, 2024 and applied February 2025 this act can be summarized as follows:

    1. Risk-Based Classification: The AI Act categorizes AI systems based on the level of risk they pose:
      • Unacceptable Risk: AI applications deemed to pose unacceptable risks, such as government-run social scoring systems, are prohibited.
      • High Risk: Applications like CV-scanning tools that rank job applicants are considered high-risk and are subject to strict legal requirements, including robust risk management, data governance, and human oversight.
      • Limited and Minimal Risk: Other AI applications fall into these categories and are subject to lighter regulations or merely transparency obligations.
    2. Scope of Application: The regulation applies to all sectors (excluding military) and all types of AI, affecting providers, deployers, importers, distributors, and product manufacturers with links to the EU market.
    3. Innovation and Testing Support: To foster AI innovation, the AI Act requires national authorities to provide testing environments that simulate real-world conditions. This initiative aims to support companies, especially small and medium-sized enterprises (SMEs), in developing and testing general-purpose AI models before public deployment.

    In the same way that GDPR is country agnostic as long as it relates to EU citizens’ data, the AI Act also crosses borders when it comes to where the source of the technology or location of the company or which training data is used.

    Most of the obligations fall upon the developers of high risk AI systems and most use cases, the risk is minimal, nonetheless, this act has been the spotlight in recent events and have created friction between the industry, the regulatory bodies and the consumers.

    From the industry side, the usual argument comes time and time again – if you regulate it you will be stifle innovation. The usual players Facebook and Google are usually the first ones to cry out but now we also have OpenAI boss Sam Altman argue that “European regulators, who are working on the application of the AI Act, must think about the consequences of their decisions on tomorrow’s opportunities, especially at a time when the rest of the world is advancing.” as per his letter to the French newspaper Le Monde on February 8th, 2025.

    Mario Draghi, the president of the European Central Bank, who claims there is an “innovation gap” when comparing Europe with the U.S. and China due to regulations.

    This is history repeating itself when the EU was criticized by the top tech giants (Facebook, Google, X, etc.) for enforcing the toughest security and privacy law, the infamous GDPR.

    Fast forward 1 or 2 years and we now have GDPR like protection in several states and more than just a few countries, most likely as a way to have a legal ground to sue the tech giants in the first place.

    In short, the technology sector wants little to no regulation to all the content available on the internet, content that was not created by them but that they are using freely to promote their business and tools. The same content that writers, reporters, painters, musicians, developers and your general consumer publishes through their site is now free for all. Except, of course, for those that are willing and have the means to fight back such as Digital Music Associations or artists that see the potential to sue someone for copyright infringement.

    Whereas Google pays license to certain newspapers or news agencies for its news feed, OpenAI and other similar platforms pay nothing.

    3. The EU Data Act

    The European Union’s Data Act is a significant legislative measure designed to regulate data access and usage across all economic sectors within the EU. Adopted on December 13, 2023, and published in the Official Journal on December 22, 2023, the regulation is set to become applicable on September 12, 2025.

    dataact 1

    Source

    In summary, this act is about all other data that GDPR does not cover and focuses on fair data-sharing agreements between businesses and governments.

    The key objectives of the Data Act are:

    1. Enhancing Data Accessibility: The Act aims to make more data available for use by clarifying who can access and utilize data generated in the EU, thereby fostering a competitive data market and stimulating data-driven innovation.
    2. Ensuring Fairness in Data Economy: It seeks to establish harmonized rules on fair access and use of data, ensuring that the value derived from data is equitably distributed among the actors in the data economy.
    3. Facilitating Business-to-Government Data Sharing: The regulation promotes the use of privately held data by public sector bodies, especially during public emergencies, to improve public services and inform policymaking.

    What is critical to many software and hardware manufacturers are the outcomes of the above objectives which are:

    1. User Rights to Data: Users of connected devices, such as Internet of Things (IoT) products, are granted the right to access data generated by their use. This empowers consumers and businesses to gain insights from their own usage data.
    2. Obligations for Data Holders: Entities that control data are required to make it available to the user upon request, under fair, reasonable, and non-discriminatory terms.
    3. Unfair Contract Terms: The Act prohibits the imposition of unfair contractual terms related to data access and use, particularly in agreements between businesses of differing bargaining power.
    4. Cloud Service Portability: Providers of data processing services, including cloud and edge services, must ensure that customers can switch to other providers without facing undue obstacles, thereby preventing vendor lock-in.

    Organizations operating within the EU will need to assess their data management practices to ensure compliance with the Data Act. This includes reviewing data access policies, contractual agreements, and technical infrastructures to facilitate data sharing and portability. Businesses are encouraged to establish clear data governance frameworks that align with the Act’s provisions to avoid potential legal challenges and to capitalize on new opportunities in the EU’s data economy.

    The Data Act represents a pivotal step in the EU’s broader strategy to create a unified digital market, promoting innovation while safeguarding fairness and privacy in the rapidly evolving data landscape.

    For those (like myself) that are starting to lose track of regulations around private data here is a nice summary:

    GDPRData Act
    Main PurposeProtects personal data and privacy rightsRegulates access to and sharing of data (including IoT and industrial data)
    Applies ToAny organization processing EU residents’ personal dataAny business handling IoT-generated data, cloud services, and industrial data
    Type of DataPersonal data (identifiable individuals)Non-personal and industrial data, but also includes personal data if generated by IoT devices
    Consent Required?Yes, individuals must consent to data processingNo, focuses on fair data-sharing agreements between businesses and governments
    Data TransfersStrongly regulated, with restrictions on international transfersAllows businesses and public authorities to request access to data under certain conditions
    EnforcementData Protection Authorities (DPA) and the European Data Protection Board (EDPB)National authorities and the European Commission

    Where this act is going to be interesting is that it now touches on any data created or generated by a person independent of what software or hardware it was used to do so, must be available to the person. The amount of data stored by third party apps that are not accessible is staggering and this means that all these manufacturers will be required to, at minimum, have an API accessible to their customers to pull their own data, irrespective if its a like on a post or the temperature of their household 3 months ago.

    How to stay “safe”

    I am not sure that we can stay safe from the regulations as they continue to evolve as much as the technology that requires those same regulations to exist. Nonetheless, here are some top of mind points which you need to keep in mind for your organization:

    • Know Your Data: Inventory what personal data you collect, where it’s stored, and which laws apply (e.g., EU customer data -> GDPR, California residents -> CCPA, etc.).
    • Embrace Privacy by Design: Build privacy and security into products and processes from the start. For example, minimize data collection to what you truly need, and integrate consent or opt-out mechanisms seamlessly for users in different regions.
    • Governance: Establish a privacy governance framework – possibly led by a Data Protection Officer or Chief Privacy Officer – to monitor regulatory changes and maintain compliance. This includes training employees on data handling and updating policies as laws change.
    • Technical Measures: Invest in cybersecurity and consider technical tools for compliance (like consent management platforms for websites, encryption for data at rest and in transit, and data loss prevention tools). Strong security not only prevents breaches but is a legal requirement (and a mitigating factor if something does go wrong).
    • Documentation and Transparency: Be prepared to demonstrate compliance. Maintain records of consent, processing activities (GDPR’s Article 30 records), risk assessments, and third-party data processing agreements. Also, be transparent with users/customers – clear privacy notices help satisfy legal requirements and build trust.

    In Summary

    Here are some things you should take away from this article:

    • Innovation in tech directly generates the need for regulations which in turn also foster more innovation in tech
    • Regulations are always evolving, many are too complex to read as well as understand, so much so that small and medium companies hardly stand a chance to be 100% sure they are inline with them
    • The tech and government will always be battling out to find the agreed medium and the consumer is most likely the one that loses one way or another

    Having said the above, you should and need to take those regulations as a guidance on how to improve the security and safety of your business, your customers and team. These regulations can be used for good, by improving the security and safety of everyone you deal with. You just need to identify what is critical and what is a “big company” problem because many of the rules are there because the big companies abused the privacy and confidence of the consumers in the first place.