How will looming big tech regulations change the way online platforms operate?

Key Points

  • The growth of online services has created substantial wealth but has also created new inequalities, acute consumer-protection issues, and troubling concentrations of power.
  • As humanity becomes more dependent on the internet, both personally and professionally, support for online-service regulations has increased. We believe regulatory scrutiny will intensify.
  • Regulations have the potential to create substantial consequences for the evolution of the internet world.

Over recent decades, the technology industry has grown rapidly, and online services have become a necessary and pervasive element of daily life. The proliferation of online services has created substantial wealth but also new inequalities, acute consumer-protection issues, and troubling concentrations of power. Civil liberties have been compromised by ubiquitous, pervasive digital surveillance. The misuse of people’s data has led to new privacy issues, consumer behavior manipulation and consumer discrimination. Further adverse consequences can include radicalization and misinformation, as well as small business exploitation, abuse of market power, flawed algorithms and stifled innovation.

In our view, many of these challenges are avoidable, and online services’ negative effects on the economy, civil rights, privacy and consumer protection are not a required ‘cost’ of a thriving online culture, economy and innovation. In reality, these harmful effects are often the product of business decisions, market failures, regulatory flaws and enforcement oversights that together have created an atmosphere where destructive and exploitative practices among online companies are considered industry standard. As humanity becomes more dependent on the internet, both personally and professionally, support for government regulation of online services has increased. We believe regulatory scrutiny should intensify. Concerns about antitrust, data usage, customer privacy and content moderation, among others, are likely to be addressed by policymakers. Legislators are also proposing industry guardrails around misinformation and synthetic content such as deep fakes.

Currently, there are five existing or proposed regulations filtering through legislative processes in the US, the UK and the European Union (EU). Each entity has one piece of legislation that seeks to regulate data privacy and content moderation, while the US and the EU are also considering antitrust regulations. Thus far, the EU is leading the regulatory charge to rein in big tech, with UK and US legislative proposals still pending.

Data Privacy and Content Moderation Regulations

American Data Privacy and Protection Act (ADPPA)

The American Data Privacy and Protection Act is a federal online privacy bill that was set out in July 2022. While the proposed legislation passed the House Energy and Commerce Committee with near unanimity, action on the ADPPA has not been completed. The Act seeks to dictate rules for all stages of data collection and usage, including the types of data that can be collected, allowable collection methods, uses of collected data, and data protection.

The ADPPA uses a data-minimization approach, by which user data collection would only be allowable for one of 17 permitted purposes, including authenticating users, preventing fraud, and completing transactions. All other kinds of data collection are simply prohibited. The bill clearly states that organizations cannot collect more data than they reasonably need, and provides a list of acceptable reasons a company might require data. This approach stands in contrast to alternative regulations that require user consent, which can lead to an endless stream of privacy pop-ups.

Additionally, the bill offers significant protection against targeted advertising. As per the ADPPA, targeted advertising using sensitive data like health information and the personal data of minors is banned. Furthermore, targeted advertising is only permitted using data collected and processed in accordance with ADPPA, therefore forbidding data collected outside the data minimization principal. Targeted advertising with the help of data related to race, color, religion, national origin, gender, sexual orientation or disability is prohibited. While first-party advertisement targeting is allowed under the scope of the act, third-party data collection and processing is restricted, as data brokers must comply with Federal Trade Commission auditing standards.

Consumers can also opt out of third-party data transfers. While some regulations allow consumers to restrict the sale of their data to third parties, the ADPPA restricts all transfers of their data. Furthermore, third-party data brokers would be required to register their company on both a public database and a “do not collect” registry similar to a “do not call” list used for avoiding robocalls. Individuals could request that all registered entities refrain from collecting their data.  

The bill gives consumers various rights over covered data, including the right to access, correct and delete their data held by a particular covered entity.

Data-security measures are also a large focus of the proposed regulations. In accordance with the bill’s regulations, organizations would have to implement a privacy-by-design principle, which requires them to design products and services using as little data as possible while protecting consumer privacy. Additionally, the ADPPA would require organizations to disclose the types of data collected, its use, the length it is retained and whether the data is accessible to the People’s Republic of China, Russia, Iran or North Korea. Most businesses will have to submit a biennial privacy impact assessment, but large data holders would be required to perform annual data privacy impact assessments on their algorithms and provide evidence of strict internal data processing controls.

While the ADPPA does not prescribe any penalties for regulation violations, it does refer to the Federal Trade Commission Act, where the penalties range between $40,000 and $50,000. Individuals can bring civil actions, including class actions, against any entity that has violated their rights under the law. However, individuals must submit a notice to cure the offending company. The company then has 45 days to remedy the alleged violation, at which point the courts can dismiss the civil action.

UK Online Safety Bill

The UK Online Safety Bill is a proposed Act that seeks to protect against illegal material including content surrounding child abuse, terrorism and self-harm. It offers a triple shield of protection against illegal content, requiring platforms to remove both illegal content that violates their terms and conditions, while allowing users to opt out of unwanted content. The Online Safety Bill was originally published as a draft in May 2021, but has not yet been approved.

The regulation would affect not only social-media giants, but also a wide range of internet services—from dating apps and search engines to online marketplaces as well as consumer cloud storage and even video games—that allow relevant user interaction. More simply, it covers any company in the world that hosts user-generated content and is accessible by UK residents.

Currently, if a user posts illegal or harmful content online, the internet platform has a liability shield, meaning the publisher does not become liable until it is made aware of the harmful content, at which point it must act to remove it. However, under this bill, companies would be required to actively monitor for illegal content and immediately remove it. Additionally, search engines and other platforms that host third-party, user-generated content would be required to protect users from fraudulent advertising. Finally, all companies would be required to have mechanisms that easily allow users to report harmful content or activity.

The regulation proposes a two-tier categorization for online service companies. Top-tier companies would include those with the largest online presences that have high-risk features, while the remainder would be considered second tier. Tier-one companies would be required to assess the risk of their platform causing significant physical or psychological harm owing to illegal content. Additionally, these companies would be legally required to publish transparency reports illustrating their plans to tackle online harms. They would also need to define acceptable ‘legal but harmful’ content in their terms and conditions and enforce this transparently and consistently

Importantly, the bill details content moderation for children. Many social media sites do not permit children under 13 to join, while accounts for children under 18 may be allowable but have limited functionality. The bill would require companies to detail their age-limit enforcement plans. It seeks to protect children by requiring organizations to quickly removing illegal content, content promoting self-harm, and other harmful and age-inappropriate content (even if it is legal).

However, skeptics of the regulation warn that the bill would ultimately limit free speech and encroach on personal freedoms through the disabling of end-to-end encryption. Additionally, they debate whether content that is legal but harmful should be removed.

Under the terms of the bill, the regulator would fine offending companies up to 10% of their global revenues. The UK’s Office of Communications (Ofcom) would also have the power to block non-compliant services from being accessed in the UK. In addition, this new amendment would impose direct liability on platform executives, introducing prison sentences of up to two years for repeated failures to comply with Ofcom instructions.

EU Digital Services Act (DSA)

The EU is well ahead of US and UK big-tech regulations, having created a global content moderation and targeted advertising benchmark for social media and technology platforms. The DSA aims to create a digital space where the fundamental rights of all digital service users are protected. The law asks internet giants to quickly remove flagged illegal content and limits personalized targeted advertising practices. The DSA was approved in October 2022, and compliance for affected service providers will be enacted in a phased manner.

Though similar to the UK’s Online Safety Bill, the DSA covers every aspect of the internet world. Large online platforms (with more than 45 million active users per month) will be monitored more closely, and these businesses will have to conduct an annual assessment to identify systemic risks associated with their services. They will also be required to publish reports detailing their efforts to curb societal risks including the limiting of free speech, public health misinformation and election concerns.

Additionally, these large platforms must publish their terms and conditions for user-generated content, which will fall under both European and national laws. Platforms will have to implement notice-and-action mechanisms and internal complaint-handling systems for their decisions to remove or disable access to illegal information. Finally, platforms are expected to boost their efforts fighting disinformation campaigns, cyber violence against women, and falsehoods on health, including Covid misinformation, among other harmful content.

The DSA grants authorities permission to access collected data, upon request, for the purposes of monitoring and assessing compliance with the act. Additionally, platforms must explain the design, logic, functioning and testing of their algorithmic systems. The DSA also provides a crisis mechanism that enables the European Commission to intervene during threats to public security or health.

To avoid lengthy terms and conditions, the bill requires players to provide a concise and easily accessible summary of the terms and conditions in a machine-readable format and to inform users if there are any significant changes to terms and conditions. The law also sets parameters on sensitive data, which cannot be leveraged to profile users or be the basis of targeted advertising. Plus, under the DSA, children’s data is completely restricted and cannot be used for profiling or targeted advertising.

The DSA imposes strict penalties for violations of its regulations, charging offending platforms up to 6% of their annual global revenues.

Data Privacy and Content Moderation Regulations – Implications for Online Services

Data privacy and content moderation laws are expected to have far-reaching consequences for online services, especially companies that have a large user population. To comply with these new rules, companies will most likely need to increase their operating expenditure to implement these measures and fulfill key requirements. Expenditures would be allocated for content moderation, including the hire of additional employees and new software systems. Technology companies that offer content moderation services would benefit from this new obligation, along with artificial intelligence platforms that moderate content. Additionally, age detection and verification processes would require company investment to develop and enact.

While these pieces of legislation are expected to have a large impact in the UK and Europe, the ADPPA’s ‘opt-out’ function will create little friction for companies. Users would need to update their settings to make changes, which occurs on a limited basis. Companies that could feel the greatest pain from these laws include third-party data brokers and large companies that rely on third-party data.

The effectiveness of targeted advertising is likely to decrease, which could affect the pricing power of technology companies, the overall growth in digital spend, and the traffic-acquisition cost.

Finally, a lack of controversial content could dampen user engagement.  

Antitrust Regulations

EU Digital Market Act (DMA)

The EU Digital Market Act seeks to establish a level playing field for online services, fostering innovation, growth and competitiveness across the European market. It defines clear rules to ensure that no one large online platform, also known as a gatekeeper, dictates the rules of the online world. The regulation came into force in November 2022, and has become applicable in a phased manner.

The DMA defines gatekeepers as core platform service providers with annual revenue of at least €7.5 billion within the EU in the past three years or with a market valuation of at least €75 billion. Secondly, designated gatekeepers must have at least 45 million monthly end users and at least 10,000 EU business users.

Gatekeepers are required to ensure effective interoperability of operating systems, hardware and software. This will force gatekeepers to allow electronic-device users to uninstall pre-installed software and install third-party equivalents, including payment systems. Additionally, smaller platforms can request that gatekeepers allow their users to exchange messages, files or make video calls across messaging applications. This provision should expand consumer choice and help to counter network effects.

When displaying goods and services to consumers, gatekeepers must apply transparent, fair and non-discriminatory ranking conditions. They are no longer allowed to treat their own goods and services more favorably than those of third parties that are using the gatekeeper platform to connect with consumers.

The DMA also requires large gatekeepers to provide pricing and performance transparency to business users that advertise on their platforms. Users need to provide explicit consent for a gatekeeper to use personal data for targeted advertising. The act not only requires permission to utilize personal data, but also limits reusing personal data collected for the purposes of another service without prior consent. Finally, if a user denies a gatekeeper’s request for personal data, the gatekeeper can only send the user additional consent requests once a year.

The act seeks to protect mobile-application developers by requiring that gatekeepers allow those developers to use the services they prefer, such as payment systems. Gatekeepers are not able to force application developers to use the services provided by the gatekeepers themselves in order to have their software appear in the gatekeeper’s application store.  

The DMA also imposes penalties for violation, including fines of up to 10% of global annual revenues and 20% for repeat violations. If there are systematic infringements, the European Commission also has the authority to impose additional remedies necessary to achieve compliance. These can include structural remedies such as the forced sale of parts of the gatekeeper’s business or a prohibition on the acquisition of other companies in the digital sector.

American Innovation and Choice Online Act (AICO)

The AICO Act is a proposed antitrust bill in the US that focuses on large-scale platforms with a market capitalization of at least $600 billion in the last 12 months, or at least 1 billion global users or 50 million active US users. While the Senate Committee on the Judiciary voted to advance the legislation in January 2022, it has not yet been enacted. Similar to the EU’s DMA, the AICO Act seeks to limit platforms from favoring their own products or services, especially in instances where competition is materially harmed.

In an effort to limit large online platforms from favoring their own services, the AICO Act prohibits self-preferencing algorithms or any kind of discrimination against businesses listing their products on these large platforms. Additionally, platforms cannot hamper third-party competitors when they are in direct competition with the platform’s product offerings.

The bill prohibits conditional access to the platform or preferential status given to users that have purchased or used other products offered by the platform operator that are not unique to the platform.

The bill seeks to ensure that rivals have the same access to the platform, and its software, hardware and operating system, as the platform’s own services have, unless this would lead to a significant cybersecurity risk. Companies cannot prevent users from uninstalling software or changing default settings.

Finally, the legislation would make it unlawful for platforms to use non-public data generated by the platform to offer its own products or services. The bill seeks to ensure online businesses can access data and contact users on large online platforms. It also would prevent platforms from using that data to unfairly compete or promote their own brands.

Antitrust Regulations – Implications for Online Services

In the spirit of antitrust regulation, legislative measures strive to restrict e-commerce and search-platform companies from manipulating their rankings, search, review systems and design in favor of their own products over competitor products.

While such laws create protections for smaller online-service companies, they may create risks for larger online-service platforms. For example, a structural change to the integrated advertisement technology stack could drive lower operational efficiencies in targeting and serving advertisements, thereby threatening a gatekeepers’ strong market positioning. Additionally, financial implications include higher regulatory costs, slower advertising growth and higher traffic-acquisition costs, and the need for more research and development as targeted advertising capabilities weaken.

However, these laws are likely to have many positive aspects for users. The current mobile smartphone duopoly imposes high costs for mobile applications that wish to be included in mobile application stores. Fees, in turn, can be passed on in the form of higher costs to consumers. Reintroducing competition in application stores, removing competitive pricing restrictions, and giving developers more selling options, could generate more competitive markets for digital applications—ultimately producing lower costs for the services used daily. Finally, users who value their privacy would be able to remove unsecure mobile applications and choose more privacy-sensitive alternatives.

Conclusion

In our view, regulations have the potential to create substantial consequences for the evolution of the internet world. Regulations governing data privacy and content moderation are the largest influencers of business plans for online platforms that harness user-generated content and rely heavily on advertising for revenue. Furthermore, we believe that antitrust laws will be essential to safeguard consumers from unfair commercial practices, and to ensure fair competition in the digital market, which is dominated by a handful of companies.

Authors

Onkar Jagtap

Onkar Jagtap

Responsible investment analyst*

PAST PERFORMANCE IS NOT NECESSARILY INDICATIVE OF FUTURE RESULTS. Any reference to a specific security, country or sector should not be construed as a recommendation to buy or sell this security, country or sector. Please note that strategy holdings and positioning are subject to change without notice. For additional Important Information, click on the link below.

Important information

For Institutional Clients Only. Issued by Newton Investment Management North America LLC ("NIMNA" or the "Firm"). NIMNA is a registered investment adviser with the US Securities and Exchange Commission ("SEC") and subsidiary of The Bank of New York Mellon Corporation ("BNY Mellon"). The Firm was established in 2021, comprised of equity and multi-asset teams from an affiliate, Mellon Investments Corporation. The Firm is part of the group of affiliated companies that individually or collectively provide investment advisory services under the brand "Newton" or "Newton Investment Management". Newton currently includes NIMNA and Newton Investment Management Ltd ("NIM") and Newton Investment Management Japan Limited ("NIMJ").

Material in this publication is for general information only. The opinions expressed in this document are those of Newton and should not be construed as investment advice or recommendations for any purchase or sale of any specific security or commodity. Certain information contained herein is based on outside sources believed to be reliable, but its accuracy is not guaranteed.

Statements are current as of the date of the material only. Any forward-looking statements speak only as of the date they are made, and are subject to numerous assumptions, risks, and uncertainties, which change over time. Actual results could differ materially from those anticipated in forward-looking statements. No investment strategy or risk management technique can guarantee returns or eliminate risk in any market environment and past performance is no indication of future performance.

Information about the indices shown here is provided to allow for comparison of the performance of the strategy to that of certain well-known and widely recognized indices. There is no representation that such index is an appropriate benchmark for such comparison.

This material (or any portion thereof) may not be copied or distributed without Newton’s prior written approval.

In Canada, NIMNA is availing itself of the International Adviser Exemption (IAE) in the following Provinces: Alberta, British Columbia, Manitoba and Ontario and the foreign commodity trading advisor exemption in Ontario. The IAE is in compliance with National Instrument 31-103, Registration Requirements, Exemptions and Ongoing Registrant Obligations.

Explore topics