26 Aug 2023 - {{hitsCtrl.values.hits}}
CNN: The world’s largest tech companies must comply with a sweeping new European law starting yesterday that affects everything from social media moderation to targeted advertising and counterfeit goods in e-commerce — with possible ripple effects for the rest of the world.
The unprecedented EU measures for online platforms will apply to companies including Amazon, Apple, Google, Meta, Microsoft, Snapchat and TikTok, among many others, reflecting one of the most comprehensive and ambitious efforts by policymakers anywhere to regulate tech giants through legislation. It could lead to fines for some companies and to changes in software
affecting consumers.
The rules seek to address some of the most serious concerns that critics of large tech platforms have raised in recent years, including the spread of misinformation and disinformation, possible harms to mental health, particularly for young people, rabbit holes of algorithmically recommended content and a lack of transparency and the spread of illegal or fake products on virtual marketplaces.
Although the European Union’s Digital Services Act (DSA) passed last year, companies have had until now to prepare for
its enforcement.
Yesterday marked the arrival of a key compliance deadline — after which tech platforms with more than 45 million EU users will have to meet the obligations laid out in the law.
The EU also says the law intends “to establish a level playing field to foster innovation, growth and competitiveness both in the European Single Market and globally.” The action reinforces Europe’s position as a leader in checking the power of large US tech companies.
For all platforms, not just the largest ones, the DSA bans data-driven targeted advertising aimed at children, as well as targeted ads to all internet users based on protected characteristics such as political affiliation, sexual orientation and ethnicity. The restrictions apply to all kinds of online ads, including commercial advertising, political advertising and issue advertising. (Some platforms had already in recent years rolled out restrictions on targeted advertising based on protected characteristics.)
The law bans so-called “dark patterns,” or the use of subtle design cues that may be intended to nudge consumers toward giving up their personal data or making other decisions that a company might prefer. An example of a dark pattern commonly cited by consumer groups is when a company tries to persuade a user to opt into tracking by highlighting an acceptance button with bright colours, while simultaneously downplaying the option to opt out by minimising that choice’s font size or placement.
The law also requires all online platforms to offer ways for users to report illegal content and products and for them to appeal content moderation decisions. And it requires companies to spell out their terms of service in an accessible manner.
For the largest platforms, the law goes further. Companies designated as Very Large Online Platforms or Very Large Online Search Engines will be required to undertake independent risk assessments focused on, for example, how bad actors might try to manipulate their platforms, or use them to interfere with elections or to violate human rights — and companies must act to mitigate those risks. And they will have to set up repositories of the ads they’ve run and allow the public to inspect them.
Yesterday marked the arrival of a key compliance deadline — after which tech platforms with more than 45 million EU users will have to meet the obligations laid out in the law.
The EU also says the law intends “to establish a level playing field to foster innovation, growth and competitiveness both in the European Single Market and globally.” The action reinforces Europe’s position as a leader in checking the power of large US tech companies.
For all platforms, not just the largest ones, the DSA bans data-driven targeted advertising aimed at children, as well as targeted ads to all internet users based on protected characteristics such as political affiliation, sexual orientation and ethnicity. The restrictions apply to all kinds of online ads, including commercial advertising, political advertising and issue advertising. (Some platforms had already in recent years rolled out restrictions on targeted advertising based on protected characteristics.)
The law bans so-called “dark patterns,” or the use of subtle design cues that may be intended to nudge consumers toward giving up their personal data or making other decisions that a company might prefer. An example of a dark pattern commonly cited by consumer groups is when a company tries to persuade a user to opt into tracking by highlighting an acceptance button with bright colours, while simultaneously downplaying the option to opt out by minimising that choice’s font size or placement.
The law also requires all online platforms to offer ways for users to report illegal content and products and for them to appeal content moderation decisions. And it requires companies to spell out their terms of service in an accessible manner.
For the largest platforms, the law goes further. Companies designated as Very Large Online Platforms or Very Large Online Search Engines will be required to undertake independent risk assessments focused on, for example, how bad actors might try to manipulate their platforms, or use them to interfere with elections or to violate human rights — and companies must act to mitigate those risks. And they will have to set up repositories of the ads they’ve run and allow the public to inspect them.
30 Oct 2024 40 minute ago
30 Oct 2024 44 minute ago
30 Oct 2024 3 hours ago
30 Oct 2024 4 hours ago
30 Oct 2024 5 hours ago