
- Meta will require advertisers running political or issue ads on its platforms to disclose when their ads contain digitally created or altered content through AI, starting in 2024.
- Advertisers will handle these disclosures when they submit new ads, particularly if the ad features manipulated content designed to mislead.
- The new rules aim to combat deepfakes and misleading digitally manipulated media, ensuring transparency in political and issue advertising.
- Special cases that require disclosures include ads with photorealistic non-existent people, events that appear real but never happened, and ads depicting realistic events that are not genuine.
- Basic digital alterations like image sharpening and cropping are excluded from the new disclosure policy, and the information will be available in Meta’s Ad Library.