US to Require Watermarking of ‘Synthetic Content’ Created for Government

US Executive Order on AI

The President of the United States, Joe Biden, has issued the US’s first Executive Order on artificial intelligence. The Executive Order, which does not require any action by Congress or state legislature to take effect, outlines new standards for AI safety and security for AI-generated content. It follows the European Union’s AI Act in an attempt to legislate the use of artificial intelligence.

The Executive Order is broad in scope, covering the wide applications of AI. In a section that will have implications for the language industry, the US Administration highlighted the need to establish standards and best practices for identifying and labeling synthetic content, as well as establishing the authenticity and provenance of digital content produced by the Federal Government or on its behalf.

In the Executive Order’s opening statement, Biden said that “my Administration will help develop effective labeling and content provenance mechanisms, so that Americans are able to determine when content is generated using AI and when it is not. These actions will provide a vital foundation for an approach that addresses AI’s risks without unduly reducing its benefits.”

The Executive Order recognizes the wide range of content types that will be impacted by these measures, defining synthetic content as “images, videos, audio clips, and text, that has been significantly modified or generated by algorithms, including by AI.”

The term “synthetic content” means information, such as images, videos, audio clips, and text, that has been significantly modified or generated by algorithms, including by AI.


In practical terms, the US government agencies that produce digital content will need to watermark media and text to include the provenance and authenticity of the original content.

While no mention has been made of multilingual content or translation, primary institutional buyers in the United States require a range of visual, audio, and text-based documents for multiple purposes to a multilingual audience. Providing watermarks across this range of media and in multiple languages may create additional challenges for language service providers if they are to be implemented by the supplier.

The US administration will likely require foreign language content to follow these new guidelines, and it remains to be seen how official guidance will impact AI-enabled machine translations or AI-generated audiovisual content in multiple languages.

Biden did, however, reference the breadth and depth that this Executive Order will impact, citing “investments in AI-related education, training, development, research, and capacity, while simultaneously tackling novel intellectual property (IP) questions and other problems to protect inventors and creators.”

Biden went on to say that the US Federal Government will “enforce existing consumer protection laws and principles and enact appropriate safeguards against fraud, unintended bias, discrimination, infringements on privacy, and other harms from AI.”

Next Steps

Upon release of the Executive Order, the US Secretary of Commerce will have up to eight months to carry out an assessment of existing standards, tools, methods and practices to authenticate content, track its provenance, detect and label synthetic content, and test software to achieve these goals.

The US Secretary of Commerce will then have six months to develop guidance regarding the existing tools and practices, and once complete, will have a further six months to issue this guidance for labeling and authentication of synthetic content to national agencies.

Once implemented, the Biden-Harris Administration hopes that this national guidance will ultimately set an example for the private sector and governments around the world to implement similar initiatives.