This is the Trace Id: b1752d53b5970edd67187d91618abc3d
Skip to main content
Corporate Social Responsibility

Bing EU Digital Services Act Transparency Report

This report is prepared semi-annually, beginning in October 2023, pursuant to the requirements of the EU Digital Services Act (DSA).

In accordance with the requirements of Regulation (EU) 2022/2065 (the EU Digital Services Act or DSA) for Very Large Online Search Engines, Bing provides the following report on content moderation activities engaged in during the period of January 1 – June 30, 2024.

DSA Article 15(1)(a): Government Orders from Member States

  • During the relevant period, Bing received zero orders from EU Member States’ authorities to act against illegal content provided by recipients of the service.
  • During the relevant period, Bing received zero orders from EU Member States’ authorities requesting specific information about individual recipients of the service.

  • As noted above, Bing received zero orders from EU Member States’ authorities requesting specific information about individual recipients of the service during the relevant period.

[1] “Recipients of the service” does not include the owners of websites indexed by the online search engine. DSA Recital 77.  Please see Microsoft's CSR Reports Hub for additional information on content removals regarding indexed website content.

DSA Article 15(1)(c): Own-Initiative Content Moderation

  • This section describes activities Bing undertakes to detect and address illegal content or information in violation of Bing’s terms and conditions that is provided by recipients of the service, and the activities Bing’s generative artificial intelligence (AI) features2 undertake to address recipient accounts in violation of their terms of use and code of conduct.3

    Use of AI-based classifiers on search prompts. Traditional web search begins with a user search query, namely the input (text, voice, or image) a user sends to Bing from the search bar. Similarly, during the relevant period, to initiate a search in Bing’s new AI-enhanced search experiences, a user submits text, voice, images, and/or other enabled queries as input to the AI model, which then performs the relevant searches of the Bing service and generates a response (or, in the case of Image Creator in Bing, generates an image). This is known as a “prompt”. Bing uses classifiers (machine learning models that help sort data into labeled classes or categories) and content filters on user search queries and prompts to help mitigate harm or prevent misuse. Examples include requests for information that could potentially lead to users being unexpectedly exposed to self-harm, violence, graphic content, hateful content, or misleading information. Flags from these classifiers may lead to mitigations, such as not returning generated content to the user, diverting the user to a different topic, or redirecting the user from AI-enhanced search to traditional web search. Bing tracks accuracy metrics for these measures such as precision, recall, error rate, under blocking and over blocking to help monitor the interventions’ effectiveness and help ensure Bing does not unduly limit free access to information. Accuracy metrics are overseen by human reviewers and based on Bing content policies.

    Automated content detection – Bing Visual Search. Bing’s Visual Search feature allows users to upload an image and search for similar images or ask questions about the image. As part of Microsoft’s longstanding commitment to preventing the spread of child sexual exploitation and abuse imagery (CSEAI), Bing uses hash matching technologies to detect matches of previously identified CSEAI. In the context of the immediate search, the use of these technologies furthers Bing’s goal to avoid inadvertently surfacing potentially harmful web content to users. More broadly, images that have been used as queries in Bing Visual Search may contribute to training Bing’s image-matching algorithms; by scanning images that users attempt to upload, Bing helps to ensure that CSEAI is not included in the Visual Search training data. Please see below for additional details about these processes.

    Bing’s generative AI features (powered by Microsoft Copilot and Image Creator) enforcement actions. In regard to Copilot in Bing and Image Creator in Bing features, Bing took action on recipient accounts where violations of Copilot Terms of Use4 are found, to either temporarily or permanently limit their access to be the same as an unauthenticated user,5 which has the same functionalities, but limited conversation turns. Image Creator in Bing (which includes Image Creator services in Copilot) takes actions on recipient accounts where violations of Image Creator Terms of Use6 are found, to either temporarily or permanently suspend their account access to Image Creator.

    Training and assistance. Human reviewers receive extensive training on our policies including the rationale behind them and how to apply them accurately and consistently. Decisions are periodically checked to ensure the policies are being applied consistently. Ongoing coaching and training are provided to review teams, as legal obligations evolve, new types of harms emerge, or policies otherwise need to adapt. For high-consequence harms, like child sexual exploitation and abuse, specialized teams receive additional focused training. Microsoft provides a program to support the mental and emotional wellbeing of Microsoft employees whose work may bring them into contact with objectional material. This program provides resources such as one on one counseling, monthly education sessions, on demand small group sessions, virtual community of practice gatherings, and access to program manager office hours. Microsoft requires our vendors to provide wellness programs for any vendor employees working with objectional material.

  • During the relevant period, Bing took voluntary actions to detect and block the upload of 123,835 items of suspected CSEAI content provided by recipients of the service. These items were identified through the use of automated content detection in Bing Visual Search as described above.
  • Bing does not provide capabilities for users to share content or interact with other users on the platform. As such, during the relevant period, Bing did not take measures that affected a recipient’s right to share content or interact with other users on the service due to illegal content or violations of terms and conditions. In the case of Bing’s generative AI features (which are powered by Microsoft Copilot and Image Creator in Bing), there are some scenarios where a user may be suspended or blocked from using the Copilot or Image Creator services due to violations of the Terms of Use. Note that in Copilot in Bing (previously known as Bing Chat) and in Image Creator in Bing (previously known as Image Creator from Designer) there is still no ability for users to upload, post or share content on the service. Account restrictions are not related to actions affecting other users, but instead relate to users attempting to generate content for their own use that violates relevant terms or codes of conduct.

    During the relevant period, Copilot (including Copilot in Bing features) temporarily restricted the recipient’s ability to interact with the product (thus limiting the number of turns in a conversation) in 68 instances and permanently restricted the recipient’s ability to interact with the product in 20 instances as a result of violations of Copilot Terms of Use and Code of Conduct which prevented the recipient from further exploitation of Copilot safety systems. For each of these instances the violation was due to “jailbreaks” - i.e., attempts to bypass Copilot safety systems.   

    During the relevant period, Image Creator in Bing (which includes Image Creator services in other Microsoft offerings including Designer, Copilot, etc.) temporarily or permanently restricted access to 26,318 instances (thus limiting the recipient account’s access to Image Creator services in other Microsoft offerings including Designer, Copilot, etc.) as a result of violations of the Image Creator Terms of Use. In the relevant period these actions were taken as a result of users attempting to bypass Image Creator in Bing safety systems.

  • [2] During the relevant period, Bing’s generative AI features included Microsoft Copilot (including Copilot in Bing features) and Image Creator in Bing. Please see Copilot Terms of Use and Image Creator Terms of Use for additional information.
  • [3] Due to their nature, Bing search and generative AI features do not generally conduct “content moderation” as that term is defined in the Digital Services Act due to the nature of those products, as content is not provided by recipients of the service nor hosted by Bing. Search queries, similar to user prompts, trigger systems that ensure the services work as intended. The outputs of these systems are not provided by a recipient of the service. Nevertheless, we have provided additional descriptions of how these systems operate.
  • [4] Please see Copilot Terms of Use for additional information.
  • [5] An unauthenticated user is defined as a user who is not logged into their Microsoft account when accessing Copilot features.
  • [6] Please see Image Creator Terms of Use for additional information.

DSA Article 15(1)(d): Appeals

  • During the relevant period:

    • Bing received zero appeals of the types of decisions described above;

    • Copilot (including Copilot in Bing features) received zero appeals of the type of decisions described above;

    • Image Creator in Bing (which includes Image Creator services in other Microsoft offerings including Designer, Copilot, etc.) received 13,284 appeals7 of the type of decisions described above.
[7] The number of appeals for Image Creator in Bing is represented at the global level. A recipient whose account access to Image Creator in Bing features was suspended as a result of violations of the Image Creator Terms of Use may have appealed the suspension multiple times.

DSA Article 15(1)(e): Automated Content Detection

  • Automated content detection – Bing Visual Search. As described above, Bing relies on the hash matching technologies PhotoDNA and MD5 to detect matches of previously identified CSEAI in images uploaded to the Bing service or an associated website by recipients of the service using the Bing Visual Search feature. This is one element of Microsoft’s overall commitment to prevent the spread of CSEAI, as described more fully in its Digital Safety Content Report and other public announcements.

    Hash-matching technology works by using a mathematical algorithm to create a unique signature (known as a “hash”) for digital images and videos. The hashing technology then compares the hashes generated from content provided by the recipient of the service with hashes of reported (known) CSEAI, in a process called “hash matching”.

    A layered approach to detection of CSEAI is applied in this context, combining both hash-matching technology and manual review. Microsoft implements its own hash verification process in which Microsoft-trained analysts review and confirm images associated with hashes provided from non-profits and other industry partners. Microsoft also implements an additional manual review process as an ongoing hash quality check. Reversal rates of the initial content moderation decision (for example, on appeal) are tracked, as a reflection of Microsoft’s application of hash-matching technology.

DSA Article 42(3): Information on monthly active users

  • Information about the average monthly active users of the Bing service in the European Union is published semi-annually. The most recent information is available on the EU Digital Services Act information page and reports approximately 132 million average monthly users in the EU during the six-month period ending June 30, 2024. The table below details the monthly active users for each EU Member State during this period. Note that these numbers may include overlap in recipients of the service who accessed Bing from multiple Member States during the relevant time period.

Average monthly active users (MAU) for each EU Member State

EU Member State Average MAU (million)
Austria
3.6
Belgium
4.7
Bulgaria
1.0
Croatia
0.8
Cyprus
0.3
Czech Republic
3.6
Denmark
2.3
Estonia
0.4
Finland
1.9
France
22.1
Germany
29.5
Greece
1.8
Hungary
2.1
Ireland
2.8
Italy
13.4
Latvia
0.5
Lithuania
0.7
Luxembourg
0.3
Malta
0.2
Netherlands
8.8
Poland
10.0
Portugal
3.3
Romania
2.3
Slovak Republic
1.2
Slovenia
0.6
Spain
12.7
Sweden
4.1

This information was compiled pursuant to the Digital Services Act and thus may differ from other user metrics published by Bing.

Follow Microsoft