Skip to end of metadata
Go to start of metadata

In this topic:

Managing items in batches is the best way to ensure that you never get throttled by the service (see Push API Limits - Recommended Maximum Number of Items/Security Identities per Hour). batch Push API operation allows you to forward a large number of push operations to the service using only a few Push API calls, rather than performing hundreds (or thousands) of single Push API calls to achieve the same results. 

Performing batch Push API operations is slightly more complex than performing single Push API operations, as doing so involves three distinct steps which are detailed in this topic.

Best Practices:

  • Update your source activity status
    Unless you are merely testing the service and do not mind having incomplete source activity logs, consider setting your Push source to an active status before performing this operation (see Updating the Status of a Push Source).

Step 1 - Create a File Container

See Creating a File Container.

Step 2 - Upload the Content Update into the File Container

Perform the following PUT request to upload the push operations required for your content update into the Amazon S3 file container you got from Step 1 - Create a File Container.


In the request body (see Item Models - BatchDocumentBody):

  • For each item you include in the AddOrUpdate array (see Item Models - DocumentBody):
    • Replace <MyItemMetadata>* by any number of arbitrary metadata key-values you want to include along with the item you are adding or updating (see Pushing Item Metadata).
    • Replace <MyItemToAddOrUpdateURI> by the URI of the item to add or update (e.g.,
    • Replace <"Data"|"CompressedBinaryData"|"CompressedBinaryDataFileId"> by the property you want to use to push the item data. You must also replace <MyItemDataOrFileId> accordingly (see Pushing Item Data).
    • If you are using the CompressedBinaryData or the CompressedBinaryDataFileId property to push item data (see Using the CompressedBinaryData Property and Using the CompressedBinaryDataFileId Property):

      • Replace <"Uncompressed"|"Deflate"|"GZip"|"LZMA"|"ZLib"> by the actual compression algorithm that was applied to the item data.


        The compressionType value is case sensitive.

      Otherwise, you do not need to include the compressionType property at all.
    • Replace <MyItemDataFileExtension> by the actual file extension which the Push API should use to interpret the item data (e.g., .txt.html, etc.). This value must include a preceding dot (.) character.

      Best Practice:

      While specifying a FileExtension is optional, doing so is considered best practice.

    • If you want to identify the item as the child of another item in the index, replace <MyItemParentId> by the URI of the parent item (see Understanding the ParentId Property).
      Otherwise, you do not need to include the ParentId property at all in your request body.
    • If the target Push source is SECURED:

      Otherwise, you do not need to include the Permissions property at all.

  • For each item you include in the Delete array (see Item Models - DeletedItem):
    • Replace <MyItemToDeleteURI> by the URI of the item to delete (e.g.,
    • Set the deleteChildren property to true if you want to delete all items identified as descendants of the specified item to delete, or set it to false otherwise (see Understanding the ParentId Property).


      The deleteChildren property is set to false by default.

A successful response (200 OK) has no content, but indicates that the content update was successfully uploaded to the Amazon S3 file container.

Sample Request

Request - Uploading a batch of push operations into a file container
Successful response - 200 OK

Step 3 - Push the File Container into a Push Source

Use the Add, update or delete a large number of encrypted items in a source operation to push the Amazon S3 file container into a Push source.

In the request path:

In the query string:

In the Authorization HTTP header:

A successful response (202 Accepted) has no content, but indicates that the operation was successfully forwarded to the service and that the batch of items is now enqueued to be processed by the Coveo Cloud V2 indexing pipeline.


This does not imply that all items in the batch were successfully added, updated and/or deleted in the target Push source (see Push API Best Practices - How to Validate the Indexing Status of Pushed Items). 

Best Practice:

If you previously set your Push source to an active status, you should consider setting it back to the IDLE status once this operation has successfully returned (see Updating the Status of a Push Source). 

Sample Request

Request - Pushing a file container into a Push source
Successful response - 202 Accepted
  • No labels