Skip to content

Commit

Permalink
chore: Update discovery artifacts (#2490)
Browse files Browse the repository at this point in the history
## Deleted keys were detected in the following stable discovery artifacts:
chromeuxreport v1 https://togithub.com/googleapis/google-api-python-client/commit/f09220b9e7387bcb9040ebf9117920403d114775
dataplex v1 https://togithub.com/googleapis/google-api-python-client/commit/c778bebdd26b105146e60878c8752732c0ab32d0
integrations v1 https://togithub.com/googleapis/google-api-python-client/commit/fc0f18c1a239b0768145adcdd95d79d108ed6e9d
redis v1 https://togithub.com/googleapis/google-api-python-client/commit/f429467ba93f8eba02736e66527330e07e6e0497

## Deleted keys were detected in the following pre-stable discovery artifacts:
redis v1beta1 https://togithub.com/googleapis/google-api-python-client/commit/f429467ba93f8eba02736e66527330e07e6e0497

## Discovery Artifact Change Summary:
feat(aiplatform): update the api https://togithub.com/googleapis/google-api-python-client/commit/a98ec30c0ed3ac32f632638f69f495cbf3c665be
feat(alloydb): update the api https://togithub.com/googleapis/google-api-python-client/commit/c8e352ecd6cdaa4a168407a65473c0405d89106c
feat(androiddeviceprovisioning): update the api https://togithub.com/googleapis/google-api-python-client/commit/a079abe0cecd41a9f9377adfbf63a2675e68a748
fix(androidpublisher): update the api https://togithub.com/googleapis/google-api-python-client/commit/525818e5998dfcc45bc90c4d82e956a075cad51f
feat(backupdr): update the api https://togithub.com/googleapis/google-api-python-client/commit/03cd7e9e23655aab788f55f12a502d5000388cff
feat(bigquerydatatransfer): update the api https://togithub.com/googleapis/google-api-python-client/commit/cfe94c54888495b6d1c7512f41b87f50247eae42
feat(chromeuxreport): update the api https://togithub.com/googleapis/google-api-python-client/commit/f09220b9e7387bcb9040ebf9117920403d114775
fix(cloudkms): update the api https://togithub.com/googleapis/google-api-python-client/commit/6c1fc37695edd1c4f702a39ffc49dad871eedd22
feat(composer): update the api https://togithub.com/googleapis/google-api-python-client/commit/3651bad2eef9c9524474d96213ca38a44d24ef1a
feat(compute): update the api https://togithub.com/googleapis/google-api-python-client/commit/01fe4afd9a7396266ddd2de6167a14c7adf33deb
feat(connectors): update the api https://togithub.com/googleapis/google-api-python-client/commit/5b1ebb68ea2743bebbff24c081204757bd4d4c33
feat(container): update the api https://togithub.com/googleapis/google-api-python-client/commit/ca3a5b71aae9a2b37f675e85fe61d0acfe3943a9
feat(datamigration): update the api https://togithub.com/googleapis/google-api-python-client/commit/b92c092cdd0ce2cf44d2bdeef2fe70c374c0f120
feat(dataplex): update the api https://togithub.com/googleapis/google-api-python-client/commit/c778bebdd26b105146e60878c8752732c0ab32d0
feat(discoveryengine): update the api https://togithub.com/googleapis/google-api-python-client/commit/ff0e83e3337996256ac729994180af2b2dc1f5fb
feat(dlp): update the api https://togithub.com/googleapis/google-api-python-client/commit/ae1cdfb26ef2419bd89fcdf5925bc6a997714786
feat(eventarc): update the api https://togithub.com/googleapis/google-api-python-client/commit/a3024a70c2c68d4f6b88c4c41d499336081f671b
feat(firebaseml): update the api https://togithub.com/googleapis/google-api-python-client/commit/8e4579653aad999f4bfea3e3ea422aa04b531e18
feat(integrations): update the api https://togithub.com/googleapis/google-api-python-client/commit/fc0f18c1a239b0768145adcdd95d79d108ed6e9d
feat(merchantapi): update the api https://togithub.com/googleapis/google-api-python-client/commit/a104d707f66ac0801a9aaea065562ec7e0d2b7a4
feat(networkmanagement): update the api https://togithub.com/googleapis/google-api-python-client/commit/cbac29bd3df64dc80a548c740b6eb9ef37c8243c
feat(networkservices): update the api https://togithub.com/googleapis/google-api-python-client/commit/7f33d633e8019dfd6662103cfd8c0fe4b911fde8
feat(places): update the api https://togithub.com/googleapis/google-api-python-client/commit/9b0d7aac56cc0f3745def9a9674f61186177004c
feat(redis): update the api https://togithub.com/googleapis/google-api-python-client/commit/f429467ba93f8eba02736e66527330e07e6e0497
feat(retail): update the api https://togithub.com/googleapis/google-api-python-client/commit/e731bb10759475f959e4487d33d726ede3d3f51a
feat(run): update the api https://togithub.com/googleapis/google-api-python-client/commit/791f0bba36aa4696568eb8a96a7c5f96aee995f8
feat(securitycenter): update the api https://togithub.com/googleapis/google-api-python-client/commit/665e8f4fac0a46547b786e66500f9342ea40c6bb
feat(servicenetworking): update the api https://togithub.com/googleapis/google-api-python-client/commit/d5cfc5ce55dbdff98c0c3b8b58c1f55482f1131a
feat(spanner): update the api https://togithub.com/googleapis/google-api-python-client/commit/4c728c2e855bfc992b4d07dd1cd3b77c0e77e74e
feat(storage): update the api https://togithub.com/googleapis/google-api-python-client/commit/667799e130d86d7295c35fae123b98fa906758ca
feat(testing): update the api https://togithub.com/googleapis/google-api-python-client/commit/8b681d9fa931dfc2553eab6548a8861143225011
feat(tpu): update the api https://togithub.com/googleapis/google-api-python-client/commit/763eb25ea434ca1cd3b9960d034eda7ab6d02086
feat(walletobjects): update the api https://togithub.com/googleapis/google-api-python-client/commit/cfd989a5b83cf12fd3671b871ad96c09d3165979
feat(youtube): update the api https://togithub.com/googleapis/google-api-python-client/commit/d825a0cc811da01ea843be734b094fc15cce9e70
  • Loading branch information
yoshi-code-bot authored Sep 24, 2024
1 parent 4c0fc11 commit 930f0bb
Show file tree
Hide file tree
Showing 254 changed files with 19,228 additions and 1,630 deletions.
2 changes: 1 addition & 1 deletion docs/dyn/admin_directory_v1.groups.html
Original file line number Diff line number Diff line change
Expand Up @@ -222,7 +222,7 @@ <h3>Method Details</h3>
Allowed values
ASCENDING - Ascending order.
DESCENDING - Descending order.
userKey: string, Email or immutable ID of the user if only those groups are to be listed, the given user is a member of. If it&#x27;s an ID, it should match with the ID of the user object.
userKey: string, Email or immutable ID of the user if only those groups are to be listed, the given user is a member of. If it&#x27;s an ID, it should match with the ID of the user object. Cannot be used with the `customer` parameter.
x__xgafv: string, V1 error format.
Allowed values
1 - v1 error format
Expand Down
164 changes: 161 additions & 3 deletions docs/dyn/aiplatform_v1.endpoints.html

Large diffs are not rendered by default.

164 changes: 161 additions & 3 deletions docs/dyn/aiplatform_v1.projects.locations.endpoints.html

Large diffs are not rendered by default.

10 changes: 5 additions & 5 deletions docs/dyn/aiplatform_v1.projects.locations.featureGroups.html
Original file line number Diff line number Diff line change
Expand Up @@ -125,7 +125,7 @@ <h3>Method Details</h3>
&quot;bigQuerySource&quot;: { # The BigQuery location for the input content. # Required. Immutable. The BigQuery source URI that points to either a BigQuery Table or View.
&quot;inputUri&quot;: &quot;A String&quot;, # Required. BigQuery URI to a table, up to 2000 characters long. Accepted forms: * BigQuery path. For example: `bq://projectId.bqDatasetId.bqTableId`.
},
&quot;dense&quot;: True or False, # Optional. If set, all feature values will be fetched from a single row per unique entityId including nulls. If not set, will collapse all rows for each unique entityId into a singe row with any non-null values if present, if no non-null values are present will sync null. ex: If source has schema (entity_id, feature_timestamp, f0, f1) and values (e1, 2020-01-01T10:00:00.123Z, 10, 15) (e1, 2020-02-01T10:00:00.123Z, 20, null) If dense is set, (e1, 20, null) is synced to online stores. If dense is not set, (e1, 20, 15) is synced to online stores.
&quot;dense&quot;: True or False, # Optional. If set, all feature values will be fetched from a single row per unique entityId including nulls. If not set, will collapse all rows for each unique entityId into a singe row with any non-null values if present, if no non-null values are present will sync null. ex: If source has schema `(entity_id, feature_timestamp, f0, f1)` and the following rows: `(e1, 2020-01-01T10:00:00.123Z, 10, 15)` `(e1, 2020-02-01T10:00:00.123Z, 20, null)` If dense is set, `(e1, 20, null)` is synced to online stores. If dense is not set, `(e1, 20, 15)` is synced to online stores.
&quot;entityIdColumns&quot;: [ # Optional. Columns to construct entity_id / row keys. If not provided defaults to `entity_id`.
&quot;A String&quot;,
],
Expand All @@ -144,7 +144,7 @@ <h3>Method Details</h3>
&quot;updateTime&quot;: &quot;A String&quot;, # Output only. Timestamp when this FeatureGroup was last updated.
}

featureGroupId: string, Required. The ID to use for this FeatureGroup, which will become the final component of the FeatureGroup&#x27;s resource name. This value may be up to 60 characters, and valid characters are `[a-z0-9_]`. The first character cannot be a number. The value must be unique within the project and location.
featureGroupId: string, Required. The ID to use for this FeatureGroup, which will become the final component of the FeatureGroup&#x27;s resource name. This value may be up to 128 characters, and valid characters are `[a-z0-9_]`. The first character cannot be a number. The value must be unique within the project and location.
x__xgafv: string, V1 error format.
Allowed values
1 - v1 error format
Expand Down Expand Up @@ -229,7 +229,7 @@ <h3>Method Details</h3>
&quot;bigQuerySource&quot;: { # The BigQuery location for the input content. # Required. Immutable. The BigQuery source URI that points to either a BigQuery Table or View.
&quot;inputUri&quot;: &quot;A String&quot;, # Required. BigQuery URI to a table, up to 2000 characters long. Accepted forms: * BigQuery path. For example: `bq://projectId.bqDatasetId.bqTableId`.
},
&quot;dense&quot;: True or False, # Optional. If set, all feature values will be fetched from a single row per unique entityId including nulls. If not set, will collapse all rows for each unique entityId into a singe row with any non-null values if present, if no non-null values are present will sync null. ex: If source has schema (entity_id, feature_timestamp, f0, f1) and values (e1, 2020-01-01T10:00:00.123Z, 10, 15) (e1, 2020-02-01T10:00:00.123Z, 20, null) If dense is set, (e1, 20, null) is synced to online stores. If dense is not set, (e1, 20, 15) is synced to online stores.
&quot;dense&quot;: True or False, # Optional. If set, all feature values will be fetched from a single row per unique entityId including nulls. If not set, will collapse all rows for each unique entityId into a singe row with any non-null values if present, if no non-null values are present will sync null. ex: If source has schema `(entity_id, feature_timestamp, f0, f1)` and the following rows: `(e1, 2020-01-01T10:00:00.123Z, 10, 15)` `(e1, 2020-02-01T10:00:00.123Z, 20, null)` If dense is set, `(e1, 20, null)` is synced to online stores. If dense is not set, `(e1, 20, 15)` is synced to online stores.
&quot;entityIdColumns&quot;: [ # Optional. Columns to construct entity_id / row keys. If not provided defaults to `entity_id`.
&quot;A String&quot;,
],
Expand Down Expand Up @@ -274,7 +274,7 @@ <h3>Method Details</h3>
&quot;bigQuerySource&quot;: { # The BigQuery location for the input content. # Required. Immutable. The BigQuery source URI that points to either a BigQuery Table or View.
&quot;inputUri&quot;: &quot;A String&quot;, # Required. BigQuery URI to a table, up to 2000 characters long. Accepted forms: * BigQuery path. For example: `bq://projectId.bqDatasetId.bqTableId`.
},
&quot;dense&quot;: True or False, # Optional. If set, all feature values will be fetched from a single row per unique entityId including nulls. If not set, will collapse all rows for each unique entityId into a singe row with any non-null values if present, if no non-null values are present will sync null. ex: If source has schema (entity_id, feature_timestamp, f0, f1) and values (e1, 2020-01-01T10:00:00.123Z, 10, 15) (e1, 2020-02-01T10:00:00.123Z, 20, null) If dense is set, (e1, 20, null) is synced to online stores. If dense is not set, (e1, 20, 15) is synced to online stores.
&quot;dense&quot;: True or False, # Optional. If set, all feature values will be fetched from a single row per unique entityId including nulls. If not set, will collapse all rows for each unique entityId into a singe row with any non-null values if present, if no non-null values are present will sync null. ex: If source has schema `(entity_id, feature_timestamp, f0, f1)` and the following rows: `(e1, 2020-01-01T10:00:00.123Z, 10, 15)` `(e1, 2020-02-01T10:00:00.123Z, 20, null)` If dense is set, `(e1, 20, null)` is synced to online stores. If dense is not set, `(e1, 20, 15)` is synced to online stores.
&quot;entityIdColumns&quot;: [ # Optional. Columns to construct entity_id / row keys. If not provided defaults to `entity_id`.
&quot;A String&quot;,
],
Expand Down Expand Up @@ -325,7 +325,7 @@ <h3>Method Details</h3>
&quot;bigQuerySource&quot;: { # The BigQuery location for the input content. # Required. Immutable. The BigQuery source URI that points to either a BigQuery Table or View.
&quot;inputUri&quot;: &quot;A String&quot;, # Required. BigQuery URI to a table, up to 2000 characters long. Accepted forms: * BigQuery path. For example: `bq://projectId.bqDatasetId.bqTableId`.
},
&quot;dense&quot;: True or False, # Optional. If set, all feature values will be fetched from a single row per unique entityId including nulls. If not set, will collapse all rows for each unique entityId into a singe row with any non-null values if present, if no non-null values are present will sync null. ex: If source has schema (entity_id, feature_timestamp, f0, f1) and values (e1, 2020-01-01T10:00:00.123Z, 10, 15) (e1, 2020-02-01T10:00:00.123Z, 20, null) If dense is set, (e1, 20, null) is synced to online stores. If dense is not set, (e1, 20, 15) is synced to online stores.
&quot;dense&quot;: True or False, # Optional. If set, all feature values will be fetched from a single row per unique entityId including nulls. If not set, will collapse all rows for each unique entityId into a singe row with any non-null values if present, if no non-null values are present will sync null. ex: If source has schema `(entity_id, feature_timestamp, f0, f1)` and the following rows: `(e1, 2020-01-01T10:00:00.123Z, 10, 15)` `(e1, 2020-02-01T10:00:00.123Z, 20, null)` If dense is set, `(e1, 20, null)` is synced to online stores. If dense is not set, `(e1, 20, 15)` is synced to online stores.
&quot;entityIdColumns&quot;: [ # Optional. Columns to construct entity_id / row keys. If not provided defaults to `entity_id`.
&quot;A String&quot;,
],
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -184,7 +184,7 @@ <h3>Method Details</h3>
&quot;updateTime&quot;: &quot;A String&quot;, # Output only. Timestamp when this FeatureView was last updated.
&quot;vertexRagSource&quot;: { # A Vertex Rag source for features that need to be synced to Online Store. # Optional. The Vertex RAG Source that the FeatureView is linked to.
&quot;ragCorpusId&quot;: &quot;A String&quot;, # Optional. The RAG corpus id corresponding to this FeatureView.
&quot;uri&quot;: &quot;A String&quot;, # Required. The BigQuery view/table URI that will be materialized on each manual sync trigger. The table/view is expected to have the following columns at least: Field name Type Mode corpus_id STRING REQUIRED/NULLABLE file_id STRING REQUIRED/NULLABLE chunk_id STRING REQUIRED/NULLABLE chunk_data_type STRING REQUIRED/NULLABLE chunk_data STRING REQUIRED/NULLABLE embeddings FLOAT REPEATED file_original_uri STRING REQUIRED/NULLABLE
&quot;uri&quot;: &quot;A String&quot;, # Required. The BigQuery view/table URI that will be materialized on each manual sync trigger. The table/view is expected to have the following columns and types at least: - `corpus_id` (STRING, NULLABLE/REQUIRED) - `file_id` (STRING, NULLABLE/REQUIRED) - `chunk_id` (STRING, NULLABLE/REQUIRED) - `chunk_data_type` (STRING, NULLABLE/REQUIRED) - `chunk_data` (STRING, NULLABLE/REQUIRED) - `embeddings` (FLOAT, REPEATED) - `file_original_uri` (STRING, NULLABLE/REQUIRED)
},
}

Expand Down Expand Up @@ -403,7 +403,7 @@ <h3>Method Details</h3>
&quot;updateTime&quot;: &quot;A String&quot;, # Output only. Timestamp when this FeatureView was last updated.
&quot;vertexRagSource&quot;: { # A Vertex Rag source for features that need to be synced to Online Store. # Optional. The Vertex RAG Source that the FeatureView is linked to.
&quot;ragCorpusId&quot;: &quot;A String&quot;, # Optional. The RAG corpus id corresponding to this FeatureView.
&quot;uri&quot;: &quot;A String&quot;, # Required. The BigQuery view/table URI that will be materialized on each manual sync trigger. The table/view is expected to have the following columns at least: Field name Type Mode corpus_id STRING REQUIRED/NULLABLE file_id STRING REQUIRED/NULLABLE chunk_id STRING REQUIRED/NULLABLE chunk_data_type STRING REQUIRED/NULLABLE chunk_data STRING REQUIRED/NULLABLE embeddings FLOAT REPEATED file_original_uri STRING REQUIRED/NULLABLE
&quot;uri&quot;: &quot;A String&quot;, # Required. The BigQuery view/table URI that will be materialized on each manual sync trigger. The table/view is expected to have the following columns and types at least: - `corpus_id` (STRING, NULLABLE/REQUIRED) - `file_id` (STRING, NULLABLE/REQUIRED) - `chunk_id` (STRING, NULLABLE/REQUIRED) - `chunk_data_type` (STRING, NULLABLE/REQUIRED) - `chunk_data` (STRING, NULLABLE/REQUIRED) - `embeddings` (FLOAT, REPEATED) - `file_original_uri` (STRING, NULLABLE/REQUIRED)
},
}</pre>
</div>
Expand Down Expand Up @@ -509,7 +509,7 @@ <h3>Method Details</h3>
&quot;updateTime&quot;: &quot;A String&quot;, # Output only. Timestamp when this FeatureView was last updated.
&quot;vertexRagSource&quot;: { # A Vertex Rag source for features that need to be synced to Online Store. # Optional. The Vertex RAG Source that the FeatureView is linked to.
&quot;ragCorpusId&quot;: &quot;A String&quot;, # Optional. The RAG corpus id corresponding to this FeatureView.
&quot;uri&quot;: &quot;A String&quot;, # Required. The BigQuery view/table URI that will be materialized on each manual sync trigger. The table/view is expected to have the following columns at least: Field name Type Mode corpus_id STRING REQUIRED/NULLABLE file_id STRING REQUIRED/NULLABLE chunk_id STRING REQUIRED/NULLABLE chunk_data_type STRING REQUIRED/NULLABLE chunk_data STRING REQUIRED/NULLABLE embeddings FLOAT REPEATED file_original_uri STRING REQUIRED/NULLABLE
&quot;uri&quot;: &quot;A String&quot;, # Required. The BigQuery view/table URI that will be materialized on each manual sync trigger. The table/view is expected to have the following columns and types at least: - `corpus_id` (STRING, NULLABLE/REQUIRED) - `file_id` (STRING, NULLABLE/REQUIRED) - `chunk_id` (STRING, NULLABLE/REQUIRED) - `chunk_data_type` (STRING, NULLABLE/REQUIRED) - `chunk_data` (STRING, NULLABLE/REQUIRED) - `embeddings` (FLOAT, REPEATED) - `file_original_uri` (STRING, NULLABLE/REQUIRED)
},
},
],
Expand Down Expand Up @@ -586,7 +586,7 @@ <h3>Method Details</h3>
&quot;updateTime&quot;: &quot;A String&quot;, # Output only. Timestamp when this FeatureView was last updated.
&quot;vertexRagSource&quot;: { # A Vertex Rag source for features that need to be synced to Online Store. # Optional. The Vertex RAG Source that the FeatureView is linked to.
&quot;ragCorpusId&quot;: &quot;A String&quot;, # Optional. The RAG corpus id corresponding to this FeatureView.
&quot;uri&quot;: &quot;A String&quot;, # Required. The BigQuery view/table URI that will be materialized on each manual sync trigger. The table/view is expected to have the following columns at least: Field name Type Mode corpus_id STRING REQUIRED/NULLABLE file_id STRING REQUIRED/NULLABLE chunk_id STRING REQUIRED/NULLABLE chunk_data_type STRING REQUIRED/NULLABLE chunk_data STRING REQUIRED/NULLABLE embeddings FLOAT REPEATED file_original_uri STRING REQUIRED/NULLABLE
&quot;uri&quot;: &quot;A String&quot;, # Required. The BigQuery view/table URI that will be materialized on each manual sync trigger. The table/view is expected to have the following columns and types at least: - `corpus_id` (STRING, NULLABLE/REQUIRED) - `file_id` (STRING, NULLABLE/REQUIRED) - `chunk_id` (STRING, NULLABLE/REQUIRED) - `chunk_data_type` (STRING, NULLABLE/REQUIRED) - `chunk_data` (STRING, NULLABLE/REQUIRED) - `embeddings` (FLOAT, REPEATED) - `file_original_uri` (STRING, NULLABLE/REQUIRED)
},
}

Expand Down
Loading

0 comments on commit 930f0bb

Please sign in to comment.