Skip to content

Commit

Permalink
chore: Update discovery artifacts (#2298)
Browse files Browse the repository at this point in the history
## Deleted keys were detected in the following stable discovery artifacts:
aiplatform v1 https://togithub.com/googleapis/google-api-python-client/commit/6f772fd6a95bf5b75b513ee971d67e9d65d45198
batch v1 https://togithub.com/googleapis/google-api-python-client/commit/dfe37392b9264470dbac3a0fe76d63d18b2f8dfd
connectors v1 https://togithub.com/googleapis/google-api-python-client/commit/7f2654064fa1a2ecafb4b77f59fa466ad1dbfc4a
datacatalog v1 https://togithub.com/googleapis/google-api-python-client/commit/2be21be6bced617d7649810ba7bbd750e5e876a0
gkehub v1 https://togithub.com/googleapis/google-api-python-client/commit/b1823e826dd40c4d9504374478887cc8f8ae5d9b
metastore v1 https://togithub.com/googleapis/google-api-python-client/commit/03d9f29e20ea4909b11777dbff60b5b2c2f0d957

## Deleted keys were detected in the following pre-stable discovery artifacts:
aiplatform v1beta1 https://togithub.com/googleapis/google-api-python-client/commit/6f772fd6a95bf5b75b513ee971d67e9d65d45198
datacatalog v1beta1 https://togithub.com/googleapis/google-api-python-client/commit/2be21be6bced617d7649810ba7bbd750e5e876a0
gkehub v1alpha https://togithub.com/googleapis/google-api-python-client/commit/b1823e826dd40c4d9504374478887cc8f8ae5d9b
gkehub v1beta https://togithub.com/googleapis/google-api-python-client/commit/b1823e826dd40c4d9504374478887cc8f8ae5d9b
metastore v1alpha https://togithub.com/googleapis/google-api-python-client/commit/03d9f29e20ea4909b11777dbff60b5b2c2f0d957
metastore v1beta https://togithub.com/googleapis/google-api-python-client/commit/03d9f29e20ea4909b11777dbff60b5b2c2f0d957
networksecurity v1beta1 https://togithub.com/googleapis/google-api-python-client/commit/a2e83231ab8ff47ce376869820872a02c6dbefed

## Discovery Artifact Change Summary:
feat(aiplatform): update the api https://togithub.com/googleapis/google-api-python-client/commit/6f772fd6a95bf5b75b513ee971d67e9d65d45198
feat(alloydb): update the api https://togithub.com/googleapis/google-api-python-client/commit/4ee2fcb08390011f600e0eb10c7ca187e0541241
feat(androidpublisher): update the api https://togithub.com/googleapis/google-api-python-client/commit/dc4e5e9fa277266469125fd34157059b6b5c47fb
feat(baremetalsolution): update the api https://togithub.com/googleapis/google-api-python-client/commit/79aa42dc6cea9849f6c675829e4d3a0c44f32a7d
feat(batch): update the api https://togithub.com/googleapis/google-api-python-client/commit/dfe37392b9264470dbac3a0fe76d63d18b2f8dfd
feat(bigquery): update the api https://togithub.com/googleapis/google-api-python-client/commit/804ef48218f875432437922e32706d1321019539
feat(cloudbilling): update the api https://togithub.com/googleapis/google-api-python-client/commit/bc285d7f5a537459962f76caa86c53a5aa24065d
feat(clouddeploy): update the api https://togithub.com/googleapis/google-api-python-client/commit/97509bb1c146902091c9ad3162160a7ef23009a2
feat(connectors): update the api https://togithub.com/googleapis/google-api-python-client/commit/7f2654064fa1a2ecafb4b77f59fa466ad1dbfc4a
feat(container): update the api https://togithub.com/googleapis/google-api-python-client/commit/67a8d7a791104026e5d15328f00adf072a2f3b88
feat(datacatalog): update the api https://togithub.com/googleapis/google-api-python-client/commit/2be21be6bced617d7649810ba7bbd750e5e876a0
feat(dataflow): update the api https://togithub.com/googleapis/google-api-python-client/commit/0c5d8136b39cc9d4fc41ffc146ef335f5799f655
feat(dns): update the api https://togithub.com/googleapis/google-api-python-client/commit/35ef8109d1a46bdd98d85412b4d705a45ce028f7
feat(gkehub): update the api https://togithub.com/googleapis/google-api-python-client/commit/b1823e826dd40c4d9504374478887cc8f8ae5d9b
feat(gkeonprem): update the api https://togithub.com/googleapis/google-api-python-client/commit/14b80769f732372d3b04497d0ca34303e7c98d1f
feat(metastore): update the api https://togithub.com/googleapis/google-api-python-client/commit/03d9f29e20ea4909b11777dbff60b5b2c2f0d957
feat(monitoring): update the api https://togithub.com/googleapis/google-api-python-client/commit/ab03f9c7832ef722bdbff6bec1b1c4fb893c9420
feat(networkmanagement): update the api https://togithub.com/googleapis/google-api-python-client/commit/3ddf64cdd3002df20b6ba1d0225d2e641291230a
feat(networksecurity): update the api https://togithub.com/googleapis/google-api-python-client/commit/a2e83231ab8ff47ce376869820872a02c6dbefed
feat(networkservices): update the api https://togithub.com/googleapis/google-api-python-client/commit/9490509110585776a9f4676fd76248308494cc29
feat(playdeveloperreporting): update the api https://togithub.com/googleapis/google-api-python-client/commit/8e799b77800cfaa3f759cc52ef36b73533fe66d0
feat(pubsub): update the api https://togithub.com/googleapis/google-api-python-client/commit/ce81218ef023be0c925ba73512faa25cd96a4a70
feat(script): update the api https://togithub.com/googleapis/google-api-python-client/commit/a2c6f6f73228ce5612ce238b25b8ad95895d9eee
feat(securitycenter): update the api https://togithub.com/googleapis/google-api-python-client/commit/9d79a685e826b937ca8f4aa5e83fac3c940976c0
feat(vpcaccess): update the api https://togithub.com/googleapis/google-api-python-client/commit/7e2f29aeef1f60b36d310664a709f7bc911f16d6
  • Loading branch information
yoshi-code-bot committed Dec 12, 2023
1 parent 0e3ea0c commit 79fee0e
Show file tree
Hide file tree
Showing 411 changed files with 16,872 additions and 2,071 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -288,10 +288,10 @@ <h3>Method Details</h3>
&quot;instancesFormat&quot;: &quot;A String&quot;, # Required. The format in which instances are given, must be one of the Model&#x27;s supported_input_storage_formats.
},
&quot;instanceConfig&quot;: { # Configuration defining how to transform batch prediction input instances to the instances that the Model accepts. # Configuration for how to convert batch prediction input instances to the prediction instances that are sent to the Model.
&quot;excludedFields&quot;: [ # Fields that will be excluded in the prediction instance that is sent to the Model. Excluded will be attached to the batch prediction output if key_field is not specified. When excluded_fields is populated, included_fields must be empty. The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
&quot;excludedFields&quot;: [ # Fields that will be excluded in the prediction instance that is sent to the Model. Excluded will be attached to the batch prediction output if key_field is not specified. When excluded_fields is populated, included_fields must be empty. The input must be JSONL with objects at each line, BigQuery or TfRecord.
&quot;A String&quot;,
],
&quot;includedFields&quot;: [ # Fields that will be included in the prediction instance that is sent to the Model. If instance_type is `array`, the order of field names in included_fields also determines the order of the values in the array. When included_fields is populated, excluded_fields must be empty. The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
&quot;includedFields&quot;: [ # Fields that will be included in the prediction instance that is sent to the Model. If instance_type is `array`, the order of field names in included_fields also determines the order of the values in the array. When included_fields is populated, excluded_fields must be empty. The input must be JSONL with objects at each line, BigQuery or TfRecord.
&quot;A String&quot;,
],
&quot;instanceType&quot;: &quot;A String&quot;, # The format of the instance that the Model accepts. Vertex AI will convert compatible batch prediction input instance formats to the specified format. Supported values are: * `object`: Each input is converted to JSON object format. * For `bigquery`, each row is converted to an object. * For `jsonl`, each line of the JSONL input must be an object. * Does not apply to `csv`, `file-list`, `tf-record`, or `tf-record-gzip`. * `array`: Each input is converted to JSON array format. * For `bigquery`, each row is converted to an array. The order of columns is determined by the BigQuery column order, unless included_fields is populated. included_fields must be populated for specifying field orders. * For `jsonl`, if each line of the JSONL input is an object, included_fields must be populated for specifying field orders. * Does not apply to `csv`, `file-list`, `tf-record`, or `tf-record-gzip`. If not specified, Vertex AI converts the batch prediction input as follows: * For `bigquery` and `csv`, the behavior is the same as `array`. The order of columns is the same as defined in the file or table, unless included_fields is populated. * For `jsonl`, the prediction instance format is determined by each line of the input. * For `tf-record`/`tf-record-gzip`, each record will be converted to an object in the format of `{&quot;b64&quot;: }`, where `` is the Base64-encoded string of the content of the record. * For `file-list`, each file in the list will be converted to an object in the format of `{&quot;b64&quot;: }`, where `` is the Base64-encoded string of the content of the file.
Expand Down Expand Up @@ -558,10 +558,10 @@ <h3>Method Details</h3>
&quot;instancesFormat&quot;: &quot;A String&quot;, # Required. The format in which instances are given, must be one of the Model&#x27;s supported_input_storage_formats.
},
&quot;instanceConfig&quot;: { # Configuration defining how to transform batch prediction input instances to the instances that the Model accepts. # Configuration for how to convert batch prediction input instances to the prediction instances that are sent to the Model.
&quot;excludedFields&quot;: [ # Fields that will be excluded in the prediction instance that is sent to the Model. Excluded will be attached to the batch prediction output if key_field is not specified. When excluded_fields is populated, included_fields must be empty. The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
&quot;excludedFields&quot;: [ # Fields that will be excluded in the prediction instance that is sent to the Model. Excluded will be attached to the batch prediction output if key_field is not specified. When excluded_fields is populated, included_fields must be empty. The input must be JSONL with objects at each line, BigQuery or TfRecord.
&quot;A String&quot;,
],
&quot;includedFields&quot;: [ # Fields that will be included in the prediction instance that is sent to the Model. If instance_type is `array`, the order of field names in included_fields also determines the order of the values in the array. When included_fields is populated, excluded_fields must be empty. The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
&quot;includedFields&quot;: [ # Fields that will be included in the prediction instance that is sent to the Model. If instance_type is `array`, the order of field names in included_fields also determines the order of the values in the array. When included_fields is populated, excluded_fields must be empty. The input must be JSONL with objects at each line, BigQuery or TfRecord.
&quot;A String&quot;,
],
&quot;instanceType&quot;: &quot;A String&quot;, # The format of the instance that the Model accepts. Vertex AI will convert compatible batch prediction input instance formats to the specified format. Supported values are: * `object`: Each input is converted to JSON object format. * For `bigquery`, each row is converted to an object. * For `jsonl`, each line of the JSONL input must be an object. * Does not apply to `csv`, `file-list`, `tf-record`, or `tf-record-gzip`. * `array`: Each input is converted to JSON array format. * For `bigquery`, each row is converted to an array. The order of columns is determined by the BigQuery column order, unless included_fields is populated. included_fields must be populated for specifying field orders. * For `jsonl`, if each line of the JSONL input is an object, included_fields must be populated for specifying field orders. * Does not apply to `csv`, `file-list`, `tf-record`, or `tf-record-gzip`. If not specified, Vertex AI converts the batch prediction input as follows: * For `bigquery` and `csv`, the behavior is the same as `array`. The order of columns is the same as defined in the file or table, unless included_fields is populated. * For `jsonl`, the prediction instance format is determined by each line of the input. * For `tf-record`/`tf-record-gzip`, each record will be converted to an object in the format of `{&quot;b64&quot;: }`, where `` is the Base64-encoded string of the content of the record. * For `file-list`, each file in the list will be converted to an object in the format of `{&quot;b64&quot;: }`, where `` is the Base64-encoded string of the content of the file.
Expand Down Expand Up @@ -870,10 +870,10 @@ <h3>Method Details</h3>
&quot;instancesFormat&quot;: &quot;A String&quot;, # Required. The format in which instances are given, must be one of the Model&#x27;s supported_input_storage_formats.
},
&quot;instanceConfig&quot;: { # Configuration defining how to transform batch prediction input instances to the instances that the Model accepts. # Configuration for how to convert batch prediction input instances to the prediction instances that are sent to the Model.
&quot;excludedFields&quot;: [ # Fields that will be excluded in the prediction instance that is sent to the Model. Excluded will be attached to the batch prediction output if key_field is not specified. When excluded_fields is populated, included_fields must be empty. The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
&quot;excludedFields&quot;: [ # Fields that will be excluded in the prediction instance that is sent to the Model. Excluded will be attached to the batch prediction output if key_field is not specified. When excluded_fields is populated, included_fields must be empty. The input must be JSONL with objects at each line, BigQuery or TfRecord.
&quot;A String&quot;,
],
&quot;includedFields&quot;: [ # Fields that will be included in the prediction instance that is sent to the Model. If instance_type is `array`, the order of field names in included_fields also determines the order of the values in the array. When included_fields is populated, excluded_fields must be empty. The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
&quot;includedFields&quot;: [ # Fields that will be included in the prediction instance that is sent to the Model. If instance_type is `array`, the order of field names in included_fields also determines the order of the values in the array. When included_fields is populated, excluded_fields must be empty. The input must be JSONL with objects at each line, BigQuery or TfRecord.
&quot;A String&quot;,
],
&quot;instanceType&quot;: &quot;A String&quot;, # The format of the instance that the Model accepts. Vertex AI will convert compatible batch prediction input instance formats to the specified format. Supported values are: * `object`: Each input is converted to JSON object format. * For `bigquery`, each row is converted to an object. * For `jsonl`, each line of the JSONL input must be an object. * Does not apply to `csv`, `file-list`, `tf-record`, or `tf-record-gzip`. * `array`: Each input is converted to JSON array format. * For `bigquery`, each row is converted to an array. The order of columns is determined by the BigQuery column order, unless included_fields is populated. included_fields must be populated for specifying field orders. * For `jsonl`, if each line of the JSONL input is an object, included_fields must be populated for specifying field orders. * Does not apply to `csv`, `file-list`, `tf-record`, or `tf-record-gzip`. If not specified, Vertex AI converts the batch prediction input as follows: * For `bigquery` and `csv`, the behavior is the same as `array`. The order of columns is the same as defined in the file or table, unless included_fields is populated. * For `jsonl`, the prediction instance format is determined by each line of the input. * For `tf-record`/`tf-record-gzip`, each record will be converted to an object in the format of `{&quot;b64&quot;: }`, where `` is the Base64-encoded string of the content of the record. * For `file-list`, each file in the list will be converted to an object in the format of `{&quot;b64&quot;: }`, where `` is the Base64-encoded string of the content of the file.
Expand Down Expand Up @@ -1153,10 +1153,10 @@ <h3>Method Details</h3>
&quot;instancesFormat&quot;: &quot;A String&quot;, # Required. The format in which instances are given, must be one of the Model&#x27;s supported_input_storage_formats.
},
&quot;instanceConfig&quot;: { # Configuration defining how to transform batch prediction input instances to the instances that the Model accepts. # Configuration for how to convert batch prediction input instances to the prediction instances that are sent to the Model.
&quot;excludedFields&quot;: [ # Fields that will be excluded in the prediction instance that is sent to the Model. Excluded will be attached to the batch prediction output if key_field is not specified. When excluded_fields is populated, included_fields must be empty. The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
&quot;excludedFields&quot;: [ # Fields that will be excluded in the prediction instance that is sent to the Model. Excluded will be attached to the batch prediction output if key_field is not specified. When excluded_fields is populated, included_fields must be empty. The input must be JSONL with objects at each line, BigQuery or TfRecord.
&quot;A String&quot;,
],
&quot;includedFields&quot;: [ # Fields that will be included in the prediction instance that is sent to the Model. If instance_type is `array`, the order of field names in included_fields also determines the order of the values in the array. When included_fields is populated, excluded_fields must be empty. The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
&quot;includedFields&quot;: [ # Fields that will be included in the prediction instance that is sent to the Model. If instance_type is `array`, the order of field names in included_fields also determines the order of the values in the array. When included_fields is populated, excluded_fields must be empty. The input must be JSONL with objects at each line, BigQuery or TfRecord.
&quot;A String&quot;,
],
&quot;instanceType&quot;: &quot;A String&quot;, # The format of the instance that the Model accepts. Vertex AI will convert compatible batch prediction input instance formats to the specified format. Supported values are: * `object`: Each input is converted to JSON object format. * For `bigquery`, each row is converted to an object. * For `jsonl`, each line of the JSONL input must be an object. * Does not apply to `csv`, `file-list`, `tf-record`, or `tf-record-gzip`. * `array`: Each input is converted to JSON array format. * For `bigquery`, each row is converted to an array. The order of columns is determined by the BigQuery column order, unless included_fields is populated. included_fields must be populated for specifying field orders. * For `jsonl`, if each line of the JSONL input is an object, included_fields must be populated for specifying field orders. * Does not apply to `csv`, `file-list`, `tf-record`, or `tf-record-gzip`. If not specified, Vertex AI converts the batch prediction input as follows: * For `bigquery` and `csv`, the behavior is the same as `array`. The order of columns is the same as defined in the file or table, unless included_fields is populated. * For `jsonl`, the prediction instance format is determined by each line of the input. * For `tf-record`/`tf-record-gzip`, each record will be converted to an object in the format of `{&quot;b64&quot;: }`, where `` is the Base64-encoded string of the content of the record. * For `file-list`, each file in the list will be converted to an object in the format of `{&quot;b64&quot;: }`, where `` is the Base64-encoded string of the content of the file.
Expand Down

0 comments on commit 79fee0e

Please sign in to comment.