Skip to content
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.

Commit 6f772fd

Browse files
committedDec 12, 2023
feat(aiplatform): update the api
#### aiplatform:v1 The following keys were deleted: - schemas.GoogleCloudAiplatformV1FunctionCall (Total Keys: 5) - schemas.GoogleCloudAiplatformV1FunctionResponse (Total Keys: 5) - schemas.GoogleCloudAiplatformV1Part.properties.functionCall.$ref (Total Keys: 1) - schemas.GoogleCloudAiplatformV1Part.properties.functionResponse.$ref (Total Keys: 1) The following keys were added: - schemas.CloudAiLargeModelsVisionEmbedVideoResponse (Total Keys: 4) - schemas.CloudAiLargeModelsVisionFilteredText (Total Keys: 7) - schemas.CloudAiLargeModelsVisionGenerateVideoResponse (Total Keys: 9) - schemas.CloudAiLargeModelsVisionImage (Total Keys: 13) - schemas.CloudAiLargeModelsVisionMedia (Total Keys: 7) - schemas.CloudAiLargeModelsVisionNamedBoundingBox (Total Keys: 17) - schemas.CloudAiLargeModelsVisionRaiInfo (Total Keys: 7) - schemas.CloudAiLargeModelsVisionReasonVideoResponse (Total Keys: 8) - schemas.CloudAiLargeModelsVisionRelativeTemporalPartition (Total Keys: 6) - schemas.CloudAiLargeModelsVisionSemanticFilterResponse (Total Keys: 5) - schemas.CloudAiLargeModelsVisionVideo (Total Keys: 5) - schemas.CloudAiNlLlmProtoServiceCandidate (Total Keys: 10) - schemas.CloudAiNlLlmProtoServiceCitation (Total Keys: 13) - schemas.CloudAiNlLlmProtoServiceContent (Total Keys: 5) - schemas.CloudAiNlLlmProtoServiceFunctionCall (Total Keys: 5) - schemas.CloudAiNlLlmProtoServiceFunctionResponse (Total Keys: 5) - schemas.CloudAiNlLlmProtoServiceGenerateMultiModalResponse (Total Keys: 7) - schemas.CloudAiNlLlmProtoServicePart (Total Keys: 24) - schemas.CloudAiNlLlmProtoServicePromptFeedback (Total Keys: 6) - schemas.CloudAiNlLlmProtoServiceSafetyRating (Total Keys: 5) - schemas.CloudAiNlLlmProtoServiceUsageMetadata (Total Keys: 8) - schemas.GoogleCloudAiplatformV1CustomJobSpec.properties.models (Total Keys: 2) - schemas.GoogleCloudAiplatformV1PublisherModel.properties.versionState.type (Total Keys: 1) - schemas.GoogleCloudAiplatformV1PublisherModelCallToActionDeploy.properties.publicArtifactUri.type (Total Keys: 1) - schemas.GoogleCloudAiplatformV1PublisherModelResourceReference.properties.useCase.type (Total Keys: 1) - schemas.GoogleTypeDate (Total Keys: 8) - schemas.IntelligenceCloudAutomlXpsMetricEntry (Total Keys: 14) - schemas.IntelligenceCloudAutomlXpsReportingMetrics (Total Keys: 7) #### aiplatform:v1beta1 The following keys were deleted: - schemas.GoogleCloudAiplatformV1beta1GenerationConfig.properties.echo.type (Total Keys: 1) - schemas.GoogleCloudAiplatformV1beta1GenerationConfig.properties.frequencyPenalty (Total Keys: 2) - schemas.GoogleCloudAiplatformV1beta1GenerationConfig.properties.logitBias (Total Keys: 3) - schemas.GoogleCloudAiplatformV1beta1GenerationConfig.properties.logprobs (Total Keys: 2) - schemas.GoogleCloudAiplatformV1beta1GenerationConfig.properties.presencePenalty (Total Keys: 2) The following keys were added: - resources.projects.resources.locations.resources.publishers.resources.models.methods.getIamPolicy (Total Keys: 14) - schemas.CloudAiLargeModelsVisionEmbedVideoResponse (Total Keys: 4) - schemas.CloudAiLargeModelsVisionFilteredText (Total Keys: 7) - schemas.CloudAiLargeModelsVisionGenerateVideoResponse (Total Keys: 9) - schemas.CloudAiLargeModelsVisionImage (Total Keys: 13) - schemas.CloudAiLargeModelsVisionMedia (Total Keys: 7) - schemas.CloudAiLargeModelsVisionNamedBoundingBox (Total Keys: 17) - schemas.CloudAiLargeModelsVisionRaiInfo (Total Keys: 7) - schemas.CloudAiLargeModelsVisionReasonVideoResponse (Total Keys: 8) - schemas.CloudAiLargeModelsVisionRelativeTemporalPartition (Total Keys: 6) - schemas.CloudAiLargeModelsVisionSemanticFilterResponse (Total Keys: 5) - schemas.CloudAiLargeModelsVisionVideo (Total Keys: 5) - schemas.CloudAiNlLlmProtoServiceCandidate (Total Keys: 10) - schemas.CloudAiNlLlmProtoServiceCitation (Total Keys: 13) - schemas.CloudAiNlLlmProtoServiceContent (Total Keys: 5) - schemas.CloudAiNlLlmProtoServiceFunctionCall (Total Keys: 5) - schemas.CloudAiNlLlmProtoServiceFunctionResponse (Total Keys: 5) - schemas.CloudAiNlLlmProtoServiceGenerateMultiModalResponse (Total Keys: 7) - schemas.CloudAiNlLlmProtoServicePart (Total Keys: 24) - schemas.CloudAiNlLlmProtoServicePromptFeedback (Total Keys: 6) - schemas.CloudAiNlLlmProtoServiceSafetyRating (Total Keys: 5) - schemas.CloudAiNlLlmProtoServiceUsageMetadata (Total Keys: 8) - schemas.GoogleCloudAiplatformV1beta1CustomJobSpec.properties.models (Total Keys: 2) - schemas.GoogleCloudAiplatformV1beta1GenerateContentRequest.properties.endpoint.deprecated (Total Keys: 1) - schemas.GoogleCloudAiplatformV1beta1PublisherModel.properties.versionState.type (Total Keys: 1) - schemas.GoogleCloudAiplatformV1beta1PublisherModelCallToActionDeploy.properties.publicArtifactUri.type (Total Keys: 1) - schemas.GoogleCloudAiplatformV1beta1PublisherModelResourceReference.properties.useCase.type (Total Keys: 1) - schemas.IntelligenceCloudAutomlXpsMetricEntry (Total Keys: 14) - schemas.IntelligenceCloudAutomlXpsReportingMetrics (Total Keys: 7)
1 parent 04bc978 commit 6f772fd

17 files changed

+2156
-375
lines changed
 

‎docs/dyn/aiplatform_v1.projects.locations.batchPredictionJobs.html

+8-8
Original file line numberDiff line numberDiff line change
@@ -288,10 +288,10 @@ <h3>Method Details</h3>
288288
&quot;instancesFormat&quot;: &quot;A String&quot;, # Required. The format in which instances are given, must be one of the Model&#x27;s supported_input_storage_formats.
289289
},
290290
&quot;instanceConfig&quot;: { # Configuration defining how to transform batch prediction input instances to the instances that the Model accepts. # Configuration for how to convert batch prediction input instances to the prediction instances that are sent to the Model.
291-
&quot;excludedFields&quot;: [ # Fields that will be excluded in the prediction instance that is sent to the Model. Excluded will be attached to the batch prediction output if key_field is not specified. When excluded_fields is populated, included_fields must be empty. The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
291+
&quot;excludedFields&quot;: [ # Fields that will be excluded in the prediction instance that is sent to the Model. Excluded will be attached to the batch prediction output if key_field is not specified. When excluded_fields is populated, included_fields must be empty. The input must be JSONL with objects at each line, BigQuery or TfRecord.
292292
&quot;A String&quot;,
293293
],
294-
&quot;includedFields&quot;: [ # Fields that will be included in the prediction instance that is sent to the Model. If instance_type is `array`, the order of field names in included_fields also determines the order of the values in the array. When included_fields is populated, excluded_fields must be empty. The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
294+
&quot;includedFields&quot;: [ # Fields that will be included in the prediction instance that is sent to the Model. If instance_type is `array`, the order of field names in included_fields also determines the order of the values in the array. When included_fields is populated, excluded_fields must be empty. The input must be JSONL with objects at each line, BigQuery or TfRecord.
295295
&quot;A String&quot;,
296296
],
297297
&quot;instanceType&quot;: &quot;A String&quot;, # The format of the instance that the Model accepts. Vertex AI will convert compatible batch prediction input instance formats to the specified format. Supported values are: * `object`: Each input is converted to JSON object format. * For `bigquery`, each row is converted to an object. * For `jsonl`, each line of the JSONL input must be an object. * Does not apply to `csv`, `file-list`, `tf-record`, or `tf-record-gzip`. * `array`: Each input is converted to JSON array format. * For `bigquery`, each row is converted to an array. The order of columns is determined by the BigQuery column order, unless included_fields is populated. included_fields must be populated for specifying field orders. * For `jsonl`, if each line of the JSONL input is an object, included_fields must be populated for specifying field orders. * Does not apply to `csv`, `file-list`, `tf-record`, or `tf-record-gzip`. If not specified, Vertex AI converts the batch prediction input as follows: * For `bigquery` and `csv`, the behavior is the same as `array`. The order of columns is the same as defined in the file or table, unless included_fields is populated. * For `jsonl`, the prediction instance format is determined by each line of the input. * For `tf-record`/`tf-record-gzip`, each record will be converted to an object in the format of `{&quot;b64&quot;: }`, where `` is the Base64-encoded string of the content of the record. * For `file-list`, each file in the list will be converted to an object in the format of `{&quot;b64&quot;: }`, where `` is the Base64-encoded string of the content of the file.
@@ -558,10 +558,10 @@ <h3>Method Details</h3>
558558
&quot;instancesFormat&quot;: &quot;A String&quot;, # Required. The format in which instances are given, must be one of the Model&#x27;s supported_input_storage_formats.
559559
},
560560
&quot;instanceConfig&quot;: { # Configuration defining how to transform batch prediction input instances to the instances that the Model accepts. # Configuration for how to convert batch prediction input instances to the prediction instances that are sent to the Model.
561-
&quot;excludedFields&quot;: [ # Fields that will be excluded in the prediction instance that is sent to the Model. Excluded will be attached to the batch prediction output if key_field is not specified. When excluded_fields is populated, included_fields must be empty. The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
561+
&quot;excludedFields&quot;: [ # Fields that will be excluded in the prediction instance that is sent to the Model. Excluded will be attached to the batch prediction output if key_field is not specified. When excluded_fields is populated, included_fields must be empty. The input must be JSONL with objects at each line, BigQuery or TfRecord.
562562
&quot;A String&quot;,
563563
],
564-
&quot;includedFields&quot;: [ # Fields that will be included in the prediction instance that is sent to the Model. If instance_type is `array`, the order of field names in included_fields also determines the order of the values in the array. When included_fields is populated, excluded_fields must be empty. The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
564+
&quot;includedFields&quot;: [ # Fields that will be included in the prediction instance that is sent to the Model. If instance_type is `array`, the order of field names in included_fields also determines the order of the values in the array. When included_fields is populated, excluded_fields must be empty. The input must be JSONL with objects at each line, BigQuery or TfRecord.
565565
&quot;A String&quot;,
566566
],
567567
&quot;instanceType&quot;: &quot;A String&quot;, # The format of the instance that the Model accepts. Vertex AI will convert compatible batch prediction input instance formats to the specified format. Supported values are: * `object`: Each input is converted to JSON object format. * For `bigquery`, each row is converted to an object. * For `jsonl`, each line of the JSONL input must be an object. * Does not apply to `csv`, `file-list`, `tf-record`, or `tf-record-gzip`. * `array`: Each input is converted to JSON array format. * For `bigquery`, each row is converted to an array. The order of columns is determined by the BigQuery column order, unless included_fields is populated. included_fields must be populated for specifying field orders. * For `jsonl`, if each line of the JSONL input is an object, included_fields must be populated for specifying field orders. * Does not apply to `csv`, `file-list`, `tf-record`, or `tf-record-gzip`. If not specified, Vertex AI converts the batch prediction input as follows: * For `bigquery` and `csv`, the behavior is the same as `array`. The order of columns is the same as defined in the file or table, unless included_fields is populated. * For `jsonl`, the prediction instance format is determined by each line of the input. * For `tf-record`/`tf-record-gzip`, each record will be converted to an object in the format of `{&quot;b64&quot;: }`, where `` is the Base64-encoded string of the content of the record. * For `file-list`, each file in the list will be converted to an object in the format of `{&quot;b64&quot;: }`, where `` is the Base64-encoded string of the content of the file.
@@ -870,10 +870,10 @@ <h3>Method Details</h3>
870870
&quot;instancesFormat&quot;: &quot;A String&quot;, # Required. The format in which instances are given, must be one of the Model&#x27;s supported_input_storage_formats.
871871
},
872872
&quot;instanceConfig&quot;: { # Configuration defining how to transform batch prediction input instances to the instances that the Model accepts. # Configuration for how to convert batch prediction input instances to the prediction instances that are sent to the Model.
873-
&quot;excludedFields&quot;: [ # Fields that will be excluded in the prediction instance that is sent to the Model. Excluded will be attached to the batch prediction output if key_field is not specified. When excluded_fields is populated, included_fields must be empty. The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
873+
&quot;excludedFields&quot;: [ # Fields that will be excluded in the prediction instance that is sent to the Model. Excluded will be attached to the batch prediction output if key_field is not specified. When excluded_fields is populated, included_fields must be empty. The input must be JSONL with objects at each line, BigQuery or TfRecord.
874874
&quot;A String&quot;,
875875
],
876-
&quot;includedFields&quot;: [ # Fields that will be included in the prediction instance that is sent to the Model. If instance_type is `array`, the order of field names in included_fields also determines the order of the values in the array. When included_fields is populated, excluded_fields must be empty. The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
876+
&quot;includedFields&quot;: [ # Fields that will be included in the prediction instance that is sent to the Model. If instance_type is `array`, the order of field names in included_fields also determines the order of the values in the array. When included_fields is populated, excluded_fields must be empty. The input must be JSONL with objects at each line, BigQuery or TfRecord.
877877
&quot;A String&quot;,
878878
],
879879
&quot;instanceType&quot;: &quot;A String&quot;, # The format of the instance that the Model accepts. Vertex AI will convert compatible batch prediction input instance formats to the specified format. Supported values are: * `object`: Each input is converted to JSON object format. * For `bigquery`, each row is converted to an object. * For `jsonl`, each line of the JSONL input must be an object. * Does not apply to `csv`, `file-list`, `tf-record`, or `tf-record-gzip`. * `array`: Each input is converted to JSON array format. * For `bigquery`, each row is converted to an array. The order of columns is determined by the BigQuery column order, unless included_fields is populated. included_fields must be populated for specifying field orders. * For `jsonl`, if each line of the JSONL input is an object, included_fields must be populated for specifying field orders. * Does not apply to `csv`, `file-list`, `tf-record`, or `tf-record-gzip`. If not specified, Vertex AI converts the batch prediction input as follows: * For `bigquery` and `csv`, the behavior is the same as `array`. The order of columns is the same as defined in the file or table, unless included_fields is populated. * For `jsonl`, the prediction instance format is determined by each line of the input. * For `tf-record`/`tf-record-gzip`, each record will be converted to an object in the format of `{&quot;b64&quot;: }`, where `` is the Base64-encoded string of the content of the record. * For `file-list`, each file in the list will be converted to an object in the format of `{&quot;b64&quot;: }`, where `` is the Base64-encoded string of the content of the file.
@@ -1153,10 +1153,10 @@ <h3>Method Details</h3>
11531153
&quot;instancesFormat&quot;: &quot;A String&quot;, # Required. The format in which instances are given, must be one of the Model&#x27;s supported_input_storage_formats.
11541154
},
11551155
&quot;instanceConfig&quot;: { # Configuration defining how to transform batch prediction input instances to the instances that the Model accepts. # Configuration for how to convert batch prediction input instances to the prediction instances that are sent to the Model.
1156-
&quot;excludedFields&quot;: [ # Fields that will be excluded in the prediction instance that is sent to the Model. Excluded will be attached to the batch prediction output if key_field is not specified. When excluded_fields is populated, included_fields must be empty. The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
1156+
&quot;excludedFields&quot;: [ # Fields that will be excluded in the prediction instance that is sent to the Model. Excluded will be attached to the batch prediction output if key_field is not specified. When excluded_fields is populated, included_fields must be empty. The input must be JSONL with objects at each line, BigQuery or TfRecord.
11571157
&quot;A String&quot;,
11581158
],
1159-
&quot;includedFields&quot;: [ # Fields that will be included in the prediction instance that is sent to the Model. If instance_type is `array`, the order of field names in included_fields also determines the order of the values in the array. When included_fields is populated, excluded_fields must be empty. The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
1159+
&quot;includedFields&quot;: [ # Fields that will be included in the prediction instance that is sent to the Model. If instance_type is `array`, the order of field names in included_fields also determines the order of the values in the array. When included_fields is populated, excluded_fields must be empty. The input must be JSONL with objects at each line, BigQuery or TfRecord.
11601160
&quot;A String&quot;,
11611161
],
11621162
&quot;instanceType&quot;: &quot;A String&quot;, # The format of the instance that the Model accepts. Vertex AI will convert compatible batch prediction input instance formats to the specified format. Supported values are: * `object`: Each input is converted to JSON object format. * For `bigquery`, each row is converted to an object. * For `jsonl`, each line of the JSONL input must be an object. * Does not apply to `csv`, `file-list`, `tf-record`, or `tf-record-gzip`. * `array`: Each input is converted to JSON array format. * For `bigquery`, each row is converted to an array. The order of columns is determined by the BigQuery column order, unless included_fields is populated. included_fields must be populated for specifying field orders. * For `jsonl`, if each line of the JSONL input is an object, included_fields must be populated for specifying field orders. * Does not apply to `csv`, `file-list`, `tf-record`, or `tf-record-gzip`. If not specified, Vertex AI converts the batch prediction input as follows: * For `bigquery` and `csv`, the behavior is the same as `array`. The order of columns is the same as defined in the file or table, unless included_fields is populated. * For `jsonl`, the prediction instance format is determined by each line of the input. * For `tf-record`/`tf-record-gzip`, each record will be converted to an object in the format of `{&quot;b64&quot;: }`, where `` is the Base64-encoded string of the content of the record. * For `file-list`, each file in the list will be converted to an object in the format of `{&quot;b64&quot;: }`, where `` is the Base64-encoded string of the content of the file.

0 commit comments

Comments
 (0)
Please sign in to comment.