Skip to content

Commit

Permalink
feat(aiplatform): update the api
Browse files Browse the repository at this point in the history
#### aiplatform:v1

The following keys were deleted:
- schemas.GoogleCloudAiplatformV1FunctionCall (Total Keys: 5)
- schemas.GoogleCloudAiplatformV1FunctionResponse (Total Keys: 5)
- schemas.GoogleCloudAiplatformV1Part.properties.functionCall.$ref (Total Keys: 1)
- schemas.GoogleCloudAiplatformV1Part.properties.functionResponse.$ref (Total Keys: 1)

The following keys were added:
- schemas.CloudAiLargeModelsVisionEmbedVideoResponse (Total Keys: 4)
- schemas.CloudAiLargeModelsVisionFilteredText (Total Keys: 7)
- schemas.CloudAiLargeModelsVisionGenerateVideoResponse (Total Keys: 9)
- schemas.CloudAiLargeModelsVisionImage (Total Keys: 13)
- schemas.CloudAiLargeModelsVisionMedia (Total Keys: 7)
- schemas.CloudAiLargeModelsVisionNamedBoundingBox (Total Keys: 17)
- schemas.CloudAiLargeModelsVisionRaiInfo (Total Keys: 7)
- schemas.CloudAiLargeModelsVisionReasonVideoResponse (Total Keys: 8)
- schemas.CloudAiLargeModelsVisionRelativeTemporalPartition (Total Keys: 6)
- schemas.CloudAiLargeModelsVisionSemanticFilterResponse (Total Keys: 5)
- schemas.CloudAiLargeModelsVisionVideo (Total Keys: 5)
- schemas.CloudAiNlLlmProtoServiceCandidate (Total Keys: 10)
- schemas.CloudAiNlLlmProtoServiceCitation (Total Keys: 13)
- schemas.CloudAiNlLlmProtoServiceContent (Total Keys: 5)
- schemas.CloudAiNlLlmProtoServiceFunctionCall (Total Keys: 5)
- schemas.CloudAiNlLlmProtoServiceFunctionResponse (Total Keys: 5)
- schemas.CloudAiNlLlmProtoServiceGenerateMultiModalResponse (Total Keys: 7)
- schemas.CloudAiNlLlmProtoServicePart (Total Keys: 24)
- schemas.CloudAiNlLlmProtoServicePromptFeedback (Total Keys: 6)
- schemas.CloudAiNlLlmProtoServiceSafetyRating (Total Keys: 5)
- schemas.CloudAiNlLlmProtoServiceUsageMetadata (Total Keys: 8)
- schemas.GoogleCloudAiplatformV1CustomJobSpec.properties.models (Total Keys: 2)
- schemas.GoogleCloudAiplatformV1PublisherModel.properties.versionState.type (Total Keys: 1)
- schemas.GoogleCloudAiplatformV1PublisherModelCallToActionDeploy.properties.publicArtifactUri.type (Total Keys: 1)
- schemas.GoogleCloudAiplatformV1PublisherModelResourceReference.properties.useCase.type (Total Keys: 1)
- schemas.GoogleTypeDate (Total Keys: 8)
- schemas.IntelligenceCloudAutomlXpsMetricEntry (Total Keys: 14)
- schemas.IntelligenceCloudAutomlXpsReportingMetrics (Total Keys: 7)

#### aiplatform:v1beta1

The following keys were deleted:
- schemas.GoogleCloudAiplatformV1beta1GenerationConfig.properties.echo.type (Total Keys: 1)
- schemas.GoogleCloudAiplatformV1beta1GenerationConfig.properties.frequencyPenalty (Total Keys: 2)
- schemas.GoogleCloudAiplatformV1beta1GenerationConfig.properties.logitBias (Total Keys: 3)
- schemas.GoogleCloudAiplatformV1beta1GenerationConfig.properties.logprobs (Total Keys: 2)
- schemas.GoogleCloudAiplatformV1beta1GenerationConfig.properties.presencePenalty (Total Keys: 2)

The following keys were added:
- resources.projects.resources.locations.resources.publishers.resources.models.methods.getIamPolicy (Total Keys: 14)
- schemas.CloudAiLargeModelsVisionEmbedVideoResponse (Total Keys: 4)
- schemas.CloudAiLargeModelsVisionFilteredText (Total Keys: 7)
- schemas.CloudAiLargeModelsVisionGenerateVideoResponse (Total Keys: 9)
- schemas.CloudAiLargeModelsVisionImage (Total Keys: 13)
- schemas.CloudAiLargeModelsVisionMedia (Total Keys: 7)
- schemas.CloudAiLargeModelsVisionNamedBoundingBox (Total Keys: 17)
- schemas.CloudAiLargeModelsVisionRaiInfo (Total Keys: 7)
- schemas.CloudAiLargeModelsVisionReasonVideoResponse (Total Keys: 8)
- schemas.CloudAiLargeModelsVisionRelativeTemporalPartition (Total Keys: 6)
- schemas.CloudAiLargeModelsVisionSemanticFilterResponse (Total Keys: 5)
- schemas.CloudAiLargeModelsVisionVideo (Total Keys: 5)
- schemas.CloudAiNlLlmProtoServiceCandidate (Total Keys: 10)
- schemas.CloudAiNlLlmProtoServiceCitation (Total Keys: 13)
- schemas.CloudAiNlLlmProtoServiceContent (Total Keys: 5)
- schemas.CloudAiNlLlmProtoServiceFunctionCall (Total Keys: 5)
- schemas.CloudAiNlLlmProtoServiceFunctionResponse (Total Keys: 5)
- schemas.CloudAiNlLlmProtoServiceGenerateMultiModalResponse (Total Keys: 7)
- schemas.CloudAiNlLlmProtoServicePart (Total Keys: 24)
- schemas.CloudAiNlLlmProtoServicePromptFeedback (Total Keys: 6)
- schemas.CloudAiNlLlmProtoServiceSafetyRating (Total Keys: 5)
- schemas.CloudAiNlLlmProtoServiceUsageMetadata (Total Keys: 8)
- schemas.GoogleCloudAiplatformV1beta1CustomJobSpec.properties.models (Total Keys: 2)
- schemas.GoogleCloudAiplatformV1beta1GenerateContentRequest.properties.endpoint.deprecated (Total Keys: 1)
- schemas.GoogleCloudAiplatformV1beta1PublisherModel.properties.versionState.type (Total Keys: 1)
- schemas.GoogleCloudAiplatformV1beta1PublisherModelCallToActionDeploy.properties.publicArtifactUri.type (Total Keys: 1)
- schemas.GoogleCloudAiplatformV1beta1PublisherModelResourceReference.properties.useCase.type (Total Keys: 1)
- schemas.IntelligenceCloudAutomlXpsMetricEntry (Total Keys: 14)
- schemas.IntelligenceCloudAutomlXpsReportingMetrics (Total Keys: 7)
  • Loading branch information
yoshi-automation committed Dec 12, 2023
1 parent 04bc978 commit 6f772fd
Show file tree
Hide file tree
Showing 17 changed files with 2,156 additions and 375 deletions.
Expand Up @@ -288,10 +288,10 @@ <h3>Method Details</h3>
&quot;instancesFormat&quot;: &quot;A String&quot;, # Required. The format in which instances are given, must be one of the Model&#x27;s supported_input_storage_formats.
},
&quot;instanceConfig&quot;: { # Configuration defining how to transform batch prediction input instances to the instances that the Model accepts. # Configuration for how to convert batch prediction input instances to the prediction instances that are sent to the Model.
&quot;excludedFields&quot;: [ # Fields that will be excluded in the prediction instance that is sent to the Model. Excluded will be attached to the batch prediction output if key_field is not specified. When excluded_fields is populated, included_fields must be empty. The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
&quot;excludedFields&quot;: [ # Fields that will be excluded in the prediction instance that is sent to the Model. Excluded will be attached to the batch prediction output if key_field is not specified. When excluded_fields is populated, included_fields must be empty. The input must be JSONL with objects at each line, BigQuery or TfRecord.
&quot;A String&quot;,
],
&quot;includedFields&quot;: [ # Fields that will be included in the prediction instance that is sent to the Model. If instance_type is `array`, the order of field names in included_fields also determines the order of the values in the array. When included_fields is populated, excluded_fields must be empty. The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
&quot;includedFields&quot;: [ # Fields that will be included in the prediction instance that is sent to the Model. If instance_type is `array`, the order of field names in included_fields also determines the order of the values in the array. When included_fields is populated, excluded_fields must be empty. The input must be JSONL with objects at each line, BigQuery or TfRecord.
&quot;A String&quot;,
],
&quot;instanceType&quot;: &quot;A String&quot;, # The format of the instance that the Model accepts. Vertex AI will convert compatible batch prediction input instance formats to the specified format. Supported values are: * `object`: Each input is converted to JSON object format. * For `bigquery`, each row is converted to an object. * For `jsonl`, each line of the JSONL input must be an object. * Does not apply to `csv`, `file-list`, `tf-record`, or `tf-record-gzip`. * `array`: Each input is converted to JSON array format. * For `bigquery`, each row is converted to an array. The order of columns is determined by the BigQuery column order, unless included_fields is populated. included_fields must be populated for specifying field orders. * For `jsonl`, if each line of the JSONL input is an object, included_fields must be populated for specifying field orders. * Does not apply to `csv`, `file-list`, `tf-record`, or `tf-record-gzip`. If not specified, Vertex AI converts the batch prediction input as follows: * For `bigquery` and `csv`, the behavior is the same as `array`. The order of columns is the same as defined in the file or table, unless included_fields is populated. * For `jsonl`, the prediction instance format is determined by each line of the input. * For `tf-record`/`tf-record-gzip`, each record will be converted to an object in the format of `{&quot;b64&quot;: }`, where `` is the Base64-encoded string of the content of the record. * For `file-list`, each file in the list will be converted to an object in the format of `{&quot;b64&quot;: }`, where `` is the Base64-encoded string of the content of the file.
Expand Down Expand Up @@ -558,10 +558,10 @@ <h3>Method Details</h3>
&quot;instancesFormat&quot;: &quot;A String&quot;, # Required. The format in which instances are given, must be one of the Model&#x27;s supported_input_storage_formats.
},
&quot;instanceConfig&quot;: { # Configuration defining how to transform batch prediction input instances to the instances that the Model accepts. # Configuration for how to convert batch prediction input instances to the prediction instances that are sent to the Model.
&quot;excludedFields&quot;: [ # Fields that will be excluded in the prediction instance that is sent to the Model. Excluded will be attached to the batch prediction output if key_field is not specified. When excluded_fields is populated, included_fields must be empty. The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
&quot;excludedFields&quot;: [ # Fields that will be excluded in the prediction instance that is sent to the Model. Excluded will be attached to the batch prediction output if key_field is not specified. When excluded_fields is populated, included_fields must be empty. The input must be JSONL with objects at each line, BigQuery or TfRecord.
&quot;A String&quot;,
],
&quot;includedFields&quot;: [ # Fields that will be included in the prediction instance that is sent to the Model. If instance_type is `array`, the order of field names in included_fields also determines the order of the values in the array. When included_fields is populated, excluded_fields must be empty. The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
&quot;includedFields&quot;: [ # Fields that will be included in the prediction instance that is sent to the Model. If instance_type is `array`, the order of field names in included_fields also determines the order of the values in the array. When included_fields is populated, excluded_fields must be empty. The input must be JSONL with objects at each line, BigQuery or TfRecord.
&quot;A String&quot;,
],
&quot;instanceType&quot;: &quot;A String&quot;, # The format of the instance that the Model accepts. Vertex AI will convert compatible batch prediction input instance formats to the specified format. Supported values are: * `object`: Each input is converted to JSON object format. * For `bigquery`, each row is converted to an object. * For `jsonl`, each line of the JSONL input must be an object. * Does not apply to `csv`, `file-list`, `tf-record`, or `tf-record-gzip`. * `array`: Each input is converted to JSON array format. * For `bigquery`, each row is converted to an array. The order of columns is determined by the BigQuery column order, unless included_fields is populated. included_fields must be populated for specifying field orders. * For `jsonl`, if each line of the JSONL input is an object, included_fields must be populated for specifying field orders. * Does not apply to `csv`, `file-list`, `tf-record`, or `tf-record-gzip`. If not specified, Vertex AI converts the batch prediction input as follows: * For `bigquery` and `csv`, the behavior is the same as `array`. The order of columns is the same as defined in the file or table, unless included_fields is populated. * For `jsonl`, the prediction instance format is determined by each line of the input. * For `tf-record`/`tf-record-gzip`, each record will be converted to an object in the format of `{&quot;b64&quot;: }`, where `` is the Base64-encoded string of the content of the record. * For `file-list`, each file in the list will be converted to an object in the format of `{&quot;b64&quot;: }`, where `` is the Base64-encoded string of the content of the file.
Expand Down Expand Up @@ -870,10 +870,10 @@ <h3>Method Details</h3>
&quot;instancesFormat&quot;: &quot;A String&quot;, # Required. The format in which instances are given, must be one of the Model&#x27;s supported_input_storage_formats.
},
&quot;instanceConfig&quot;: { # Configuration defining how to transform batch prediction input instances to the instances that the Model accepts. # Configuration for how to convert batch prediction input instances to the prediction instances that are sent to the Model.
&quot;excludedFields&quot;: [ # Fields that will be excluded in the prediction instance that is sent to the Model. Excluded will be attached to the batch prediction output if key_field is not specified. When excluded_fields is populated, included_fields must be empty. The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
&quot;excludedFields&quot;: [ # Fields that will be excluded in the prediction instance that is sent to the Model. Excluded will be attached to the batch prediction output if key_field is not specified. When excluded_fields is populated, included_fields must be empty. The input must be JSONL with objects at each line, BigQuery or TfRecord.
&quot;A String&quot;,
],
&quot;includedFields&quot;: [ # Fields that will be included in the prediction instance that is sent to the Model. If instance_type is `array`, the order of field names in included_fields also determines the order of the values in the array. When included_fields is populated, excluded_fields must be empty. The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
&quot;includedFields&quot;: [ # Fields that will be included in the prediction instance that is sent to the Model. If instance_type is `array`, the order of field names in included_fields also determines the order of the values in the array. When included_fields is populated, excluded_fields must be empty. The input must be JSONL with objects at each line, BigQuery or TfRecord.
&quot;A String&quot;,
],
&quot;instanceType&quot;: &quot;A String&quot;, # The format of the instance that the Model accepts. Vertex AI will convert compatible batch prediction input instance formats to the specified format. Supported values are: * `object`: Each input is converted to JSON object format. * For `bigquery`, each row is converted to an object. * For `jsonl`, each line of the JSONL input must be an object. * Does not apply to `csv`, `file-list`, `tf-record`, or `tf-record-gzip`. * `array`: Each input is converted to JSON array format. * For `bigquery`, each row is converted to an array. The order of columns is determined by the BigQuery column order, unless included_fields is populated. included_fields must be populated for specifying field orders. * For `jsonl`, if each line of the JSONL input is an object, included_fields must be populated for specifying field orders. * Does not apply to `csv`, `file-list`, `tf-record`, or `tf-record-gzip`. If not specified, Vertex AI converts the batch prediction input as follows: * For `bigquery` and `csv`, the behavior is the same as `array`. The order of columns is the same as defined in the file or table, unless included_fields is populated. * For `jsonl`, the prediction instance format is determined by each line of the input. * For `tf-record`/`tf-record-gzip`, each record will be converted to an object in the format of `{&quot;b64&quot;: }`, where `` is the Base64-encoded string of the content of the record. * For `file-list`, each file in the list will be converted to an object in the format of `{&quot;b64&quot;: }`, where `` is the Base64-encoded string of the content of the file.
Expand Down Expand Up @@ -1153,10 +1153,10 @@ <h3>Method Details</h3>
&quot;instancesFormat&quot;: &quot;A String&quot;, # Required. The format in which instances are given, must be one of the Model&#x27;s supported_input_storage_formats.
},
&quot;instanceConfig&quot;: { # Configuration defining how to transform batch prediction input instances to the instances that the Model accepts. # Configuration for how to convert batch prediction input instances to the prediction instances that are sent to the Model.
&quot;excludedFields&quot;: [ # Fields that will be excluded in the prediction instance that is sent to the Model. Excluded will be attached to the batch prediction output if key_field is not specified. When excluded_fields is populated, included_fields must be empty. The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
&quot;excludedFields&quot;: [ # Fields that will be excluded in the prediction instance that is sent to the Model. Excluded will be attached to the batch prediction output if key_field is not specified. When excluded_fields is populated, included_fields must be empty. The input must be JSONL with objects at each line, BigQuery or TfRecord.
&quot;A String&quot;,
],
&quot;includedFields&quot;: [ # Fields that will be included in the prediction instance that is sent to the Model. If instance_type is `array`, the order of field names in included_fields also determines the order of the values in the array. When included_fields is populated, excluded_fields must be empty. The input must be JSONL with objects at each line, CSV, BigQuery or TfRecord.
&quot;includedFields&quot;: [ # Fields that will be included in the prediction instance that is sent to the Model. If instance_type is `array`, the order of field names in included_fields also determines the order of the values in the array. When included_fields is populated, excluded_fields must be empty. The input must be JSONL with objects at each line, BigQuery or TfRecord.
&quot;A String&quot;,
],
&quot;instanceType&quot;: &quot;A String&quot;, # The format of the instance that the Model accepts. Vertex AI will convert compatible batch prediction input instance formats to the specified format. Supported values are: * `object`: Each input is converted to JSON object format. * For `bigquery`, each row is converted to an object. * For `jsonl`, each line of the JSONL input must be an object. * Does not apply to `csv`, `file-list`, `tf-record`, or `tf-record-gzip`. * `array`: Each input is converted to JSON array format. * For `bigquery`, each row is converted to an array. The order of columns is determined by the BigQuery column order, unless included_fields is populated. included_fields must be populated for specifying field orders. * For `jsonl`, if each line of the JSONL input is an object, included_fields must be populated for specifying field orders. * Does not apply to `csv`, `file-list`, `tf-record`, or `tf-record-gzip`. If not specified, Vertex AI converts the batch prediction input as follows: * For `bigquery` and `csv`, the behavior is the same as `array`. The order of columns is the same as defined in the file or table, unless included_fields is populated. * For `jsonl`, the prediction instance format is determined by each line of the input. * For `tf-record`/`tf-record-gzip`, each record will be converted to an object in the format of `{&quot;b64&quot;: }`, where `` is the Base64-encoded string of the content of the record. * For `file-list`, each file in the list will be converted to an object in the format of `{&quot;b64&quot;: }`, where `` is the Base64-encoded string of the content of the file.
Expand Down

0 comments on commit 6f772fd

Please sign in to comment.