Skip to content

Latest commit

 

History

History
401 lines (307 loc) · 12.8 KB

bigquery_dataset.html.markdown

File metadata and controls

401 lines (307 loc) · 12.8 KB
subcategory page_title description
BigQuery
Google: google_bigquery_dataset
Datasets allow you to organize and control access to your tables.

google_bigquery_dataset

Datasets allow you to organize and control access to your tables.

To get more information about Dataset, see:

~> Warning: You must specify the role field using the legacy format OWNER instead of roles/bigquery.dataOwner. The API does accept both formats but it will always return the legacy format which results in Terraform showing permanent diff on each plan and apply operation.

## Example Usage - Bigquery Dataset Basic
resource "google_bigquery_dataset" "dataset" {
  dataset_id                  = "example_dataset"
  friendly_name               = "test"
  description                 = "This is a test description"
  location                    = "EU"
  default_table_expiration_ms = 3600000

  labels = {
    env = "default"
  }

  access {
    role          = "OWNER"
    user_by_email = google_service_account.bqowner.email
  }

  access {
    role   = "READER"
    domain = "hashicorp.com"
  }
}

resource "google_service_account" "bqowner" {
  account_id = "bqowner"
}

Example Usage - Bigquery Dataset Cmek

resource "google_bigquery_dataset" "dataset" {
  dataset_id                  = "example_dataset"
  friendly_name               = "test"
  description                 = "This is a test description"
  location                    = "US"
  default_table_expiration_ms = 3600000

  default_encryption_configuration {
    kms_key_name = google_kms_crypto_key.crypto_key.id
  }
}

resource "google_kms_crypto_key" "crypto_key" {
  name     = "example-key"
  key_ring = google_kms_key_ring.key_ring.id
}

resource "google_kms_key_ring" "key_ring" {
  name     = "example-keyring"
  location = "us"
}
## Example Usage - Bigquery Dataset Authorized Dataset
resource "google_bigquery_dataset" "public" {
  dataset_id                  = "public"
  friendly_name               = "test"
  description                 = "This dataset is public"
  location                    = "EU"
  default_table_expiration_ms = 3600000

  labels = {
    env = "default"
  }

  access {
    role          = "OWNER"
    user_by_email = google_service_account.bqowner.email
  }

  access {
    role   = "READER"
    domain = "hashicorp.com"
  }
}

resource "google_bigquery_dataset" "dataset" {
  dataset_id                  = "private"
  friendly_name               = "test"
  description                 = "This dataset is private"
  location                    = "EU"
  default_table_expiration_ms = 3600000

  labels = {
    env = "default"
  }

  access {
    role          = "OWNER"
    user_by_email = google_service_account.bqowner.email
  }

  access {
    role   = "READER"
    domain = "hashicorp.com"
  }

  access {
    dataset {
      dataset {
        project_id = google_bigquery_dataset.public.project
        dataset_id = google_bigquery_dataset.public.dataset_id
      }
      target_types = ["VIEWS"]
    }
  }
}

resource "google_service_account" "bqowner" {
  account_id = "bqowner"
}

Argument Reference

The following arguments are supported:

  • dataset_id - (Required) A unique ID for this dataset, without the project name. The ID must contain only letters (a-z, A-Z), numbers (0-9), or underscores (_). The maximum length is 1,024 characters.

  • max_time_travel_hours - (Optional) Defines the time travel window in hours. The value can be from 48 to 168 hours (2 to 7 days).

  • access - (Optional) An array of objects that define dataset access for one or more entities. Structure is documented below.

  • default_table_expiration_ms - (Optional) The default lifetime of all tables in the dataset, in milliseconds. The minimum value is 3600000 milliseconds (one hour).

    Once this property is set, all newly-created tables in the dataset will have an expirationTime property set to the creation time plus the value in this property, and changing the value will only affect new tables, not existing ones. When the expirationTime for a given table is reached, that table will be deleted automatically. If a table's expirationTime is modified or removed before the table expires, or if you provide an explicit expirationTime when creating a table, that value takes precedence over the default expiration time indicated by this property.

  • default_partition_expiration_ms - (Optional) The default partition expiration for all partitioned tables in the dataset, in milliseconds.

    Once this property is set, all newly-created partitioned tables in the dataset will have an expirationMs property in the timePartitioning settings set to this value, and changing the value will only affect new tables, not existing ones. The storage in a partition will have an expiration time of its partition time plus this value. Setting this property overrides the use of defaultTableExpirationMs for partitioned tables: only one of defaultTableExpirationMs and defaultPartitionExpirationMs will be used for any new partitioned table. If you provide an explicit timePartitioning.expirationMs when creating or updating a partitioned table, that value takes precedence over the default partition expiration time indicated by this property.

  • description - (Optional) A user-friendly description of the dataset

  • friendly_name - (Optional) A descriptive name for the dataset

  • labels - (Optional) The labels associated with this dataset. You can use these to organize and group your datasets

  • location - (Optional) The geographic location where the dataset should reside. See official docs.

    There are two types of locations, regional or multi-regional. A regional location is a specific geographic place, such as Tokyo, and a multi-regional location is a large geographic area, such as the United States, that contains at least two geographic places.

    The default value is multi-regional location US. Changing this forces a new resource to be created.

  • default_encryption_configuration - (Optional) The default encryption key for all tables in the dataset. Once this property is set, all newly-created partitioned tables in the dataset will have encryption key set to this value, unless table creation request (or query) overrides the key. Structure is documented below.

  • project - (Optional) The ID of the project in which the resource belongs. If it is not provided, the provider project is used.

  • delete_contents_on_destroy - (Optional) If set to true, delete all the tables in the dataset when destroying the resource; otherwise, destroying the resource will fail if tables are present.

The access block supports:

  • domain - (Optional) A domain to grant access to. Any users signed in with the domain specified will be granted the specified access

  • group_by_email - (Optional) An email address of a Google Group to grant access to.

  • role - (Optional) Describes the rights granted to the user specified by the other member of the access object. Basic, predefined, and custom roles are supported. Predefined roles that have equivalent basic roles are swapped by the API to their basic counterparts. See official docs.

  • special_group - (Optional) A special group to grant access to. Possible values include:

    • projectOwners: Owners of the enclosing project.

    • projectReaders: Readers of the enclosing project.

    • projectWriters: Writers of the enclosing project.

    • allAuthenticatedUsers: All authenticated BigQuery users.

  • user_by_email - (Optional) An email address of a user to grant access to. For example: fred@example.com

  • view - (Optional) A view from a different dataset to grant access to. Queries executed against that view will have read access to tables in this dataset. The role field is not required when this field is set. If that view is updated by any user, access to the view needs to be granted again via an update operation. Structure is documented below.

  • dataset - (Optional) Grants all resources of particular types in a particular dataset read access to the current dataset. Structure is documented below.

The view block supports:

  • dataset_id - (Required) The ID of the dataset containing this table.

  • project_id - (Required) The ID of the project containing this table.

  • table_id - (Required) The ID of the table. The ID must contain only letters (a-z, A-Z), numbers (0-9), or underscores (_). The maximum length is 1,024 characters.

The dataset block supports:

  • dataset - (Required) The dataset this entry applies to Structure is documented below.

  • target_types - (Required) Which resources in the dataset this entry applies to. Currently, only views are supported, but additional target types may be added in the future. Possible values: VIEWS

The dataset block supports:

  • dataset_id - (Required) The ID of the dataset containing this table.

  • project_id - (Required) The ID of the project containing this table.

The default_encryption_configuration block supports:

  • kms_key_name - (Required) Describes the Cloud KMS encryption key that will be used to protect destination BigQuery table. The BigQuery Service Account associated with your project requires access to this encryption key.

Attributes Reference

In addition to the arguments listed above, the following computed attributes are exported:

  • id - an identifier for the resource with format projects/{{project}}/datasets/{{dataset_id}}

  • creation_time - The time when this dataset was created, in milliseconds since the epoch.

  • etag - A hash of the resource.

  • last_modified_time - The date when this dataset or any of its tables was last modified, in milliseconds since the epoch.

  • self_link - The URI of the created resource.

Timeouts

This resource provides the following Timeouts configuration options:

  • create - Default is 20 minutes.
  • update - Default is 20 minutes.
  • delete - Default is 20 minutes.

Import

Dataset can be imported using any of these accepted formats:

$ terraform import google_bigquery_dataset.default projects/{{project}}/datasets/{{dataset_id}}
$ terraform import google_bigquery_dataset.default {{project}}/{{dataset_id}}
$ terraform import google_bigquery_dataset.default {{dataset_id}}

User Project Overrides

This resource supports User Project Overrides.