Skip to content
This repository has been archived by the owner on Nov 29, 2023. It is now read-only.

Commit

Permalink
chore: transition the library to microgenerator (#56)
Browse files Browse the repository at this point in the history
* chore: remove old GAPIC code

* Regenerate the library with microgenerator

* Fix docs toctree includes

* Update Python version compatibility in README

* Adjust samples

* Fix datatransfer shim unit test

* Reduce required coverage threshold

The generated code tests do not cover all code paths after all...

* Simplify TransferConfig instantiation in sample

* Add UPGRADING guide

* Update UPGRADING.md (method name)

Co-authored-by: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com>

Co-authored-by: Bu Sun Kim <8822365+busunkim96@users.noreply.github.com>
  • Loading branch information
plamut and busunkim96 committed Sep 21, 2020
1 parent b2c296e commit ad6d893
Show file tree
Hide file tree
Showing 60 changed files with 11,686 additions and 11,930 deletions.
43 changes: 43 additions & 0 deletions .kokoro/populate-secrets.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,43 @@
#!/bin/bash
# Copyright 2020 Google LLC.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.

set -eo pipefail

function now { date +"%Y-%m-%d %H:%M:%S" | tr -d '\n' ;}
function msg { println "$*" >&2 ;}
function println { printf '%s\n' "$(now) $*" ;}


# Populates requested secrets set in SECRET_MANAGER_KEYS from service account:
# kokoro-trampoline@cloud-devrel-kokoro-resources.iam.gserviceaccount.com
SECRET_LOCATION="${KOKORO_GFILE_DIR}/secret_manager"
msg "Creating folder on disk for secrets: ${SECRET_LOCATION}"
mkdir -p ${SECRET_LOCATION}
for key in $(echo ${SECRET_MANAGER_KEYS} | sed "s/,/ /g")
do
msg "Retrieving secret ${key}"
docker run --entrypoint=gcloud \
--volume=${KOKORO_GFILE_DIR}:${KOKORO_GFILE_DIR} \
gcr.io/google.com/cloudsdktool/cloud-sdk \
secrets versions access latest \
--project cloud-devrel-kokoro-resources \
--secret ${key} > \
"${SECRET_LOCATION}/${key}"
if [[ $? == 0 ]]; then
msg "Secret written to ${SECRET_LOCATION}/${key}"
else
msg "Error retrieving secret ${key}"
fi
done
50 changes: 13 additions & 37 deletions .kokoro/release/common.cfg
Original file line number Diff line number Diff line change
Expand Up @@ -23,42 +23,18 @@ env_vars: {
value: "github/python-bigquery-datatransfer/.kokoro/release.sh"
}

# Fetch the token needed for reporting release status to GitHub
before_action {
fetch_keystore {
keystore_resource {
keystore_config_id: 73713
keyname: "yoshi-automation-github-key"
}
}
}

# Fetch PyPI password
before_action {
fetch_keystore {
keystore_resource {
keystore_config_id: 73713
keyname: "google_cloud_pypi_password"
}
}
}

# Fetch magictoken to use with Magic Github Proxy
before_action {
fetch_keystore {
keystore_resource {
keystore_config_id: 73713
keyname: "releasetool-magictoken"
}
}
# Fetch PyPI password
before_action {
fetch_keystore {
keystore_resource {
keystore_config_id: 73713
keyname: "google_cloud_pypi_password"
}
}
}

# Fetch api key to use with Magic Github Proxy
before_action {
fetch_keystore {
keystore_resource {
keystore_config_id: 73713
keyname: "magic-github-proxy-api-key"
}
}
}
# Tokens needed to report release status back to GitHub
env_vars: {
key: "SECRET_MANAGER_KEYS"
value: "releasetool-publish-reporter-app,releasetool-publish-reporter-googleapis-installation,releasetool-publish-reporter-pem"
}
15 changes: 10 additions & 5 deletions .kokoro/trampoline.sh
Original file line number Diff line number Diff line change
Expand Up @@ -15,9 +15,14 @@

set -eo pipefail

python3 "${KOKORO_GFILE_DIR}/trampoline_v1.py" || ret_code=$?
# Always run the cleanup script, regardless of the success of bouncing into
# the container.
function cleanup() {
chmod +x ${KOKORO_GFILE_DIR}/trampoline_cleanup.sh
${KOKORO_GFILE_DIR}/trampoline_cleanup.sh
echo "cleanup";
}
trap cleanup EXIT

chmod +x ${KOKORO_GFILE_DIR}/trampoline_cleanup.sh
${KOKORO_GFILE_DIR}/trampoline_cleanup.sh || true

exit ${ret_code}
$(dirname $0)/populate-secrets.sh # Secret Manager secrets.
python3 "${KOKORO_GFILE_DIR}/trampoline_v1.py"
7 changes: 5 additions & 2 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -48,11 +48,14 @@ dependencies.

Supported Python Versions
^^^^^^^^^^^^^^^^^^^^^^^^^
Python >= 3.5
Python >= 3.6

Deprecated Python Versions
^^^^^^^^^^^^^^^^^^^^^^^^^^
Python == 2.7. Python 2.7 support will be removed on January 1, 2020.
Python == 2.7.

The last version of this library compatible with Python 2.7 is
``google-cloud-bigquery-datatransfer==1.1.1``.


Mac/Linux
Expand Down
211 changes: 211 additions & 0 deletions UPGRADING.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,211 @@
<!--
Copyright 2020 Google LLC
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
https://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->


# 2.0.0 Migration Guide

The 2.0 release of the `google-cloud-bigquery-datatransfer` client is a significant
upgrade based on a [next-gen code generator](https://github.com/googleapis/gapic-generator-python),
and includes substantial interface changes. Existing code written for earlier versions
of this library will likely require updates to use this version. This document
describes the changes that have been made, and what you need to do to update your usage.

If you experience issues or have questions, please file an
[issue](https://github.com/googleapis/python-bigquery-datatransfer/issues).


## Supported Python Versions

> **WARNING**: Breaking change
The 2.0.0 release requires Python 3.6+.


## Import Path

> **WARNING**: Breaking change
The library was moved into `google.cloud.bigquery` namespace. Existing imports
need to be updated.

**Before:**
```py
from google.cloud import bigquery_datatransfer
from google.cloud import bigquery_datatransfer_v1
```

**After:**
```py
from google.cloud.bigquery import datatransfer
from google.cloud.bigquery import datatransfer_v1
```


## Method Calls

> **WARNING**: Breaking change
Methods that send requests to the backend expect request objects. We provide a script
that will convert most common use cases.

* Install the library

```py
python3 -m pip install google-cloud-bigquery-datatransfer
```

* The script `fixup_datatransfer_v1_keywords.py` is shipped with the library. It expects
an input directory (with the code to convert) and an empty destination directory.

```sh
$ scripts/fixup_datatransfer_v1_keywords.py --input-directory .samples/ --output-directory samples/
```

**Before:**
```py
from google.cloud import bigquery_datatransfer

client = bigquery_datatransfer.DataTransferServiceClient()

parent_project = "..."
transfer_config = {...}
authorization_code = "..."

response = client.create_transfer_config(
parent_project, transfer_config, authorization_code=authorization_code
)
```


**After:**
```py
from google.cloud.bigquery import datatransfer

client = datatransfer.DataTransferServiceClient()

parent_project = "..."
transfer_config = {...}
authorization_code = "..."

response = client.create_transfer_config(
request={
"parent": parent_project,
"transfer_config": transfer_config,
"authorization_code": authorization_code,
}
)
```

### More Details

In `google-cloud-bigquery-datatransfer<2.0.0`, parameters required by the API were positional
parameters and optional parameters were keyword parameters.

**Before:**
```py
def create_transfer_config(
self,
parent,
transfer_config,
authorization_code=None,
version_info=None,
service_account_name=None,
retry=google.api_core.gapic_v1.method.DEFAULT,
timeout=google.api_core.gapic_v1.method.DEFAULT,
metadata=None,
):
```

In the `2.0.0` release, methods that interact with the backend have a single
positional parameter `request`. Method docstrings indicate whether a parameter is
required or optional.

Some methods have additional keyword only parameters. The available parameters depend
on the [`google.api.method_signature` annotation](https://github.com/googleapis/python-bigquery-datatransfer/blob/master/google/cloud/bigquery_datatransfer_v1/proto/datatransfer.proto#L80)
specified by the API producer.


**After:**
```py
def create_transfer_config(
self,
request: datatransfer.CreateTransferConfigRequest = None,
*,
parent: str = None,
transfer_config: transfer.TransferConfig = None,
retry: retries.Retry = gapic_v1.method.DEFAULT,
timeout: float = None,
metadata: Sequence[Tuple[str, str]] = (),
) -> transfer.TransferConfig:
```

> **NOTE:** The `request` parameter and flattened keyword parameters for the API are
> mutually exclusive. Passing both will result in an error.

Both of these calls are valid:

```py
response = client.create_transfer_config(
request={
"parent": project_path,
"transfer_config": {"foo": "bar"},
}
)
```

```py
response = client.create_transfer_config(
parent=project_path,
transfer_config={"foo": "bar"},
)
```

This call is _invalid_ because it mixes `request` with a keyword argument `transfer_config`.
Executing this code will result in an error:

```py
response = client.create_transfer_config(
request={"parent": project_path},
transfer_config= {"foo": "bar"},
)
```

> **NOTE:** The `request` parameter of some methods can also contain a more rich set of
> options that are otherwise not available as explicit keyword only parameters, thus
> these _must_ be passed through `request`.

## Removed Utility Methods

> **WARNING**: Breaking change
Most utility methods such as `project_path()` have been removed. The paths must
now be constructed manually:

```py
project_path = f"project/{PROJECT_ID}"
```

The only two that remained are `transfer_config_path()` and `parse_transfer_config_path()`.


## Removed `client_config` Parameter

The client cannot be constructed with `client_config` argument anymore, this deprecated
argument has been removed. If you want to customize retry and timeout settings for a particular
method, you need to do it upon method invocation by passing the custom `timeout` and
`retry` arguments, respectively.
1 change: 1 addition & 0 deletions docs/UPGRADING.md
6 changes: 6 additions & 0 deletions docs/datatransfer_v1/services.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
Services for Google Cloud Bigquery Datatransfer v1 API
======================================================

.. automodule:: google.cloud.bigquery.datatransfer_v1.services.data_transfer_service
:members:
:inherited-members:
5 changes: 5 additions & 0 deletions docs/datatransfer_v1/types.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
Types for Google Cloud Bigquery Datatransfer v1 API
===================================================

.. automodule:: google.cloud.bigquery.datatransfer_v1.types
:members:
6 changes: 0 additions & 6 deletions docs/gapic/v1/api.rst

This file was deleted.

5 changes: 0 additions & 5 deletions docs/gapic/v1/types.rst

This file was deleted.

0 comments on commit ad6d893

Please sign in to comment.