Skip to content
This repository has been archived by the owner on Nov 29, 2023. It is now read-only.

chore: Adjust gapic namespace to google.cloud.bigquery_datatransfer #61

Merged
merged 3 commits into from
Sep 29, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
19 changes: 0 additions & 19 deletions CONTRIBUTING.rst
Original file line number Diff line number Diff line change
Expand Up @@ -80,25 +80,6 @@ We use `nox <https://nox.readthedocs.io/en/latest/>`__ to instrument our tests.

.. nox: https://pypi.org/project/nox/

Note on Editable Installs / Develop Mode
========================================

- As mentioned previously, using ``setuptools`` in `develop mode`_
or a ``pip`` `editable install`_ is not possible with this
library. This is because this library uses `namespace packages`_.
For context see `Issue #2316`_ and the relevant `PyPA issue`_.

Since ``editable`` / ``develop`` mode can't be used, packages
need to be installed directly. Hence your changes to the source
tree don't get incorporated into the **already installed**
package.

.. _namespace packages: https://www.python.org/dev/peps/pep-0420/
.. _Issue #2316: https://github.com/GoogleCloudPlatform/google-cloud-python/issues/2316
.. _PyPA issue: https://github.com/pypa/packaging-problems/issues/12
.. _develop mode: https://setuptools.readthedocs.io/en/latest/setuptools.html#development-mode
.. _editable install: https://pip.pypa.io/en/stable/reference/pip_install/#editable-installs

*****************************************
I'm getting weird errors... Can you help?
*****************************************
Expand Down
20 changes: 18 additions & 2 deletions UPGRADING.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,8 +38,24 @@ The 2.0.0 release requires Python 3.6+.

> **WARNING**: Breaking change

The library was moved into `google.cloud.bigquery` namespace. Existing imports
need to be updated.

### Version 2.1.0

The library had its old namespace restored, since importing from
`google.cloud.bigquery` clashed with the `google-cloud-bigquery` library when the
latter was also installed.

The import paths that were changed in version `2.0.0` should be reverted:

```py
from google.cloud import bigquery_datatransfer
from google.cloud import bigquery_datatransfer_v1
```
Comment on lines +42 to +53
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This assumes we will release the namespace adjustment in v2.1.0.

Version 2.1.0 can be viewed as a non-trivial "fix" of the v2.0.0 release, but if we want to release it as v3.0.0, please comment below.


### Version 2.0.0

(obsolete) The library was moved into `google.cloud.bigquery` namespace. Existing
imports need to be updated, unless using a version `>=2.1.0`.

**Before:**
```py
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
Services for Google Cloud Bigquery Datatransfer v1 API
======================================================

.. automodule:: google.cloud.bigquery.datatransfer_v1.services.data_transfer_service
.. automodule:: google.cloud.bigquery_datatransfer_v1.services.data_transfer_service
:members:
:inherited-members:
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
Types for Google Cloud Bigquery Datatransfer v1 API
===================================================

.. automodule:: google.cloud.bigquery.datatransfer_v1.types
.. automodule:: google.cloud.bigquery_datatransfer_v1.types
:members:
1 change: 1 addition & 0 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,7 @@
"sphinx.ext.autosummary",
"sphinx.ext.intersphinx",
"sphinx.ext.coverage",
"sphinx.ext.doctest",
"sphinx.ext.napoleon",
"sphinx.ext.todo",
"sphinx.ext.viewcode",
Expand Down
4 changes: 2 additions & 2 deletions docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,8 @@ API Reference
.. toctree::
:maxdepth: 2

Client <datatransfer_v1/services>
Types <datatransfer_v1/types>
Client <bigquery_datatransfer_v1/services>
Types <bigquery_datatransfer_v1/types>


Migration Guide
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,84 +15,84 @@
# limitations under the License.
#

from google.cloud.bigquery.datatransfer_v1.services.data_transfer_service.async_client import (
from google.cloud.bigquery_datatransfer_v1.services.data_transfer_service.async_client import (
DataTransferServiceAsyncClient,
)
from google.cloud.bigquery.datatransfer_v1.services.data_transfer_service.client import (
from google.cloud.bigquery_datatransfer_v1.services.data_transfer_service.client import (
DataTransferServiceClient,
)
from google.cloud.bigquery.datatransfer_v1.types.datatransfer import (
from google.cloud.bigquery_datatransfer_v1.types.datatransfer import (
CheckValidCredsRequest,
)
from google.cloud.bigquery.datatransfer_v1.types.datatransfer import (
from google.cloud.bigquery_datatransfer_v1.types.datatransfer import (
CheckValidCredsResponse,
)
from google.cloud.bigquery.datatransfer_v1.types.datatransfer import (
from google.cloud.bigquery_datatransfer_v1.types.datatransfer import (
CreateTransferConfigRequest,
)
from google.cloud.bigquery.datatransfer_v1.types.datatransfer import DataSource
from google.cloud.bigquery.datatransfer_v1.types.datatransfer import DataSourceParameter
from google.cloud.bigquery.datatransfer_v1.types.datatransfer import (
from google.cloud.bigquery_datatransfer_v1.types.datatransfer import DataSource
from google.cloud.bigquery_datatransfer_v1.types.datatransfer import DataSourceParameter
from google.cloud.bigquery_datatransfer_v1.types.datatransfer import (
DeleteTransferConfigRequest,
)
from google.cloud.bigquery.datatransfer_v1.types.datatransfer import (
from google.cloud.bigquery_datatransfer_v1.types.datatransfer import (
DeleteTransferRunRequest,
)
from google.cloud.bigquery.datatransfer_v1.types.datatransfer import (
from google.cloud.bigquery_datatransfer_v1.types.datatransfer import (
GetDataSourceRequest,
)
from google.cloud.bigquery.datatransfer_v1.types.datatransfer import (
from google.cloud.bigquery_datatransfer_v1.types.datatransfer import (
GetTransferConfigRequest,
)
from google.cloud.bigquery.datatransfer_v1.types.datatransfer import (
from google.cloud.bigquery_datatransfer_v1.types.datatransfer import (
GetTransferRunRequest,
)
from google.cloud.bigquery.datatransfer_v1.types.datatransfer import (
from google.cloud.bigquery_datatransfer_v1.types.datatransfer import (
ListDataSourcesRequest,
)
from google.cloud.bigquery.datatransfer_v1.types.datatransfer import (
from google.cloud.bigquery_datatransfer_v1.types.datatransfer import (
ListDataSourcesResponse,
)
from google.cloud.bigquery.datatransfer_v1.types.datatransfer import (
from google.cloud.bigquery_datatransfer_v1.types.datatransfer import (
ListTransferConfigsRequest,
)
from google.cloud.bigquery.datatransfer_v1.types.datatransfer import (
from google.cloud.bigquery_datatransfer_v1.types.datatransfer import (
ListTransferConfigsResponse,
)
from google.cloud.bigquery.datatransfer_v1.types.datatransfer import (
from google.cloud.bigquery_datatransfer_v1.types.datatransfer import (
ListTransferLogsRequest,
)
from google.cloud.bigquery.datatransfer_v1.types.datatransfer import (
from google.cloud.bigquery_datatransfer_v1.types.datatransfer import (
ListTransferLogsResponse,
)
from google.cloud.bigquery.datatransfer_v1.types.datatransfer import (
from google.cloud.bigquery_datatransfer_v1.types.datatransfer import (
ListTransferRunsRequest,
)
from google.cloud.bigquery.datatransfer_v1.types.datatransfer import (
from google.cloud.bigquery_datatransfer_v1.types.datatransfer import (
ListTransferRunsResponse,
)
from google.cloud.bigquery.datatransfer_v1.types.datatransfer import (
from google.cloud.bigquery_datatransfer_v1.types.datatransfer import (
ScheduleTransferRunsRequest,
)
from google.cloud.bigquery.datatransfer_v1.types.datatransfer import (
from google.cloud.bigquery_datatransfer_v1.types.datatransfer import (
ScheduleTransferRunsResponse,
)
from google.cloud.bigquery.datatransfer_v1.types.datatransfer import (
from google.cloud.bigquery_datatransfer_v1.types.datatransfer import (
StartManualTransferRunsRequest,
)
from google.cloud.bigquery.datatransfer_v1.types.datatransfer import (
from google.cloud.bigquery_datatransfer_v1.types.datatransfer import (
StartManualTransferRunsResponse,
)
from google.cloud.bigquery.datatransfer_v1.types.datatransfer import (
from google.cloud.bigquery_datatransfer_v1.types.datatransfer import (
UpdateTransferConfigRequest,
)
from google.cloud.bigquery.datatransfer_v1.types.transfer import EmailPreferences
from google.cloud.bigquery.datatransfer_v1.types.transfer import ScheduleOptions
from google.cloud.bigquery.datatransfer_v1.types.transfer import TransferConfig
from google.cloud.bigquery.datatransfer_v1.types.transfer import TransferMessage
from google.cloud.bigquery.datatransfer_v1.types.transfer import TransferRun
from google.cloud.bigquery.datatransfer_v1.types.transfer import TransferState
from google.cloud.bigquery.datatransfer_v1.types.transfer import TransferType
from google.cloud.bigquery_datatransfer_v1.types.transfer import EmailPreferences
from google.cloud.bigquery_datatransfer_v1.types.transfer import ScheduleOptions
from google.cloud.bigquery_datatransfer_v1.types.transfer import TransferConfig
from google.cloud.bigquery_datatransfer_v1.types.transfer import TransferMessage
from google.cloud.bigquery_datatransfer_v1.types.transfer import TransferRun
from google.cloud.bigquery_datatransfer_v1.types.transfer import TransferState
from google.cloud.bigquery_datatransfer_v1.types.transfer import TransferType

__all__ = (
"CheckValidCredsRequest",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,9 +28,9 @@
from google.auth import credentials # type: ignore
from google.oauth2 import service_account # type: ignore

from google.cloud.bigquery.datatransfer_v1.services.data_transfer_service import pagers
from google.cloud.bigquery.datatransfer_v1.types import datatransfer
from google.cloud.bigquery.datatransfer_v1.types import transfer
from google.cloud.bigquery_datatransfer_v1.services.data_transfer_service import pagers
from google.cloud.bigquery_datatransfer_v1.types import datatransfer
from google.cloud.bigquery_datatransfer_v1.types import transfer
from google.protobuf import duration_pb2 as duration # type: ignore
from google.protobuf import field_mask_pb2 as field_mask # type: ignore
from google.protobuf import struct_pb2 as struct # type: ignore
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -32,9 +32,9 @@
from google.auth.exceptions import MutualTLSChannelError # type: ignore
from google.oauth2 import service_account # type: ignore

from google.cloud.bigquery.datatransfer_v1.services.data_transfer_service import pagers
from google.cloud.bigquery.datatransfer_v1.types import datatransfer
from google.cloud.bigquery.datatransfer_v1.types import transfer
from google.cloud.bigquery_datatransfer_v1.services.data_transfer_service import pagers
from google.cloud.bigquery_datatransfer_v1.types import datatransfer
from google.cloud.bigquery_datatransfer_v1.types import transfer
from google.protobuf import duration_pb2 as duration # type: ignore
from google.protobuf import field_mask_pb2 as field_mask # type: ignore
from google.protobuf import struct_pb2 as struct # type: ignore
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -17,8 +17,8 @@

from typing import Any, AsyncIterable, Awaitable, Callable, Iterable, Sequence, Tuple

from google.cloud.bigquery.datatransfer_v1.types import datatransfer
from google.cloud.bigquery.datatransfer_v1.types import transfer
from google.cloud.bigquery_datatransfer_v1.types import datatransfer
from google.cloud.bigquery_datatransfer_v1.types import transfer


class ListDataSourcesPager:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -25,8 +25,8 @@
from google.api_core import retry as retries # type: ignore
from google.auth import credentials # type: ignore

from google.cloud.bigquery.datatransfer_v1.types import datatransfer
from google.cloud.bigquery.datatransfer_v1.types import transfer
from google.cloud.bigquery_datatransfer_v1.types import datatransfer
from google.cloud.bigquery_datatransfer_v1.types import transfer
from google.protobuf import empty_pb2 as empty # type: ignore


Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,8 +26,8 @@

import grpc # type: ignore

from google.cloud.bigquery.datatransfer_v1.types import datatransfer
from google.cloud.bigquery.datatransfer_v1.types import transfer
from google.cloud.bigquery_datatransfer_v1.types import datatransfer
from google.cloud.bigquery_datatransfer_v1.types import transfer
from google.protobuf import empty_pb2 as empty # type: ignore

from .base import DataTransferServiceTransport, DEFAULT_CLIENT_INFO
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,8 @@
import grpc # type: ignore
from grpc.experimental import aio # type: ignore

from google.cloud.bigquery.datatransfer_v1.types import datatransfer
from google.cloud.bigquery.datatransfer_v1.types import transfer
from google.cloud.bigquery_datatransfer_v1.types import datatransfer
from google.cloud.bigquery_datatransfer_v1.types import transfer
from google.protobuf import empty_pb2 as empty # type: ignore

from .base import DataTransferServiceTransport, DEFAULT_CLIENT_INFO
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@
import proto # type: ignore


from google.cloud.bigquery.datatransfer_v1.types import transfer
from google.cloud.bigquery_datatransfer_v1.types import transfer
from google.protobuf import duration_pb2 as duration # type: ignore
from google.protobuf import field_mask_pb2 as field_mask # type: ignore
from google.protobuf import timestamp_pb2 as timestamp # type: ignore
Expand Down
6 changes: 3 additions & 3 deletions samples/create_scheduled_query.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,9 +20,9 @@

def sample_create_transfer_config(project_id, dataset_id, authorization_code=""):
# [START bigquerydatatransfer_create_scheduled_query]
from google.cloud.bigquery import datatransfer_v1
from google.cloud import bigquery_datatransfer

client = datatransfer_v1.DataTransferServiceClient()
client = bigquery_datatransfer.DataTransferServiceClient()

# TODO(developer): Set the project_id to the project that contains the
# destination dataset.
Expand Down Expand Up @@ -54,7 +54,7 @@ def sample_create_transfer_config(project_id, dataset_id, authorization_code="")

parent = f"projects/{project_id}"

transfer_config = datatransfer_v1.types.TransferConfig(
transfer_config = bigquery_datatransfer.TransferConfig(
destination_dataset_id=dataset_id,
display_name="Your Scheduled Query Name",
data_source_id="scheduled_query",
Expand Down
4 changes: 2 additions & 2 deletions samples/snippets/quickstart.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,9 +17,9 @@

def run_quickstart(project="my-project"):
# [START bigquerydatatransfer_quickstart]
from google.cloud.bigquery import datatransfer
from google.cloud import bigquery_datatransfer

client = datatransfer.DataTransferServiceClient()
client = bigquery_datatransfer.DataTransferServiceClient()

# TODO: Update to your project ID.
# project = "my-project"
Expand Down
4 changes: 2 additions & 2 deletions samples/tests/conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,9 +39,9 @@ def credentials():

@pytest.fixture(scope="module")
def bqdts_client(credentials):
from google.cloud.bigquery import datatransfer_v1
from google.cloud import bigquery_datatransfer

return datatransfer_v1.DataTransferServiceClient(credentials=credentials)
return bigquery_datatransfer.DataTransferServiceClient(credentials=credentials)


@pytest.fixture(scope="module")
Expand Down
4 changes: 2 additions & 2 deletions samples/update_transfer_config.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,9 +20,9 @@

def sample_update_transfer_config(config_name, display_name):
# [START bigquerydatatransfer_update_transfer_config]
from google.cloud.bigquery import datatransfer_v1
from google.cloud import bigquery_datatransfer

client = datatransfer_v1.DataTransferServiceClient()
client = bigquery_datatransfer.DataTransferServiceClient()
# TODO(developer): Set the config_name which user wants to update.
# config_name = "your-created-transfer-config-name"

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ def partition(
return results[1], results[0]


class datatransferCallTransformer(cst.CSTTransformer):
class bigquery_datatransferCallTransformer(cst.CSTTransformer):
CTRL_PARAMS: Tuple[str] = ('retry', 'timeout', 'metadata')
METHOD_TO_PARAMS: Dict[str, Tuple[str]] = {
'check_valid_creds': ('name', ),
Expand Down Expand Up @@ -103,7 +103,7 @@ def fix_files(
in_dir: pathlib.Path,
out_dir: pathlib.Path,
*,
transformer=datatransferCallTransformer(),
transformer=bigquery_datatransferCallTransformer(),
):
"""Duplicate the input dir to the output dir, fixing file method calls.

Expand Down Expand Up @@ -136,7 +136,7 @@ def fix_files(

if __name__ == '__main__':
parser = argparse.ArgumentParser(
description="""Fix up source that uses the datatransfer client library.
description="""Fix up source that uses the bigquery_datatransfer client library.

The existing sources are NOT overwritten but are copied to output_dir with changes made.

Expand Down
5 changes: 1 addition & 4 deletions setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -57,9 +57,6 @@
if "google.cloud" in packages:
namespaces.append("google.cloud")

if "google.cloud.bigquery" in packages:
namespaces.append("google.cloud.bigquery")

setuptools.setup(
name=name,
version=version,
Expand Down Expand Up @@ -87,7 +84,7 @@
install_requires=dependencies,
extras_require=extras,
python_requires=">=3.6",
scripts=["scripts/fixup_datatransfer_v1_keywords.py"],
scripts=["scripts/fixup_bigquery_datatransfer_v1_keywords.py"],
include_package_data=True,
zip_safe=False,
)