Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add more product tests coverage for native filesystem GCS and Azure #21958

Merged

Conversation

anusudarsan
Copy link
Member

@anusudarsan anusudarsan commented May 13, 2024

Release notes

(x) This is not user-visible or is docs only, and no release notes are required.

@ebyhr
Copy link
Member

ebyhr commented May 13, 2024

/test-with-secrets sha=c7a2a6a227c9d6e03270146169028a3b49a1c57b

Copy link

The CI workflow run with tests that require additional secrets has been started: https://github.com/trinodb/trino/actions/runs/9070514362

}

@Test(groups = {HIVE_AZURE, PROFILE_SPECIFIC_TESTS})
public void testBasicWriteOperations()
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would it make sense to add a test for reading external tables, and/or tables created by Hive? This would verify the compatibility of native FS libs with those other systems.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TestAzureBlobFileSystem tests this. but the GCS counterpart might be missing. I will add it


@Test(groups = {ICEBERG_AZURE, PROFILE_SPECIFIC_TESTS})
public void testBasicWriteOperations()
{
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see that the tests present here are in line with io.trino.tests.product.deltalake.TestDeltaLakeGcs which is not really a compatibility test against Spark.
Ideally we should ensure that spark can read what trino writes and the other way around for all the storages.
Yes, this would involve potentially creating new testing docker images

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok this part wasnt clear to me. I will work on adding these

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

cc @ebyhr for visibility on the testing docker images front.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will add spark compatibility tests as a follow up. addressed other comments

@anusudarsan anusudarsan force-pushed the anu/extend-product-tests-for-native-fs branch from c7a2a6a to fc3ae05 Compare May 17, 2024 22:00
@ebyhr
Copy link
Member

ebyhr commented May 20, 2024

/test-with-secrets sha=fc3ae0584a8f19288711f8925da0a568cdbc5f1a

Copy link

The CI workflow run with tests that require additional secrets has been started: https://github.com/trinodb/trino/actions/runs/9153277764

@@ -5,6 +5,11 @@
<value>false</value>
</property>

<property>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What happened when this parameter was not specified?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the with space test in Hive GCS failed. Caused by: io.trino.spi.TrinoException: java.lang.IllegalArgumentException: Invalid bucket name (....) or object name (env_multinode_gcs_..../test_path_special_character7ehvy1chkk/part=with space) This is true for legacy fs tests as well.

Copy link
Contributor

@findinpath findinpath left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM % comment in special character handling in the partition name test.

@anusudarsan anusudarsan force-pushed the anu/extend-product-tests-for-native-fs branch from fc3ae05 to e39e770 Compare May 22, 2024 15:39
@anusudarsan
Copy link
Member Author

I added a new commit for spark compatibility tests 1828858 ptal. the image is based off from trinodb/docker-images#198

@anusudarsan anusudarsan force-pushed the anu/extend-product-tests-for-native-fs branch from 17f4c57 to faa0d12 Compare May 24, 2024 16:40
@anusudarsan
Copy link
Member Author

@ebyhr can you re-trigger ci with secrets please?

@ebyhr
Copy link
Member

ebyhr commented May 24, 2024

/test-with-secrets sha=faa0d123b2a6011135ba585bc635b4bb79a33e9f

Copy link

The CI workflow run with tests that require additional secrets has been started: https://github.com/trinodb/trino/actions/runs/9230141387

@ebyhr ebyhr force-pushed the anu/extend-product-tests-for-native-fs branch from faa0d12 to dae880a Compare May 28, 2024 10:36
String sparkTableName = format("%s.test_compat.%s", getSparkCatalog(), baseTableName);
String trinoTableName = format("%s.test_compat.%s", getCatalogName(), baseTableName);
try {
onTrino().executeQuery(format("CREATE SCHEMA %s.test_compat WITH (location = '%s')", getCatalogName(), schemaLocation));
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No need to handle in this PR, but we should avoid creating a new schema per test.

@ebyhr
Copy link
Member

ebyhr commented May 28, 2024

/test-with-secrets sha=dae880a9b3ae965391c7fed516bf105fd62e6b3e

Copy link

The CI workflow run with tests that require additional secrets has been started: https://github.com/trinodb/trino/actions/runs/9268029862

@ebyhr ebyhr merged commit eca7ab3 into trinodb:master May 28, 2024
46 checks passed
@github-actions github-actions bot added this to the 449 milestone May 28, 2024
@anusudarsan anusudarsan deleted the anu/extend-product-tests-for-native-fs branch May 28, 2024 13:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Development

Successfully merging this pull request may close these issues.

None yet

4 participants