You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I installed gemini according to the docker-compose documentation, then ran:
> docker-compose exec gemini ./hash /repositories
It results with:
Running Hashing as Apache Spark job, master: local[*]
Hashing 2 repositories in: '/repositories' ()
file:/repositories/186661fcc144ea9c6ed0a57e706b94a3526a7daa.siva
file:/repositories/c093e97dc56b5f91d93f503bf69b5ebc126d2e36.siva
Exception in thread "main" java.io.IOException: Failed to open native connection to Cassandra at {172.18.0.4}:9042
at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:168)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$8.apply(CassandraConnector.scala:154)
at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$8.apply(CassandraConnector.scala:154)
at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:32)
at com.datastax.spark.connector.cql.RefCountedCache.syncAcquire(RefCountedCache.scala:69)
at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:57)
at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:79)
at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:111)
at tech.sourced.gemini.cmd.HashSparkApp$.delayedEndpoint$tech$sourced$gemini$cmd$HashSparkApp$1(HashSparkApp.scala:175)
at tech.sourced.gemini.cmd.HashSparkApp$delayedInit$body.apply(HashSparkApp.scala:38)
at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:381)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at tech.sourced.gemini.cmd.HashSparkApp$.main(HashSparkApp.scala:38)
at tech.sourced.gemini.cmd.HashSparkApp.main(HashSparkApp.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:755)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: com.datastax.driver.core.exceptions.NoHostAvailableException: All host(s) tried for query failed (tried: scylla/172.18.0.4:9042 (com.datastax.driver.core.exceptions.TransportException: [scylla/172.18.0.4:9042] Cannot connect))
at com.datastax.driver.core.ControlConnection.reconnectInternal(ControlConnection.java:233)
at com.datastax.driver.core.ControlConnection.connect(ControlConnection.java:79)
at com.datastax.driver.core.Cluster$Manager.init(Cluster.java:1483)
at com.datastax.driver.core.Cluster.getMetadata(Cluster.java:399)
at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:161)
... 27 more```
The text was updated successfully, but these errors were encountered:
hi @amotzhoshen !
sorry for the late response. Could you please verify that ScyllaDB started correctly for you?
You can see the status of containers using docker-compose status and to see the logs docker-compose logs scylla.
I installed gemini according to the docker-compose documentation, then ran:
> docker-compose exec gemini ./hash /repositories
It results with:
The text was updated successfully, but these errors were encountered: