Tag: spark cassandra connector

无法在Spark上运行Docker上的Cassandra

我有一个在Docker上运行的Zeppelin笔记本。 我有以下代码使用Cassandra: import org.apache.spark.sql.cassandra._ val cqlContext = new CassandraSQLContext(sc) cqlContext.sql("select * from demo.table").collect.foreach(println) 但是,我得到这个错误: import org.apache.spark.sql.cassandra._ cqlContext: org.apache.spark.sql.cassandra.CassandraSQLContext = org.apache.spark.sql.cassandra.CassandraSQLContext@395e28a8 com.google.common.util.concurrent.UncheckedExecutionException: java.lang.IllegalArgumentException: Cannot build a cluster without contact points at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2199) at com.google.common.cache.LocalCache.get(LocalCache.java:3932) at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3936) at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4806) at org.apache.spark.sql.cassandra.CassandraCatalog.lookupRelation(CassandraCatalog.scala:28) at org.apache.spark.sql.cassandra.CassandraSQLContext$$anon$2.org$apache$spark$sql$catalyst$analysis$OverrideCatalog$$super$lookupRelation(CassandraSQLContext.scala:219) at org.apache.spark.sql.catalyst.analysis.OverrideCatalog$$anonfun$lookupRelation$3.apply(Catalog.scala:137) at org.apache.spark.sql.catalyst.analysis.OverrideCatalog$$anonfun$lookupRelation$3.apply(Catalog.scala:137) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.sql.catalyst.analysis.OverrideCatalog$class.lookupRelation(Catalog.scala:137) at org.apache.spark.sql.cassandra.CassandraSQLContext$$anon$2.lookupRelation(CassandraSQLContext.scala:219) at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$5.applyOrElse(Analyzer.scala:143) at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$5.applyOrElse(Analyzer.scala:138) […]

在Java中通过spark连接到Cassandra时出现问题

我有服务器与docker,并创build3 Cassandra节点,2个工人火花节点和一个主火花节点。 现在我想通过Java应用程序连接到我的笔记本电脑的火花。 我的java代码是: public SparkTestPanel(String id, User user) { super(id); form = new Form("form"); form.setOutputMarkupId(true); this.add(form); SparkConf conf = new SparkConf(true); conf.setAppName("Spark Test"); conf.setMaster("spark://172.11.100.156:9050"); conf.set("spark.cassandra.connection.host", "cassandra-0"); conf.set("spark.cassandra.connection.port", "9042"); conf.set("spark.cassandra.auth.username", "cassandra"); conf.set("spark.cassandra.auth.password", "cassandra"); JavaSparkContext sc = null; try { sc = new JavaSparkContext(conf); CassandraTableScanJavaRDD<com.datastax.spark.connector.japi.CassandraRow> cassandraTable = javaFunctions(sc).cassandraTable("test", "test_table"); List<com.datastax.spark.connector.japi.CassandraRow> collect = cassandraTable.collect(); for(com.datastax.spark.connector.japi.CassandraRow cassandraRow : […]