1、用./bin/spark-shell启动spark时遇到异常:java.net.BindException: Can’t assign requested address: Service ‘sparkDriver’ failed after 16 retries! 解决方法:add export SPARK_LOCAL_IP=”127.0.0.1″ to spark-env.sh 2、java Kafka producer error:ERROR kafka.utils.Utils$ ...继续阅读