1.在java代码中写scala代码
编译后会出现
scala
代码找不到,需要在pom.xml
添加
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.0</version>
<executions>
<execution>
<id>scala-compile-first</id>
<phase>process-resources</phase>
<goals>
<goal>add-source</goal>
<goal>compile</goal>
</goals>
</execution>
<execution>
<id>test-compile-scala</id>
<phase>test-compile</phase>
<goals>
<goal>add-source</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
<configuration>
<scalaVersion>${scala.binary.version}</scalaVersion>
</configuration>
</plugin>
2.为了减少定位问题成本,使用jdk1.8,不然会出现各种诡异问题
3.dubbo中使用jdk
init.sh
中添加
shopt -s expand_aliases
export JAVA_HOME="/usr/local/jdk18"
alias java="/usr/local/jdk18/bin/java"
alias nohup=":;"
wiki:
4.jar包冲突
使用mvn dependency:tree
排除低版本jar包
5.spark启动异常
Exception in thread "main" java.net.BindException: Cannot assign requested address: bind: Service 'sparkDriver' failed after 16 retries (on a random free port)!
Consider explicitly setting the appropriate binding address for the service 'sparkDriver' (for example spark.driver.bindAddress for SparkDriver) to the correct binding address.
添加SPARK_LOCAL_IP
变量,整合dubbo
在init.sh
中添加export SPARK_LOCAL_IP="127.0.0.1"