这是系列搭建springcloud基础框架的文章,内容包括集成shiro、Mysql主从、seata、activiti、drools、hadoop大数据常用组件、keepalive+nginx https配置等;
创建 springboot+scala
a. 需要 scala jdk 库,将你 scala-2.11.12.tgz 解压出来,通过 Project Structure 的 Library 的 “+” 添加 scala 目录进来;
b. scala 应该是一个 普通是一个普通的maven 项目,因为通过网上各种打包方式,spark-submit 都找不到指定的 class
方式一:springboot不行,普通maven 未试过;
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>4.2.0</version>
<executions>
<execution>
<id>compile-scala</id>
<phase>compile</phase>
<goals>
<goal>add-source</goal>
<goal>compile</goal>
</goals>
</execution>
<execution>
<id>test-compile-scala</id>
<phase>test-compile</phase>
<goals>
<goal>add-source</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
</plugin>
方式二:springboot不行,普通maven可行
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<version>2.15.2</version>
<executions>
<execution>
<id>scala-compile-first</id>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
<configuration>
<includes>
<include>**/*.scala</include>
</includes>
<scalaVersion>2.12.10</scalaVersion>
<args>
<arg>-target:jvm-1.5</arg>
</args>
</configuration>
</execution>
</executions>
</plugin>
方式3:springboot 不行,无法解压 jar,但网上说可以
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.2.4</version>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<version>${spring-boot.version}</version>
</dependency>
</dependencies>
<configuration>
<keepDependenciesWithProvidedScope>false</keepDependenciesWithProvidedScope>
<createDependencyReducedPom>false</createDependencyReducedPom>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>META-INF/spring.handlers</resource>
</transformer>
<transformer
implementation="org.springframework.boot.maven.PropertiesMergingResourceTransformer">
<resource>META-INF/spring.factories</resource>
</transformer>
<transformer
implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>META-INF/spring.schemas</resource>
</transformer>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<mainClass>${start-class}</mainClass>
</transformer>
</transformers>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
</plugin>
c. spark-submit 提示 类找不到 ,native 问题;
spark-submit jar 提示 class not found 和 native 问题.note
参考:https://blog.csdn.net/qq_44642612/article/details/104379893
参考:https://www.likecs.com/show-308516231.html (有需要解决 native 的方法)
d. org/apache/spark/streaming/kafka010/KafkaUtils$
i1. 将打包的 scalaVersion 与 idea 的 scala sdk 库一致打包,还是一样;
i2. 缺少的是 spark-streaming+kafka 的 spark-streaming-kafka-0-10_2.12-3.0.0.jar,拷贝到 spark安装目录/libs 下;
e. Exception in thread “main” java.lang.NoSuchMethodError: org.apache.spark.internal.Logging.$init$(Lorg/apache/spark/internal/Logging;)V
i1. 通过 spark/bin/spark-shell ,看到 spark 的环境为
spark: 2.3.0
scala: 2.11.8
将 scala+maven 相关依赖改为以上版本后 ,问题依旧;
i2. 解决:将 /spark/jars/ 下 spark-streaming-kafka-0-10_2.12-3.0.0.jar 删除,保留(spark-streaming-kafka-0-10_2.11-2.3.0.jar)
参考:https://www.codenong.com/54085459/
f. java.lang.NoClassDefFoundError: org/apache/kafka/clients/consumer/Consumer
参考:http://www.manongjc.com/detail/15-jqnjpwtzjnjnbhh.html
将 kafka-clients-0.10.0.1.jar 拷贝到 spark/jars下,这个 jar 版本与 idea maven library 库里显示的一致;
或是将 kafka-clients-0.10.0.1.jar 拷贝到 与 speak-submit 提交的 Jar 同一位置,在 spark-submit 指定 –jars kafka-clients-0.10.0.1.jar
== spark 也有类似的错误记录 ;
g. SerializationException: Could not read JSON: Unexpected character(‘\’
改为 redisTemplate.opsForValues(). get set
h. Service ‘MasterUI’ could not bind on port 8080. Attempting port 8081 (但是 spark 启动的报错,8080是 web的端口)
lsof -i:8080 找不到,只能添加 docker containers 映射端口;
i. io.lettuce.core.RedisCommandTimeoutException: Command timed out after 3 second(s)
错误原因:redis连接池lettuce存在bug
解决办法1:更换连接池,使用jedis
解决办法2:手动心跳
@Scheduled(cron = "0/10 * * * * *")
public void timer() {
redisTemplate.opsForValue().get("heartbeat");
}
j. Redis command timed out; nested exception is io.lettuce.core.RedisCommandTimeoutException
去掉 redis 的timeout 配置;
k. spark-submit 开启后,日志刷得很快
修改 spark/conf/log4j.properties 只输出 error日志;
参考:https://blog.csdn.net/XnCSD/article/details/100586224 (spark-submit 使用详解)
参考:https://blog.csdn.net/dkl12/article/details/84140956 (日志配置)
l. springboot logback slf4j 输出日志乱码