『壹』 怎麼用Eclipse搭建Spark源碼閱讀環境
第一部分、軟體安裝
1、 安裝JDK (版本為1.7.0_11)
2、 安裝Scala (版本為2.11.2)
3、 安裝ScalaIDE(版本為3.0.4)
第二部分:加壓縮官網下載的源代碼包或者找到通過Git抽取的Spark源文件:
我用的是spark-1.1.1版本(最新版本),由於idea 13已經原生支持sbt,所以無須為idea安裝sbt插件。
源碼下載(用git工具):
# Masterdevelopment branch
gitclone git://github.com/apache/spark.git
# 1.1 maintenancebranch with stability fixes on top of Spark 1.1.1
gitclone git://github.com/apache/spark.git -b branch-1.1
源碼更新(用git工具同步跟新源碼):
gitclone https://github.com/apache/spark.git
第三部分:通過sbt工具,構建Scala的Eclipse工程,詳細步驟如下所示
1、通過cmd命令進入DOS界面,之後通過cd命令進入源代碼項目中,我下載的Spark.1.1.1版本的源代碼放在(E:\Spark計算框架的研究\spark_1_1_1_eclipse)文件夾中,之後運行sbt命令,如下所示:
2、運行sbt命令之後,解析編譯相關的jar包,並出現sbt命令界面窗口,出現的效果圖如下所示,之後運行eclipse命令,sbt對這個工程進行編譯,構建Eclipse項目,效果圖如下所示:
4、 打開ScalaIDE工具,File à Import à Existing Projects into Workspace à
Next à
選擇剛好用sbt工具編譯好的Eclipse工程(E:\Spark計算框架的研究\spark_1_1_1_eclipse),如下圖所示。
5、 通過上面的操作,就可以將通過sbt工具編譯生成的Eclipse項目導入到EclipseIDE開發環境中,效果圖如下所示:
錯誤提示如下所示:我導入的包為,如下文件夾中所示。
(E:\Spark計算框架的研究\spark_1_1_1_eclipse\lib_managed\bundles)
Description Resource Path Location Type
akka-remote_2.10-2.2.3-shaded-protobuf.jar is cross-compiled
with an incompatible version of Scala (2.10).
In case of errorneous report, this check can be disabled
in the compiler preference page.
spark-core Unknown Scala Classpath Problem
Description Resource Path Location Type
akka-slf4j_2.10-2.2.3-shaded-protobuf.jar is cross-compiled with
an incompatible version of Scala (2.10). In case of errorneous report,
this check can be disabled in the compiler preference page.
spark-core Unknown Scala Classpath Problem
Description Resource Path Location Type
akka-testkit_2.10-2.2.3-shaded-protobuf.jar is cross-compiled
with an incompatible version of Scala (2.10).
In case of errorneous report, this check can be disabled in the compiler preference page.
spark-core Unknown Scala Classpath Problem
Description Resource Path Location Type
akka-zeromq_2.10-2.2.3-shaded-protobuf.jar is cross-compiled
with an incompatible version of Scala (2.10).
In case of errorneous report, this check can be disabled in the compiler preference page.
spark-core Unknown Scala Classpath Problem
上面這些包兼容性問題還沒有解決,修改相應的jar包就可以解決。
『貳』 如何配置sbt的build.sbt使得編譯時將依賴包也打包進去
首先問題解決了,就是sbt-assembly插件的配置問題。這個文檔自己前兩次讀表示看不懂意思。過2天又仔細看了一遍大致明白了,敢動手操作了。
assembly插件的目的是:
The goal is simple: Create a fat JAR of your project with all of its dependencies.
即將項目依賴的大文件也打包到生成的jar中。我的報錯Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/streaming/kafka/KafkaUtils$就是因為包含KafkaUtil的jar包沒打包到jar中。
配置使用插件
下面是我的scala項目目錄結構:
.
├── assembly.sbt
├── build.sbt
├── project
├── README.md
├── run-assembly.sh
├── run.sh
├── src
└── target
插件的配置取決於sbt的版本,詳情見這里
我的是sbt 0.13.8,所以在project/assembly.sbt添加(assembly.sbt)要自己創建:
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.0")
配置assembly的參數
參數在項目根目錄下新建assembly.sbt。
直接引入插件就可以用
sbt assembly
編譯了,但是由於我的sbt下載了大量的依賴包,在編譯的時候遇到了編譯包沖突的問題。這個時候需要配置Merge Strategy(合並策略)
『叄』 cassandra在pom.xml裡面怎麼設置library
1.執行代碼//CassandraTest.scalaimportorg.apache.spark.{Logging,SparkContext,SparkConf}importcom.datastax.spark.connector.cql.{defmain(args:Array[String]){#配置spark,cassandra的ip,這里都是本機valSparkMasterHost="127.0.0.1"valCassandraHost="127.0.0.1"//:valconf=newSparkConf(true).set("spark.cassandra.connection.host",CassandraHost).set("spark.cleaner.ttl","3600").setMaster("local[12]").setAppName("CassandraTestApp")//ConnecttotheSparkcluster:lazyvalsc=newSparkContext(conf)//預處理腳本,連接的時候就執行這些CassandraConnector(conf).withSessionDo{session=>session.execute("={'class':'SimpleStrategy','replication_factor':1}")session.execute("CREATETABLEIFNOTEXISTStest.key_value(keyINTPRIMARYKEY,valueVARCHAR)")session.execute("TRUNCATEtest.key_value")session.execute("INSERTINTOtest.key_value(key,value)VALUES(1,'firstrow')")session.execute("INSERTINTOtest.key_value(key,value)VALUES(2,'secondrow')")session.execute("INSERTINTOtest.key_value(key,value)VALUES(3,'thirdrow')")}//載入connectorimportcom.datastax.spark.connector._//Readtabletest.kvandprintitscontents:valrdd=sc.cassandraTable("test","key_value").select("key","value")rdd.collect().foreach(row=>println(s"ExistingData:$row"))//Writetwonewrowstothetest.kvtable:valcol=sc.parallelize(Seq((4,"fourthrow"),(5,"fifthrow")))col.saveToCassandra("test","key_value",SomeColumns("key","value"))//.kvtable:assert(col.collect().length==2)col.collect().foreach(row=>println(s"NewData:$row"))println(s"Workcompleted,stoppingtheSparkcontext.")sc.stop()}}2.目錄結構由於構建工具是用sbt,所以目錄結構也必須遵循sbt規范,主要是build.sbt和src目錄,其它目錄會自動生成。qpzhang@qpzhangdeMac-mini:~/scala_code/CassandraTest$lltotal8drwxr-xr-x6qpzhangstaff204112612:14./drwxr-xr-x10qpzhangstaff340112517:30../-rw-r--r--1qpzhangstaff460112610:11build.sbtdrwxr-xr-x3qpzhangstaff102112517:42project/drwxr-xr-x3qpzhangstaff102112517:32src/drwxr-xr-x6qpzhangstaff204112517:55target/qpzhang@qpzhangdeMac-mini:~/scala_code/CassandraTest$treesrc/src/└──main└──scala└──CassandraTest.scalaqpzhang@qpzhangdeMac-mini:~/scala_code/CassandraTest$catbuild.sbtname:="CassandraTest"version:="1.0"scalaVersion:="2.10.4"libraryDependencies+="org.apache.spark"%%"spark-core"%"1.5.2"%"provided"libraryDependencies+="com.datastax.spark"%%"spark-cassandra-connector"%"1.5.0-M2":={casePathList(ps@_*)ifps.lastendsWith".properties"=>MergeStrategy.firstcasex=>valoldStrategy=().valueoldStrategy(x)}這里需要注意的是,sbt安裝的是當時最新版本0.13,並且安裝了assembly插件(這里要吐槽一下sbt,下載一坨坨的jar包,最好有FQ代理,否則下載等待時間很長)。qpzhang@qpzhangdeMac-mini:~/scala_code/CassandraTest$cat~/.sbt/0.13/plugins/plugins.sbtaddSbtPlugin("com.typesafe.sbteclipse"%"sbteclipse-plugin"%"2.5.0")addSbtPlugin("com.eed3si9n"%"sbt-assembly"%"0.14.1")3.sbt編譯打包在build.sbt目錄下,使用sbt命令啟動。然後使用compile命令進行編譯,使用assembly進行打包。在次期間,遇到了sbt-assembly-deplicate-error的問題,參考這里。>compile[success]Totaltime:0s,completed2015-11-2610:11:50>>assembly[info]Includingfromcache:slf4j-api-1.7.5.jar[info]Includingfromcache:metrics-core-3.0.2.jar[info]Includingfromcache:netty-codec-4.0.27.Final.jar[info]Includingfromcache:netty-handler-4.0.27.Final.jar[info]Includingfromcache:netty-common-4.0.27.Final.jar[info]Includingfromcache:joda-time-2.3.jar[info]Includingfromcache:netty-buffer-4.0.27.Final.jar[info]Includingfromcache:commons-lang3-3.3.2.jar[info]Includingfromcache:jsr166e-1.1.0.jar[info]Includingfromcache:cassandra-clientutil-2.1.5.jar[info]Includingfromcache:joda-convert-1.2.jar[info]Includingfromcache:netty-transport-4.0.27.Final.jar[info]Includingfromcache:guava-16.0.1.jar[info]Includingfromcache:spark-cassandra-connector_2.10-1.5.0-M2.jar[info]Includingfromcache:cassandra-driver-core-2.2.0-rc3.jar[info]Includingfromcache:scala-reflect-2.10.5.jar[info]Includingfromcache:scala-library-2.10.5.jar[info]Checkingevery*.class/*.jarfile'sSHA-1.[info]Mergingfiles[warn]Merging'META-INF/INDEX.LIST'withstrategy'discard'[warn]Merging'META-INF/MANIFEST.MF'withstrategy'discard'[warn]Merging'META-INF/io.netty.versions.properties'withstrategy'first'[warn]Merging'META-INF/maven/com.codahale.metrics/metrics-core/pom.xml'withstrategy'discard'[warn]Merging'META-INF/maven/com.datastax.cassandra/cassandra-driver-core/pom.xml'withstrategy'discard'[warn]Merging'META-INF/maven/com.google.guava/guava/pom.xml'withstrategy'discard'[warn]Merging'META-INF/maven/com.twitter/jsr166e/pom.xml'withstrategy'discard'[warn]Merging'META-INF/maven/io.netty/netty-buffer/pom.xml'withstrategy'discard'[warn]Merging'META-INF/maven/io.netty/netty-codec/pom.xml'withstrategy'discard'[warn]Merging'META-INF/maven/io.netty/netty-common/pom.xml'withstrategy'discard'[warn]Merging'META-INF/maven/io.netty/netty-handler/pom.xml'withstrategy'discard'[warn]Merging'META-INF/maven/io.netty/netty-transport/pom.xml'withstrategy'discard'[warn]Merging'META-INF/maven/joda-time/joda-time/pom.xml'withstrategy'discard'[warn]Merging'META-INF/maven/org.apache.commons/commons-lang3/pom.xml'withstrategy'discard'[warn]Merging'META-INF/maven/org.joda/joda-convert/pom.xml'withstrategy'discard'[warn]Merging'META-INF/maven/org.slf4j/slf4j-api/pom.xml'withstrategy'discard'[warn]Strategy'discard'wasappliedto15files[warn]Strategy'first'wasappliedtoafile[info]SHA-1:[info]Packaging/Users/qpzhang/scala_code/CassandraTest/target/scala-2.10/CassandraTest-assembly-1.0.jar[info]Donepackaging.[success]Totaltime:19s,completed2015-11-2610:12:224.提交到spark,執行結果qpzhang@qpzhangdeMac-mini:~/project/spark-1.5.2-bin-hadoop2.6$./bin/spark-submit--class"CassandraTestApp"--masterlocal[4]~/scala_code/CassandraTest/target/scala-2.10/CassandraTest-assembly-1.0.jar//5/11/2611:40:23INFOTaskSetManager:Startingtask0.0instage0.0(TID0,localhost,NODE_LOCAL,26660bytes)15/11/2611:40:23INFOExecutor:Runningtask0.0instage0.0(TID0)15/11/2611:40:23INFOExecutor:Fetchinghttp://10.60.215.42:57683/jars/CassandraTest-assembly-1.0./11/2611:40:23INFOCassandraConnector::TestCluster15/11/2611:40:23INFOUtils:Fetchinghttp://10.60.215.42:57683/jars/CassandraTest-assembly-1.0.jarto/private/var/folders/2l//T/spark-4030cadf-8489-4540-976e-e98eedf50412/userFiles-63085bda-aa04-4906-9621-c1cedd98c163/.tmp15/11/2611:40:23INFOExecutor:Addingfile:/private/var/folders/2l//T/spark-4030cadf-8489-4540-976e-e98eedf50412/userFiles-63085bda-aa04-4906-9621-c1cedd98c163/CassandraTest-assembly-1.0.jartoclassloader15/11/2611:40:24INFOCluster:NewCassandrahostlocalhost/127.0.0.1:9042added15/11/2611:40:24INFOCassandraConnector:ConnectedtoCassandracluster:TestCluster15/11/2611:40:25INFOExecutor:Finishedtask0.0instage0.0(TID0).2676bytesresultsenttodriver15/11/2611:40:25INFOTaskSetManager:Finishedtask0.0instage0.0(TID0)in2462msonlocalhost(1/1)15/11/2611:40:25INFOTaskSchelerImpl:RemovedTaskSet0.0,whosetaskshaveallcompleted,frompool15/11/2611:40:25INFODAGScheler:ResultStage0(collectatCassandraTest.scala:32)finishedin2.481s15/11/2611:40:25INFODAGScheler:Job0finished:collectatCassandraTest.scala:32,took2.940601sExistingData:CassandraRow{key:1,value:firstrow}ExistingData:CassandraRow{key:2,value:secondrow}ExistingData:CassandraRow{key:3,value:thirdrow}//..5/11/2611:40:27INFOTaskSchelerImpl:RemovedTaskSet3.0,whosetaskshaveallcompleted,frompool15/11/2611:40:27INFODAGScheler:ResultStage3(collectatCassandraTest.scala:41)finishedin0.032s15/11/2611:40:27INFODAGScheler:Job3finished:collectatCassandraTest.scala:41,took0.046502sNewData:(4,fourthrow)NewData:(5,fifthrow)Workcompleted,stoppingtheSparkcontext.cassandra中的數據cqlsh:test>select*fromkey_value;key|value-----+------------5|fifthrow1|firstrow2|secondrow4|fourthrow3|thirdrow(5rows)到此位置,還算順利,除了assembly重復文件的問題,都還算ok。
『肆』 很簡單的java 介面編譯問題,求幫助!
是不是因為介面只是聲明了一個東西但是沒有具體實現它的方法。
然後編譯了介面,但是你再接著編譯另外一個類文件的時候介面的class文件的生命周期到了被自動銷毀?
如果不用編譯介面 直接編譯你的LoginAction文件試試行不行?
『伍』 JAVA介面和反射方面的問題,編譯沒問題,運行時報錯
Class clazz=Class.forName(propName);
這一句拋異常了. . . 列印一下propName看是什麼
可能是沒取到屬性文件的值
『陸』 Protel/Altium Designer編譯問題。只要器件上沒有連接的介面編譯時都會報錯,怎麼解決
這個沒關系,一般error要看看,警告可以不用管,確保你邏輯上沒有錯就可以了
『柒』 linux下用sbt編譯scala需要安裝哪些工具
1、下載sbt通用平台壓縮包:sbt-0.13.5.tgz2、建立目錄,解壓文件到所建立目錄 $ sudo mkdir /opt/scala/sbt$ sudo tar zxvf sbt-0.13.5.tgz -C /opt/scala/3、建立啟動sbt的腳本文件/*選定一個位置,建立啟動sbt的腳本文本文件,如/opt/scala/s...
『捌』 很簡單的Java介面編譯問題
建議你去看一下public ,protected,private 和不寫這三個修飾關鍵字的作用域各是什麼
『玖』 sbt編譯時怎麼匹配scala版本
build.sbt中寫入:
scalaVersion := "2.10.5"
就會使用對應的版本的庫進行編譯
『拾』 Altium Designer 6.9 編譯之後出現了很多警告,幾乎每一個介面都有問題,具體進來看~
你按住Ctrl,用滑鼠點住某的元件往出拉,,如果線和元件還連得我就不明白了,如果沒繼續連得,,說明不對,,,原因是元件和元件見沒有加線,,你直接連到一起了,,應該元件和元件見連上線