2016-03-31 2 views
-5

16/03/30 23:23:20 WARN TaskSetManager: Lost task 1.0 in stage 0.0 (TID 1, 10.208.91.144): java.lang.NoClassDefFoundError: edu/stanford/nlp/trees/TreebankLanguagePack at java.lang.Class.getDeclaredFields0(Native Method) at java.lang.Class.privateGetDeclaredFields(Class.java:2499) at java.lang.Class.getDeclaredField(Class.java:1951) at java.io.ObjectStreamClass.getDeclaredSUID(ObjectStreamClass.java:1659) at java.io.ObjectStreamClass.access$700(ObjectStreamClass.java:72) at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:480) at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:468) at java.security.AccessController.doPrivileged(Native Method) at java.io.ObjectStreamClass.(ObjectStreamClass.java:468) at java.io.ObjectStreamClass.lookup(ObjectStreamClass.java:365) at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:602) at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622) at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370) at scala.collection.immutable.$colon$colon.readObject(List.scala:362) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990) at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915) at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798) at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350) at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370) at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:69) at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:95) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:58) at org.apache.spark.scheduler.Task.run(Task.scala:70) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.ClassNotFoundException: edu.stanford.nlp.trees.TreebankLanguagePack at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at java.lang.ClassLoader.loadClass(ClassLoader.java:358) ... 58 moreЧто подразумевает под этим исключением в искре

16/03/30 23:23:20 INFO TaskSetManager: Starting task 1.1 in stage 0.0 (TID 2, 10.208.91.144, PROCESS_LOCAL, 5942 bytes) 16/03/30 23:23:20 INFO TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0) on executor 10.208.91.144: java.lang.NoClassDefFoundError (edu/stanford/nlp/trees/TreebankLanguagePack) [duplicate 1] 16/03/30 23:23:20 INFO TaskSetManager: Starting task 0.1 in stage 0.0 (TID 3, 10.208.91.144, PROCESS_LOCAL, 1435 bytes) 16/03/30 23:23:20 WARN TransportChannelHandler: Exception in connection from /10.208.91.144:61788 java.io.IOException: An existing connection was forcibly closed by the remote host at sun.nio.ch.SocketDispatcher.read0(Native Method) at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:43) at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223) at sun.nio.ch.IOUtil.read(IOUtil.java:192) at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380) at io.netty.buffer.PooledUnsafeDirectByteBuf.setBytes(PooledUnsafeDirectByteBuf.java:311) at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:881) at io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:225) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:119) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116) at java.lang.Thread.run(Thread.java:745) 16/03/30 23:23:20 ERROR TaskSchedulerImpl: Lost executor 0 on 10.208.91.144: remote Rpc client disassociated 16/03/30 23:23:20 INFO TaskSetManager: Re-queueing tasks for 0 from TaskSet 0.0 16/03/30 23:23:20 WARN ReliableDeliverySupervisor: Association with remote system [akka.tcp://[email protected]:61767] has failed, address is now gated for [5000] ms. Reason is: [Disassociated]. 16/03/30 23:23:20 WARN TaskSetManager: Lost task 1.1 in stage 0.0 (TID 2, 10.208.91.144): ExecutorLostFailure (executor 0 lost) 16/03/30 23:23:20 WARN TaskSetManager: Lost task 0.1 in stage 0.0 (TID 3, 10.208.91.144): ExecutorLostFailure (executor 0 lost) 16/03/30 23:23:20 INFO DAGScheduler: Executor lost: 0 (epoch 0) 16/03/30 23:23:20 INFO BlockManagerMasterEndpoint: Trying to remove executor 0 from BlockManagerMaster. 16/03/30 23:23:20 INFO BlockManagerMasterEndpoint: Removing block manager BlockManagerId(0, 10.208.91.144, 61786) 16/03/30 23:23:20 INFO BlockManagerMaster: Removed 0 successfully in removeExecutor 16/03/30 23:23:20 INFO AppClient$ClientActor: Executor updated: app-20160330232314-0002/0 is now EXITED (Command exited with code 50) 16/03/30 23:23:20 INFO SparkDeploySchedulerBackend: Executor app-20160330232314-0002/0 removed: Command exited with code 50 16/03/30 23:23:20 ERROR SparkDeploySchedulerBackend: Asked to remove non-existent executor 0 16/03/30 23:23:20 INFO AppClient$ClientActor: Executor added: app-20160330232314-0002/1 on worker-20160330231130-10.208.91.144-61218 (10.208.91.144:61218) with 4 cores 16/03/30 23:23:20 INFO SparkDeploySchedulerBackend: Granted executor ID app-20160330232314-0002/1 on hostPort 10.208.91.144:61218 with 4 cores, 512.0 MB RAM 16/03/30 23:23:20 INFO AppClient$ClientActor: Executor updated: app-20160330232314-0002/1 is now RUNNING 16/03/30 23:23:20 INFO AppClient$ClientActor: Executor updated: app-20160330232314-0002/1 is now LOADING 16/03/30 23:23:23 INFO SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://[email protected]:61815/user/Executor#-238863041]) with ID 1 16/03/30 23:23:23 INFO TaskSetManager: Starting task 0.2 in stage 0.0 (TID 4, 10.208.91.144, PROCESS_LOCAL, 1435 bytes) 16/03/30 23:23:23 INFO TaskSetManager: Starting task 1.2 in stage 0.0 (TID 5, 10.208.91.144, PROCESS_LOCAL, 5942 bytes) 16/03/30 23:23:24 INFO BlockManagerMasterEndpoint: Registering block manager 10.208.91.144:61834 with 265.4 MB RAM, BlockManagerId(1, 10.208.91.144, 61834) 16/03/30 23:23:24 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 10.208.91.144:61834 (size: 30.4 KB, free: 265.4 MB) 16/03/30 23:23:24 INFO TaskSetManager: Lost task 1.2 in stage 0.0 (TID 5) on executor 10.208.91.144: java.lang.NoClassDefFoundError (edu/stanford/nlp/trees/TreebankLanguagePack) [duplicate 2] 16/03/30 23:23:24 INFO TaskSetManager: Starting task 1.3 in stage 0.0 (TID 6, 10.208.91.144, PROCESS_LOCAL, 5942 bytes) 16/03/30 23:23:24 INFO TaskSetManager: Lost task 0.2 in stage 0.0 (TID 4) on executor 10.208.91.144: java.lang.NoClassDefFoundError (edu/stanford/nlp/trees/TreebankLanguagePack) [duplicate 3] 16/03/30 23:23:24 INFO TaskSetManager: Starting task 0.3 in stage 0.0 (TID 7, 10.208.91.144, PROCESS_LOCAL, 1435 bytes) 16/03/30 23:23:25 WARN TransportChannelHandler: Exception in connection from /10.208.91.144:61835 java.io.IOException: An existing connection was forcibly closed by the remote host at sun.nio.ch.SocketDispatcher.read0(Native Method) at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:43) at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223) at sun.nio.ch.IOUtil.read(IOUtil.java:192) at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:380) at io.netty.buffer.PooledUnsafeDirectByteBuf.setBytes(PooledUnsafeDirectByteBuf.java:311) at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:881) at io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:225) at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:119) at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468) at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116) at java.lang.Thread.run(Thread.java:745) 16/03/30 23:23:25 ERROR TaskSchedulerImpl: Lost executor 1 on 10.208.91.144: remote Rpc client disassociated 16/03/30 23:23:25 INFO TaskSetManager: Re-queueing tasks for 1 from TaskSet 0.0 16/03/30 23:23:25 WARN TaskSetManager: Lost task 0.3 in stage 0.0 (TID 7, 10.208.91.144): ExecutorLostFailure (executor 1 lost) 16/03/30 23:23:25 WARN ReliableDeliverySupervisor: Association with remote system [akka.tcp://[email protected]:61815] has failed, address is now gated for [5000] ms. Reason is: [Disassociated]. 16/03/30 23:23:25 ERROR TaskSetManager: Task 0 in stage 0.0 failed 4 times; aborting job 16/03/30 23:23:25 WARN TaskSetManager: Lost task 1.3 in stage 0.0 (TID 6, 10.208.91.144): ExecutorLostFailure (executor 1 lost) 16/03/30 23:23:25 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 16/03/30 23:23:25 INFO TaskSchedulerImpl: Cancelling stage 0 16/03/30 23:23:25 INFO DAGScheduler: ResultStage 0 (saveAsTextFile at Polarity.java:62) failed in 8.085 s 16/03/30 23:23:25 INFO DAGScheduler: Job 0 failed: saveAsTextFile at Polarity.java:62, took 8.447334 s 16/03/30 23:23:25 INFO DAGScheduler: Executor lost: 1 (epoch 1) org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 7, 10.208.91.144): ExecutorLostFailure (executor 1 lost) Driver stacktrace: at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1266) at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1257) at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1256) at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47) at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1256) at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730) at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730) at scala.Option.foreach(Option.scala:236) at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:730) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1450) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1411) at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48) Exception in thread "main" java.io.FileNotFoundException: G:\Workspace\DSS\TextFile\part-00000 (The system cannot find the file specified) at java.io.FileInputStream.open0(Native Method) at java.io.FileInputStream.open(FileInputStream.java:195) at java.io.FileInputStream.(FileInputStream.java:138) at java.io.FileInputStream.(FileInputStream.java:93) at java.io.FileReader.(FileReader.java:58) at com.Polarity.read(Polarity.java:94) at com.Polarity.main(Polarity.java:288) 16/03/30 23:23:25 INFO BlockManagerMasterEndpoint: Trying to remove executor 1 from BlockManagerMaster. 16/03/30 23:23:25 INFO BlockManagerMasterEndpoint: Removing block manager BlockManagerId(1, 10.208.91.144, 61834) 16/03/30 23:23:25 INFO BlockManagerMaster: Removed 1 successfully in removeExecutor 16/03/30 23:23:25 INFO AppClient$ClientActor: Executor updated: app-20160330232314-0002/1 is now EXITED (Command exited with code 50) 16/03/30 23:23:25 INFO SparkDeploySchedulerBackend: Executor app-20160330232314-0002/1 removed: Command exited with code 50 16/03/30 23:23:25 ERROR SparkDeploySchedulerBackend: Asked to remove non-existent executor 1 16/03/30 23:23:25 INFO AppClient$ClientActor: Executor added: app-20160330232314-0002/2 on worker-20160330231130-10.208.91.144-61218 (10.208.91.144:61218) with 4 cores 16/03/30 23:23:25 INFO SparkDeploySchedulerBackend: Granted executor ID app-20160330232314-0002/2 on hostPort 10.208.91.144:61218 with 4 cores, 512.0 MB RAM 16/03/30 23:23:25 INFO SparkContext: Invoking stop() from shutdown hook 16/03/30 23:23:25 INFO AppClient$ClientActor: Executor updated: app-20160330232314-0002/2 is now RUNNING 16/03/30 23:23:25 INFO AppClient$ClientActor: Executor updated: app-20160330232314-0002/2 is now LOADING 16/03/30 23:23:25 INFO SparkUI: Stopped Spark web UI at http://10.208.91.144:4040 16/03/30 23:23:25 INFO DAGScheduler: Stopping DAGScheduler 16/03/30 23:23:25 INFO SparkDeploySchedulerBackend: Shutting down all executors 16/03/30 23:23:25 INFO SparkDeploySchedulerBackend: Asking each executor to shut down 16/03/30 23:23:25 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 16/03/30 23:23:25 INFO Utils: path = C:\Users\Developer\AppData\Local\Temp\spark-618a9039-a9d4-4fb2-bdc8-408d1e7f3c0e\blockmgr-4436a6d6-fca4-4190-ac2a-48c8ebd4e7db, already present as root for deletion. 16/03/30 23:23:25 INFO MemoryStore: MemoryStore cleared 16/03/30 23:23:25 INFO BlockManager: BlockManager stopped 16/03/30 23:23:25 INFO BlockManagerMaster: BlockManagerMaster stopped 16/03/30 23:23:25 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 16/03/30 23:23:25 INFO RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon. 16/03/30 23:23:25 INFO SparkContext: Successfully stopped SparkContext 16/03/30 23:23:25 INFO RemoteActorRefProvider$RemotingTerminator: Remote daemon shut down; proceeding with flushing remote transports. 16/03/30 23:23:25 INFO Utils: Shutdown hook called 16/03/30 23:23:25 INFO Utils: Deleting directory C:\Users\Developer\AppData\Local\Temp\spark-618a9039-a9d4-4fb2-bdc8-408d1e7f3c0e

это мой pom.xml

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> 
    <modelVersion>4.0.0</modelVersion> 
    <groupId>Fyp</groupId> 
    <artifactId>DSS</artifactId> 
    <version>0.0.1-SNAPSHOT</version> 
    <build> 
     <plugins> 
      <plugin> 
      <groupId>org.apache.maven.plugins</groupId> 
       <artifactId>maven-compiler-plugin</artifactId> 
       <version>2.3.2</version> 
       <configuration> 
        <source>1.8</source> 
        <target>1.8</target> 
       </configuration> 
      </plugin> 
      <plugin> 
       <groupId>org.apache.maven.plugins</groupId> 
       <artifactId>maven-jar-plugin</artifactId> 
       <configuration> 
        <archive> 
         <manifest> 
          <addClasspath>true</addClasspath> 
          <classpathPrefix>lib/</classpathPrefix> 
          <mainClass>com.Polarity</mainClass> 
         </manifest> 
        </archive> 
       </configuration> 
      </plugin> 
      <plugin> 
       <groupId>org.apache.maven.plugins</groupId> 
       <artifactId>maven-compiler-plugin</artifactId> 
       <executions> 
        <execution> 
         <phase>compile</phase> 
         <goals> 
          <goal>compile</goal> 
         </goals> 
        </execution> 
       </executions> 
      </plugin> 
     </plugins> 
    </build> 

    <dependencies> 

     <!-- Import Spark --> 


     <dependency> 
      <groupId>org.apache.spark</groupId> 
      <artifactId>spark-core_2.10</artifactId> 
      <version>1.4.0</version> 
     </dependency> 
     <dependency> 
      <groupId>org.apache.spark</groupId> 
      <artifactId>spark-sql_2.10</artifactId> 
      <version>1.4.0</version> 
     </dependency> 
     <dependency> 
     <groupId>org.springframework</groupId> 
     <artifactId>spring-core</artifactId> 
     <version>2.5</version> 
     </dependency> 
     <dependency> 
    <groupId>org.apache.spark</groupId> 
    <artifactId>spark-sql_2.10</artifactId> 
    <version>1.3.1</version> 
</dependency> 
    <dependency> 
     <groupId>jaws</groupId> 
     <artifactId>jaws</artifactId> 
     <version>1.2</version> 
     <type>jar</type> 
     <scope>system</scope> 
     <systemPath>G:/Workspace/DSS/lib/jaws-bin.jar</systemPath> 
</dependency> 
<dependency> 
    <groupId>commons-logging</groupId> 
    <artifactId>commons-logging</artifactId> 
    <version>1.1.3</version> 
</dependency> 
     <dependency> 
      <groupId>edu.stanford.nlp</groupId> 
      <artifactId>stanford-corenlp</artifactId> 
      <version>3.5.2</version> 
     </dependency> 
     <dependency> 
    <groupId>edu.stanford.nlp</groupId> 
    <artifactId>stanford-parser</artifactId> 
    <version>2.0.2</version> 
    </dependency> 
     <dependency> 
    <groupId>edu.stanford.nlp</groupId> 
    <artifactId>stanford-corenlp</artifactId> 
    <version>3.5.0</version> 
    <classifier>models</classifier> 
</dependency> 
     <dependency> 
    <groupId>com.googlecode.json-simple</groupId> 
    <artifactId>json-simple</artifactId> 
    <version>1.1</version> 
</dependency> 
     <dependency> 
      <groupId>junit</groupId> 
      <artifactId>junit</artifactId> 
      <version>4.11</version> 
      <scope>test</scope> 
     </dependency> 
    </dependencies> 
    <properties> 
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> 
    </properties> 

</project> 
+0

Нет класса def found = у вас нет подходящего jar в вашем проекте, вам нужно добавить его в свой путь к классам – Whitefret

+0

, но я добавлю maven dependecy для jar –

+0

, то почему эта ошибка возникает ?? –

ответ

3

Первая строка трассировки стека говорит вам, что проблема есть.

java.lang.NoClassDefFoundError: edu/stanford/nlp/trees/TreebankLanguagePack at 

Если поиск Google или StackOverflow для NoClassDefFoundError, вы найдете статьи, описывающие, как решить эту проблему. Это не относится к Apache Spark. Это общая проблема Java. По всей вероятности, ваш путь Java classpath некорректно настроен --- вам не хватает файла «jar» в пути к классам, введите неправильную версию jar или не включили все файлы классов в путь к классам.

Возможно, вам необходимо решить, какой «jar» содержит класс edu.stanford.nlp.trees.TreebankLanguagePack. Поиск Google для этого имени класса предполагает, что вам не хватает stanford-parser.jar. Возможно, вам тоже не хватает других банок.


UPDATE: Теперь вы отправили свой конфиг maven. Я думаю, вы указали старую версию stanford-parser.jar, которая не содержит класс TreebankLanguagePack. Попробуйте это вместо:

<dependency> 
    <groupId>edu.stanford.nlp</groupId> 
    <artifactId>stanford-parser</artifactId> 
    <version>3.6.0</version> 
</dependency> 
+0

это не maven, чтобы сделать это? – Whitefret

+0

Ну, может быть, да. Он спросил, что такое исключение. –

+0

это исключение из-за того, что я не добавлял банки в sparkcontext ??? –

0

Я решил эту проблему. Ошибка связана с отсутствием файлов jar, которые не привязаны в SparkContext.

Вот баночки, которые я добавил:

String jars[]={"lib/stanford-corenlp-1.3.5.jar","lib/stanford-parse-models-1.3.2.jar","lib/stanford-parser.jar","lib/stanford-parser-3.5.2-javadoc.jar","lib/stanford-postagger.jar","lib/stanford-postagger-3.5.2.jar","lib/stanford-postagger-3.5.2-javadoc.jar","lib/org.springframework.core-3.0.3.RELEASE.jar"}; 
    SparkConf sparkConf = new SparkConf().setAppName("DSS").setMaster("spark://192.168.1.100:7077").setJars(jars); 
0

Существует лучший способ решить такие проблемы, как это. Используя сборку sbt (или один), вы можете создать единую жировую банку для вашего искрового приложения.

AFAIK это выбор по умолчанию во всем мире.