当前位置:

python报org.apache.spark.SparkException: Python worker failed to connect back.

访客 2024-04-23 1071 0

org.apache.spark.SparkException:Pythonworkerfailedtoconnectback.

22/12/0615:04:14WARNNativeCodeLoader:Unabletoloadnative-hadooplibraryforyourplatform...usingbuiltin-javaclasseswhereapplicable22/12/0615:04:27ERRORExecutor:Exceptionintask7.0instage0.0(TID7)org.apache.spark.SparkException:Pythonworkerfailedtoconnectback.atorg.apache.spark.api.python.PythonWorkerFactory.createSimpleWorker(PythonWorkerFactory.scala:189)atorg.apache.spark.api.python.PythonWorkerFactory.create(PythonWorkerFactory.scala:109)atorg.apache.spark.SparkEnv.createPythonWorker(SparkEnv.scala:124)atorg.apache.spark.api.python.BasePythonRunner.compute(PythonRunner.scala:164)atorg.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:65)atorg.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:365)atorg.apache.spark.rdd.RDD.iterator(RDD.scala:329)atorg.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)atorg.apache.spark.scheduler.Task.run(Task.scala:136)atorg.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:548)atorg.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1504)atorg.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:551)atjava.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)atjava.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)atjava.lang.Thread.run(Thread.java:748)Causedby:java.net.SocketTimeoutException:Accepttimedoutatjava.net.DualStackPlainSocketImpl.waitForNewConnection(NativeMethod)atjava.net.DualStackPlainSocketImpl.socketAccept(DualStackPlainSocketImpl.java:131)atjava.net.AbstractPlainSocketImpl.accept(AbstractPlainSocketImpl.java:535)atjava.net.PlainSocketImpl.accept(PlainSocketImpl.java:189)atjava.net.ServerSocket.implAccept(ServerSocket.java:545)atjava.net.ServerSocket.accept(ServerSocket.java:513)atorg.apache.spark.api.python.PythonWorkerFactory.createSimpleWorker(PythonWorkerFactory.scala:176)...14more22/12/0615:04:27WARNTaskSetManager:Losttask7.0instage0.0(TID7)(172.23.48.1executordriver):org.apache.spark.SparkException:Pythonworkerfailedtoconnectback.atorg.apache.spark.api.python.PythonWorkerFactory.createSimpleWorker(PythonWorkerFactory.scala:189)atorg.apache.spark.api.python.PythonWorkerFactory.create(PythonWorkerFactory.scala:109)atorg.apache.spark.SparkEnv.createPythonWorker(SparkEnv.scala:124)atorg.apache.spark.api.python.BasePythonRunner.compute(PythonRunner.scala:164)atorg.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:65)atorg.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:365)atorg.apache.spark.rdd.RDD.iterator(RDD.scala:329)atorg.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)atorg.apache.spark.scheduler.Task.run(Task.scala:136)atorg.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:548)atorg.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1504)atorg.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:551)atjava.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)atjava.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)atjava.lang.Thread.run(Thread.java:748)Causedby:java.net.SocketTimeoutException:Accepttimedoutatjava.net.DualStackPlainSocketImpl.waitForNewConnection(NativeMethod)atjava.net.DualStackPlainSocketImpl.socketAccept(DualStackPlainSocketImpl.java:131)atjava.net.AbstractPlainSocketImpl.accept(AbstractPlainSocketImpl.java:535)atjava.net.PlainSocketImpl.accept(PlainSocketImpl.java:189)atjava.net.ServerSocket.implAccept(ServerSocket.java:545)atjava.net.ServerSocket.accept(ServerSocket.java:513)atorg.apache.spark.api.python.PythonWorkerFactory.createSimpleWorker(PythonWorkerFactory.scala:176)...14more22/12/0615:04:27ERRORTaskSetManager:Task7instage0.0failed1times;abortingjobTraceback(mostrecentcalllast):File"E:\学习资料\编程项目\Python\test\pythonProject1\pyspark\03_数据计算_map方法.py",line26,in<module>print(rdd2.collect())File"E:\module\learn\Python\lib\site-packages\pyspark\rdd.py",line1197,incollectsock_info=self.ctx._jvm.PythonRDD.collectAndServe(self._jrdd.rdd())File"E:\module\learn\Python\lib\site-packages\py4j\java_gateway.py",line1321,in__call__return_value=get_return_value(File"E:\module\learn\Python\lib\site-packages\py4j\protocol.py",line326,inget_return_valueraisePy4JJavaError(py4j.protocol.Py4JJavaError:Anerroroccurredwhilecallingz:org.apache.spark.api.python.PythonRDD.collectAndServe.:org.apache.spark.SparkException:Jobabortedduetostagefailure:Task7instage0.0failed1times,mostrecentfailure:Losttask7.0instage0.0(TID7)(172.23.48.1executordriver):org.apache.spark.SparkException:Pythonworkerfailedtoconnectback.atorg.apache.spark.api.python.PythonWorkerFactory.createSimpleWorker(PythonWorkerFactory.scala:189)atorg.apache.spark.api.python.PythonWorkerFactory.create(PythonWorkerFactory.scala:109)atorg.apache.spark.SparkEnv.createPythonWorker(SparkEnv.scala:124)atorg.apache.spark.api.python.BasePythonRunner.compute(PythonRunner.scala:164)atorg.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:65)atorg.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:365)atorg.apache.spark.rdd.RDD.iterator(RDD.scala:329)atorg.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)atorg.apache.spark.scheduler.Task.run(Task.scala:136)atorg.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:548)atorg.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1504)atorg.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:551)atjava.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)atjava.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)atjava.lang.Thread.run(Thread.java:748)Causedby:java.net.SocketTimeoutException:Accepttimedoutatjava.net.DualStackPlainSocketImpl.waitForNewConnection(NativeMethod)atjava.net.DualStackPlainSocketImpl.socketAccept(DualStackPlainSocketImpl.java:131)atjava.net.AbstractPlainSocketImpl.accept(AbstractPlainSocketImpl.java:535)atjava.net.PlainSocketImpl.accept(PlainSocketImpl.java:189)atjava.net.ServerSocket.implAccept(ServerSocket.java:545)atjava.net.ServerSocket.accept(ServerSocket.java:513)atorg.apache.spark.api.python.PythonWorkerFactory.createSimpleWorker(PythonWorkerFactory.scala:176)...14moreDriverstacktrace:atorg.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2672)atorg.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2608)atorg.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2607)atscala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)atscala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)atscala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)atorg.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2607)atorg.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1182)atorg.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1182)atscala.Option.foreach(Option.scala:407)atorg.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1182)atorg.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2860)atorg.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2802)atorg.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2791)atorg.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)atorg.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:952)atorg.apache.spark.SparkContext.runJob(SparkContext.scala:2228)atorg.apache.spark.SparkContext.runJob(SparkContext.scala:2249)atorg.apache.spark.SparkContext.runJob(SparkContext.scala:2268)atorg.apache.spark.SparkContext.runJob(SparkContext.scala:2293)atorg.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1021)atorg.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)atorg.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)atorg.apache.spark.rdd.RDD.withScope(RDD.scala:406)atorg.apache.spark.rdd.RDD.collect(RDD.scala:1020)atorg.apache.spark.api.python.PythonRDD$.collectAndServe(PythonRDD.scala:180)atorg.apache.spark.api.python.PythonRDD.collectAndServe(PythonRDD.scala)atsun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethod)atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)atjava.lang.reflect.Method.invoke(Method.java:498)atpy4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)atpy4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)atpy4j.Gateway.invoke(Gateway.java:282)atpy4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)atpy4j.commands.CallCommand.execute(CallCommand.java:79)atpy4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:182)atpy4j.ClientServerConnection.run(ClientServerConnection.java:106)atjava.lang.Thread.run(Thread.java:748)Causedby:org.apache.spark.SparkException:Pythonworkerfailedtoconnectback.atorg.apache.spark.api.python.PythonWorkerFactory.createSimpleWorker(PythonWorkerFactory.scala:189)atorg.apache.spark.api.python.PythonWorkerFactory.create(PythonWorkerFactory.scala:109)atorg.apache.spark.SparkEnv.createPythonWorker(SparkEnv.scala:124)atorg.apache.spark.api.python.BasePythonRunner.compute(PythonRunner.scala:164)atorg.apache.spark.api.python.PythonRDD.compute(PythonRDD.scala:65)atorg.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:365)atorg.apache.spark.rdd.RDD.iterator(RDD.scala:329)atorg.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)atorg.apache.spark.scheduler.Task.run(Task.scala:136)atorg.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:548)atorg.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1504)atorg.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:551)atjava.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)atjava.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)...1moreCausedby:java.net.SocketTimeoutException:Accepttimedoutatjava.net.DualStackPlainSocketImpl.waitForNewConnection(NativeMethod)atjava.net.DualStackPlainSocketImpl.socketAccept(DualStackPlainSocketImpl.java:131)atjava.net.AbstractPlainSocketImpl.accept(AbstractPlainSocketImpl.java:535)atjava.net.PlainSocketImpl.accept(PlainSocketImpl.java:189)atjava.net.ServerSocket.implAccept(ServerSocket.java:545)atjava.net.ServerSocket.accept(ServerSocket.java:513)atorg.apache.spark.api.python.PythonWorkerFactory.createSimpleWorker(PythonWorkerFactory.scala:176)...14more进程已结束,退出代码为1
报以上错误需要导包,需要添加解释器位置

解决方法:
添加python解释器的路径

importosos.environ['PYSPARK_PYTHON']="E:\\module\\learn\\Python\\python.exe"


结果:
运行成功

E:\module\learn\Python\python.exeE:/学习资料/编程项目/Python/test/pythonProject1/pyspark/03_数据计算_map方法.py22/12/0615:09:57WARNShell:Didnotfindwinutils.exe:java.io.FileNotFoundException:java.io.FileNotFoundException:HADOOP_HOMEandhadoop.home.dirareunset.-seehttps://wiki.apache.org/hadoop/WindowsProblemsSettingdefaultloglevelto"WARN".Toadjustlogginglevelusesc.setLogLevel(newLevel).ForSparkR,usesetLogLevel(newLevel).22/12/0615:09:57WARNNativeCodeLoader:Unabletoloadnative-hadooplibraryforyourplatform...usingbuiltin-javaclasseswhereapplicable[10,20,30,40,50]进程已结束,退出代码为0

发表评论

  • 评论列表
还没有人评论,快来抢沙发吧~